首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
This paper introduces a citation-based "systems approach" for analyzing the various institutional and cognitive dimensions of scientific excellence within national research systems. The methodology, covering several aggregate levels, focuses on the most highly cited research papers in the international journal literature. The distribution of these papers across institutions and disciplines enables objective comparisons their (possible) international-level scientific excellence. By way of example, we present key results from a recent series of analyses of the research system in the Netherlands in the mid 1990s, focussing on the performance of the universities across the various major scientific disciplines within the context of the entire system"s scientific performance. Special attention is paid to the contribution in the world"s top 1% and top 10% most highly cited research papers. The findings indicate that these high performance papers provide a useful analytical framework - both in terms of transparency, cognitive and institutional differentiation, as well as its scope for domestic and international comparisons - providing new indicators for identifying "world class" scientific excellence at the aggregate level. The average citation scores of these academic "Centres of Scientific Excellence" appear to be an inadequate predictor of their production of highly cited papers. However, further critical reflection and in-depth validation studies are needed to establish the true potential of this approach for science policy analyses and evaluation of research performance. This revised version was published online in August 2006 with corrections to the Cover Date.  相似文献   

2.
Gangan Prathap 《Scientometrics》2017,110(3):1085-1097
In this paper a three-dimensional framework to see how Indian universities and research-focused institutions fare in the world of high end research in terms of excellence and diversity of its research base is proposed. At the country level scholarly performance is broken down into three components—size, excellence and balance or evenness. A web application available in the public domain which visualizes scientific excellence worldwide in several subject areas is used. India has a presence in fifteen of twenty-two subject areas in which there are at least 50 institutes globally that have published more than 500 papers. It has no institution which can be counted at this level of size and excellence in seven areas: Arts and Humanities; Business, Management and Accounting; Health Professions; Neuroscience; Nursing; Psychology; and Social Sciences. India’s research base is completely skewed towards the Physical Sciences and Engineering with very little for Biological Sciences and Medicine and virtually none in Social Sciences and Arts and Humanities when excellence at the highest level is considered. Its performance is also benchmarked against three nations, namely Australia, The Netherlands and Taiwan which are of similar size in terms of GDP and scientific output. It is seen that although India has the highest GDP among the four countries, its performance lags considerably behind. Even in terms of diversity, its performance is poor compared to the three comparator countries.  相似文献   

3.
This study of multinational publication (publications involving authors from more than one country) focuses on a viable method of fractionation, which can be used in on-line bibliometric research. Fractionation occurs when the credit for co-authored papers is added only partially to the total of publications of countries or authors. We attempted to find an empirical relation between the share of a country's papers in some field that is multinationally co-authored and the degree of fractionation which results. A linear regression analysis yielded a significant correlation of –0.95. The fractionation method is the first that can be applied to publication data collected on-line. A comparison is made with fractionation by first author (i.e., first address) counting. Application of the method to British scientific output for 1984–1989 suggests that British output was stable. The fractionation method can be applied to both natural and life sciences and to social and behavioral sciences. Findings suggest that similar processes of multinational publication are prevalent in both types of science. Implications of the model are discussed.  相似文献   

4.
Describes a new method of evaluation of scientific output by laboratories engaged in diverse fields of research. This method helps to evaluate those outputs which are quite recent and not amenable to citation analysis. For the purpose of analysis, impact factor of journals in which papers are published are considered. A method for normalisation of impact factor of journals has been described and, normalised impact factors have also been used for the purpose of analysis. It is found that in such analysis normalised impact factor tends to show better results compared to simple impact factor. The analysis helps us to generate numerous performance indicators such as average impact factor and normalised impact factor for each laboratory and the research complex such as CSIR as a whole; average impact factor and normalised impact factor for each scientist of a laboratory and the research complex; spectral distribution of papers falling within various ranges of impact factors and normalised impact factors. By comparing the performances over several years the trend of research activity of each laboratory can also be obtained.Paper presented at the International Conference on Science Indicators for Developing Countries, Paris, 15–19 October, 1990.  相似文献   

5.
Summary For all rankings of countries research output based on number of publications or citations compared with population, GDP, R&D and public R&D expenses, and other national characteristics the counting method is decisive. Total counting (full credit to a country when at least one of the authors is from this country) and Fractional Counting (a country receives a fraction of full credit for a publication equal to the fraction of authors from this country) of publications give widely different results. Counting methods must be stated, rankings based on different counting methods cannot be compared, and Fractional Counting is to be preferred.  相似文献   

6.
Basu  Aparna  Aggarwal  Ritu 《Scientometrics》2001,52(3):379-394
In this paper, our objective is to delineate some of the problems that could arise in using research output for performance evaluation. Research performance in terms of the Impact Factor (IF) of papers, say of scientific institutions in a country, could depend critically on coauthored papers in a situation where internationally co-authored papers are known to have significantly different (higher) impact factors as compared to purely indigenous papers. Thus, international collaboration not only serves to increase the overall output of research papers of an institution, the contribution of such papers to the average Impact Factor of the institutional output could also be disproportionately high. To quantify this effect, an index of gain in impact through foreign collaboration (GIFCOL) is defined such that it ensures comparability between institutions with differing proportions of collaborative output. A case study of major Indian institutions is undertaken, where Cluster Analysis is used to distinguish between intrinsically high performance institutions and those that gain disproportionately in terms of perceived quality of their output as a result of international collaboration. This revised version was published online in June 2006 with corrections to the Cover Date.  相似文献   

7.
This study analyzed journal articles published by authors from the G9 countries (Canada, China, France, Germany, Italy, Japan, Russia, the United Kingdom, and the United States) to identify the distribution of research funding and funding agencies in these countries. A total of 5,856,744 articles published between 2009 and 2014 were collected from the Web of Science database. The results showed that China had the highest proportion of funded papers among its overall scientific output, while Italy had the lowest funded paper ratio. The leading sponsoring countries of papers by other countries were China and the United States, with China having a sponsorship surplus with all the other G9 countries, and the United States having a sponsorship surplus with seven other countries excepting China. Furthermore, governmental agencies were the major sponsors of funded papers in the G9 countries. The field of life sciences had the highest proportion of funded papers among the field’s total paper output; while natural sciences had the highest proportion of papers among all funded papers of a country. Regarding funding agencies, the top three funding agencies in each G9 country were primarily domestic agencies; and a large proportion of the funding provided by these agencies were granted to domestic research projects.  相似文献   

8.
In this paper we analyze the evolution of China’s growing importance in international scientific collaboration over the past 15 years. Using co-authored publications indexed in Clarivate Analytics’s Web of Science Core Collection we develop novel weighted and unweighted centrality measures to quantify China’s emerging role in the global scientific research network. We analyze the networks formed by international co-authorship in three 5-year periods: 2001–2005, 2006–2010, and 2011–2015. This analysis highlights China’s sharp increase in prominence in international scientific collaborations. The analysis of China’s co-authored, highly cited papers also illustrates China’s rising importance in scientific research and collaboration from a different perspective. The impact of multilaterally co-authored papers to the centrality measure is also analyzed both theoretically and empirically. The results show that multilateral collaboration is also a key factor that influences the centrality of a country beyond simply the scale of international co-authorship. We further contextualize our work in a discussion of international scientific collaboration as both a key driver of China’s economy and its emerging perception as a first-world innovator and intellectual power. Finally, we suggest directions for further research including more granular analysis by academic discipline and an alternative investigation based on the fractional counting method.  相似文献   

9.
This article evaluates the scientific research competitiveness of world universities in computer science. The data source is the Essential Science Indicator (ESI) database with a time span of more than 10 years, from 01/01/1996 to 08/31/2006. We establish a hierarchical indicator system including four primary indicators which consist of scientific research production, influence, innovation and development and six secondary indicators which consist of the number of papers, total citations, highly cited papers, hot papers, average citations per paper and the ration of highly cited papers to papers. Then we assign them with proper weights. Based on these, we obtain the rankings of university and country/territory competitiveness in computer science. We hope this paper can contribute to the further study in the evaluation of a certain subject or a whole university.  相似文献   

10.
An increasing number of researchers have recently shown interest in the relationship between economic growth of a country and its research output, measured in scientometric indicators. The answer is not only of theoretical interest but it can also influence the specific policies aimed at the improvement of a country’s research performance. Our paper focuses on this relationship. We argue that research output is a manifestation of the improvement of human capital in the economy. We examine this relationship specifically in South Africa for the period 1980–2008. Using the autoregressive distributed lag method, we investigate the relationship between GDP and the comparative research performance of the country in relation to the rest of the world (the share of South African papers compared to the rest of the world). The relationship is confirmed for individual fields of science (biology and biochemistry, chemistry, material sciences, physics, psychiatry and psychology). The results of this study indicate that in South Africa for the period 1980–2008 the comparative performance of the research output can be considered as a factor affecting the economic growth of the country. Similarly, the results confirm the results of Vinkler (2008) and Lee et al. (2011). In contrast, economic growth did not influence the research output of the country for the same period. Policy implications are also discussed.  相似文献   

11.
A top journal is defined as a journal which is within the first 10% of journals ranked by impact factor in the SCI list, within a particular scientific subfield, for the year considered. Journals which were for 11 or more years within the first 10% were considered top journals during the whole period even though they were not within the first 10%, in some of the years covered by this study.In the period from 1980 to 2000, the Croatian scientists affiliated with research institutions within the Republic of Croatia, published a total of 13,021 papers in journals covered by the Science Citation Index (SCI). Out of these papers, only 2,720 were published in top journals. This amounts to 20.9% of the total, and this is below the world average of 29.5% for the same scientific subfields. Out of the above 2,720 publications, 1,250 (46.0%) were published in international collaboration, and 335 (12.3%) papers were Meeting Abstracts. The Croatian scientists were most productive in the main scientific fields: Physics (875 papers; 32.2%), Medicine (786 papers; 28.9%), and Chemistry (580 papers; 21.3%). All others fields, taken together, comprised 17.6% of the total scientific output. Of the 786 medical papers, 290 were Meeting Abstracts, or 36.9% of the total output in the field of Medicine, and medical Meeting Abstracts represent 86.6% of the total number of abstracts (335). Articles (2,060) represent 75.7% of the total Croatian scientific output in top journals. This revised version was published online in June 2006 with corrections to the Cover Date.  相似文献   

12.
This paper is based on theSource Book in Astronomy and Astrophysics 1900–1975 which is considered representative of the pioneer research work in the field. The distribution of important scientific achievements over a certain period, their distribution by subject area and sources, single or multiple authorship and age of techniques relevant to these areas are quantitatively examined. In some cases results are obtained as known from the analysis of the overall output of the sciences (including astronomy). As regards, however, the frequency of published important papers and the role of the latest technique pioneer achievements differ significantly from the total of scientific publications.  相似文献   

13.
A desirable goal of scientific management is to introduce, if it exists, a simple and reliable way to measure the scientific excellence of publicly funded research institutions and universities to serve as a basis for their ranking and financing. While citation-based indicators and metrics are easily accessible, they are far from being universally accepted as way to automate or inform evaluation processes or to replace evaluations based on peer review. Here we consider absolute measurements of research excellence at an amalgamated, institutional level and specific measures of research excellence as performance per head. Using biology research institutions in the UK as a test case, we examine the correlations between peer review-based and citation-based measures of research excellence on these two scales. We find that citation-based indicators are very highly correlated with peer-evaluated measures of group strength, but are poorly correlated with group quality. Thus, and almost paradoxically, our analysis indicates that citation counts could possibly form a basis for deciding on, how to fund research institutions, but they should not be used as a basis for ranking them in terms of quality.  相似文献   

14.
Except the alphabetic ordering authorship papers, the citations of multi-authored papers are allocated to the authors based on their contributions to the paper. For papers without clarification of contribution proportion, a function of author number and rank is presented to rightly determine the credit allocated proportion and allocated citations of each author. Our citation allocation scheme is between the equally fractional counting and the one using the inverse of author rank. It has a parameter to adjust the credit distribution among the different authors. The allocated citations can either be used alone to indicate one’s performance in a paper, or can be applied in the modification of h-index and g-index to represent the achievement of a scientist on the whole. The modified h-index and g-index of an author makes use of more papers in which he or she played important roles. Our method is suitable for the papers with wide range of author numbers.  相似文献   

15.
This paper seeks to provide current indicators on Indian science and technology for measuring the country’s progress in research. The study uses for the purpose 11 years publications data on India and top 20 productive countries as drawn from the Scopus database for the period 1996 to 2006. The study examines country performance on several measures including country publication share in the world research output, country publication share in various subjects in the national context and in the global context, patterns of research communication in core Indian domestic and international journals, geographical distribution of publications, share of international collaborative papers at the national level as well as across subjects and characteristics of high productivity institutions, scientists and cited papers. The paper also compares the similarity of Indian research profile with top 20 productive countries. The findings of the study should be of special significance to the planners & policy-makers as they have implications for the long term S&T planning of the country.  相似文献   

16.
17.
A sample comprising the three years publication output (1976–1978) of 85 Hungarian research institutes was subjected to scientometric analysis. Values of and correlations between some measures of publishing performance, scientific manpower, and citation impact were compared across the following research fields: mathematical and physical sciences, chemical sciences, biological and medical sciences, agricultural sciences, and engineering. A new quality measure of publishing performance, thetotal impact of the journal papers of individual institutes has been suggested.  相似文献   

18.
Brazilian scientific production has increased significantly over the last decade, and mental health has been a leading research field in the country, with a growing number of articles published in high quality international journals. This article analyses the scientific output of mental health research between 2004 and 2006 and estimates individual research performance based on four different strategies. A total of 106 mental health scientists were included in the analysis; together they published 1,209 articles indexed in Medline or ISI, with over 65% of the production in journals with impact factor ≥1. Median impact factor of publications was 2. Spearman correlation coefficient showed a large positive correlation between all four different measures used to estimate individual research output. Ten investigators were together responsible for almost 30% of the articles published in the period, whereas 65% of the sample contributed with less than 10 articles.  相似文献   

19.
This paper presents and discusses a new bibliometric indicator of research performance, designed with the fundamental concern of enabling cross-disciplinary comparisons. The indicator, called x-index, compares a researcher??s output to a reference set of research output from top researchers, identified in the journals where the researcher has published. It reflects publication quantity and quality, uses a moderately sized data set, and works with a more refined definition of scientific fields. x-index was developed to rank researchers in a scientific excellence award in the Faculty of Engineering of the University of Porto. The data set collected for the 2009 edition of the award is used to study the indicator??s features and design choices, and provides the basis for a discussion of its advantages and limitations.  相似文献   

20.
Zhao  Dangzhi  Logan  Elisabeth 《Scientometrics》2002,54(3):449-472
With the primary goal of exploring whether citation analysis using scientific papers found on the Web as a data source is a worthwhile means of studying scholarly communication in the new digital environment, the present case study examines the scholarly communication patterns in XML research revealed by citation analysis of ResearchIndex data and SCI data. Results suggest that citation analysis using scientific papers found on the Web as a data source has both advantages and disadvantages when compared with citation analysis of SCI data, but is nonetheless a valid method for evaluating scholarly contributions and for studying the intellectual structure in XML research. This revised version was published online in August 2006 with corrections to the Cover Date.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号