首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
I discuss the difficulties that I encountered in reproducing the results of the Shanghai ranking of world universities. In the Shanghai ranking, the dependence between the score for the SCI indicator and the weighted number of considered articles obeys a power law, instead of the proportional dependence that is suggested by the official methodology of the ranking. Discrepancies from proportionality are also found in some of the scores for the N&S and Size indicators. This shows that the results of the Shanghai ranking cannot be reproduced, given raw data and the public methodology of the ranking.  相似文献   

2.
In this paper, we examine whether the quality of academic research can be accurately captured by a single aggregated measure such as a ranking. With Shanghai University’s Academic Ranking of World Universities as the basis for our study, we use robust principal component analysis to uncover the underlying factors measured by this ranking. Based on a sample containing the top 150 ranked universities, we find evidence that, for the majority of these institutions, the Shanghai rankings reflect not one but in fact two different and uncorrelated aspects of academic research: overall research output and top-notch researchers. Consequently, the relative weight placed upon these two factors determines to a large extent the final ranking.  相似文献   

3.
Although universities’ world rankings are popular, their design and methods still request considerable elaborations. The paper demonstrates some shortcomings in the Academic World Ranking of Universities (ARWU, Shanghai Jiao Tong University) ranking methods. One deficiency is that universities’ scale differences are neglected due to omitting the whole input side. By resampling and reanalyzing the ARWU data, the paper proposes an input-output analysis for measuring universities’ scientific productivity with special emphasis on those universities which meet the productivity threshold (i.e. share of output exceeds share of input) in a certain group of universities. The productivity analysis on Scandinavian universities evaluates multidisciplinary and specialized universities on their own terms; consequently the ranking based on scientific productivity deviates significantly from the ARWU.  相似文献   

4.
The Academic Ranking of World Universities (ARWU) published by researchers at Shanghai Jiao Tong University has become a major source of information for university administrators, country officials, students and the public at large. Recent discoveries regarding its internal dynamics allow the inversion of published ARWU indicator scores to reconstruct raw scores for 500 world class universities. This paper explores raw scores in the ARWU and in other contests to contrast the dynamics of rank-driven and score-driven tables, and to explain why the ARWU ranking is a score-driven procedure. We show that the ARWU indicators constitute sub-scales of a single factor accounting for research performance, and provide an account of the system of gains and non-linearities used by ARWU. The paper discusses the non-linearities selected by ARWU, concluding that they are designed to represent the regressive character of indicators measuring research performance. We propose that the utility and usability of the ARWU could be greatly improved by replacing the unwanted dynamical effects of the annual re-scaling based on raw scores of the best performers.  相似文献   

5.
This study describes the basic methodological approach and the results of URAP-TR, the first national ranking system for Turkish universities. URAP-TR is based on objective bibliometric data resources and includes both size-dependent and size-independent indicators that balance total academic performance with performance per capita measures. In the context of Turkish national university rankings, the paper discusses the implications of employing multiple size-independent and size-dependent indicators on national university rankings. Fine-grained ranking categories for Turkish universities are identified through an analysis of ranking results across multiple indicators.  相似文献   

6.
Recently there is increasing interest in university rankings. Annual rankings of world universities are published by QS for the Times Higher Education Supplement, the Shanghai Jiao Tong University, the Higher Education and Accreditation Council of Taiwan and rankings based on Web visibility by the Cybermetrics Lab at CSIC. In this paper we compare the rankings using a set of similarity measures. For the rankings that are being published for a number of years we also examine longitudinal patterns. The rankings limited to European universities are compared to the ranking of the Centre for Science and Technology Studies at Leiden University. The findings show that there are reasonable similarities between the rankings, even though each applies a different methodology. The biggest differences are between the rankings provided by the QS-Times Higher Education Supplement and the Ranking Web of the CSIC Cybermetrics Lab. The highest similarities were observed between the Taiwanese and the Leiden rankings from European universities. Overall the similarities are increased when the comparison is limited to the European universities.  相似文献   

7.
South Africa has 23 universities, of which five are placed in one or more of the 2011 Shanghai Jiao Tong, Times Higher Education, and Quacquarelli Symonds world university rankings. The five are: Cape Town, Witwatersrand, KwaZulu-Natal, Stellenbosch and Pretoria. They are ranked above the other 18 universities, with Cape Town in top position, mainly because they have significantly higher publication and citation counts. In the Shanghai Jiao Tong ranking Cape Town??s Nobel Prize alumni and highly-cited researchers give it an additional lead over second-placed Witwatersrand, which has Nobel Prize alumni but no highly-cited researchers. KwaZulu-Natal, in third place, has no Nobel Prize alumni but one highly-cited researcher, which places it ahead of Stellenbosch and Pretoria despite the latter two having higher publication output. However, in the Times Higher Education ranking, which places Cape Town first and Witwatersrand second, Stellenbosch is ranked but not KwaZulu-Natal, presumably because the publication and citation counts of Stellenbosch are higher. The other 18 universities are ranked by the SCImago and Webometrics rankings in an order consistent with bibliometric indicators, and consistent with approximate simulations of the Shanghai Jiao Tong and Times Higher Education methods. If a South African university aspires to rise in the rankings, it needs to increase publications, citations, staff-student ratio, and proportions of postgraduate students, international students and international staff.  相似文献   

8.
Recently there are many organizations conducting projects on ranking world universities from different perspectives. These ranking activities have made impacts and caused controversy. This study does not favor using bibliometric indicators to evaluate universities?? performances, but not against the idea either. We regard these ranking activities as important phenomena and aim to investigate correlation of different ranking systems taking bibliometric approach. Four research questions are discussed: (1) the inter-correlation among different ranking systems; (2) the intra-correlation within ranking systems; (3) the correlation of indicators across ranking systems; and (4) the impact of different citation indexes on rankings. The preliminary results show that 55?% of top 200 universities are covered in all ranking systems. The rankings of ARWU and PRSPWU show stronger correlation. With inclusion of another ranking, WRWU (2009?C2010), these rankings tend to converge. In addition, intra-correlation is significant and this means that it is possible to find out some ranking indicators with high degree of discriminativeness or representativeness. Finally, it is found that there is no significant impact of using different citation indexes on the ranking results for top 200 universities.  相似文献   

9.

Research universities have a strong devotion and advocacy for research in their core academic mission. This is why they are widely recognized for their excellence in research which make them take the most renowned positions in the different worldwide university leagues. In order to examine the uniqueness of this group of universities we analyze the scientific production of a sample of them in a 5 year period of time. On the one hand, we analyze their preferences in research measured with the relative percentage of publications in the different subject areas, and on the other hand, we calculate the similarity between them in research preferences. In order to select a set of research universities, we studied the leading university rankings of Shanghai, QS, Leiden, and Times Higher Education (THE). Although the four rankings own well established and developed methodologies and hold great prestige, we choose to use THE because data were readily available for doing the study we had in mind. Having done that, we selected the twenty academic institutions ranked with the highest score in the last edition of THE World University Rankings 2020 and to contrast their impact, we also, we compared them with the twenty institutions with the lowest score in this ranking. At the same time, we extracted publication data from Scopus database for each university and we applied bibliometrics indicators from Elsevier’s SciVal. We applied the statistical techniques cosine similarity and agglomerative hierarchical clustering analysis to examine and compare affinities in research preferences among them. Moreover, a cluster analysis through VOSviewer was done to classify the total scientific production in the four major fields (health sciences, physical sciences, life sciences and social sciences). As expected, the results showed that top universities have strong research profiles, becoming the leaders in the world in those areas and cosine similarity pointed out that some are more affine among them than others. The results provide clues for enhancing existing collaboration, defining and re-directing lines of research, and seeking for new partnerships to face the current pandemic to find was to tackle down the covid-19 outbreak.

  相似文献   

10.
With the growth of competition between nations in our knowledge-based world economy, excellence programs are becoming a national agenda item in developing as well as developed Asian countries. The main purpose of this paper is to compare the goals, funding policies and selection criteria of excellence programs in China, Japan, Korea and Taiwan and to analyze the academic achievement of their top ranked universities in three areas: research output, internationalization, and excellence, by using data from the Shanghai Jiao Tong, QS, and HEEACT rankings. The effectiveness of Taiwan??s ??Development Plan for World Class Universities and Research Centers of Excellence?? was assessed as a case study in the paper via a survey targeting on 138 top administrators from 11 Taiwan??s universities and 30 reviewers. The study found that more funding nations had, the more outputs and outcomes they would gain, for example China. The Taiwan case demonstrates that world-class universities and research centers are needed in Asian nations despite the concerns for inequality which they raise.  相似文献   

11.
This paper proposes a critical analysis of the “Academic Ranking of World Universities”, published every year by the Institute of Higher Education of the Jiao Tong University in Shanghai and more commonly known as the Shanghai ranking. After having recalled how the ranking is built, we first discuss the relevance of the criteria and then analyze the proposed aggregation method. Our analysis uses tools and concepts from Multiple Criteria Decision Making (MCDM). Our main conclusions are that the criteria that are used are not relevant, that the aggregation methodology is plagued by a number of major problems and that the whole exercise suffers from an insufficient attention paid to fundamental structuring issues. Hence, our view is that the Shanghai ranking, in spite of the media coverage it receives, does not qualify as a useful and pertinent tool to discuss the “quality” of academic institutions, let alone to guide the choice of students and family or to promote reforms of higher education systems. We outline the type of work that should be undertaken to offer sound alternatives to the Shanghai ranking.  相似文献   

12.
Publication productivity during 2009–2011 was studied for physicists who teach in South African universities, using data from departmental websites and Thomson Reuters’ Web of Science. The objective was to find typical ranges of two measures of individual productivity: number of papers and sum of author share, where author share per n-author paper is 1/n author units (AU). All values given below are average output per year. Median productivity was 1.33 papers (inter-quartile range 0.33–2.33) and 0.3 AU (inter-quartile range 0.1–0.5 AU). The lowest 10 % did not publish, and the top 10 % produced above four papers and above 1 AU. Productivity varied with rank, ranging from medians of 0.67 papers and 0.2 AU for lecturers to 1.67 papers and 0.4 AU for full professors. Productivity of South African professors was similar to that of a sample of USA professors in a comparable mid-ranked bracket in the Shanghai Jiao Tong world ranking of universities, and about half that of professors in the six top-ranked departments in the world, which had medians of four papers and 1 AU.  相似文献   

13.
Ranking is widely considered to be an important tool for evaluating the performance, competitiveness, and success of academic institutions. An appropriate ranking system should evaluate the key missions of the higher education system in a way that helps to improve the leadership goals and activities carried out by the universities. Based on the concepts derived from the Iranian Higher Education Upstream Documents and Measures used internationally for university ranking, this study identifies 21 key measures that can be used in the ranking of Iranian universities. The measures are grouped into five categories: scientific infrastructure, scientific effectiveness, socio-cultural effectiveness, international interactions, and sustainability. Then, using the Interpretative Structural Modeling approach, the researchers develop a coherent rubric for establishing the ranking. The proposed conceptual model focuses primarily on the universities’ contribution to technological and scientific infrastructure, then secondarily on their contribution to scientific advancement and international interactions, and finally at a tertiary level on their socio-cultural effectiveness and sustainability.  相似文献   

14.
This paper describes the results of a multi-level network analysis of web-citations among the 1,000 universities with the greatest presence on the world wide web. Using data from January 2011, it describes the web-citation network of the world’s universities and ascertains the antecedent factors that determine its structure. At the university level, the network is composed of ten groups, and the most central universities are mainly from the United States. The factors that predict the structure of the network are, whether or not the universities are in the same country, the language of instruction, the size and excellence of the institution (university ranking and the number of Nobel Prizes received), if they offer doctoral degrees, and the infrastructure of its country. Physical distance was not a determinant of the network’s structure. At the nation-state level, international connections among a nation’s universities are composed of a single cluster with the United States, United Kingdom and Germany at the center. The structure of the international network may be predicted by the countries’ overall hyperlink connections, international co-authorships, student flows and the number of Nobel Prizes won by its citizen.  相似文献   

15.
Zhu  Xing  Wu  Qi  Zheng  Yingzi  Ma  Xin 《Scientometrics》2004,60(2):237-347
The academic level and scientific reputation is the most important merit of a research university. Publication of the scientific achievement in the world leading scientific journals is the key to asses a university's overall performance. Peking University is a leading university among the Chinese research universities, and the number of papers published in Science Citation Index (SCI) indexed journals has been on the top of the national list. In this paper, based on our long-term experience and practice in scientific management, we use scientometrics and informetrics method to analyze the academic performance of the researchers, departments and schools of Peking University, mainly using the citations of publications. Highly cited papers are specially important to the reputation of our university. We compare those data with some selected world well-known universities, hence, some important information can be deduced for the policy decision of the university. The results presented here is not only an academic survey, but also a guideline for the future strategic development of Peking University. This revised version was published online in June 2006 with corrections to the Cover Date.  相似文献   

16.
The problem of comparing academic institutions in terms of their research production is nowadays a priority issue. This paper proposes a relative bidimensional index that takes into account both the net production and the quality of it, as an attempt to provide a comprehensive and objective way to compare the research output of different institutions in a specific field, using journal contributions and citations. The proposed index is then applied, as a case study, to rank the top Spanish universities in the fields of Chemistry and Computer Science in the period ranging from 2000 until 2009. A comparison with the top 50 universities in the ARWU rankings is also made, showing the proposed ranking is better suited to distinguish among non-elite universities.  相似文献   

17.
Most academic rankings attempt to measure the quality of university education and research. However, previous studies that examine the most influential rankings conclude that the variables they use could be an epiphenomenon of an X factor that has little to do with quality. The aim of this study is to investigate the existence of this hidden factor or profile in the two most influential global university rankings in the world: the Academic Ranking of World Universities (ARWU) of the University of Shanghai Jiao Tong, and the Times Higher Education (THE) ranking. Results support the existence of an underlying entity profile, characterized by institutions normally from the US that enjoy a high reputation. Results also support the idea that rankings lack the capacity to assess university quality in all its complexity, and two strategies are suggested in relation to the vicious circle created between institutional reputation and rankings.  相似文献   

18.
This paper focuses on measuring the academic research performance of Chinese universities by using Scopus database from 2007 to 2010. We have provided meaningful indicators to measure the research performance of Chinese universities as compared to world class universities of the US and the European region. Using these indicators, we first measure the quantity and quality of the research outcomes of the universities and then examine the internationalization of research by using international collaborations, international citations and international impact metrics. Using all of this data, we finally present an overall score called research performance point to measure the comprehensive research strength of the universities for the selected subject categories. The comparison identifies the gap between Chinese universities and top-tier universities from selected regions across various subject areas. We find that Chinese universities are doing well in terms of publication volume but receive less citations from their published work. We also find that the Chinese universities have relative low percentage of publications at high impact venues, which may be the reason that they are not receiving more citations. Therefore, a careful selection of publication venues may help the Chinese universities to compete with world class universities and increase their research internationalization.  相似文献   

19.
Jacek Pietrucha 《Scientometrics》2018,114(3):1129-1139
This paper examines country-specific factors that affect the three most influential world university rankings (the Academic Ranking of World Universities, the QS World University Ranking, and the Times Higher Education World University Ranking). We run a cross sectional regression that covers 42–71 countries (depending on the ranking and data availability). We show that the position of universities from a country in the ranking is determined by the following country-specific variables: economic potential of the country, research and development expenditure, long-term political stability (freedom from war, occupation, coups and major changes in the political system), and institutional variables, including government effectiveness.  相似文献   

20.
Bar-Ilan  Judit 《Scientometrics》2004,59(3):391-403
Links analysis proved to be very fruitful on the Web. Google's very successful ranking algorithm is based on link analysis. There are only a few studies that analyzed links qualitatively, most studies are quantitative. Our purpose was to characterize these links in order to gain a better understanding why links are created. We limited the study to the academic environment, and as a specific case we chose to characterize the interlinkage between the eight Israeli universities. This revised version was published online in June 2006 with corrections to the Cover Date.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号