Centre for Science and Technology Studies Centre for Science and Technology Studies 2333AL Leiden Zuid Holland 31715273909
  • Blog
  • Moving beyond just ranking - The CWTS Leiden Ranking 2016

Blog archive

This is the archive of our old blog. Please visit our new Leiden Madtrics blog.

Moving beyond just ranking - The CWTS Leiden Ranking 2016



Today at CWTS we are releasing the 2016 edition of our Leiden Ranking. The CWTS Leiden Ranking is a web tool that provides a suite of bibliometric statistics for a large number of research-intensive universities worldwide. We try to design and construct our ranking in accordance with the most recent insights in the field of bibliometrics and scientometrics. New editions of the ranking therefore usually include improvements compared with earlier editions, for instance by refining the methodology, improving the presentation of the statistics, or increasing the number of universities included in the ranking.

In the 2016 edition of our ranking, the main improvement lies in the presentation of the bibliometric statistics. We emphasize a multidimensional perspective on the statistics and focus less on the traditional unidimensional notion of ranking universities. The approach that we have adopted is reflected by the motto of the Leiden Ranking 2016: Moving beyond just ranking.

The traditional approach to university ranking

A traditional ‘league table’ university ranking offers a list of universities ordered by a certain indicator, such as the percentage of highly cited publications produced by a university. The university at the top of the list is at rank 1, the next university in the list is at rank 2, and so on. When such a list-based presentation is used, the performance of a university is typically interpreted in terms of its rank relative to other universities. The performance of a university may for instance be summarized by observing that the university is at rank 250 worldwide and at rank 5 within its own country. Likewise, it may be claimed that the performance of a university has improved because its rank has improved relative to the previous edition of a ranking.

However, interpreting a university ranking exclusively in terms of the ranks of universities imposes some serious problems of interpretation. One problem is that in some cases a quite large difference in ranks corresponds with a very minor difference in the indicator from which the ranks have been obtained. For instance, if we rank the universities in the 2016 edition of the Leiden Ranking based on their percentage of highly cited publications (where a publication is considered highly cited if it belongs to the top 10% most cited publications in its field), we find that the universities at ranks 400 and 425 have only a minor difference in their percentage of highly cited publications. The university at rank 400 has 9.4% highly cited publications, while for the university at rank 425 the percentage of highly cited publications is just slightly lower (9.2%). The figure below shows that for many universities in the Leiden Ranking there are a significant number of other universities with a comparable percentage of highly cited publications. Hence, it is doubtful whether a difference of for instance 25 ranking positions has any practical significance and, for that matter, whether a university at for instance rank 400 can really claim to outperform a university at rank 425.

Blog post

We now turn to a second problem. Interpreting a university ranking exclusively in terms of the ranks of universities places a lot of emphasis on a single indicator, making the interpretation strongly sensitive to peculiarities of this indicator and to special characteristics of a university. This may lead to strange outcomes and confusion. Suppose for instance that we are interested in the scientific performance of universities in the field of physics. When universities are ranked according to their percentage of highly cited publications in the field of physics, a university without a physics department may actually be ranked first. This seems strange and hypothetical, but this situation might actually occur when a university has produced just a few physics publications and these publications happen to be highly cited. Just think of an engineering university where the engineering departments occasionally produce a physics publication. However, when in addition to the university’s percentage of highly cited publications also its absolute number of highly cited publications in the field of physics is taken into account, it would immediately become clear that the university is actually a very minor contributor to the field of physics. This illustrates how confusion may arise when too much emphasis is put on a single indicator. From this point of view, the traditional single-indicator focus of many university rankings is problematic. Obtaining a more balanced and deeper understanding of the performance of universities requires multiple indicators to be taken into account together.

Moving beyond just ranking

As already mentioned, the guiding rationale for implementing changes in the 2016 edition of the CWTS Leiden Ranking is reflected by the Moving beyond just ranking motto. Compared with the 2015 edition, the new edition offers essentially the same information, but the presentation of this information on our website has been significantly revised. In addition to a traditional list-based presentation (list view), the website now also offers a chart-based presentation (chart view) as well as a map-based presentation (map view). The chart view enables comparing the performance of universities using two indicators simultaneously. Moreover, this view emphasizes the actual values of the indicators rather than the ranks implied by these values. The map view facilitates easy comparisons between universities that are geographically close to each other.

The traditional list view has also been revised. Based on feedback from our user community, we have learned that the distinction between size-dependent and size-independent indicators sometimes causes problems. An example of a size-dependent indicator is the number of highly cited publications of a university. The corresponding size-independent indicator is a university’s percentage of highly cited publications. In the case of a size-dependent indicator, larger universities will typically have higher values than smaller universities. Size-independent indicators correct for the size of the publication output of a university and are therefore often used to compare universities that differ in research volume. In the 2016 edition of the Leiden Ranking, size-dependent and size-independent indicators are always reported together in the list view. This emphasizes that both types of indicators need to be taken into account to obtain a good understanding of the scientific performance of universities.

Furthermore, unlike previous editions of our ranking, the 2016 edition no longer uses the percentage of highly cited publications as the default indicator for ranking universities. In the new edition, by default universities are ranked simply based on the size of their publication output. If users would like to rank universities based on their highly cited publications, they need to explicitly choose whether they are interested in a size-dependent or a size-independent ranking, so a ranking based on the number or the percentage of highly cited publications. We expect that this will induce users to reflect more carefully on the choice between the two types of rankings.

User feedback

We hope you will appreciate the changes we have made in the new edition of the CWTS Leiden Ranking. As always, we very much appreciate your feedback, which is essential for future improvements of the ranking. Suggestions and comments can be provided by responding to this blog post or by contacting us directly using this contact form. We are looking forward to hearing from you!

     

     


About Ludo Waltman

Ludo Waltman is professor of Quantitative Science Studies and scientific director at the Centre for Science and Technology Studies (CWTS) at Leiden University. He is a coordinator of the Information & Openness focal area and a member of the Evaluation & Culture focal area. Ludo is co-chair of the Research on Research Institute (RoRI).

About Nees Jan van Eck

Senior researcher, head of data science, and coordinator of the Information & Openness focal area. Nees Jan's research focuses on infrastructures and the development of tools and algorithms to support research assessment, science policy, and scholarly communication.

About Robert Tijssen

Emeritus professor of Science and Innovation Studies at Leiden University. Professor at Stellenbosch University (South Africa). Robert’s current interests focus on contributions of universities and higher education institutions to local or regional socio-economic development.

About Paul Wouters

Paul Wouters is emeritus professor of scientometrics.. Paul is interested in how evaluation systems have developed and are creating new constraints for the development of knowledge. He is also interested in the history of science in general and the role of information systems in these histories in particular.


4 comments

Mandatory fields
  • Harri Hankar February 19th, 2017 5:28 pm
    Looking at citation counts or number of publications, doesn't this completely bias the ranking towards larger universities? Wouldn't it make more sense to normalize by the number of phd students? Otherwise one could simply merge two average universities and obtain a single highly-ranked one?
    I actually wonder why size-dependent indicators are still part of _any_ ranking system... Could anyone clarify this?
    Reply
    • Ludo Waltman February 19th, 2017 5:42 pm
      The perspective of the CWTS Leiden Ranking is that size-dependent and size-independent indicators both provide useful information. It depends on the purpose for which the ranking is used whether one should focus on the size-dependent or the size-independent indicators.
      In the CWTS Leiden Ranking, size-independent indicators are obtained by normalizing by the number of publications of a university. An alternative way of performing a size normalization could be based on the number of researchers of a university or based on the amount of research funding a university has available. However, it is hard to obtain the data required for these alternative normalizations.
      Reply
  • Heikki Holopainen May 27th, 2016 9:19 am
    Thanks for the new improved ranking! Will you be adding industry collaboration in this year's edition as in 2015?
    Reply
    • Ludo Waltman May 28th, 2016 7:52 am
      In this year's edition of the CWTS Leiden Ranking, we have chosen not to include indicators on collaboration with industry. These indicators will be published separately later this year.
      Reply
Share on:
Subscribe to:
Build on Applepie CMS by Waltman Development