Evaluation Practices in Context

The working group Evaluation Practices in Context (EPIC) examines the politics and practices of research evaluation in connection with contemporary forms of governance of research and scholarship.


EPIC combines and contributes to theoretical frameworks and detailed empirical studies from Science and Technology Studies (STS) broadly defined (including scientometrics, and history, sociology and anthropology of science), political science, organizational studies, and higher education studies. The working group pays particular attention to the implications of research assessment, and the performance criteria applied, for scientific and scholarly communication and knowledge production. Important STS perspectives that we build on have demonstrated that ‘science’ and ‘politics’ or ‘knowledge’ and ‘power’ should not be seen as separate spheres of action, but are involved in a constant process of mutual embedding and stabilization. Accordingly, our work analyzes the co-constitution of knowledge in relation to specific epistemic cultures, evaluation systems, publication practices, and governance contexts.

Share this page


Promoting Integrity as an Integral Dimension of Excellence in Research (PRINTEGER) (Funded by the EC through a H2020 COST Action)

Coordinators: Hub Zwart and Willem Halffman (Radboud University Nijmegen)
Consortium partners: Free University Brussels, the University of Tartu (Estonia), Oslo and Akershus University College, Leiden University, and the Universities of Bonn, Bristol, and Trento
EPIC team members: Thed van Leeuwen, Paul Wouters, Sarah de Rijcke
The goal of the project is to encourage a research culture that treats integrity as an integral part of doing research, instead of an externally driven steering mechanism. In order to stimulate integrity and responsible research, new forms of governance are needed that are firmly grounded in and informed by research practice. EPIC contributes to PRINTEGER with a) a bibliometric analysis of ‘traces of fraud’ (e.g. retracted articles, manipulative editorials, non-existent authors and papers, fake journals, bogus conferences, non-existent universities), against the background of general shifts in publication patterns);
b) Two in-depth cases studies of research misconduct, not the evident or spectacular, but more particularly reflecting dilemmas and conflicts that occur in grey areas; c) Task leadership on formulation of Advice for research support organisations, including on IT tools. This task will draw conclusions from the research on the operation of the research system, specifically publication infrastructures; d) The organisation of local advisory panels consisting of five to ten key stakeholders of the project: policy makers, research leaders or managers, support organisations, early career scientists.

Measuring science: The use of metrics in assessing impact, innovation and excellence in modern academia (funded by The Swedish Research Council, 2014-2017)

Björn Hammarfelt & Sarah de Rijcke

This project concerns the governance of science with an emphasis on how research is made auditable through the use of performance indicators. Publication patterns as well as questionnaires are used in order to examine how evaluation influences the practices of scholars. Studies are also made of university rankings and their use in promoting the ‘excellent’ university. In tying the practical, institutional, and political levels together the project aims to provide a unique perspective on the emergence of a ‘metrics culture´ in academia. Using a theoretical stance in theories about the ´audit´ and ´risk´ society it aims to take the analysis beyond discussions about different models of evaluating, or indicators of impact, and to place it in a larger context of social and organizational change. This broader perspective is required in order to understand the co-evolution of governance systems and techniques of measurement; a perspective where ‘metrics’ is not only a symptom but also a practice that in itself contributes to these developments.

Excellent Science: 25 years of science policy for top research

Collaboration with Rathenau Institute (project coordination)
EPIC team members: Thomas Franssen & Sarah de Rijcke

How has the Dutch science policy landscape for excellent research taken shape in the last 25 years? How has this policy contributed to: a) a concentration of research funding in the hands of a limited number of researchers and/or research organisations; b) scientific innovation and break-throughs? EPIC contributes to the study in a sub-project that analyses the research policies and knowledge production practices of 4 ‘excellent’ Dutch research groups in different disciplines (desk research, interviews, observations).

Quality and Relevance of Research in Law, Social Sciences and Humanities. Toward Evidence-Informed Policies (Sept 2015-Sept. 2019)
Consortium University Library – EUR Faculty of Social Sciences - EUR / Centre for Public Knowledge - Centre for Science and Technology Studies –Leiden University

EPIC team members co-supervise a PhD project with prof. Willem Schinkel at EUR (Paul Wouters, Sarah de Rijcke). Collaboration with CWTS working group leaders Thed van Leeuwen and Ingeborg Meijer.

How can different research fields be characterised in terms of their scientific quality and societal impact? Research fields within different domains (Law, Social Sciences and Humanities) will be analysed based on their drivers and barriers for knowledge production and societal impact, the different forms in which they arrive at research results and achieve societal relevance, their scientific environment, their group organisation and their ability to adopt multidisciplinary approaches. By including evidence from mixed-methods research, the research will lead to a broader understanding of the different forms of knowledge production within diverse domains and of the different ways in which research groups organize their socio-epistemic tasks.

Quantifying academic lives: Practices of (self-)evaluation under 'excellent' conditions (2014-2015)
Thomas Franssen & Sarah de Rijcke

Practices of (self-)evaluation of junior researchers have become increasingly based on quantified measures of worth such as publication counts, impact factors and h-index. Based on ethnographic and interview data at an internationally acclaimed Dutch social scientific research group, this project focuses on the practices of evaluation that junior researchers experience and the ways in which researchers relate to and use performance measures in academic identity formation, and in the process of positioning themselves in the academic labor market.

Defining standards of intellectual quality in Dutch legal scholarship (2014-2015)

Wolfgang Kaltenbrunner & Sarah de Rijcke

This project draws on a comparison of ongoing debates about research evaluation in three Dutch Law faculties. The project aims to analyse the implications of the need to create new alignments between research and science policy on the one hand (including the requirement of (‘societal relevance’), and the everyday articulation work legal scholars engage in to bring research to closure on the other.

The impact of indicators: How evaluation shapes biomedical knowledge production (2012-2015)

Alex Rushforth & Sarah de Rijcke

In this exploratory ethnographic study we have analyzed how performance indicators and research evaluation link to biomedical knowledge production in the Netherlands. Fieldwork was carried out at two University Medical Centres (UMCs) and at three research groups within each centre: a molecular cell biology laboratory, a surgical oncology laboratory, and a medical statistics group. Conducting fieldwork in two UMCs enabled a focus on how the institutional context shapes the dynamics of evaluation and indicator usage. In addition, we identified three broad registers of biomedical knowledge production, following institutionalized distinctions between basic, translational and applied research at each UMC. This sampling logic was also based on an assumption that different sub-fields of biomedicine pursue quite distinct patterns of knowledge production, for instance in terms of publication and citation practices. Theoretically, the project draws on and contributes to historical, organisational, and sociological studies of accountability practices in science.


To convey your interest, suggest a talk, or if you have any questions, please e-mail Sarah de Rijcke (working group leader EPIC) at s.de.rijcke@cwts.leidenuniv.nl.


Sarah de Rijcke

Associate professor and deputy director of CWTS, and coordinator of the Evaluation Practices in Context (EPIC) working group. Her research focuses on the growing use of assessment procedures and bibliometric indicators in scientific and scholarly research, and the effects on knowledge production.

Thomas Franssen

Postdoctoral researcher. Thomas is interested in evaluation and quantification practices. Inspired by cultural sociology as well as science and technology studies, Thomas analyzes the ways in which researchers are evaluated and evaluate themselves, and the (numerical) devices that are used to do so.

Bj�rn Hammarfelt

Visiting postdoctoral researcher. Björn’s current research focuses on evaluation systems and university rankings as examples of an emerging “metric culture” in academia. He is also interested in the use, and misuse, of bibliometric measures on the humanities.

Thed van Leeuwen

Senior researcher and coordinator of the Social Sciences, Humanities & Law (WISSH) working group. Thed's research focuses on the development of indicators for the research assessment of scholarly activity in the social sciences, the humanities, and law. He is co-editor the OUP journal Research Evaluation.

Alex Rushforth

Postdoctoral researcher. Alex's research concerns how governance changes impact modern science. At present this has taken him to explore the effects that formal evaluations and new metric indicators are having on research in academic institutions.

Clifford Tatum

Researcher and member of the Science and Evaluation Studies (SES) working group. Clifford’s research is focused on Open Science in relation to emerging evaluation practices and research information systems.

Paul Wouters

Director of CWTS and professor of Scientometrics. Paul is interested in how evaluation systems have developed and are creating new constraints for the development of knowledge. He is also interested in the history of science in general and the role of information systems in these histories in particular.

Build on Applepie CMS by Waltman Development