Professional standards in bibliometric research evaluation? A meta-evaluation of European assessment practice 2005-2019.
Despite growing demand for practicable methods of research evaluation, the use of bibliometric indicators remains controversial. This paper examines performance assessment practice in Europe-first, identifying the most commonly used bibliometric methods and, second, identifying the actors who have d...
Main Author: | |
---|---|
Format: | Article |
Language: | English |
Published: |
Public Library of Science (PLoS)
2020-01-01
|
Series: | PLoS ONE |
Online Access: | https://doi.org/10.1371/journal.pone.0231735 |
id |
doaj-9234fefe4397470490f54125aca89799 |
---|---|
record_format |
Article |
spelling |
doaj-9234fefe4397470490f54125aca897992021-03-03T21:41:43ZengPublic Library of Science (PLoS)PLoS ONE1932-62032020-01-01154e023173510.1371/journal.pone.0231735Professional standards in bibliometric research evaluation? A meta-evaluation of European assessment practice 2005-2019.Arlette JappeDespite growing demand for practicable methods of research evaluation, the use of bibliometric indicators remains controversial. This paper examines performance assessment practice in Europe-first, identifying the most commonly used bibliometric methods and, second, identifying the actors who have defined wide-spread practices. The framework of this investigation is Abbott's theory of professions, and I argue that indicator-based research assessment constitutes a potential jurisdiction for both individual experts and expert organizations. This investigation was conducted using a search methodology that yielded 138 evaluation studies from 21 EU countries, covering the period 2005 to 2019. Structured content analysis revealed the following findings: (1) Bibliometric research assessment is most frequently performed in the Nordic countries, the Netherlands, Italy, and the United Kingdom. (2) The Web of Science (WoS) is the dominant database used for public research assessment in Europe. (3) Expert organizations invest in the improvement of WoS citation data, and set technical standards with regards to data quality. (4) Citation impact is most frequently assessed with reference to international scientific fields. (5) The WoS classification of science fields retained its function as a de facto reference standard for research performance assessment. A detailed comparison of assessment practices between five dedicated organizations and other individual bibliometric experts suggests that corporate ownership and limited access to the most widely used citation databases have had a restraining effect on the development and diffusion of professional bibliometric methods during this period.https://doi.org/10.1371/journal.pone.0231735 |
collection |
DOAJ |
language |
English |
format |
Article |
sources |
DOAJ |
author |
Arlette Jappe |
spellingShingle |
Arlette Jappe Professional standards in bibliometric research evaluation? A meta-evaluation of European assessment practice 2005-2019. PLoS ONE |
author_facet |
Arlette Jappe |
author_sort |
Arlette Jappe |
title |
Professional standards in bibliometric research evaluation? A meta-evaluation of European assessment practice 2005-2019. |
title_short |
Professional standards in bibliometric research evaluation? A meta-evaluation of European assessment practice 2005-2019. |
title_full |
Professional standards in bibliometric research evaluation? A meta-evaluation of European assessment practice 2005-2019. |
title_fullStr |
Professional standards in bibliometric research evaluation? A meta-evaluation of European assessment practice 2005-2019. |
title_full_unstemmed |
Professional standards in bibliometric research evaluation? A meta-evaluation of European assessment practice 2005-2019. |
title_sort |
professional standards in bibliometric research evaluation? a meta-evaluation of european assessment practice 2005-2019. |
publisher |
Public Library of Science (PLoS) |
series |
PLoS ONE |
issn |
1932-6203 |
publishDate |
2020-01-01 |
description |
Despite growing demand for practicable methods of research evaluation, the use of bibliometric indicators remains controversial. This paper examines performance assessment practice in Europe-first, identifying the most commonly used bibliometric methods and, second, identifying the actors who have defined wide-spread practices. The framework of this investigation is Abbott's theory of professions, and I argue that indicator-based research assessment constitutes a potential jurisdiction for both individual experts and expert organizations. This investigation was conducted using a search methodology that yielded 138 evaluation studies from 21 EU countries, covering the period 2005 to 2019. Structured content analysis revealed the following findings: (1) Bibliometric research assessment is most frequently performed in the Nordic countries, the Netherlands, Italy, and the United Kingdom. (2) The Web of Science (WoS) is the dominant database used for public research assessment in Europe. (3) Expert organizations invest in the improvement of WoS citation data, and set technical standards with regards to data quality. (4) Citation impact is most frequently assessed with reference to international scientific fields. (5) The WoS classification of science fields retained its function as a de facto reference standard for research performance assessment. A detailed comparison of assessment practices between five dedicated organizations and other individual bibliometric experts suggests that corporate ownership and limited access to the most widely used citation databases have had a restraining effect on the development and diffusion of professional bibliometric methods during this period. |
url |
https://doi.org/10.1371/journal.pone.0231735 |
work_keys_str_mv |
AT arlettejappe professionalstandardsinbibliometricresearchevaluationametaevaluationofeuropeanassessmentpractice20052019 |
_version_ |
1714815663384559616 |