In response to the findings published on 8 July 2015 from the Independent Review of the Role of Metrics in Research Assessment and Management , Professor Richard Jones, Pro-Vice-Chancellor for Research and Innovation at the University of Sheffield, said:
"Funding Bodies, Research Councils and Universities all have to make judgments about research quality, and it’s important that those judgments are made transparently and robustly on the basis of good evidence.
"The use of metrics such as article citation counts as indicators of research quality can be helpful, but they need to be used with thought and care. Academic colleagues are rightly concerned that metrics should not be used inappropriately, and prize the in-depth picture that qualitative judgements by experts allow us to paint of our research.
"This very thorough review, which consulted widely in the academic community, gives us a sound basis for the responsible use of metrics in research management. Metrics need to be transparent and carefully chosen, and should always supplement and support expert judgment, rather than replace it.
"Disciplinary differences need to be respected, and potential equality and diversity issues arising from careless use of metrics need to be born in mind. Some metrics - such as the use of journal impact factors to indicate the quality of individual research articles - are deeply flawed and should not be used at all.
"It is important now to ensure that we don’t let the quest for metrics and measurement divert resources and focus away from achieving the impact that we’re all working towards - supporting the excellent researchers in all disciplines who contribute to our world-class UK research system."
Metrics need to be transparent and carefully chosen, and should always supplement and support expert judgment, rather than replace it.
Professor Richard Jones
Professor Richard Jones is one of the expert members of the independent steering group that supported the review.
The University of Sheffield welcomes the findings of the Metrics Review, recognising that the academic community will have more confidence in a system that uses appropriate metrics to inform peer-review of research outputs, as was the case with some Research Excellence Framework (REF) sub-panels in REF 2014, than in a wholly metrics driven assessment system.
It notes the recommendation to use ORCID as the preferred system of unique identifiers, and it is already taking steps to introduce the ability to create and record ORCID identifiers within its internal systems. ORCID is already required by publishers, such as the Nature Group, and by some funders such as Wellcome Trust as a way to unambiguously link individual authors to their outputs and contributions.