You can use bibliometric indicators to monitor dissemination and engagement with your research output.
Key principles of responsible metrics
The key principles of responsible metrics, as defined by the UK Forum for Responsible Research Metrics:
- Robustness: basing metrics on the best possible data in terms of accuracy and scope
- Humility: recognising that quantitative evaluation should support, but not supplant, qualitative, expert assessment
- Transparency: that those being evaluated can test and verify the results
- Diversity: accounting for variation by research field, and using a range of indicators to reflect and support a plurality of research and researcher career paths across the system
- Reflexivity: recognising and anticipating the systemic and potential effects of indicators, and updating them in response
How to assess articles for your research
- Assess metrics in relation to a research output, not the researcher or the publication it is part of.
- Consider metrics in relation to the context of the article, including factors such as career stage, gender, language of publication and date of publication.
- Be careful comparing across disciplines, because they have different publishing practices. Not all subjects publish research in journal articles, so citation counts and impact factors are less relevant. For example:
- Some research in Computer Science and Economics are made available as conference papers
- Much of the research in History and English is published in books
- Research in the Social Sciences is sometimes made available via discussion papers and reports
- Be aware that bibliometrics are only as good as the databases they use:
- no database is complete or 100% accurate
- providers make judgements when deciding what to include in their datasets.
- Don’t assume that the very best articles only appear in the high impact journals. High ranking journals contain low performing articles and vice versa.
Make sure metrics reflect the reach of your work
- Check the Kent Academic Repository to make sure all your research works are recorded there correctly.
- Maximise metrics, by maximising the visibility of your research.
- Make your work Open Access as soon as possible.
- Include Open Data reporting/references in the article.
- Register for and use an ORCID iD to ensure consistent, reliable attribution of work.
- Use a mixture of metrics and qualitative evidence; metrics are not yet at the stage where they can replace peer review or analysis of an output. Using the two in conjunction presents an accurate picture of your work.
- Provide quantitative data in context and, where possible, appropriately normalised scores, can give a better picture of what the number reflects. For example:
- a ‘3’ is meaningless on its own
- a ‘3’ in a field of 100 ‘20s’ is a vastly different thing to a ‘3’ in a field of 100 ‘0.1s’.
More about responsible metrics
If you have any queries, email firstname.lastname@example.org
Find out all the ways you can get in touch: