Skip to main content

Guides

Page title

Making your research count

On this page

There are a number of ways you can make sure that your research is disseminated across the research community. This will help to build your reputation as a researcher, promote your work and increase your own citation rate. 

  • Use a unique identifier to link all of your output. All of the tools that measure research output need ways to identify both items and people.  Library Services advises that you register for an ORCID iD. Make sure that you mention the University of Worcester in the author affiliation.
  • Make all of your publications OpenAccess: Contact Library Services for advice.
  • Use social media tools to promote your research; this can be a good way to engage in discussions with other researchers about your work, find collaborators, ideas for funding etc. It can also generate interest from non-academic quarters.
  • Consider carefully where you submit your articles; the getting published page will tell you more.

 

Measuring the impact of your publications

There are two main types of metrics for measuring the impact of your work.  Bibliometrics are based on citation analyses, and altmetrics are based on online activity.  Understanding these measures can help you decide where to publish and how to assess important research in your subject area.  Care should always be taken in the interpretation of impact measures.  Results for the same article/author will vary depending on the application used. Always check the formulation, which should be available on each website.  Nothing as yet measures quality better than peer review and expert judgement. 

 

Bibliometrics

Bibliometrics is the quantitative analysis of research, policy and organisational literature, based upon citations at either journal or article level.  Bibliometrics can be used to evaluate the impact of a research paper, an individual researcher, a research group or institution, or a journal. You can find out how and where your article has been cited by searching for it or yourself in Scopus, Web of Science and other databases.

Bibliometrics is a controversial area because of the nature of the interpretation.  Do remember that bibliometrics measure quantity rather than quality.  A highly cited work may be negatively referenced.  It is important to be aware of how a particular bibliometric is compiled, especially which journals are indexed, as results will vary. 

Citation patterns vary between subject areas; some will cite the same output more frequently than other disciplines. This makes comparison between disciplines very difficult.  Journal article citations favour scientific output rather than the arts or humanities.  Not all research areas are covered comprehensively, and some of the tools only index material published by related companies.  Gaming of citations can distort the data. Gaming includes inappropriate self-citation, citing colleagues and dividing output between multiple articles. Some tools now allow you to exclude self-citations.  Since citation data is developed over lengthy timeframes, established researchers with multiple outputs are favoured.  Some tools allow you to filter the results by year.

Generally these types of bibliometrics are primarily based on journal articles and have a long lead time.  The most commonly used measures are these:

Metric Description
H-Index The h-index is an index to quantify an individual’s scientific research output. It was devised by J E Hirsch.  An H-index of 15 indicates that 15 papers have been cited at least 15 times. This can be calculated for either an individual or a journal title. Google Scholar uses H-indexes to rank journals.
Journal Impact Factor Indicates the importance of a particular journal. It is based on the average number of citations received per paper published in that journal in the preceding two years. Indexing is provided by Web of Science.
CiteScore

Indicates the importance of a particular journal. It is an alternative to the Journal Impact Factor. CiteScore is simple and is based on the average citations received per document. CiteScore is the number of citations received by a journal in one year to documents published in the three previous years, divided by the number of documents indexed in Scopus published in those same three years.

 

Altmetrics

Altmetrics are the study and use of scholarly impact measures based on activity in online tools and environments.  It does not measure general web presence or activity, but rather the attention received in particular social media tools.

Altmetrics allow you to see online discussions about your research, to find similar research and to gauge a more immediate response to your research than might be obtained through bibliometrics.

There are several tools that display altmetrics, including some publishers' websites.  The altmetric bookmarklet lets you view the online shares and mentions of your articles.  PlumX metrics are found in Scopus and other databases, such as Academic Search Complete. ImpactStory is a platform which makes research "more open, accessible, and reusable". It comes from the creators of Unpaywall. Kudos is another tool which allows you to explain your research in plain English and then monitor the activity created on the various social media platforms. Registration is free for researchers.

Like any metrics, care should always be taken in the interpretation.  Results for the same article/author will vary depending on the application used.  Always check the formulation, which should be available on each website.  Nothing as yet measures quality better than peer review and expert judgement.  Something can be mentioned many times on social media because it is seen as amusing or unusual rather than of high quality.