Skip to main content

Research Impact: Use citation metrics to assess impact

Get strategies to increase the reach of your work and measure your research impact

Citation metrics track and measure citations to provide an indication of the impact of a publication. A range of metrics are available that measure citations from different sources and are calculated in different ways. However, all metrics have limitations. For example, citations may be made in the context of either positive or negative attention and individual metrics only measure citations from a limited range of sources. It is important to be aware of these limitations and to understand exactly what is being measured.

Some key things to note about citation metrics are:

  • Metrics are a complement to, not a replacement for, peer review and expert opinion when making research management decisions
  • Use multiple metrics and multiple sources where possible to help answer a question more confidently
  • Select whichever metrics will add value to your decision making in any particular situation (Elsevier, 2014)

Scopus or SciVal?

Scopus and SciVal use the same Elsevier data. Scopus has greater historical coverage, but SciVal offers advanced analytical functions.

Use Scopus to find up-to-date Elsevier metrics for a single publication or author, or for journals

Use SciVal if you need to track citations from Elsevier at scale or over time.

Common metrics

Metric Explanation Sources Differences between sources Limitations and things to note
Citations per output

Measures how many citations a single publication has received within a specified source.


Web of Science

Google Scholar

Different sources provide different citation counts, depending on their coverage.

No single source lists all publications or all citations. 

Measures how much a publication has been discussed in the academic literature.

Citations can be accrued by poor quality or controversial findings.

Researchers cite differently in different fields. You should only compare raw citation counts for publications in similar fields.

Journal and conference level metrics

Considers the citation performance across all articles within a journal or conference, over a specified period of time within a specified source.

Journal and conference level metrics include:

  • SNIP, SJR and CiteScore in Scopus
  • Journal Impact Factor (JIF) in Web of Science
  • h5-index in Google Scholar


Web of Science

Google Scholar

Different sources will provide different metrics, values and rankings.

Not all journals and conferences will be listed in all sources.

Conferences are usually better covered in Google Scholar.

Represent overall past citation performance; poor predictors of citations performance of future articles.

Based on mean citations; a small number of highly cited papers can misleadingly inflate metrics.

Researchers cite differently in different fields. You should check to see if the metric you use is “field normalised” i.e. takes the field of the journal into account.

Book metrics A minority of published books are indexed in Scopus and Web of Science. Use in complement with other measures of quality when assessing impact.


Web of Science

Google Scholar
Different sources will provide a different citation count depending on their coverage. 

Limited coverage of non-English language publications.

Very strong coverage of the sciences. Weaker coverage of arts and humanities disciplines.


A researcher with an index of h has published h papers, each of which has been cited h times, over a specified period of time, within a specified source.

Google Scholar also calculates the i10-index.

We recommend building your profile via ORCID before attempting to calculate your h-index.


Web of Science

Google Scholar

Different sources will provide a different h-index depending on their coverage.

Should not be used in isolation: expert opinion and peer review are critical to place values in context.

Researchers cite differently in different fields. You should only compare researchers in similar fields and at similar stages of their careers. 

Other metrics See Compare metrics for a more detailed list

Need help? Get in touch with your Academic Liaison Librarian or the Research Portfolio


Read more

Cochran A. (2017, February 8). How many grains of salt must we take when looking at metrics?

Davis, P. (2017, May 15). Citation performance indicators – a very short introduction.

Economic and Social Research Council (2017). Impact toolkit.

Elsevier (2014). Snowball metrics: global standards for institutional benchmarking.

Elsevier (2017). Snowball metrics: standardized research metrics – by the sector for the sector