Digital resources in the Social Sciences and Humanities OpenEdition Our platforms OpenEdition Books OpenEdition Journals Hypotheses Calenda Libraries OpenEdition Freemium Follow us

It’s the system that counts

You would expect it to be simple: when somebody downloads a book from the OAPEN Library, the system adds one to the total number of downloads. After a while you put the numbers in a report, and share it with the world. Sadly, the reality is more complex. All the books and chapters can be downloaded by everybody, including automated processes (bots). Also, if you think as downloads as a measure of impact, it becomes tempting to inflate it by downloading a certain book again and again.

So, the raw download numbers need to be filtered, in order to give a more realistic indication of the true impact. Many libraries use the COUNTER Code of Practice as standard, which enables them to compare the data from different sources. However, many online platforms measure their visitors using Google Analytics. The OAPEN Library uses both (but we only report the COUNTER data). Together with the migration to a new platform, a new version of the COUNTER reporting (Release 5) was introduced. A good moment to compare Google Analytics (GA) with COUNTER Release 5 (R5).

Comparing the amounts of monthly downloads is simple: where GA reports over 1 million downloads per month, R5 stricter filter lets it report around 400,000 downloads. Again, when we look at the details, the reality is more complex. For instance, comparing the number of downloads per country shows large differences for the USA, France, China and Russia. In contrast, the numbers for Australia, Canada and Austria are virtually the same. When we compare the usage data of each title, the differences are even less simple to explain. You would expect that both GA an R5 more or less agree about the order of books: which book was downloaded the most, which one comes after that etc. But that is very much not the case.

GA and R5 have made their own choices on what is reported and what not. One metric is not better than the other, but we should be open about the choices made. After all, open access book metrics are complicated and we can only benefit from clarity.

More details about usage data and the two systems can be found in:

Ronald Snijder, “Open access book usage data – how close is COUNTER to the other kind?,” Insights 34 (1): 9. (2021), https://doi.org/10.1629/uksg.539.
Submitted on 11 November 2020 and published by UKSG in association with Ubiquity Press on 28 April 2021

You might also be interested in the OAeBU DataTrust Pilot or this OBP blog. Things get even more complex when you try to compare different platforms…