FAQs and Usage Abnormalities with NC LIVE Usage Reports

Below are some common questions about NC LIVE usage reports. A full list of Data Definitions (i.e., what does FT-views mean for different resources) is also available.  

What resources don’t we have stats for?

ABC-CLIO, hoopla audiobooks and ebooks (consortial stats only, no library level stats), ProQuest Research Companion. No consortial reports are available. Statistics can be downloaded by individual libraries for all of these resources except for hoopla. We are working with hoopla on getting consortial stats functionality. 

ProQuest Statistics Updated in NC LIVE Reports from Jan 2020- Nov 2022

  • Searches for ProQuest databases using the COUNTER 5 DRD1 report will now be calculated using Searches_Federated + Searches_Regular. We used to report on just Searches_Regular. We've rerun the NC LIVE reports from January 2020 to present to include this new search data. 
  • We will start calculating usage for ProQuest databases in a parent-child databases as: 
    • Parent databases- Searches_Federated + Searches_Regular = NC LIVE Searches 
    • Child databases- Total Item Requests = NC LIVE Full-Text Views
    • Child databases- Total Item Investigations = NC LIVE Abs+Full-Text Views 
  • Usage from children databases in ProQuest parent-child relationships will be reported under the parent name in our reports.
  •  In the past, we were mistakenly only collecting search stats for parent databases, so there were several ProQuest databases with zero full-text views usage in our previous reports. We have rerun the stats from January 2020 to present, and the following databases will now have usage for Full-Text Views and Abs+Full-Text Views in NC LIVE usage reports. 
    • ABI/INFORM Collection
    • Accounting Tax & Banking Collection
    • Agricultural & Environmental Science Collection
    • Asian & European Business Collection
    • Business Market Research Collection
    • Canadian Business & Current Affairs Database
    • Career & Technical Education Database
    • India Database
    • International Newsstream
    • U.S. Newsstream 
  • Because we reran ProQuest stats and started reporting full-text views for the parent-children databases listed above, the full-text views have increased for ProQuest resources for all libraries from January 2020 to present in NC LIVE's reports.  
  • We have started collecting usage stats for The Washington Post and The Wall Street Journal. Usage has been collected from January 2020 to present. 
  • We restarted collecting stats for the Statistical Abstract of the US. Usage has been collected from January 2022 to present. 

EBSCO lower-than-normal usage for August 2022

From EBSCO: "Beginning August 8th and extending until its resolution on August 19th, we experienced an issue impacting the recording of usage activity in our products, including EBSCOhost and EBSCO Discovery Service. Any report that you generate or receive will show a decrease in reported totals across all fields (i.e. sessions, searches, etc.) during that date window. This affects both our COUNTER reports, as well as our Standard Usage reports, for all profiles utilizing our classic experiences."

Why are abstract numbers lower with COUNTER 5 than COUNTER 4 reports?

NC LIVE used Result Clicks for Abstracts in COUNTER 5, and we use Total Item Investigations for Abstracts plus Full Text Views in COUNTER 5. “Result clicks (COUNTER 4 metric) is defined as a click resulting from search results. This can include article links, ILL, catalog links in the search results. Total item investigations (COUNTER 5 metric) is defined as a retrieval (viewing, downloading, printing, emailing items – whether abstract or full-text).”

COUNTER 5 Usage Reports

Starting with January 2020 reports, NC LIVE is using COUNTER 5 reports for Gale and ProQuest, instead of COUNTER 4 reports. We are using the COUNTER 5 DR D1 (Database Search and Item Usage) Report.

Gale
We are using COUNTER 5 reports for all Gale resources.* 
NC LIVE Searches= COUNTER 5 Regular Searches
NC LIVE Full-Text Views= COUNTER 5 Item Requests
NC LIVE Abs Plus Views=  COUNTER 5 Total Item Investigations
 
*Gale resources not using COUNTER 5: TERC
 
ProQuest
We are using COUNTER 5 reports for all ProQuest resources.**
NC LIVE Searches= COUNTER 5 Regular Searches
NC LIVE Full-Text Views= COUNTER 5 Item Requests
NC LIVE Abs Plus Views=  COUNTER 5 Total Item Investigations
 
** ProQuest resources not using COUNTER 5: Historic NC Digital Newspapers Collection, Ebook Central, Sanborn Maps
 

For more information about COUNTER 5 Item Requests vs. Total Item Investigations, please see: https://www.projectcounter.org/release-5-understanding-investigations-and-requests/

Can my library see our vendor reports?

Any library is welcome to access usage statistics directly from vendors, including title-level and other detailed reports not included in the NC LIVE usage reports. More information is available here

How is NC LIVE tracking usage statistics to make decisions?

The Resource Advisory Committee (RAC) reviews usage data reports to make resource purchasing recommendations to the NC LIVE Librarians Council. Vendor-reported usage statistics are just one measure the RAC uses to make resource selection decisions. Other data sources are reviewed to get a holistic picture of member library needs including member library feedback, discovery service statistics, available funding, and resource cost.

Why is usage for Data-Axle (ReferenceUSA) inflated during the years 2016-2020? 

Some libraries experienced high amounts of usage of Data-Axle (formerly known as ReferenceUSA) from 2016-2020 due to illegal downloading. These high Data-Axle usage numbers inflates the library's usage during those years, but also consortia usage and COI usage. NC LIVE is working on normalizing the data from these years and updating usage statistics on the website. 

Archived FAQs

Auto Repair Reference Center Statistics

Starting with February 2019 reports, EBSCO changed the way that Auto Repair Reference Center usage statistics are calculated. January 2019 and prior, data was calculated in a way that attributed multiple sessions to different vehicle makes and models within Auto Repair Reference Center. Now, EBSCO is de-duplicating session IDs, which represents unique user sessions, in a more consistent manner. This inevitably results in a perceived decrease in Auto Repair Reference Center usage compared to previous months. However, the de-duplication ensures a true and consistent number of unique user sessions by allotting one "session" per user login, regardless of the amount of content accessed within the interface once logged in.

Ebook Central - Ebrary Comparison

Ebrary and Ebook Central usage statistics are most accurately compared by using Ebrary “Sessions” and Ebook Central “FT-Views” as reported on NC LIVE’s usage reports.

Ebrary data listed on NC LIVE reports under “Sessions” was reported on Ebrary reports as “User Sessions”. Ebrary data listed on NC LIVE reports as “FT-Views” and “AbsPlusViews” was reported on Ebrary reports as “Full-Title Downloads”.

The Ebook Central “Usage Report” lists one row for each time an ebook is accessed, regardless what actions, reading, downloading, printing, etc. were taken in the ebook. The sum of these rows, by library, shows how many times ebooks were accessed in Ebook Central and is reported on NC LIVE reports as “FT-Views” and “AbsPlusViews”.

Ebook Central does not provide a user sessions report or a more accurate full-text metric. Ebook Central does provide COUNTER Book Report 2 reports. However, this report contains "the sum of Pages Viewed, Pages Printed, Pages Copied, Chapter Downloads, and Full Downloads". The inclusion of page metrics in this summation creates inflated view of full-text usage, so NC LIVE does not use it.

HeritageQuest Statistics

Due to a technical issue on Ancestry's side, the session count data in customer COUNTER Database reports is incorrect for the time period of December 10, 2018 through January 22, 2019. These incorrect session counts overestimate the number of sessions during that time period for both Ancestry Library Edition and HeritageQuest Online.

As of January 23, 2019, Ancestry has fixed this technical issue and the session counts from January 23 forward are correct and accurate. Unfortunately, Ancestry has advised that corrected session counts for the time period in question cannot be provided or recovered.

Effective November 2017 HeritageQuest changed the customer hierarchy reporting methodology used to measure library usage on HertiageQuest Online and Ancestry Library Edition. No change was made to the information displayed in the reports or the usage recorded. The methodology change may result in lower usage counts for some libraries due to the new methodology using a revised parent/child hierarchy to ensure usage is only counted once. Usage data for October 2017 and prior uses the old methodology, which may result in higher usage counts. Usage data from November 2017 to present uses the new methodology HeritageQuest implemented to more accurately represent usage and avoid duplication of usage.

Infobase - Ferguson’s Career Guidance Center and Films on Demand Statistics

Infobase underwent a COUNTER certification review with LibLynx that involved an extensive review of how usage is logged by Infobase. This resulted in several reporting changes that are reflected in NC LIVE reports for Ferguson’s Career Guidance Center and Films on Demand starting January 1, 2019. Overall, lower usage should be expected for Ferguson’s and a slight increase in usage for Films on Demand.

Searches in both platforms will be lower post January 1, 2019. Previously going to additional pages of search results via pagination links or load more buttons registered additional searches. These actions are no longer recorded as searches.

For Ferguson’s index pages and other browsable pages were inaccurately logging record views when these pages were accessed.  No actual content is viewable on these pages and therefore these events should not have been counted. Additionally, page tool access were incorrectly logged as record views.  Only tools derived around downloading or printing in certain circumstances may log record views or multimedia views. Lastly, when two requests were made for the same article within specified time limits (10 seconds for HTML, 30 seconds for PDF), the first request should be removed and the second retained. Any additional requests for the same article within these time limits should be treated identically: always remove the first and retain the second.  Our legacy reports had no mechanism for filtering out these types of requests. Starting January 1, 2019, these changes will likely result in noticeably lower ‘full-text views’ in NC LIVE reports for Ferguson’s.

For Films on Demand, InfoBase discovered that usage was undercounted in some places.  For example, watching a video via the video preview page tool was not registering record views and now it does. Starting January 1, 2019, this may result in higher ‘full-text views’ in NC LIVE reports for Films on Demand.

ProQuest Search Statistics

Total ProQuest searches shown on NC LIVE reports is the deduplicated number of federated searches performed across the entire ProQuest platform and listed on NC LIVE reports as “ProQuest Unique Searches” plus searches from SIRS Knowledge Source and Statistical Abstract of the United States searches. ProQuest COUNTER Database 1 Report is used by NC LIVE to compile these statistics.

Neither SIRS Knowledge Source nor the Statistical Abstract of the United States are included in the ProQuest federated search and are therefore not included in “ProQuest Unique Searches”.

All other searches in ProQuest are preformed across multiple platforms as a federated search and are deduplicated for accuracy. For example, when a user performs a search in the Psychology Database it also counts as a search in each of the individual databases across the ProQuest Platform, such as the Science Database, U.S. Newsstream as well as all other ProQuest databases. To avoid inflating statistics, search statistics across ProQuest are deduplicated which removes the count of the same search being performed in each ProQuest database. These deduplicated searches are labeled as "ProQuest unique searches", which is included in the ProQuest search totals.

The ProQuest administrative portal offers Database Activity reports in addition to COUNTER reports. Usage statistics may differ between these two reports due to ProQuest measuring searches different than is required by COUNTER standards.

Testing & Education Reference Center - Learning Express Comparison

Given the disparity in full-text data reported by Learning Express and Testing & Education Reference Center (TERC) it is recommended to evaluate “sessions” when comparing these two resources even though “sessions” for either resource does not fully capture user activity within the resource.

TERC data listed on the NC LIVE reports under “FT-Views” and “AbsPlusViews” are measured on TERC reports as “retrievals” which are the number of times a PDF book was opened within TERC.  TERC data listed on the NC LIVE reports under “Sessions”, are measured on TERC reports as “sessions” and are the number of times users log in, regardless of what actions they take after logging in. Neither TERC metric, “retrievals” nor “sessions”, fully captures user activity as many resources are accessed without downloading a PDF, and users may view more than one resource during each session.

Learning Express reported “recorded sessions” which shows in NC LIVE data under “FT-Views” and “AbsPlusViews”.  “Recorded Sessions” is the sum of: tests added, eBooks added, tutorials added, computer tutorials added. Learning Express “User Sessions” are shown on NC LIVE reports as “Sessions”.

Why do 2015 usage reports have very different numbers as compared to 2014?

Beginning in January 2015, NC LIVE began creating library usage reports using COUNTER-compliant vendor data when available. This change was made to allow for more consistent and comparable usage definitions across resource vendors. NC LIVE provided updated usage data definitions that detail which vendor reports are used and what constitutes a search, session, full-text view, and full-text view plus abstract view for each NC LIVE resource.

This change, combined with discovery service changes made in 2015, means that reliable direct comparisons cannot be made between resources NC LIVE licensed in 2012-2014 (such as EBSCOhost research databases), and different resources licensed in 2015-2017 (such as ProQuest research databases). This is because the usage definitions (i.e. what constitutes a search, session, full-text view, or full-text view plus abstract) for different vendors are not necessarily the same.

Why are my library’s search numbers extremely low compared to 2014?

Differences in the way that NC LIVE’s previous discovery service (EDS) and current discovery service (Summon) affect search totals in individual databases accounts for the very different search totals some libraries see for some resources in 2015. Where EDS would search every database in a profile every time a search or search refinement was performed, the structure of Summon does not result in as many instances of all databases being searched at once. For some libraries this greatly reduced the number of searches reported.

Because of the impact of discovery on search totals, libraries should be cautious about adding up searches across different databases to report as a single search total. Because one user’s query may register as a search for multiple databases, adding up those numbers will count one query multiple times.

Additionally, NC LIVE usage reports only show searches for individual libraries across our databases, but do not include Summon searches. We do not currently have a reliable method to differentiate Summon users by institution, so those search numbers are omitted from usage reports.

Does this mean usage reports are not useful for comparing ‘old’ and ‘new’ resources?

It is always difficult to compare usage of one vendor’s resource to another. Factors including discovery methods, vendor data definitions, and local integrations can mean a comparison between two resources from different vendors is not informative---like comparing “apples to oranges.”

However, comparing usage of a resource to itself over time with reliable and consistent data can be very useful. The change to COUNTER-compliant vendor data when available ensures that this type of comparison is reliable, and could even allow for more accurate cross-vendor comparisons in the future.

Which resources have the same data definitions for both 2014 and 2015?

The following resources use the same measures both before and after 2015:

  • ABC-CLIO

  • Alexander Street Press

  • Chadwyck-Healey

  • CQ Researcher and CQ Weekly

  • eBooks on EBSCOhost

  • HeritageQuest

  • LearningExpress Library

  • Morningstar

  • MyiLibrary

  • NC LIVE Video Collection

  • SimplyMap

The following resources were available before 2015, but the data definitions used to measure them were changed in 2015:

  • Gale Infotrac Newsstand - changed to COUNTER data

  • Gale Virtual Reference Library - changed to COUNTER data

  • Wall Street Journal - changed to COUNTER data ***this resource is now a title in ABI-INFORM and title-level data is available via the ProQuest Administrator Module***