From: JJE Esposito <[log in to unmask]>
Date: Tue, 19 Nov 2019 20:36:44 -0500

How are people taking into account the amount of usage that takes place at Sci-Hub and ResearchGate, or does that not matter? (Roger Schonfeld calls this "leakage.")

Joe Esposito

On Tue, Nov 19, 2019 at 7:48 PM LIBLICENSE <[log in to unmask]> wrote:
From: Dmitri Zaitsev <[log in to unmask]>
Date: Tue, 19 Nov 2019 00:21:49 +0000

Dear Ted,

If the faculty is the main user, what about letting them rate journals by importance and subsequently subscribe to the most requested ones? Your faculty is probably the best positioned to evaluate individual journals' quality, will appreciate being asked for their opinion, and library will have more evidence to justify the use of their funds.

I would be curious to hear thoughts about it.

Dmitri

-- 
Dmitri Zaitsev
School of Mathematics
Trinity College Dublin

WWW:  http://www.maths.tcd.ie/~zaitsev/

On Tue, Nov 19, 2019 at 12:06 AM LIBLICENSE <[log in to unmask]> wrote:
From: Ted Bergstrom <[log in to unmask]>
Date: Sun, 17 Nov 2019 13:10:17 -0800

Negotiations between Elsevier and the University of California system over open access and pricing seem to have reached a stalemate, and the UC no longer has the Elsevier Big Deal.   Currently,  no UC campus  subscribes to any Elsevier journals. If the UC chooses not to reenter the Big Deal, the UC campus libraries will probably find it worthwhile to subscribe to some Elsevier journals.  Which ones should they choose?      

A UCSB student, Zhiyao Ma, and I are developing a little tool that we hope will  help UC librarians in  making cost-effective selections of Elsevier journals for subscription.  The UC has   download statistics for each Elsevier journal at each  of its campuses.  Elsevier posts a la carte subscription prices for each of its journals.  Our tool allows one to select a cost per download threshold and obtain a list of journals that meet this criterion, along with their total cost.  It also allows for  separate thresholds to be used for different disciplines.  You can check out the current version at  https://yaoma.shinyapps.io/Elsevier-Project/

Since this project is still under way, we would be interested in any suggestions from librarians about how to make this tool more broadly useful.  Extending this tool to make comparisons among journals from  multiple publishers is an obvious step. However, we are dubious about the value of download statistics for cross-publisher comparisons.  There is evidence that download counts substantially overstate usage, because of repeated downloads of the same article by the same users, and that the amount of double-counting varies systematically by publisher.  This is discussed in  a couple of papers of which I am a coauthor.

"Looking under the Counter for Overcounted Downloads" (with Kristin Antelman and Richard Uhrig)

and

"Do Download counts reliably measure journal usage: Trusting the fox to count your hens". (with Alex Wood-Doughty and Doug Steigerwald)

Instead of using download data, we could construct a similar calculator using price per recent citation as a measure of cost-effectiveness.  We have found that the ratio of downloads to citations differ significantly between disciplines.    So it is probably appropriate for cost per citation thresholds to  differ among disciplines.

At any rate, we would value suggestions.

Ted Bergstrom