Re: Open access to research worth 1.5bn a year

From: Stevan Harnad <harnad_at_ecs.soton.ac.uk>
Date: Tue, 27 Sep 2005 19:44:44 EDT

On Mon, 26 Sep 2005, Donald Waters wrote:

> I can see why one might argue that citation of a research publication
> represents a kind of "return on investment" in research. However, the
> logic of Mr. Harnad's argument escapes me when he insists in Proposition
> #2 below that publication itself is not also a kind of return on
> investment.

Yes, number of publications is a measure of return on the investment. I
should have said not the best measure. I did say that a piece of research
that is not used may as well not have been done. Hence usage is a better
message of impact than mere publication.

> Mr. Harnad's use of "return on investment" is not at all rigorously
> defined, but if you accept his usage for the common sense assertion that
> it is, not only does the research publication surely represent another
> kind of "return," it is a primary and immediate return compared to a
> subsequent citation to the publication, which is secondary and would
> likely emerge only over time.

To repeat: a piece of research unused is a piece of research that may as
well not have been done, or funded.

Of course usage may come with time (and the citation counts were estimated
over a 10-year baseline). But the point was not that research that is
accessible only to researchers who are at institutions that can afford the
journal in which is happened to be published has no citation impact at
all. The point was that it does not have its full potential citation
impact. It loses 50%-250% of it. And that citation impact is the more
accurate measure of the return on the public's investment in the research;
not the crude count of articles published.

(If you don't like "return on investment," use "getting your money's
worth": research that is not self-archived gives the public 50%-250% less
than what it paid for.)

> If there are "returns on investment" in research, then we are faced with
> a proposition requiring careful analysis of multiple factors and their
> relative weighting over a base of investment, not simply an "either/or"
> proposition as Mr. Harnad so dismissively suggests.

Publication counts are a crude first approximation; citation counts are a
better approximation. There are of course even better measures of research
impact (downloads, post-publication peer evaluations, and many of the
scientometric measures of the future that an online full-text corpus will
spawn).

> Second, and more troubling, even if you allow for his black and white
> view of the world, Mr. Harnad fails to provide any sort of rationale for
> why it is reasonable and valid to impute a specific monetary value to a
> citation, much less as a "return on investment" in an accounting system
> that measures country-wide investment in research. He simply assumes
> that it has such value, takes a percentage difference in the quantity of
> citations to published articles that are distributed in different ways,
> and applies the difference to the assumed base value, which is the UK's
> annual investment in research. To identify a value for citations, why
> not apply the percentage difference to the UK's gross national product,
> the annual gross sales of research publications in the UK, the total
> salaries of university researchers in the UK, or their total
> expenditures on groceries? An arbitrary application of percentages to
> an arbitrary base value is hardly a disciplined way to calculate value.

Rather harder to calculate, in comparing self-archived and
non-self-archived articles, but by all means go ahead and do it!

> In his original article, which appears at
> http://openaccess.eprints.org/index.php?/archives/28-guid.html, Mr.
> Harnad does try to be more specific and refers to a 1986 article by A.M.
> Diamond in the Journal of Human Resources entitled "What is a citation
> worth." Mr. Harnad points to a reprint of the Diamond article
> introduced by Eugene Garfield at
> http://www.garfield.library.upenn.edu/essays/v11p354y1988.pdf, but his
> adoption of the Diamond article is slavish and uncritical despite the
> warnings contained both in Diamond's highly nuanced and qualified
> article and especially in Garfield's introduction.

Apart from the slavish adoption, please note that the Diamond measure was
used for another calculation: What the researcher is losing, per citation,
in salary and grant income. That is not the same measure as the one on
whether the UK public is getting it's full money's worth.

> Diamond had examined the salary structures of university professors and
> roughly estimated that the marginal value of a citation fell at the time
> within the range of US$50-1,300 of additional salary. For his own
> purposes, Mr. Harnad simply takes this estimate out of context converts
> it to pound sterling, inflates it to current value, and multiplies it by
> the number of citations that authors presumably lose by failing to
> self-archive. However, in the introduction to the Diamond article,
> Garfield cautions emphatically against just such a usage.

Well, for what it's worth, I checked my article with both Diamond and
Garfield. Diamond replied with no objection to the application. Garfield
didn't reply.

> Garfield points out that Diamond "is not saying that every additional
> citation is worth 'X' amount of dollars" and then continues:
> "Economists are interested in the structure of wages and in its
> components, and they present their data to show that structure.
> Diamond does not claim that that there is any simple, automatic
> connection between citations and salaries. There is no real evidence of
> such a causal connection. Rather, as Harriet Zuckerman, Department of
> Sociology, Columbia University points out, from Diamond's findings we
> can conclude that citations can be regarded 'as a kind of "proxy" for
> certain services for which scientists and scholars get paid.'"

A correlation is a correlation. The causality would require a much more
complicated study. But since promotion committees are if anything, more
explicitly counting citations since 1986, causality is likely. See also
the studies showing the correlation between the rankings (and funding)
made by the UK Research Assessment Exercise (RAE) and citation counts.
(Citations are not counted directly by RAE, but the ranked departments and
institutions do emphasise the citation impact factor -- average citation
ratios -- of the journals their researchers publish in, in designing their
submissions.

    http://www.ariadne.ac.uk/issue35/harnad/

> Mr. Harnad may well be on to an important subject and line of argument
> in suggesting that citations are a kind of return on investment.
> However, close inspection of the concepts and logic of his argument
> suggests that he is quite a bit further from proving his case than he
> seems to have convinced himself that he is.

No proof here: Just conservative estimates.

Stevan Harnad
Received on Wed Sep 28 2005 - 02:07:00 BST

This archive was generated by hypermail 2.3.0 : Fri Dec 10 2010 - 19:48:02 GMT