Re: Let's dumb-up (journal citation) impact factors

From: Colin Steele <Colin.Steele_at_anu.edu.au>
Date: Sun, 24 Oct 2004 21:18:24 +0100

I don't disagree with Stevan here, nor with his earlier comments that we
need to concentrate on filling up the "empty space" of repositories. The
crucial issue here is the incentive system to do so. Unless we change
the reward mechanisms or incentives, there will not be major academic
movements in the short term.

I copy below the first section of a September first draft for the OACI
Working Group which met in Leiden. To ignore ISI and indeed Elsevier's
Scopus would be unwise. What we need to do is to identify their metric
"deficiencies" in terms of bibliometric cleansing, coverage, etc; to
lobby those bodies which are adopting ISI and other rankings
unreservedly; to promote the wider bibliometric web measurements listed
by Stevan and Tim Brody; and most importantly, get them incorporated
into the metrics of the research evaluation processes which are driving
the academic community. See The Guardian cartoon, October 23, attached
which, while not complex, puts the point
http://books.guardian.co.uk/posysimmonds/page/0,12694,1334031,00.html

Herewith the beginning of my original OACI draft:

"How is research excellence measured? Tijssen has analysed the issues
and challenges in utilising "measurable attributes" to establish
"scoreboards of research excellence". (Tijssen, 2004) This is an issue
which has increasingly interested governments, universities and funding
bodies as measures of accountability and quality are sought.
Professor Sir David King, the Chief Scientific Advisor and Head of the
UK Office of Science and Technology has stated that the "ability to
judge a nations scientific standing is vital for the governments,
businesses and trusts that must decide scientific priorities and
funding. (King, 2004). Using only ISI data, which King says "provides
metrics for judging achievement", he recently evaluated the UK's
performance in science and engineering

Professor Tony Hey, the Chair of the UK eScience Program has stated in
an email to Professor Stevan Harnad (circulated to the OACI group 9
September)" The Government is keen to measure the effectiveness of the
UK's research funding by comparing the citation impact of UK researchers
field by field. These citation counts come, of course, from ISI and the
US is top in almost all fields. The UK is second in some but only 3rd or
4th in others like engineering. I suspect that the ISI ratings may have
a bias towards US researchers and US journals and wondered if OA
citation measurements give the same or similar rankings. Obviously at
this stage of institutional repositories this is very early days for
such analyses"

Borgman and Furner in a comprehensive review article "Scholarly
Communication and Bibliometrics" provide references showing how
bibliometric methods are used to assess the research performance of
entire nations. (Borgman and Furner, 2002). Surprisingly, the United
States nationally seems not to have engaged in such direct national
dialogues, perhaps because it has no direct national equivalent of the
RAE's for universities or perhaps because American universities usually
figure quite prominently in global rankings and therefore no problem is
perceived. However, King, in his Nature article, notes that US
scientific output has declined proportionally in comparison to that of
the UK. Dr Diana Hicks has also indicated an absolute decline in the
number of US papers in international peer reviewed literature. (Hicks,
2004).

Sir Gareth Roberts, Chair of the UK 2008 RAE Exercise, in a June 2004
presentation to the Australian Scholarly Communication Round Table,
indicated that the purpose of RAE's was to allow funding bodies to
assess the quality of research arising from the investment of public
money; enabling the academic sector to assess its success; and inform
its future strategy and perhaps most importantly, to inform a funding
model. (Roberts, 2004).

The Berlin 2 Roadmap wishes to "discredit impact factor as appropriate
measure in career evaluation and tenure promotion",
(http://www.zim.mpg.de/openaccess-cern/presentation-oa2berlin-roadmap-pro)
but seems to ignore the political reality?
Internationally research assessment and quality exercises have come to
rely increasingly on metrics such as the ISI citation indexes. The ISI
Citation data is invaluable to such bodies, as they allegedly "provide a
quick and easy yard stick for measuring research quality". (Adam, 2002)

The Shanghai Jiao Tong Index and University League Tables
http://ed.sjtu.edu.cn/ranking.htm

The Shanghai Jiao Tong Index of the "Academic Ranking of World
Universities" is perhaps the prime example of the importance assumed by
universities in a competitive environment. Thus the latest release of
the Shanghai rankings was picked up, for example, in August 2004 in New
Zealand, UK, Australian Canadian, Taiwanese and European, newspapers in
terms of "boasting" respective individual and thus national research
rankings.

Professor Anthony F J van Raan of Leiden University has shown in a most
cogent paper, "Fatal Attraction" that the bibliometric underpinnings of
the Shanghai rankings are far from definitive. (Van Raan, 2004). While
there are significant issues to be addressed in the collection and use
of bibliometrics there is just as important an issue to be addressed,
namely the simplistic acceptance by governments of "excellence" in terms
of usage of citation analyses.

The Australian National Academies Forum in July 2004:
http://www.science.org.au/conferences/researchexcellence/index.htm,
which was attended by the nations top scientists, research council
heads, government administrators, etc highlighted that the vast majority
of the OACI issues "we" are talking about are mostly unfamiliar to such
groups. They are happy to accept what they believe are definitive
indicators of research excellence, unaware of the concerns expressed in
bibliometric and scholarly communication circles. The main issue in this
context is not the use of bibliometric indicators per se, but the
application of inadequate or flawed bibliometric measures. Again rather
than deride this approach," "we" need to be aware of the political
realities and respond accordingly.

Van Raan says rankings such as the Shanghai one "are part of a larger
problem in the science evaluation circus. Quite often I am confronted
with the situation that responsible science administrators in national
governments and in institutions request the application of bibliometric
indicators that are not advanced enough. They ...want to have it 'fast',
in 'main lines', and not 'too expensive ... the fault of these leading
scientists and administrators is asking too much and offering too
little." (Van Raan, 2004).

Van Raan continues that "it is not so much the commercialization of the
monopolist data producer ISI that makes the problems. These heads of
institutions, government administrators and policy makers are the first
to blame that the intermediary research groups that hitherto cleaned the
crude ISI data, prepared the data for the construction of reliable
bibliometric indicators and developed the competence and skill to
interpret the indicators (Weingart 2004) are being squeezed out of the
market".

Since the purchase of ISI by Thomson Corporation in the early 1990s
there has been a marked change in the marketing and commercialisation of
the product. One commentator has called the pre-Thomson purchase period,
ISI's "Romantic Period", when specialists in the field were allowed
greater freedom with data manipulation. (Braun quoted in Adam, 2002).
Since the purchase by Thomson there has undoubtedly been a significant
increase in the influence of ISI, due partly through a combination of
increased market dominance, as well as acceptance, as evidenced above,
by administrators of what to them is a ready made metric as a tool for
performance analysis.

Dr Paul Wouters has stated in the Abstract to his seminal thesis, The
Citation Culture that the "Science Citation Index is moreover not merely
a bibliographic instrument. It also creates a new picture of science via
bibliographic references found in scientific literature. In this way,
the SCI provides a fundamentally new representation of science. By
focussing on the seemingly most insignificant entity in scientific
communication, the inventors of the SCI have created a completely novel
set of signs and of a new symbolic universe." (Wouters, 1999).

It was quite clear from Sir Gareth Roberts's presentation in Australia
to the NSCF Roundtable that for the UK RAE, that in certain science
disciplines the existing citation data, ie the ISI data, could be taken
at face value. This reaffirms the power of the ISI citation indexes
because to the administrators it is a ready made tool for measurement,
although they are often unaware of the problems in the use of the data,
eg the need for bibliographical cleansing, the differences between
different disciplines, the lack of coverage of certain subjects, author
self-citation patterns, etc."

The OACI Working Group is pursuing this topic and hopefully will issue a
"Leiden Declaration" incorporating an Open Access Impact Factor.

Colin Steele
Received on Sun Oct 24 2004 - 21:18:24 BST

This archive was generated by hypermail 2.3.0 : Fri Dec 10 2010 - 19:47:38 GMT