This article was first published on SciDev.net by Caroline Wagner
To make science truly useful to development, we need a new, inclusive system of tracking publications, says S&T policy expert Caroline Wagner.
Global comparisons of scientific output are commonplace. As non-experts, policymakers and administrators must rely on indexes of impact and recognition — counts of published papers and citations, and the prestige of source journals — to assess the impact of public spending and to allocate research funds.
The gold standard is SCIE — Scientific Citation Index Expanded, once called ISI — owned by Thomson Reuters. SCIE is an excellent abstracting service, but it covers only a small percentage of all scientific literature.
And although the information revolution is making it easier to publish online, and therefore to access the results of scientific research, it also confounds efforts to monitor and compare the outputs as materials proliferate in many new venues.
The result is a rapidly growing, open system that is harder to evaluate than ever before.
Extent of the ‘unseen’
In a recent study we counted more than 15,000 scientific periodicals among the ‘BRIC’ countries (Brazil, Russia, India and China), of which just 495 — about 3 per cent — are listed in SCIE.
Amazingly, this is not an anomaly: we found that SCIE lists only about 3 per cent of journals for most scientifically advanced countries.
This means that decision makers anywhere in the world relying on SCIE (or its cousins, Scopus or perhaps Google Scholar) do not account for, access, or compare as much as 90 per cent of scientific output — works we call “unseen science”.
For scientifically advanced countries such as the United States, other abstracting services such as Index Medicus and the Chemical Abstracts Service provide additional access. But no single source provides common access or enables comparability.
For developing countries, the challenge of ‘unseen’ science is compounded by the language barrier. China publishes 6,596 scientific journals, of which only a handful are abstracted in English. Similarly, Russia and Brazil each have close to 2,000 journals in national languages that are not indexed in SCIE.
India is better represented in English and in SCIE, but unlike the other three BRICs, its national publications (about 550) are scattered among a number of databases that are difficult to track down. 
How would a researcher go about finding these works? Right now, there is no way to do this.
Quantity and quality
That science is growing worldwide is widely celebrated. It enriches knowledge, but it also adds to the challenges of assessing and comparing global scientific output.
Many more countries fund research and development (R&D) than at any time in history. In 1990, six countries were responsible for 90 per cent of R&D spending; by 2008 this elite group had grown to more than 13 countries. Since the beginning of this century, developing countries have more than doubled their R&D spending.
Growing numbers of journals in native languages (paper and electronic); open-source avenues for publication such as Creative Commons; and global conferences (physical and online) are signs of healthy science. Combined with new possibilities for publication opened up by the Internet, scientific findings multiply daily.
It is certainly a good thing that new communications tools enable a vast new group to participate in the global network of science. But the trend also confounds assessments and raises questions about what is being measured in global comparative studies.
The proliferation of sources also raises questions about quality. Online archives such as arXiv and ResearchGate are growing in popularity, and they include pre-publication versions of articles that have not been reviewed, yet are often read and cited by others. But no clear standard is emerging to account for a change in status (such as pre- or post-review), let alone comparing citations over time.
Electronic journals, newsletters and bulletins do not always use high standards of quality assessment (such as peer review and editing). It is impossible to sort out which ones are presenting quality data.
This does not discount the usefulness of these contributions to the stock of global knowledge. Indeed, the seeds of future breakthroughs may very well be contained there. But sifting through expanding volumes of material to find them is becoming increasingly difficult.
Key step: taking stock
Indeed, recent calls for global standards for scientific assessment are well-intentioned in theory, but the proliferation of scientific communications and sources makes the idea a dream that is difficult to put into practice.
The first step towards any global assessment should be an inventory of the various types of scientific outputs and their sources. There are many possible configurations — for example, electronic only or electronic and paper; frequency of publication; how many places the same article is published (pre- and post-publication); links to supporting data; open source or subscription; editor or peer reviewed.
A move towards standard terms for types of outputs would help analysts make accurate counts, and policymakers to use all available information in decision making.
Inclusion may require that regional or national governments, or perhaps academies of science, invest in good accounting and a national library of all scientific periodicals — which must be open access — such as Russia is attempting at elibrary.ru.
An inclusive approach to ensuring that all good science is ‘seen’ at the global level is a lofty goal that presents significant challenges to the scientific community. Global transparency can be achieved with new protocols for viewing, cataloguing and understanding the swath of activities that represent science in this collaborative era. But it requires commitment from governments and scientific societies.
As things stand, the vast majority of scientific publications remain unseen by most potential users, and this lack of access makes all of global science poorer.
Caroline Wagner is Wolf Chair in International Affairs and director at the Battelle Center for Science & Technology Policy, The Ohio State University, Columbus, Ohio. Caroline can be contacted at email@example.com.