Originally published 21 January 1991
More than half of scientific research contributes nothing to the growth of knowledge.
This astonishing statement just might be true, but before I give the evidence, let me explore a familiar metaphor.
Scientific knowledge grows organically, like a tree.
Every piece of published research is like a new bud on a twig. The bud is connected to every other bud on the tree. Two buds may be very close together, on the same twig, or very far apart, so that to trace the connection one would have to follow twigs and branches all the way back to the trunk and out again along other branches and twigs. Ultimately, all scientific knowledge is one. The connectedness of science gives us confidence in its integrity.
The amount of research being published today is so great that no one can claim to know the overall shape of the tree. There are presently nearly 75,000 scientific journals listed in the Bowker/Ulrich catalog of international periodicals, and thousands of new journals are added each year. It is virtually impossible for any one scientist to become familiar with more than a single branch of the tree.
Or to put it another way, we can’t see the tree of science for the forest of published words. However, help may be at hand.
The citation rule
A rule of science requires that every published paper cite all previously published work that bears upon the same subject. The citation rule assures that new research is firmly connected to the tree of science.
Computer-based citation indexes may make it possible to sketch out the outlines of the tree, and computer analysts have begun to do just that, by tracing networks of citations.
At the request of the journal Science, David Pendlebury of the Philadelphia-based Institute for Scientific Information (ISI) recently searched a citation database for papers published between 1981 and 1985. He discovered that 55 percent of papers surveyed had not been cited even once in the five years after publication.
The ISI database included only the top 4,500 science and social science journals, or about 6 percent of the total number listed in the Bowker/Ulrich catalog of periodicals. These are the journals that are most likely to be read and cited, so the citation rate for papers published in other journals is almost certainly lower.
Because of the citation rule, if a work is not cited, it presumably has no influence on subsequent research. Apparently, half of the buds on the tree of science are dead ends. They could be snipped away and the tree would grow as robustly as ever. Or at least that’s what the computer study seems to indicate.
At first glance, this might seem promising. Good horticultural practice suggests that a tree grows better with judicious pruning. Snip a branch here and there, and the other branches will grow more vigorously. This might be especially true when the available nutrients for growth — in the form of government funding — are becoming increasingly meager.
The problem is knowing which branches to prune.
Citation indexes, if widely available on an international computer network, might provide the answer. Funding, promotion, and academic tenure could be directed to those researchers whose works are cited. Uncited work would be allowed to wither away. The fecundity of a line of research, not a mere list of publications, would become the measure of quality.
There is, in fact, a strong correlation between citation rates and the public perception of quality. A typical Harvard scientific paper, for example, is cited 25 times over a 15-year period, which puts Harvard at the top of the Ivy League citation sweepstakes, with Yale and Princeton not far behind. The Ivy League’s citation rate is significantly higher than the average paper in ISI’s database.
It is tempting to think that by directing limited resources only to cited lines of research, waste would be eliminated and the growth knowledge would not be impaired. But there are dangers with using citation indexes to direct the growth of science.
New branches and deadwood
Who can tell which scientific research being done today will be the start of a fruitful new branch of the tree and which is destined to be deadwood? Significant work may go unrecognized for years following its publication. Without some possibly wasteful funding support, promising ideas may fade before they have a chance to establish themselves in a citation index.
More worrisome, science might become a self-perpetuating aristocracy, rather than an open meritocracy. It may be necessary to tolerate a substantial amount of utterly fruitless research in order to insure that science remains open to the gifted young, and to others working outside of elite research establishments.
It may be, after all, that science grows best wastefully and wild, without the horticulturist’s tidying hand.