其他

A Nobel Prize Winner by his science that's never been cited

2017-12-15 蝌蚪士

特别声明


本平台推出文稿均出于非商业性的教育和科研目的,旨在传播学术研究信息、净化大学教育与科研生态环境。但声明该文仅代表原作者的个人观点并不意味着本公众号赞同其观点或证实其内容的真实性。如有异议或侵权,本平台将在第一时间处理。期望读者关注点赞《蝌蚪士》公益事业: 为苦逼科民发声、并贡献正义的智力;且为平民大众免费科普,使之走进科学、传承科学、壮大科学——人人都能成为真才实学的蝌蚪士 (主编| 赛德夫).



ILLUSTRATION BY SERGE BLOCH


The geneticist and Nobel prizewinner Oliver Smithies, who died in January aged 91, was a modest, self-effacing inventor. It was typical of him to trot out the tale of one of his greatest flops: a paper1 about measuring osmotic pressure published in 1953, which, as he put it, had “the dubious distinction of never being cited”.


“Nobody ever quoted it, and nobody ever used the method,” he told students at a 2014 meeting in Lindau, Germany.


As it happens, Smithies’ paper drew more attention than he realized: nine articles referenced it within a decade of its publication. But his mistake is understandable. Many scientists harbour false impressions about uncited research — both its extent and its impact on scholarship.


One widely repeated estimate, reported in a controversial article in Science in 1990, suggests that more than half of all academic articles remain uncited five years after their publication2. Scientists genuinely fret about this issue, says Jevin West, an information scientist at the University of Washington in Seattle who studies large-scale patterns in research literature. After all, citations are widely recognized as a standard measure of academic influence: a marker that work not only has been read, but also has proved useful to later studies. Researchers worry that high rates of uncitedness point to a heap of useless or irrelevant research. “I can’t tell you how many people over dinner have asked me: ‘How much of the literature is never cited?’” West says.


In reality, uncited research isn’t always useless. What’s more, there isn’t really that much of it, says Vincent Larivière, an information scientist at the University of Montreal in Canada.


To get a better handle on this dark and forgotten corner of published research, Nature dug into the figures to find out how many papers actually do go uncited (explore the full data set and methods). It is impossible to know for sure, because citation databases are incomplete. But it’s clear that, at least for the core group of 12,000 or so journals in the Web of Science — a large database owned by Clarivate Analytics in Philadelphia, Pennsylvania — zero-citation papers are much less prevalent than is widely believed.


Web of Science records suggest that fewer than 10% of scientific articles are likely to remain uncited. But the true figure is probably even lower, because large numbers of papers that the database records as uncited have actually been cited somewhere by someone.


This doesn’t necessarily mean that there is less low-quality research to worry about: thousands of journals aren’t indexed by the Web of Science, and concerns that scientists pad out their CVs with pointless papers remain very real.


But the new figures may reassure those dismayed by reports of oceans of neglected work. And a closer look at some uncited papers shows that they have use — and are read — despite having apparently been ignored. “Lack of citation cannot be interpreted as meaning articles are useless or valueless,” says David Pendlebury, a senior citations analyst at Clarivate.  


Myths of uncitedness

The idea that the literature is awash with uncited research goes back to a pair of articles in Science — in the 1990 one2 and in another3, in 1991. The 1990 report noted that 55% of articles published between 1981 and 1985 hadn’t been cited in the 5 years after their publication. But those analyses are misleading, mainly because the publications they counted included documents such as letters, corrections, meeting abstracts and other editorial material, which wouldn’t usually get cited. If these are removed, leaving only research papers and review articles, rates of uncitedness plummet. Extending the cut-off past five years reduces the rates even more.


In 2008, Larivière and colleagues took a fresh look at the Web of Science and reported not only that uncitedness was lower than believed, but also that the percentage of uncited papers had been falling for decades4. Nature asked Larivière, together with Cassidy Sugimoto at Indiana University Bloomington, to update and elaborate on that analysis for this article.


The new figures — which count research articles and reviews — suggest that in most disciplines, the proportion of papers attracting zero citations levels off between five and ten years after publication, although the proportion is different in each discipline (see ‘Uncited science’). Of all biomedical-sciences papers published in 2006, just 4% are uncited today; in chemistry, that number is 8% and in physics, it is closer to 11%. (When cases of researchers citing their own papers are removed, these proportions rise — in some disciplines, by half as much again.) In engineering and technology, the uncitedness rate of the 2006 cohort of Web of Science-indexed papers is 24%, much higher than in the natural sciences. This higher figure may relate to the technical nature of many of these reports, which solve specific problems rather than provide work for others to cumulatively build on, Larivière suggests.



Source: V. LARIVIERE & C. SUGIMOTO/WEB OF SCIENCE


For the literature as a whole — 39 million research papers across all disciplines recorded in the Web of Science from 1900 to the end of 2015 — some 21% haven’t yet been cited. Unsurprisingly, most of these uncited papers appear in little-known journals; almost all papers in well-known journals do get cited.


An impossible measurement

These data give only a partial picture. But filling in the blanks across the literature is an impractical task.


It’s hard enough to check a handful of papers. In 2012, for instance, Petr Heneberg, a biologist at Charles University in Prague, decided to examine the Web of Science records of 13 Nobel prizewinners5, to scrutinize a preposterous-sounding paper that claimed that around 10% of Nobel laureates’ research was uncited6. His first glance at the Web of Science suggested a number closer to 1.6%. Then, checking on Google Scholar, Heneberg saw that many of the remaining papers actually had been referenced by other works indexed in the Web of Science, but had been missed because of data-entry errors or typos in the papers. And there were additional citations in journals and books that the Web of Science never indexed. By the time Heneberg gave up searching, after about 20 hours of work, he had reduced the proportion another fivefold, to a mere 0.3%.


Such flaws are why the true number of never-cited papers can’t be known: it would take too long to repeat Heneberg’s manual checks on such a grand scale. Disciplines also vary in how much they are affected by these flaws in measurement. The Web of Science records, for instance, that 65% of humanities papers published in 2006 have not yet been cited. It’s true that a lot of the humanities literature doesn’t get referenced — in part because, compared with the sciences, new research is less dependent on the cumulative knowledge of what goes before. But the Web of Science doesn’t accurately reflect the field, because it neglects many of its journals and books.


The same kinds of consideration bedevil comparisons between nations. The Web of Science shows that papers authored by scientists in China, India and Russia are more likely to be ignored than are those written in the United States and Europe. But the database doesn’t track many regional-language journals that, if taken into consideration, would narrow the gap, says Larivière.


Despite the caveats about absolute figures, the decline in uncitedness within the Web of Science is a robust pattern, Larivière says. The Internet has made it so much easier to find and cite relevant papers, he says. (It’s possible that a drive to make articles open-access is also helping.) But Larivière cautions against reading too much into the trend. He and others found in a 2009 study7 that rates of uncitedness are falling because scientists publish a larger volume of papers and stuff more references into their articles. Bibliometrics researcher Ludo Waltman, at Leiden University in the Netherlands, agrees. “I would not tend to interpret these figures as reassurance that more of our scientific work is providing a useful purpose.”





Waltman says that many papers escape uncitedness very narrowly: independent calculations from Waltman and Larivière show that papers on the Web of Science with only one or two citations outnumber those that have zero. “And we know that many citations are quite superficial or perfunctory,” he says. Or they could be a sign of academics scratching each other’s backs, says Dahlia Remler, a health economist at the Marxe School of Public and International Affairs in New York City. “Even highly cited research could be a game that academics play together that serves no one’s interest,” she says.


Not totally pointless

Some researchers might still be tempted to dismiss uncited papers as irrelevant. After all, if they mattered — even a little bit — wouldn’t someone have mentioned them?


Probably, but not always. Academics are influenced by many more papers than they actually cite, says Michael MacRoberts, a botanist at Louisiana State University in Shreveport. In a 2010 article8 on the shortcomings of citation analysis, MacRoberts referenced his own 1995 paper9 about the discovery of nodding club-moss (Palhinhaea cernua) in Texas. It was the first and only time the paper had been cited, but the information in it has been recorded in plant atlases and large online databases; those who use these databases are relying on this paper and thousands of botanical reports like it. “The information in these so-called uncited articles is used; it is just not being cited,” he says.


And uncited articles are still being read. In 2010, researchers at New York City’s Department of Health and Mental Hygiene published a study that used software to analyse performance glitches in a saliva-based HIV test10. A few years earlier, use of the kit had been suspended in clinics, although it was later reinstated. The authors wanted to use the clinics’ experience as a case study, to ask whether the software could have been used to analyse the performance of the kits when problems arose.


TALES OF THE UNCITED

The long waitFor any researcher who ever wished that one of their papers would eventually start gathering citations, there’s hope in the tale of Albert Peck, whose 1926 paper13characterizing a kind of defect in glass attracted its first citation in 2014. In the 1950s, the paper became superfluous because manufacturers worked out how to make smooth glass with no such defects. But in 2014, materials researcher Kevin Knowles at the University of Cambridge, UK, came across the paper on Google while doing a survey of the field for his work on using such defects as a way to diffuse light. He has now cited it in four articles. “I like writing papers where I can pick out obscure articles,” he says.The missed waveDoctoral student Francisco Pina-Martins at the University of Lisbon published a paper on interpreting genetic-sequence data in 2016 that he’s pretty sure will never be cited because the technology it refers to, made by the biotech firm 454 Life Sciences, is obsolete and has been phased out. He had uploaded his data-analysis software to the GitHub code-sharing website in 2012 — and that has been referenced in a few papers. But it took four years for the research to get published, largely, he says, because it relates to a rare problem that peer reviewers didn’t understand.The blind alleyMany stories of uncited papers are unhappy ones. 


In 2010, neuroscientist Adriano Ceccarelli published a paper in PLoS ONE about gene regulation in Dictyostelium slime moulds. His applications for grants to follow up the research failed, and the paper has never been cited. “You know how research goes — it turns out this was a blind direction,” he says. “My ideas are not valuable in terms of funding. Now I’m just teaching and waiting for retirement. I would be very keen to do the work if I get funded tomorrow.”


Their paper, published in the journal PLoS ONE, has never been cited. But it has been viewed more than 1,500 times and downloaded almost 500 times, notes Joe Egger, a co-author of the paper who is now at the Duke Global Health Institute in Durham, North Carolina. “The goal of the article was to improve public-health practice, not really to move a scientific field,” he says.


Still other articles might remain uncited because they close off unproductive avenues of research, says Niklaas Buurma, a chemist at Cardiff University, UK. In 2003, Buurma and colleagues published a paper about ‘the isochoric controversy’ — an argument about whether it would be useful to stop a solvent from contracting or expanding during a reaction, as usually occurs when temperatures change. In theory, this technically challenging experiment might offer insight into how solvents influence chemical reaction rates. But Buurma’s tests showed that chemists don’t learn new information from this type of experiment. “We set out to show that something was not worth doing — and we showed it,” he says. “I am quite proud of this as a fully uncitable paper,” he adds.


Oliver Smithies, speaking at the Lindau meeting, said that he recognized the value of his 1953 paper, even though he thought it hadn’t been cited. The work behind it, he told his audience, helped him to earn his PhD and become a fully fledged scientist. In essence, it represented a future Nobel prizewinner’s apprenticeship. “I enjoyed doing it,” he said, “and I learned to do good science.” Smithies does have at least one truly uncited paper in his back catalogue: a 1976 article which showed that a particular immune-system gene was located on human chromosome 15. But even that was important for other reasons, says geneticist Raju Kucherlapati, at Harvard Medical School in Boston, Massachusetts, who co-authored the paper. The article, he says, was the start of a long-lasting collaboration with Smithies’ lab, culminating in work on mouse genetics that would earn Smithies the 2007 Nobel Prize in Physiology or Medicine. “For me,” says Kucherlapati, “the significance of that paper was that I got to know Oliver.”


Nature Briefing


Sign up for the daily Nature Briefing email newsletter

Stay up to date with what matters in science and why, handpicked from Nature and other publications worldwide.



您可能也对以下帖子感兴趣

文章有问题?点此查看未经处理的缓存