Do Pressures to Publish Increase Scientists' Bias? An Empirical Support from US States Data
Top Cited Papers
Open Access
- 21 April 2010
- journal article
- research article
- Published by Public Library of Science (PLoS) in PLOS ONE
- Vol. 5 (4) , e10271
- https://doi.org/10.1371/journal.pone.0010271
Abstract
The growing competition and “publish or perish” culture in academia might conflict with the objectivity and integrity of research, because it forces scientists to produce “publishable” results at all costs. Papers are less likely to be published and to be cited if they report “negative” results (results that fail to support the tested hypothesis). Therefore, if publication pressures increase scientific bias, the frequency of “positive” results in the literature should be higher in the more competitive and “productive” academic environments. This study verified this hypothesis by measuring the frequency of positive results in a large random sample of papers with a corresponding author based in the US. Across all disciplines, papers were more likely to support a tested hypothesis if their corresponding authors were working in states that, according to NSF data, produced more academic papers per capita. The size of this effect increased when controlling for state's per capita R&D expenditure and for study characteristics that previous research showed to correlate with the frequency of positive results, including discipline and methodology. Although the confounding effect of institutions' prestige could not be excluded (researchers in the more productive universities could be the most clever and successful in their experiments), these results support the hypothesis that competitive academic environments increase not only scientists' productivity but also their bias. The same phenomenon might be observed in other countries where academic competition and pressures to publish are high.Keywords
This publication has 41 references indexed in Scilit:
- How Many Scientists Fabricate and Falsify Research? A Systematic Review and Meta-Analysis of Survey DataPLOS ONE, 2009
- Systematic Review of the Empirical Evidence of Study Publication Bias and Outcome Reporting BiasPLOS ONE, 2008
- International ranking systems for universities and institutions: a critical appraisalBMC Medicine, 2007
- Time to publication for results of clinical trialsCochrane Database of Systematic Reviews, 2007
- Full publication of results initially presented in abstractsCochrane Database of Systematic Reviews, 2007
- Normal Misbehavior: Scientists Talk about the Ethics of ResearchJournal of Empirical Research on Human Research Ethics, 2006
- The Academic Ranking of World UniversitiesHigher Education in Europe, 2005
- What determines the citation frequency of ecological papers?Trends in Ecology & Evolution, 2005
- Empirical Evidence for Selective Reporting of Outcomes in Randomized TrialsJAMA, 2004
- The Empire of Chance: How Probability Changed Science and Everyday Life.Journal of the American Statistical Association, 1990