The scientific literature paints a rosy picture of research. Observations from others are synthesized to form new hypotheses, which are subsequently tested in a linear fashion. It can appear as if the story were already written, and all that remains is performing the actual experiments. This rosy picture stems in part from positive publication bias, i.e., the selective publication of positive results over negative results.
Reality paints a different picture, one in which experimental methods and results more often find themselves tucked away in a file drawer than published in a peer-reviewed journal. Sometimes experiments produce negative results, and negative results make for boring stories. Other times, marginal conceptual or technical advances fail to meet the high threshold for publication in standard peer-reviewed journals. Even worse, some datasets remain completely untouched due to factors such as lack of time or technical capability.
One solution to the “file drawer problem”  of science is to make it easier to publish orphan data and methods. This article summarises the forms these data can take and outlines innovative publishing venues for sharing such data.
What is Orphan Data?
Orphan data refers to useful scientific products that are unpublishable in conventional, peer-reviewed journals. This includes negative data, confirmatory (i.e., not novel) data, single observations, methods customizations, and large datasets.
Why Publish Orphan Data?
Two words: reproducibility crisis. Storytelling favors positive results and leads to the positive publication bias that permeates the scientific literature. Scientists are encouraged to publish results that show statistically significant differences between experimental conditions (i.e., positive results) over those that do not. This leads to conscious or subconscious efforts to manipulate data to produce positive results, called p-hacking. Practices like p-hacking can increase the rate at which false claims are canonized as fact.  On the other hand, relegating negative results to the file drawer wastes the time and resources of those chasing the same hypotheses.
In addition, time and resources might be saved by publishing methods customizations, curated datasets, and single observations. These useful scientific products are too frequently buried in lab notebooks when they could prove valuable to other scientists. The more scientists share, the easier science is for everyone.
Journals and Other Venues for Orphan Data
The following list focuses on innovative publishing platforms not discussed in our previous article on where to publish negative results.
Pre-print Servers (arXiv, bioRxiv, medRxiv)
Pre-print servers are essentially free archives for unpublished scientific work. Scientists can submit their work to these servers at any stage. Authors can update their works, and others can comment on them. Pre-prints can often be published in peer-reviewed journals. Originally limited to physics and mathematics (arXiv), there are now pre-print servers for the biologic (bioRxiv) and health sciences (medRxiv). Scientists often publish on pre-print servers to establish the priority and novelty of their work when independent groups are working on the same problem.
Micro-Publications (BMC Research Notes, Science Matters)
Instead of focusing on storytelling, micro-publications provide a forum for individual scientific observations. These publications are typically limited to a single data figure, although supplemental data can be included as support. Examples include BMC Research Notes and the newer alternative Science Matters. Science Matters offers several features meant to solve problems associated with conventional publishing venues. For example, it uses a triple-blind peer-review process in which editors, authors, and reviewers remain anonymous to one another throughout the publication process.
Method Customizations (MethodsX, Nature Protocols Exchange)
There are many journals for methods papers, yet few accept the marginal methodological customizations that scientists use to streamline their experiments. MethodsX was created to provide such a forum. MethodsX publishes methods from any scientific discipline. However, method customizations must be benchmarked against existing techniques to demonstrate their usefulness. Alternatively, any protocol can be shared in Nature Protocols Exchange, an open repository that functions much like a pre-print server in that there is no peer review.
Datasets (Scientific Data)
Many scientists generate large datasets that have exceptional reuse potential. While data repositories such as the Gene Expression Omnibus (GEO) have existed for a while, these repositories are not peer-reviewed. Valuable datasets can now be published in peer-reviewed “data journals”, such as Scientific Data.
Megajournals (PLoS ONE, Scientific Reports)
Megajournals like PLoS ONE and Scientific Reports function roughly in the same manner as other conventional, peer-reviewed publications with one main exception. Instead of emphasizing novelty and conceptual advance, these journals focus on the technical soundness of submissions. They also lack minimum paper length requirements.
Read the Fine Print!
Before submitting your work to these venues, make sure you understand the fine print. Pre-print servers and protocol/data repositories are not peer-reviewed and technically do not count as bona fide “publications” (although there are many instances in which pre-prints are cited in peer-reviewed work). Other journals (e.g., Science Matters) lack consistent indexing on PubMed and other publication databases, making it more difficult for others to find your work. The age and popularity of a journal should also be considered, as some fledgling publication venues can fade into obscurity due to a lack of interest from the scientific community. Finally, because many of these venues are open access, the peer-reviewed ones charge (sometimes hefty) open-access fees for publication.
Overall, these innovative publishing forums fill the gaps left by conventional scientific publishing practices and offer scientists new solutions for sharing their painstaking work. Although these publications are unconventional, the principles of good manuscript writing still apply. For more advice on how to write a manuscript, follow these steps, and don’t forget the common sins of scientific publishing!
Do you know of any publication venues that did not make our list? Let us know in the comments below.
- Rosenthal, R. The file drawer problem and tolerance for null results. Psychological Bulletin, 86, 3 (1979). doi: 10.1037/0033-2909.86.3.638
- Nissen SB, Magidson T, Gross K, Bergstrom CT. Publication bias and the canonization of false facts. Elife, 20, 5 (2016). doi: 10.7554/eLife.21451.