
How open science helps researchers succeed
Erin C McKiernan
Philip E Bourne
C Titus Brown
Damon McDougall
Karthik Ram
Courtney K Soderberg
Jeffrey R Spies
Kaitlin Thaney
Kara H Woo
Tal Yarkoni
Email:emckiernan@ciencias.unam.mx
Roles
Received 2016 Apr 8; Accepted 2016 Jul 4; Collection date 2016.
This article is distributed under the terms of theCreative Commons Attribution License, which permits unrestricted use and redistribution provided that the original author and source are credited.
Abstract
Open access, open data, open source and other open scholarship practices are growing in popularity and necessity. However, widespread adoption of these practices has not yet been achieved. One reason is that researchers are uncertain about how sharing their work will affect their careers. We review literature demonstrating that open research is associated with increases in citations, media attention, potential collaborators, job opportunities and funding opportunities. These findings are evidence that open research practices bring significant benefits to researchers relative to more traditional closed practices.
DOI:http://dx.doi.org/10.7554/eLife.16800.001
Research Organism: None
Introduction
Recognition and adoption of open research practices is growing, including new policies that increase public access to the academic literature (open access; Björk et al., 2014;Swan et al., 2015) and encourage sharing of data (open data; Heimstädt et al., 2014;Michener, 2015;Stodden et al., 2013), and code (open source; Stodden et al., 2013;Shamir et al., 2013). Such policies are often motivated by ethical, moral or utilitarian arguments (Suber, 2012;Willinsky, 2006), such as the right of taxpayers to access literature arising from publicly-funded research (Suber, 2003), or the importance of public software and data deposition for reproducibility (Poline et al., 2012;Stodden, 2011;Ince et al., 2012). Meritorious as such arguments may be, however, they do not address the practical barriers involved in changing researchers’ behavior, such as the common perception that open practices could present a risk to career advancement. In the present article, we address such concerns and suggest that the benefits of open practices outweigh the potential costs.
We take a researcher-centric approach in outlining the benefits of open research practices. Researchers can use open practices to their advantage to gain more citations, media attention, potential collaborators, job opportunities and funding opportunities. We address common myths about open research, such as concerns about the rigor of peer review at open access journals, risks to funding and career advancement, and forfeiture of author rights. We recognize the current pressures on researchers, and offer advice on how to practice open science within the existing framework of academic evaluations and incentives. We discuss these issues with regard to four areas – publishing, funding, resource management and sharing, and career advancement – and conclude with a discussion of open questions.
Publishing
Open publications get more citations
There is evidence that publishing openly is associated with higher citation rates (Hitchcock, 2016). For example, Eysenbach reported that articles published in theProceedings of the National Academy of Sciences (PNAS) under their open access (OA) option were twice as likely to be cited within 4–10 months and nearly three times as likely to be cited 10–16 months after publication than non-OA articles published in the same journal (Eysenbach, 2006). Hajjem and colleagues studied over 1.3 million articles published in 10 different disciplines over a 12-year period and found that OA articles had a 36–172% advantage in citations over non-OA articles (Hajjem et al., 2006). While some controlled studies have failed to find a difference in citations between OA and non-OA articles or attribute differences to factors other than access (Davis, 2011;Davis et al., 2008;Frandsen, 2009a;Gaulé and Maystre, 2011;Lansingh and Carter, 2009), a larger number of studies confirm the OA citation advantage. Of 70 studies registered as of June 2016 in the Scholarly Publishing and Academic Resources Coalition (SPARC) Europe database of citation studies, 46 (66%) found an OA citation advantage, 17 (24%) found no advantage, and 7 (10%) were inconclusive (SPARC Europe, 2016). Numerical estimates of the citation advantage in two reviews range from -5% to 600% (Swan, 2010) and 25% to 250% (Wagner, 2010). The size of the advantage observed is often dependent on discipline (Figure 1). Importantly, the OA citation advantage can be conferred regardless of whether articles are published in fully OA journals, subscription journals with OA options (hybrid journals), or self-archived in open repositories (Eysenbach, 2006;Hajjem et al., 2006;Gargouri et al., 2010;Research Information Network, 2014;Wang et al., 2015;Swan, 2010;Wagner, 2010). Moreover, at least in some cases, the advantage is not explained by selection bias (i.e., authors deliberately posting their better work to open platforms), as openly archived articles receive a citation advantage regardless of whether archiving is initiated by the author or mandated by an institution or funder (Gargouri et al., 2010;Xia and Nakanishi, 2012).
Figure 1. Open access articles get more citations.
The relative citation rate (OA: non-OA) in 19 fields of research. This rate is defined as the mean citation rate of OA articles divided by the mean citation rate of non-OA articles. Multiple points for the same discipline indicate different estimates from the same study, or estimates from several studies. References by discipline: Agricultural studies (Kousha and Abdoli, 2010); Physics/astronomy (Gentil-Beccot et al., 2010;Harnad and Brody, 2004;Metcalfe, 2006); Medicine (Sahu et al., 2005;Xu et al., 2011); Computer science (Lawrence, 2001); Sociology/social sciences (Hajjem et al., 2006;Norris et al., 2008;Xu et al., 2011); Psychology (Hajjem et al., 2006); Political science (Hajjem et al., 2006;Antelman, 2004;Atchison and Bull, 2015); Management (Hajjem et al., 2006); Law (Donovan et al., 2015;Hajjem et al., 2006); Economics (Hajjem et al., 2006;McCabe and Snyder, 2015;Norris et al., 2008;Wohlrabe, 2014); Mathematics (Antelman, 2004;Davis and Fromerth, 2007;Norris et al., 2008); Health (Hajjem et al., 2006); Engineering (Antelman, 2004;Koler-Povh et al., 2014); Philosophy (Antelman, 2004); Education (Hajjem et al., 2006;Zawacki-Richter et al., 2010); Business (Hajjem et al., 2006;McCabe and Snyder, 2015); Communication studies (Zhang, 2006); Ecology (McCabe and Snyder, 2014;Norris et al., 2008); Biology (Frandsen, 2009b;Hajjem et al., 2006;McCabe and Snyder, 2014).
Open publications get more media coverage
One way for researchers to gain visibility is for their publications to be shared on social media and covered by mainstream media outlets. There is evidence that publishing articles openly can help researchers get noticed. A study of over 2,000 articles published inNature Communications showed that those published openly received nearly double the number of unique tweeters and Mendeley readers as closed-access articles (Adie, 2014a). A similar study of over 1,700Nature Communications articles found that OA articles receive 2.5–4.4 times the number of page views, and garnered more social media attention via Twitter and Facebook than non-OA articles (Wang et al., 2015). There is tentative evidence that news coverage confers a citation advantage. For example, a small quasi-experimental 1991 study found that articles covered by theNew York Times received up to 73% more citations that those not covered (Phillips et al., 1991). A 2003 correlational study supported these results, reporting higher citation rates for articles covered by the media (Kiernan, 2003).
Prestige and journal impact factor
As Sydney Brenner wrote in 1995,‘‘…what matters absolutely is the scientific content of a paper and…nothing will substitute for either knowing it or reading it’’ (Brenner, 1995). Unfortunately, academic institutions often rely on proxy metrics, like journal impact factor (IF), to quickly evaluate researchers’ work. The IF is a flawed metric that correlates poorly with the scientific quality of individual articles (Brembs et al., 2013;Neuberger and Counsell, 2002;PLOS Medicine Editors, 2006;Seglen, 1997). In fact, several of the present authors have signed the San Francisco Declaration on Research Assessment (SF-DORA) recommending IF not be used as a research evaluation metric (American Society for Cell Biology, 2013). However, until institutions cease using IF in evaluations, researchers will understandably be concerned about the IF of journals in which they publish. In author surveys, researchers repeatedly rank IF and associated journal reputation as among the most important factors they consider when deciding where to publish (Nature Publishing Group, 2015;Solomon, 2014). Researchers are also aware of the associated prestige that can accompany publication in high-IF journals such asNature orScience. Thus, OA advocates should recognize and respect the pressures on researchers to select publishing outlets based, at least in part, on IF.
Fortunately, concerns about IF need not prevent researchers from publishing openly. For one thing, the IFs of indexed OA journals are steadily approaching those of subscription journals (Björk and Solomon, 2012). In the 2012 Journal Citation Report, over 1,000 (13%) of the journals listed with IFs were OA (Gunasekaran and Arunachalam, 2014). Of these OA journals, thirty-nine had IFs over 5.0 and nine had IFs over 10.0. Examples of OA journals in the biological and medical sciences with moderate to high 2015 IFs includePLOS Medicine (13.6),Nature Communications (11.3), and BioMed Central’sGenome Biology (11.3). The Cofactor Journal Selector Tool allows authors to search for OA journals with an IF (Cofactor Ltd, 2016). We reiterate that our goal in providing such information is not to support IF as a valid measure of scholarly impact, but to demonstrate that researchers do not have to choose between IF and OA when making publishing decisions.
In addition, many subscription-based high-IF journals offer authors the option to pay to make their articles openly accessible. While one can debate the long-term viability and merits of a model that allows publishers to effectively reap both reader-paid and author-paid charges (Björk, 2012), in the short term, researchers who wish to publish their articles openly in traditional journals can do so. Researchers can also publish in high-IF subscription journals and self-archive openly (see section "Publish where you want and archive openly"). We hope that in the next few years, use of IF as a metric will diminish or cease entirely, but in the meantime, researchers have options to publish openly while still meeting any IF-related evaluation and career advancement criteria.
Rigorous and transparent peer review
Unlike most subscription journals, several OA journals have open and transparent peer review processes. Journals such asPeerJ and Royal Society’sOpen Science offer reviewers the opportunity to sign their reviews and offer authors the option to publish the full peer review history alongside their articles. In 2014,PeerJ reported that40% of reviewers sign their reports and80% of authors choose to make their review history public (PeerJ Staff, 2014). BioMed Central’sGigaScience, all the journals in BMC’s medical series, Copernicus journals,F1000Research, and MDPI’sLife require that reviewer reports be published, either as part of a prepublication review process, or subsequent to publication. Some studies suggest open peer review may produce reviews of higher quality, including better substantiated claims and more constructive criticisms, compared to closed review (Kowalczuk et al., 2013;Walsh et al., 2000). Additional studies have also argued that transparent peer review processes are linked to measures of quality (Wicherts, 2016). Other studies have reported no differences in the quality of open versus closed reviews (van Rooyen et al., 1999;van Rooyen et al., 2010). More research in this area is needed.
Unfortunately, the myth that OA journals have poor or non-existent peer review persists. This leads many to believe that OA journals are low quality and causes researchers to be concerned that publishing in these venues will be considered less prestigious in academic evaluations. To our knowledge, there has been no controlled study comparing peer review in OA versus subscription journals. Studies used by some to argue the weakness of peer review at OA journals, such as the John Bohannon ‘sting’ (Bohannon, 2013) in which a fake paper was accepted by several OA journals, have been widely criticized in the academic community for poor methodology, including not submitting to subscription journals for comparison (Joseph, 2013;Redhead, 2013). In fact, Bohannon admitted,‘‘Some open-access journals that have been criticized for poor quality control provided the most rigorous peer review of all’’. He citesPLOS ONE as an example, saying it was the only journal to raise ethical concerns with his submitted work (Bohannon, 2013).
Subscription journals have not been immune to problems with peer review. In 2014, Springer and IEEE retracted over 100 published fake articles from several subscription journals (Van Noorden, 2014;Springer, 2014). Poor editorial practices at one SAGE journal opened the door to peer review fraud that eventually led 60 articles to be retracted (Bohannon, 2014;Journal of Vibration and Control, 2014). Similar issues in other subscription journals have been documented by Retraction Watch (Oransky and Marcus, 2016). Problems with peer review thus clearly exist, but are not exclusive to OA journals. Indeed, large-scale empirical analyses indicate that the reliability of the traditional peer review process itself leaves much to be desired. Bornmann and colleagues reviewed 48 studies of inter-reviewer agreement and found that the average level of agreement was low (mean ICC of .34 and Cohen’s kappa of .17) – well below what what would be considered adequate in psychometrics or other fields focused on quantitative assessment (Bornmann et al., 2010). Opening up peer review, including allowing for real-time discussions between authors and reviewers, could help address some of these issues.
Over time, we expect that transparency will help dispel the myth of poor peer review at OA journals, as researchers read reviews and confirm that the process is typically as rigorous as that of subscription journals. Authors can use open reviews to demonstrate to academic committees the rigorousness of the peer review process in venues where they publish, and highlight reviewer comments on the importance of their work. Researchers in their capacity as reviewers can also benefit from an open approach, as this allows them to get credit for this valuable service. Platforms like Publons let researchers create reviewer profiles to showcase their work (Publons, 2016).
Publish where you want and archive openly
Some researchers may not see publishing in OA journals as a viable option, and may wish instead to publish in specific subscription journals seen as prestigious in their field. Importantly, there are ways to openly share work while still publishing in subscription journals.
Preprints: Authors may provide open access to their papers by posting them as preprints prior to formal peer review and journal publication. Preprints servers are both free for authors to post and free for readers. Several archival preprint servers exist covering different subject areas (Table 1). (Note: The list inTable 1 is not all-inclusive; there are many other servers and institutional repositories that also accept preprints).
Table 1.
Preprint servers and general repositories accepting preprints.
| Preprint server or repository* | Subject areas | Repository open source? | Public API? | Can leave feedback?† | Third party persistent ID? |
|---|---|---|---|---|---|
| arXivarxiv.org | physics, mathematics, computer science, quantitative biology, quantitative finance, statistics | No | Yes | No | No‡ |
| bioRxivbiorxiv.org | biology, life sciences | No | No | Yes | Yes (DOI) |
| CERN document servercds.cern.ch | high-energy physics | Yes (GPL) | Yes | No | No |
| Cogprintscogprints.org | psychology, neuroscience, linguistics, computer science, philosophy, biology | No | Yes | No | No |
| EconStoreconstor.eu | economics | No | Yes | No | Yes (Handle) |
| e-LiSeprints.rclis.org | library and information sciences | No§ | Yes | No | Yes (Handle) |
| figsharefigshare.com | general repository for all disciplines | No | Yes | Yes | Yes (DOI) |
| Munich Personal RePEc Archivempra.ub.uni-muenchen.de | economics | No¶ | Yes | No | No |
| Open Science Frameworkosf.io | general repository for all disciplines | Yes (Apache 2) | Yes | Yes | Yes (DOI/ARK) |
| PeerJ Preprintspeerj.com/archives-preprints | biological, life, medical, and computer sciences | No | Yes | Yes | Yes (DOI) |
| PhilSci Archivephilsci-archive.pitt.edu | philosophy of science | No** | Yes | No | No |
| Self-Journal of Sciencewww.sjscience.org | general repository for all disciplines | No | No | Yes | No |
| Social Science Research Networkssrn.com | social sciences and humanities | No | No | Yes | Yes (DOI) |
| The Winnowerthewinnower.com | general repository for all disciplines | No | No | Yes | Yes (DOI)†† |
| Zenodozenodo.org | general repository for all disciplines | Yes (GPLv2) | Yes | No | Yes (DOI) |
* All these servers and repositories are indexed by Google Scholar.
† Most, if not all, of those marked ’Yes’ require some type of login or registration to leave comments.
‡ arXiv provides internally managed persistent identifiers.
§ e-LiS is built on open source software (EPrints), but the repository itself, including modifications to the code, plugins, etc. is not open source.
¶ MPRA is built on open source software (EPrints), but the repository itself, including modifications to the code, plugins, etc. is not open source.
** PhilSci Archive is built on open source software (EPrints), but the repository itself, including modifications to the code, plugins, etc. is not open source.
†† The Winnower charges a $25 fee to assign a DOI.
Many journals allow posting of preprints, includingScience,Nature, andPNAS, as well as most OA journals. Journal preprint policies can be checked via Wikipedia (Wikipedia, 2016) and SHERPA/RoMEO (SHERPA/RoMEO, 2016). Of the over 2,000 publishers in the SHERPA/RoMEO database, 46% explicitly allow preprint posting. Preprints can be indexed in Google Scholar and cited in the literature, allowing authors to accrue citations while the paper is still in review. In one extreme case, one of the present authors (CTB) published a preprint that has received over 50 citations in three years (Brown et al., 2012), and was acknowledged in NIH grant reviews.
In some fields, preprints can establish scientific priority. In physics, astronomy, and mathematics, preprints have become an integral part of the research and publication workflow (Brown, 2001;Larivière et al., 2014;Gentil-Beccot et al., 2010). Physics articles posted as preprints prior to formal publication tend to receive more citations than those published only in traditional journals (Gentil-Beccot et al., 2010;Schwarz and Kennicutt Jr, 2004;Metcalfe, 2006). Unfortunately, because of the slow adoption of preprints in the biological and medical sciences, few if any studies have been conducted to examine citation advantage conferred by preprints in these fields. However, the growing number of submissions to the quantitative biology section of arXiv, as well as to dedicated biology preprint servers such as bioRxiv and PeerJ PrePrints, should make such studies feasible. Researchers have argued for increased use of preprints in biology (Desjardins-Proulx et al., 2013). The recent Accelerating Science and Publication in biology (ASAPbio) meeting demonstrates growing interest and support for life science preprints from researchers, funders, and publishers (Berg et al., 2016;ASAPbio, 2016).
Postprints: Authors can also archive articles on open platforms after publication in traditional journals (postprints). SHERPA/RoMEO allows authors to check policies from over 2,200 publishers, 72% of which allow authors to archive postprints, either in the form of the authors’ accepted manuscript post-peer review, or the publisher’s formatted version, depending on the policy (SHERPA/RoMEO, 2016). Of notable example isScience, which allows authors to immediately post the accepted version of their manuscript on their website, and post to larger repositories like PubMed Central six months after publication. The journalNature likewise allows archiving of the accepted article in open repositories six months after publication.
If the journal in which authors publish does not formally support self-archiving, authors can submit an author addendum that allows them to retain rights to post a copy of their article in an open repository. The Scholarly Publishing and Academic Resources Coalition (SPARC) provides a template addendum, as well as information on author rights (SPARC, 2016). The Scholar’s Copyright Addendum Engine helps authors generate a customized addendum to send to publishers (Science Commons, 2016). Not all publishers will accept author addenda, but some are willing to negotiate the terms of their publishing agreements.
Retain author rights and control reuse with open licenses
To make their findings known to the world, scientists have historically forfeited ownership of the products of their intellectual labor by signing over their copyrights or granting exclusive reuse rights to publishers. In contrast, authors publishing in OA journals retain nearly all rights to their manuscripts and materials. OA articles are typically published under Creative Commons (CC) licenses, which function within the legal framework of copyright law (Creative Commons, 2016). Under these licenses, authors retain copyright, and simply grant specific (non-exclusive) reuse rights to publishers, as well as other users. Moreover, CC licenses require attribution, which allows authors to receive credit for their work and accumulate citations. Licensors can specify that attribution include not just the name of the author(s) but also a link back to the original work. Authors submitting work to an OA journal should review its submission rules to learn what license(s) the journal permits authors to select.
If terms of a CC license are violated by a user, the licensor can revoke the license and, if the revocation is not honored, take legal action to enforce their copyright. There are several legal precedents upholding CC licenses, including: (1) Adam Curry v. Audax Publishing (Court of Amsterdam, 2006;Garlick, 2006a); (2) Sociedad General de Autores y Editores (SGAE) v. Ricardo Andrés Utrera Fernández (Juzgado de Primera Instancia Número Seis de Badajoz, España, 2006;Garlick, 2006b); and (3) Gerlach v. Deutsche Volksunion (DVU) (Linksvayer, 2011). Through open licensing, researchers thus retain control over how their work is read, shared, and used by others.
An emerging and interesting development is the adoption of rights-retention open access policies (Harvard Open Access Project, 2016). To date, such policies have been adopted by at least 60 schools and institutions worldwide, including some in Canada, Iceland, Kenya, Saudi Arabia, and U.S. universities like Harvard (Harvard Library, Office for Scholary Communication, 2016) and MIT (MIT Libraries, Scholarly Publishing, 2016). These policies involve an agreement by the faculty to grant universities non-exclusive reuse rights on future published works. By putting such a policy in place prior to publication, faculty work can be openly archived without the need to negotiate with publishers to retain or recover rights; open is the default. We expect to see adoption of such policies grow in coming years.
Publish for low-cost or no-cost
Researchers often cite high costs, primarily in the form of article processing charges (APCs), as a barrier to publishing in OA journals. While some publishers – subscription as well as OA – do charge steep fees (Lawson, 2016;Wellcome Trust, 2016c), many others charge nothing at all. In a 2014 study of 1,357 OA journals, 71% did not request any APC (West et al., 2014). A study of over 10,300 OA journals from 2011 to 2015 likewise found 71% did not charge (Crawford, 2016).Eigenfactor.org maintains a list of hundreds of no-fee OA journals across fields (Eigenfactor Project, 2016). Researchers can also search for no-cost OA journals using the Cofactor Journal Selector tool (Cofactor Ltd, 2016). Notable examples of OA journals which do not currently charge authors to publish includeeLife, Royal Society’sOpen Science, and all journals published by consortiums like Open Library of Humanities and SCOAP3. The Scientific Electronic Library Online (SciELO) and the Network of Scientific Journals in Latin America, the Caribbean, Spain, and Portugal (Redalyc), each host over 1,000 journals that provide free publishing for authors.
Many other OA journals charge minimal fees, with the average APC around $665 USD (Crawford, 2016). AtPeerJ, for example, a one-time membership fee of $199 USD allows an author to publish one article per year for life, subject to peer review. (Note: SincePeerJ requires the membership fee to be paid for each author up to 12 authors, the maximum cost of an article would be $2,388 USD. However, this is a one-time fee, after which subsequent articles for the same authors would be free.) Most Pensoft OA journals charge around €100–400 ($115–460 USD), while a select few are free. Ubiquity Press OA journals charge an average APC of £300 ($500 USD), with their open data and software metajournals charging £100 ($140 USD). Cogent’s OA journals all function on a flexible payment model, with authors paying only what they are able based on their financial resources. Importantly, most OA journals do not charge any additional fees for submission or color figures. These charges, as levied by many subscription publishers, can easily sum to hundreds or thousands of dollars (e.g. in Elsevier’sNeuron the first color figure is $1,000 USD, while each additional one is $275). Thus, publishing in OA journals need not be any more expensive than publishing in traditional journals, and in some cases, may cost less.
The majority of OA publishers charging higher publication fees (e.g., PLOS or Frontiers, which typically charge upwards of $1,000 USD per manuscript) offer fee waivers upon request for authors with financial constraints. Policies vary by publisher, but frequently include automatic full waivers for authors from low-income countries, and partial waivers for those in lower-middle-income countries. Researchers in any country can request a partial or full waiver if they cannot pay. Some publishers, such as BioMed Central, F1000, Hindawi, and PeerJ, have membership programs through which institutions pay part or all of the APC for affiliated authors. Some institutions also have discretionary funds for OA publication fees. Increasingly, funders are providing OA publishing funds, or allowing researchers to write these funds into their grants. PLOS maintains a searchable list of both institutions and funders that support OA publication costs (Public Library of Science, 2016). Finally, as discussed previously in the section "Publish where you want and archive openly", researchers can make their work openly available for free by self-archiving preprints or postprints.
Funding
Awards and special funding
For academics in many fields, securing funding is essential to career development and success of their research program. In the last three years, new fellowships and awards for open research have been created by multiple organizations (Table 2). While there is no guarantee that these particular funding mechanisms will be maintained, they are a reflection of the changing norms in science, and illustrate the increasing opportunities to gain recognition and resources by sharing one’s work openly.
Table 2.
Special funding opportunities for open research, training, and advocacy.
| Funding | Description | URL |
|---|---|---|
| Shuttleworth Foundation Fellowship Program | funding for researchers working openly on diverse problems | shuttleworthfoundation.org/fellows/ |
| Mozilla Fellowship for Science | funding for researchers interested in open data and open source | www.mozillascience.org/fellows |
| Leamer-Rosenthal Prizes for Open Social Science (UC Berkeley and John Templeton Foundation) | rewards social scientists for open research and education practices | www.bitss.org/prizes/leamer-rosenthal-prizes/ |
| OpenCon Travel Scholarship (Right to Research Coalition and SPARC) | funding for students and early-career researchers to attend OpenCon, and receive training in open practices and advocacy | www.opencon2016.org/ |
| Preregistration Challenge (Center for Open Science) | prizes for researchers who publish the results of a preregistered study | cos.io/prereg/ |
| Open Science Prize (Wellcome Trust, NIH, and HHMI) | funding to develop services, tools, and platforms that will increase openness in biomedical research | www.openscienceprize.org/ |
Funder mandates on article and data sharing
Increasingly, funders are not only preferring but mandating open sharing of research. The United States National Institutes of Health (NIH) has been a leader in this respect. In 2008, the NIH implemented a public access policy, requiring that all articles arising from NIH-funded projects be deposited in the National Library of Medicine’s open repository, PubMed Central, within one year of publication (Rockey, 2012). NIH also requires that projects receiving $500K or more per year in direct costs include a data management plan that specifies how researchers will share their data (National Institutes of Health, 2003). NIH intends to extend its data sharing policy to a broader segment of its portfolio in the near future. Since 2011, the United States National Science Foundation (NSF) has also encouraged sharing data, software, and other research outputs (National Science Foundation, 2011). All NSF investigators are required to submit a plan, specifying data management and availability. In 2015, U.S. government agencies, including the NSF, Centers for Disease Control and Prevention (CDC), Department of Defense (DoD), National Aeronautics and Space Administration (NASA), and more announced plans to implement article and data sharing requirements in response to the White House Office of Science and Technology (OTSP) memo on public access (Holdren, 2013). A crowd-sourced effort has collected information on these agency policies and continues to be updated (Whitmire et al., 2015).
Several governmental agencies and charitable foundations around the world have implemented even stronger open access mandates. For example, the Wellcome Trust’s policy states that articles from funded projects must be made openly available within six months of publication, and where it provides publishing fee support, specifically requires publication under a Creative Commons Attribution (CC BY) license (Wellcome Trust, 2016b). The Netherlands Organization for Scientific Research (NWO) requires that all manuscripts reporting results produced using public funds must be made immediately available (NWO, 2016). Similar policies are in place at CERN (CERN, 2014), the United Nations Educational, Scientific and Cultural Organization (UNESCO, 2013), and the Bill & Melinda Gates Foundation (Bill & Melinda Gates Foundation, 2015) among others, and are increasingly covering data sharing. Funders recognize that certain types of data, such as clinical records, are sensitive and require special safeguards to permit sharing while protecting patient privacy. The Expert Advisory Group on Data Access (EAGDA) was recently established as a collaboration between the Wellcome Trust, Cancer Research UK, the Economic and Social Research Council, and the Medical Research Council to advise funders on best practices for creating data sharing policies for human research (Wellcome Trust, 2016a).
Researchers can check article and data sharing policies of funders in their country via SHERPA/JULIET (SHERPA/JULIET, 2016). BioSharing also maintains a searchable database of data management and sharing policies from both funders and publishers worldwide (Biosharing.org, 2016). Internationally, the number of open access policies has been steadily increasing over the last decade (Figure 2). Some funders, including the NIH and Wellcome Trust, have begun suspending or withholding funds if researchers do not meet their policy requirements (National Institutes of Health, 2012;Van Noorden, 2014;Wellcome Trust, 2012). Thus, researchers funded by a wide variety of sources will soon be not just encouraged but required to engage in open practices to receive and retain funding. Those already engaging in these practices will likely have a competitive advantage.
Figure 2. Increase in open access policies.
The number of open access policies registered in ROARMAP (roarmap.eprints.org) has increased over the last decade. Data are broken down by type of organization: research organization (e.g., a university or research institution); funder; subunit of research organization (e.g. a library within a university); funder and research organization; multiple research organizations (e.g., an organization with multiple research centers, such as Max Planck Society). Figure used with permission from Stevan Harnad.
Resource management and sharing
In our researcher-centric approach, the rationale for data sharing based on funder mandates could be understood simply as ‘funders want you to share, so it is in your interest to do so’. That may be a compelling but dissatisfying reason to practice openly. Fortunately, there are other compelling reasons to share.
Documentation and reproducibility benefits
First, submitting data and research materials to an independent repository ensures preservation and accessibility of that content in the future - both for one’s own access and for others. This is a particular benefit for responding to requests for data or materials by others. Preparation of research materials for sharing during the active phase of the project is much easier than reconstructing work from years earlier. Second, researchers who plan to release their data, software, and materials are likely to engage in behaviors that are easy to skip in the short-term but have substantial benefits in the long-term, such as clear documentation of the key products of the research. Besides direct benefits for oneself in facilitating later reuse, such practices increase the reproducibility of published findings and the ease with which other researchers can use, extend, and cite that work (Gorgolewski and Poldrack, 2016). Finally, sharing data and materials signals that researchers value transparency and have confidence in their own research.
Gain more citations and visibility by sharing data
Data sharing also confers a citation advantage.Piwowar and Vision (2013) analyzed over 10,000 studies with gene expression microarray data published in 2001–2009, and found an overall 9% citation advantage for papers with shared data and advantages around 30% for older studies.Henneken and Accomazzi (2011) found a 20% citation advantage for astronomy articles that linked to open datasets.Dorch et al., 2015 found a 28–50% citation advantage for astrophysics articles, whileSears (2011) reported a 35% advantage for paleoceanography articles with publicly available data. Similar positive effects of data sharing have been described in the social sciences.Gleditsch et al., 2003 found that articles in theJournal of Peace Research offering data in any form – either through appendices, URLs, or contact addresses – were cited twice as frequently on average as articles with no data but otherwise equivalent author credentials and article variables. Studies with openly published code are also more likely to be cited than those that do not open their code (Vandewalle, 2012). In addition to more citations,Pienta et al., 2010 found that data sharing is associated with higher publication productivity. Across over 7,000 NSF and NIH awards, they reported that research projects with archived data produced a median of 10 publications, versus only 5 for projects without archived data.
Importantly, citation studies may underestimate the scientific contribution and resulting visibility associated with resource sharing, as many data sets and software packages are published as stand-alone outputs that are not associated with a paper but may be widely reused. Fortunately, new outlets for data and software papers allow researchers to describe new resources of interest without necessarily reporting novel findings (Chavan and Penev, 2011;Gorgolewski et al., 2013). There is also a growing awareness that data and software are independent, first class scholarly outputs, that need to be incorporated into the networked research ecosystem. Many open data and software repositories have mechanisms for assigning digital object identifiers (DOIs) to these products. The use of persistent, unique identifiers like DOIs has been recommended by the Joint Declaration of Data Citation Principles to facilitate data citation (Data Citation Synthesis Group, 2014). Researchers can register for a unique Open Researcher and Contributor ID (ORCID) (Haak et al., 2012) to track their research outputs, including datasets and software, and build a richer profile of their contributions. Together, these developments should support efforts to ‘‘make data count’’, further incentivize sharing, and ensure that data generators and software creators receive greater credit for their work (Kratz and Strasser, 2015).
In summary, data and software sharing benefits researchers both because it is consistent with emerging mandates, and because it signals credibility and engenders good research practices that can reduce errors and promote reuse, extension, and citation.
Career advancement
Find new projects and collaborators
Research collaborations are essential to advancing knowledge, but identifying and connecting with appropriate collaborators is not trivial. Open practices can make it easier for researchers to connect with one another by increasing the discoverability and visibility of one’s work, facilitating rapid access to novel data and software resources, and creating new opportunities to interact with and contribute to ongoing communal projects. For example, in 2011, one of the present authors (BAN) initiated a project to replicate a sample of studies to estimate the reproducibility of psychological science (Open Science Collaboration, 2012;Open Science Collaboration, 2014). Completing a meaningful number of replications in a single laboratory would have been difficult. Instead, the project idea was posted to a listserv as an open collaboration. Ultimately, more than 350 people contributed, with 270 earning co-authorship on the publication (Open Science Collaboration, 2015). Open collaboration enabled distribution of work and expertise among many researchers, and was essential for the project’s success. Other projects have used similar approaches to successfully carry out large-scale collaborative research (Klein et al., 2014).
Similar principles are the core of the thriving open -source scientific software ecosystem. In many scientific fields, widely used state-of-the-art data processing and analysis packages are hosted and developed openly, allowing virtually anyone to contribute. Perhaps the paradigmatic example is thescikit-learn Python package for machine learning (Pedregosa et al., 2011), which, in the space of just over five years, has attracted over 500 unique contributors, 20,000 individual code contributions, and 2,500 article citations. Producing a comparable package using a traditional closed-source approach would likely not be feasible, and would, at the very least, have required a budget of tens of millions of dollars. While scikit-learn is clearly an outlier, hundreds of other open-source scientific packages that support much more domain-specific needs depend in a similar fashion on unsolicited community contributions e.g., the NIPY group of projects in neuroimaging (Gorgolewski et al., 2016). Importantly, such contributions not only result in new functionality from which the broader scientific community can benefit, but also regularly provide their respective authors with greater community recognition, and lead to new project and employment opportunities.
Institutional support of open research practices
Institutions are increasingly recognizing the limitations of journal-level metrics and exploring the potential benefits of article-level and alternative metrics in evaluating the contributions of specific research outputs. In 2013, the American Society for Cell Biology, along with a group of diverse stakeholders in academia, released the San Francisco Declaration on Research Assessment (SF-DORA) (American Society for Cell Biology, 2013). The declaration recommends that institutions cease using all journal-level metrics, including journal impact factor (IF), to evaluate research for promotion and tenure decisions, and focus instead on research content. Additional recommendations include recognizing data and software as valuable research products. As of March 2016, over 12,000 individuals and more than 600 organizations have signed SF-DORA in support of the recommendations, including universities from all over the world. The 2015 Higher Education Funding Council for England (HEFCE) report for The Research Excellence Framework (REF) – UK’s system for assessing research quality in higher education institutions – also rejects the use of IF and other journal metrics to evaluate researchers for hiring and promotion, and recommends institutions explore a variety of quantitative and qualitative indicators of research impact and ways to recognize sharing of diverse research outputs (Wilsdon et al., 2015).
Several U.S. institutions have passed resolutions explicitly recognizing open practices in promotion and tenure evaluations, including Virginia Commonwealth University (Virginia Commonwealth University Faculty Senate, 2010) and Indiana University-Purdue University Indianapolis (Indiana University-Purdue University Indianapolis, 2016). In 2014, Harvard’s School of Engineering and Applied Sciences launched a pilot program to encourage faculty to archive their articles in the university’s open repository as part of the promotion and tenure process (Harvard Library, Office for Scholarly Communication, 2014). The University of Liège has gone a step further and requires publications to be included in the university’s open access repository to be considered for promotion (University of Liège, 2016). Explicit statements of the importance of open practices are even starting to appear in faculty job advertisements, such as one from LMU München asking prospective candidates to describe their open research activities (Schönbrodt, 2016).
Discussion
Open questions
The emerging field of metascience provides some evidence about the value of open practices, but it is far from complete. There are many initiatives aimed at increasing open practices, and not yet enough published evidence about their effectiveness. For example, journals can offer badges to acknowledge open practices such as open data, open materials, and preregistration (Open Research Badges, 2016). Initial evidence from a single adopting journal,Psychological Science, and a sample of comparison journals suggests that this simple incentive increases data sharing rates from less than 3% to more than 38% (Kidwell et al., 2016). More research is needed across disciplines to follow up on this encouraging evidence. UCLA’s Knowledge Infrastructures project is an ongoing study that, among other objectives, is learning about data sharing practices and factors that discourage or promote sharing across four collaborative scientific projects (Borgman et al., 2015;Darch et al., 2015).
Open research advocates often cite reproducibility as one of the benefits of data and code sharing (Gorgolewski and Poldrack, 2016). There is a logical argument that having access to the data, code, and materials makes it easier to reproduce the evidence that was derived from that research content. Data sharing correlates with fewer reporting errors, compared to papers with unavailable data (Wicherts, 2016), and could be due to diligent data management practices. However, there is not yet direct evidence that open practices per se are a net benefit to research progress. As a first step, the University of California at Riverside and the Center for Open Science have initiated an NSF-supported randomized trial to evaluate the impact of receiving training to use the Open Science Framework for managing, archiving, and sharing lab research materials and data. Labs across the university will be randomly assigned to receive the training, and outcomes of their research will be assessed across multiple years.
Preregistration of research designs and analysis plans is a proposed method to increase the credibility of reported research and a means to increase transparency of the research workflow. However, preregistration is rarely practiced outside of clinical trials where it is required by law in the U.S. and as a condition for publication in most journals that publish them. Research suggests that preregistration may counter some questionable practices, such as flexible definition of analytic models and outcome variables in order to find positive results (Kaplan and Irvin, 2015). Public registration also makes it possible to compare publications and registrations of the same study to identify cases in which outcomes were changed or unreported, as is the focus of the COMPare project based at the University of Oxford (COMPare, 2016). Similar efforts include the AllTrials project, run by an international team (AllTrials, 2016), and extending beyond just preregistration of planned studies to retroactive registration and transparent reporting for previously conducted clinical trials. Another example is the AsPredicted project, which is run by researchers at the University of Pennsylvania and University of California Berkeley, and offers preregistration services for any discipline (AsPredicted, 2016). To initiate similar research efforts in the basic and preclinical sciences, the Center for Open Science launched the Preregistration Challenge, offering one thousand $1,000 awards to researchers that publish the outcomes of preregistered research (Center for Open Science, 2016).
Openness as a continuum of practices
While there are clear definitions and best practices for open access (Chan et al., 2002), open data (Open Knowledge, 2005;Murray-Rust et al., 2010), and open source (Open Source Initiative, 2007), openness is not ‘all-or-nothing’. Not all researchers are comfortable with the same level of sharing, and there are a variety of ways to be open (seeBox 1). Openness can be thus defined by a continuum of practices, starting perhaps at the most basic level with openly self-archiving postprints and reaching perhaps the highest level with openly sharing grant proposals, research protocols, and data in real time. Fully open research is a long-term goal to strive towards, not a switch we should expect to flip overnight.
Box 1. What can I do right now?
Engaging in open science need not require a long-term commitment or intensive effort. There are a number of practices and resolutions that researchers can adopt with very little effort that can help advance the overall open science cause while simultaneously benefiting the individual researcher.
Post free copies of previously published articles in a public repository. Over 70% of publishers allow researchers to post an author version of their manuscript online, typically 6-12 months after publication (see section "Publish where you want and archive openly").
Deposit preprints of all manuscripts in publicly accessible repositories as soon as possible – ideally prior to, and no later than, the initial journal submission (see section "Postprints").
Publish in open access venues whenever possible. As discussed in Prestige and journal impact factor, this need not mean forgoing traditional subscription-based journals, as many traditional journals offer the option to pay an additional charge to make one’s article openly accessible.
Publicly share data and materials via a trusted repository. Whenever it is feasible, the data, materials, and analysis code used to generate the findings reported in one’s manuscripts should be shared. Many journals already require authors to share data upon request as a condition of publication; pro-actively sharing data can be significantly more efficient, and offers a variety of other benefits (see section "Resource management and sharing").
Preregister studies. Publicly preregistering one’s experimental design and analysis plan in advance of data collection is an effective means of minimizing bias and enhancing credibility (see section "Open questions"). Since the preregistration document(s) can be written in a form similar to a Methods section, the additional effort required for preregistration is often minimal.
Many of the discussions about openness center around the associated fears, and we need encouragement to explore the associated benefits as well. As researchers share their work and experience the benefits, they will likely become increasingly comfortable with sharing and willing to experiment with new open practices. Acknowledging and supporting incremental steps is a way to respect researchers’ present experience and comfort, and produce a gradual culture change from closed to open research. Training of researchers early in their careers is fundamental. Graduate programs can integrate open science and modern scientific computing practices into their existing curriculum. Methods courses could incorporate training on publishing practices such as proper citation, author rights, and open access publishing options. Institutions and funders could provide skills training on self-archiving articles, data, and software to meet mandate requirements. Importantly, we recommend integrating education and training with regular curricular and workshop activities so as not to increase the time burden on already-busy students and researchers.
Summary
The evidence that openly sharing articles, code, and data is beneficial for researchers is strong and building. Each year, more studies are published showing the open citation advantage; more funders announce policies encouraging, mandating, or specifically financing open research; and more employers are recognizing open practices in academic evaluations. In addition, a growing number of tools are making the process of sharing research outputs easier, faster, and more cost-effective. In his 2012 bookOpen Access, Peter Suber summed it up best: "[OA] increases a work’s visibility, retrievability, audience, usage, and citations, which all convert to career building. For publishing scholars, it would be a bargain even if it were costly, difficult, and time-consuming. But…it’s not costly, not difficult, and not time-consuming.’’ (Suber, 2012)
Acknowledgements
This paper arose from the ‘‘Open Source, Open Science’’ meeting held March 19-20th, 2015 at the Center for Open Science in collaboration with Mozilla Science Lab. This meeting was supported by the National Institute of Aging (R24AG048124), the Laura and John Arnold Foundation, and the John Templeton Foundation (46545). The authors thank all those who responded to our public calls for comment – especially Virginia Barbour, Peter Binfield, Nazeefa Fatima, Daniel S. Katz, Sven Kochmann, Ehud Lamm, Alexei Lutay, Ben Marwick, Daniel Mietchen, Ian Mulvany, Cameron Neylon, Charles Oppenheim, Pandelis Perakakis, Richard Smith-Unna, Peter Suber, and Anne-Katharina Weilenmann – whose feedback helped us improve this manuscript.
Funding Statement
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Competing interests
ECM: Founder of the 'Why Open Research?' project, an open research advocacy and educational site funded by the Shuttleworth Foundation. She is also a figshare and PeerJ Preprints advisor, Center for Open Science ambassador, and OpenCon organizing committee member - all volunteer positions.
AK: Works at the open access publisher BioMed Central, a part of the larger SpringerNature company, where she leads initiatives around open data and research and oversees a portfolio of journals in the health sciences.
JL: Works for CrossRef and is involved in building infrastructure that supports open science research: Principles for Open Scholarly Research, open data initiatives, and open scholarly metadata.
BAN: Employed by the non-profit Center for Open Science, which runs the Open Science Framework, and includes in its mission "increasing openness, integrity, and reproducibility of scientific research".
CKS: Employed by the non-profit Center for Open Science, which runs the Open Science Framework, and includes in its mission "increasing openness, integrity, and reproducibility of scientific research".
JRS: Employed by the non-profit Center for Open Science, which runs the Open Science Framework, and includes in its mission "increasing openness, integrity, and reproducibility of scientific research".
KT: Employed by the Mozilla Foundation, where she leads the organization's open science program - the Mozilla Science Lab. The Science Lab supports fellowships, training and prototyping, including work on open research badges.
The other authors declare that no competing interests exist.
Author contributions
ECM, Conception and design, Drafting or revising the article.
PEB, Conception and design, Drafting or revising the article.
CTB, Conception and design, Drafting or revising the article.
SB, Conception and design, Drafting or revising the article.
AK, Conception and design, Drafting or revising the article.
JL, Conception and design, Drafting or revising the article.
DM, Conception and design, Drafting or revising the article.
BAN, Conception and design, Drafting or revising the article.
KR, Conception and design, Drafting or revising the article.
CKS, Conception and design, Drafting or revising the article.
JRS, Conception and design, Drafting or revising the article.
KT, Conception and design, Drafting or revising the article.
AU, Conception and design, Drafting or revising the article.
KHW, Conception and design, Drafting or revising the article.
TY, Conception and design, Drafting or revising the article.
Funding Information
This paper was supported by the following grants:
National Institute on AgingR24AG048124 to Brian A Nosek, Courtney K Soderberg.
Laura and John Arnold Foundation to Brian A Nosek, Jeffrey R Spies.
John Templeton Foundation46545 to Brian A Nosek, Jeffrey R Spies.
References
- Adie E. Attention! A study of open access vs non-open access articles. Figshare. 2014 doi: 10.6084/m9.figshare.1213690. [DOI] [Google Scholar]
- AllTrials All trials registered, all results reported.http://www.alltrials.net/ 2016
- American Society for Cell Biology San Francisco declaration on research assessment.http://www.ascb.org/dora/ 2013
- Antelman K. Do open-access articles have a greater research impact? College & Research Libraries. 2004;65:372–382. doi: 10.5860/crl.65.5.372. [DOI] [Google Scholar]
- ASAPbio Opinions on preprints in biology.http://asapbio.org/survey.https://dx.doi.org/10.6084/m9.figshare.2247616.v1 2016
- AsPredicted Pre-registration made easy.https://aspredicted.org/ 2016
- Atchison A, Bull J. Will open access get me cited? an analysis of the efficacy of open access publishing in political science. Political Science & Politics. 2015;48:129–137. doi: 10.1017/S1049096514001668. [DOI] [Google Scholar]
- Berg JM, Bhalla N, Bourne PE, Chalfie M, Drubin DG, Fraser JS, Greider CW, Hendricks M, Jones C, Kiley R, King S, Kirschner MW, Krumholz HM, Lehmann R, Leptin M, Pulverer B, Rosenzweig B, Spiro JE, Stebbins M, Strasser C, Swaminathan S, Turner P, Vale RD, VijayRaghavan K, Wolberger C. Preprints for the life sciences. Science. 2016;352:899–901. doi: 10.1126/science.aaf9133. [DOI] [PubMed] [Google Scholar]
- Bill & Melinda Gates Foundation Open Access Policy.http://www.gatesfoundation.org/How-We-Work/General-Information/Open-Access-Policy 2015
- Biosharing.org 2016. A catalogue of data preservation, management and sharing policies from international funding agencies, regulators and journals. biosharing.org/policies
- Björk B-C. The hybrid model for open access publication of scholarly articles: A failed experiment? Journal of the American Society for Information Science and Technology. 2012;63:1496–1504. doi: 10.1002/asi.22709. [DOI] [Google Scholar]
- Björk B-C, Laakso M, Welling P, Paetau P. Anatomy of green open access. Journal of the Association for Information Science and Technology. 2014;65:237–250. doi: 10.1002/asi.22963. [DOI] [Google Scholar]
- Björk B-C, Solomon D. Open access versus subscription journals: a comparison of scientific impact. BMC Medicine. 2012;10:73. doi: 10.1186/1741-7015-10-73. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bohannon J. Who's afraid of peer review? Science. 2013;342:60–65. doi: 10.1126/science.342.6154.60. [DOI] [PubMed] [Google Scholar]
- Bohannon J. Lax reviewing practice prompts 60 retractions at SAGE journal. Science Insider, 2014.http://www.sciencemag.org/news/2014/07/updated-lax-reviewing-practice-prompts-60-retractions-sage-journal 2014
- Borgman CL, Darch PT, Sands AE, Pasquetto IV, Golshan MS, Wallis JC, Traweek S. Knowledge infrastructures in science: data, diversity, and digital libraries. International Journal on Digital Libraries. 2015;16:207–227. doi: 10.1007/s00799-015-0157-z. [DOI] [Google Scholar]
- Bornmann L, Mutz R, Daniel HD. A reliability-generalization study of journal peer reviews: a multilevel meta-analysis of inter-rater reliability and its determinants. PLoS One. 2010;5:e16800. doi: 10.1371/journal.pone.0014331. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Brembs B, Button K, Munafò M. Deep impact: unintended consequences of journal rank. Frontiers in Human Neuroscience. 2013;7:e16800. doi: 10.3389/fnhum.2013.00291. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Brenner S. Loose end. Current Biology. 1995;5:568. doi: 10.1016/S0960-9822(95)00109-X. [DOI] [PubMed] [Google Scholar]
- Brown C. The E-volution of preprints in the scholarly communication of physicists and astronomers. Journal of the American Society for Information Science and Technology. 2001;52:187–200. doi: 10.1002/1097-4571(2000)9999:9999<::AID-ASI1586>3.0.CO;2-D. [DOI] [Google Scholar]
- Brown CT, Howe A, Zhang Q, Pyrkosz AB, Brom TH. A reference-free algorithm for computational normalization of shotgun sequencing data. 2012. e16800
- Center for Open Science The 1,000,000 Preregistration Challenge.https://cos.io/prereg/ 2016
- CERN Open Access Policy for CERN Physics Publications.http://cds.cern.ch/record/1955574/files/CERN-OPEN-2014-049.pdf 2014
- Chan L, Cuplinskas D, Eisen M, Friend F, Genova Y, Guédon J-C, Hagemann M, Harnad S, Johnson R, Kupryte R, La Manna M, Rév I, Segbert M, de Souza S, Suber P, Velterop J. Budapest Open Access Initiative.http://www.budapestopenaccessinitiative.org/ 2002
- Chavan V, Penev L. The data paper: a mechanism to incentivize data publishing in biodiversity science. BMC Bioinformatics. 2011;12:S2. doi: 10.1186/1471-2105-12-S15-S2. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Cofactor Ltd Cofactor Journal Selector Tool.http://cofactorscience.com/journal-selector 2016
- COMPare Tracking switched outcomes in clinical trials.http://compare-trials.org/ 2016
- Court of Amsterdam Adam Curry v. Audax Publishing.http://deeplink.rechtspraak.nl/uitspraak?id=ECLI:NL:RBAMS:2006:AV4204 2006
- Crawford W. Cites & Insights Books. 2016. Gold Open Access Journals 2011-2015. [Google Scholar]
- Creative Commons About The Licenses.https://creativecommons.org/licenses/ 2016
- Darch PT, Borgman CL, Traweek S, Cummings RL, Wallis JC, Sands AE. What lies beneath?: Knowledge infrastructures in the subseafloor biosphere and beyond. International Journal on Digital Libraries. 2015;16:61–77. doi: 10.1007/s00799-015-0137-3. [DOI] [Google Scholar]
- Data Citation Synthesis Group Joint declaration of data citation principles.https://www.force11.org/group/joint-declaration-data-citation-principles-final 2014
- Davis P, Fromerth M. Does the arXiv lead to higher citations and reduced publisher downloads for mathematics articles? Scientometrics. 2007;71:203–215. doi: 10.1007/s11192-007-1661-8. [DOI] [Google Scholar]
- Davis PM, Lewenstein BV, Simon DH, Booth JG, Connolly MJL. Open access publishing, article downloads, and citations: randomised controlled trial. BMJ 337. 2008:e16800. doi: 10.1136/bmj.a568. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Davis PM. Open access, readership, citations: a randomized controlled trial of scientific journal publishing. FASEB Journal. 2011;25:2129–2134. doi: 10.1096/fj.11-183988. [DOI] [PubMed] [Google Scholar]
- Desjardins-Proulx P, White EP, Adamson JJ, Ram K, Poisot T, Gravel D. The case for open preprints in biology. PLOS Biology. 2013;11:e16800. doi: 10.1371/journal.pbio.1001563. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Donovan JM, Watson CA, Osborne C. The open access advantage for American law reviews. Edison: Law + Technology. 2015;2015:1–22. [Google Scholar]
- Dorch SBF, Drachen TM, Ellegaard O. The data sharing advantage in astrophysics. 2015. e16800
- Eigenfactor Project No-fee open access journals for all fields.www.eigenfactor.org/openaccess/fullfree.php 2016
- Eysenbach G. Citation advantage of open access articles. PLoS Biology. 2006;4:e157. doi: 10.1371/journal.pbio.0040157. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Frandsen TF. The effects of open access on un-published documents: a case study of economics working papers. Journal of Informetrics. 2009a;3:124–133. doi: 10.1016/j.joi.2008.12.002. [DOI] [Google Scholar]
- Frandsen TF. The integration of open access journals in the scholarly communication system: three science fields. Information Processing & Management. 2009b;45:131–141. doi: 10.1016/j.ipm.2008.06.001. [DOI] [Google Scholar]
- Gargouri Y, Hajjem C, Larivière V, Gingras Y, Carr L, Brody T, Harnad S. Self-selected or mandated, open access increases citation impact for higher quality research. PLoS One. 2010;5:e16800. doi: 10.1371/journal.pone.0013636. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Garlick M. Creative Commons licenses upheld in Dutch court.https://creativecommons.org/2006/03/16/creativecommonslicensesupheldindutchcourt/ 2006a
- Garlick M. Spanish Court Recognizes CC-Music.https://creativecommons.org/2006/03/23/spanishcourtrecognizesccmusic/ 2006b
- Gaulé P, Maystre N. Getting cited: Does open access help? Research Policy. 2011;40:1332–1338. doi: 10.1016/j.respol.2011.05.025. [DOI] [Google Scholar]
- Gentil-Beccot A, Mele S, Brooks TC. Citing and reading behaviours in high-energy physics. Scientometrics. 2010;84:345–355. doi: 10.1007/s11192-009-0111-1. [DOI] [Google Scholar]
- Gleditsch NP, Metelits C, Strand H. Posting your data: will you be scooped or will you be famous? International Studies Perspectives. 2003;4:89–97. doi: 10.1111/1528-3577.04105. [DOI] [Google Scholar]
- Gorgolewski K, Esteban O, Burns C, Ziegler E, Pinsard B, Madison C, Waskom M, Ellis DG, Clark D, Dayan M, Manhães-Savio A, Notter MP, Johnson H, Dewey YO, Hamalainen C, Keshavan A, Clark D, Huntenburg JM, Hanke M, Nichols BN, Wassermann D, Eshaghi A, Markiewicz C, Varoquaux G, Acland B, Forbes J, Rokem A, Kong X-Z, Gramfort A, Kleesiek J, Schaefer A, Sikka S, Perez-Guevara MF, Glatard T, Iqbal S, Liu S, Welch D, Sharp P, Warner J, Kastman E, Lampe L, Perkins LN, Craddock RC, Küttner R, Bielievtsov D, Geisler D, Gerhard S, Liem F, Linkersdörfer J, Margulies DS, Andberg SK, Stadler J, Steele CJ, Broderick W, Cooper G, Floren A, Huang L, Gonzalez I, McNamee D, Papadopoulos Orfanos D, Pellman J, Triplett W, Ghosh S. Nipype: a flexible, lightweight and extensible neuroimaging data processing framework in python. Zenodo. 2016 doi: 10.5281/zenodo.50186. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gorgolewski KJ, Margulies DS, Milham MP. Making data sharing count: a publication-based solution. Frontiers in Neuroscience. 2013;7:e16800. doi: 10.3389/fnins.2013.00009. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gorgolewski KJ, Poldrack R. A practical guide for improving transparency and reproducibility in neuroimaging research. bioRxiv. 2016 doi: 10.1101/039354. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gunasekaran S, Arunachalam S. The impact factors of open access and subscription journals across fields. Current Science. 2014;107:380. [Google Scholar]
- Haak LL, Fenner M, Paglione L, Pentz E, Ratner H. ORCID: a system to uniquely identify researchers. Learned Publishing. 2012;25:259–264. doi: 10.1087/20120404. [DOI] [Google Scholar]
- Hajjem C, Harnad S, Gingras Y. Ten-year cross-disciplinary comparison of the growth of open access and how it increases research citation impact. 2006. e16800
- Harnad S, Brody T. Comparing the impact of open access (oa) vs. non-oa articles in the same journals. D-Lib Magazine. 2004;10 [Google Scholar]
- Harvard Library, Office for Scholarly Communication Harvard’s School of Engineering and Applied Sciences Recommends Open-Access Deposit for Faculty Review Process.http://bit.ly/1X8cLob 2014
- Harvard Library, Office for Scholary Communication Open access policies.https://osc.hul.harvard.edu/policies/ 2016
- Harvard Open Access Project Good practices for university open-access policies.http://bit.ly/goodoa 2016
- Heimstädt M, Saunderson F, Heath T. From toddler to teen: growth of an open data ecosystem. eJournal of eDemocracy & Open Government. 2014;6:123–135. [Google Scholar]
- Henneken EA, Accomazzi A. Linking to data-effect on citation rates in astronomy. 2011. e16800
- Hitchcock S. The effect of open access and downloads (’hits’) on citation impact: a bibliography of studies.http://opcit.eprints.org/oacitation-biblio.html 2016
- Holdren JP. Increasing access to the results of federally funded scientific research.https://www.whitehouse.gov/sites/default/files/microsites/ostp/ostp_public_access_memo_2013.pdf 2013
- Ince DC, Hatton L, Graham-Cumming J. The case for open computer programs. Nature. 2012;482:485–488. doi: 10.1038/nature10836. [DOI] [PubMed] [Google Scholar]
- Indiana University-Purdue University Indianapolis IUPUI Promotion & Tenure Guidelines.http://www.facultysenate.vcu.edu/tag/open-access-scholarship-promotion-and-tenure/ 2016
- Joseph H. Science magazine’s open access sting. SPARC blog, 2013.http://www.sparc.arl.org/blog/science-magazine-open-access-sting 2013
- Journal of Vibration and Control Retraction notice. Journal of Vibration and Control. 2014;20:1601–1604. doi: 10.1177/1077546314541924. [DOI] [Google Scholar]
- Juzgado de Primera Instancia Número Seis de Badajoz, España Sociedad General de Autores y Editores v. Ricardo Andres Utrera Fernández.http://www.internautas.org/archivos/sentencia_metropoli.pdf 2006
- Kaplan RM, Irvin VL. Likelihood of Null Effects of Large NHLBI Clinical Trials Has Increased over Time. PLoS One. 2015;10:e0132382. doi: 10.1371/journal.pone.0132382. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kidwell MC, Lazarević LB, Baranski E, Hardwicke TE, Piechowski S, Falkenberg LS, Kennett C, Slowik A, Sonnleitner C, Hess-Holden C, Errington TM, Fiedler S, Nosek BA. Badges to Acknowledge Open Practices: A Simple, Low-Cost, Effective Method for Increasing Transparency. PLOS Biology. 2016;14:e1002456. doi: 10.1371/journal.pbio.1002456. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kiernan V. Diffusion of news about research. Science Communication. 2003;25:3–13. doi: 10.1177/1075547003255297. [DOI] [Google Scholar]
- Klein RA, Ratliff KA, Vianello M, Adams RB, Bahník Š, Bernstein MJ, Bocian K, Brandt MJ, Brooks B, Brumbaugh CC, Cemalcilar Z, Chandler J, Cheong W, Davis WE, Devos T, Eisner M, Frankowska N, Furrow D, Galliani EM, Hasselman F, Hicks JA, Hovermale JF, Hunt SJ, Huntsinger JR, IJzerman H, John M-S, Joy-Gaba JA, Barry Kappes H, Krueger LE, Kurtz J, Levitan CA, Mallett RK, Morris WL, Nelson AJ, Nier JA, Packard G, Pilati R, Rutchick AM, Schmidt K, Skorinko JL, Smith R, Steiner TG, Storbeck J, Van Swol LM, Thompson D, van ‘t Veer AE, Vaughn LA, Vranka M, Wichman AL, Woodzicka JA, Nosek BA. Investigating Variation in Replicability. Social Psychology. 2014;45:142–152. doi: 10.1027/1864-9335/a000178. [DOI] [Google Scholar]
- Koler-Povh T, Južnič P, Turk G. Impact of open access on citation of scholarly publications in the field of civil engineering. Scientometrics. 2014;98:1033–1045. doi: 10.1007/s11192-013-1101-x. [DOI] [Google Scholar]
- Kousha K, Abdoli M. The citation impact of open access agricultural research. Online Information Review. 2010;34:772–785. doi: 10.1108/14684521011084618. [DOI] [Google Scholar]
- Kowalczuk MK, Dudbridge F, Nanda S, Harriman SL, Moylan EC. A comparison of the quality of reviewer reports from author-suggested reviewers and editor-suggested reviewers in journals operating on open or closed peer review models. F1000 Posters. 2013;4:e16800. doi: 10.1136/bmjopen-2015-008707. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kratz JE, Strasser C. Scientific Data. 2015;2:Comment: Making data count. 150039. doi: 10.1038/sdata.2015.39. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Lansingh VC, Carter MJ. Does open access in ophthalmology affect how articles are subsequently cited in research? Ophthalmology. 2009;116:1425–1431. doi: 10.1016/j.ophtha.2008.12.052. [DOI] [PubMed] [Google Scholar]
- Larivière V, Sugimoto CR, Macaluso B, Milojević S, Cronin B, Thelwall M. arXiv E-prints and the journal of record: An analysis of roles and relationships. Journal of the Association for Information Science and Technology. 2014;65:1157–1169. doi: 10.1002/asi.23044. [DOI] [Google Scholar]
- Lawrence S. Free online availability substantially increases a paper's impact. Nature. 2001;411:521. doi: 10.1038/35079151. [DOI] [PubMed] [Google Scholar]
- Lawson S. APC data for 27 UK higher education institutions in 2015. Figshare. 2016 doi: 10.6084/m9.figshare.1507481.v4. [DOI] [Google Scholar]
- Linksvayer M. Creative Commons Attribution-ShareAlike license enforced in Germany. Creative Commons Blog, 2011.https://creativecommons.org/2011/09/15/creative-commons-attribution-sharealike-license-enforced-in-germany/ 2011
- McCabe MJ, Snyder CM. Identifying the effect of open access on citations using a panel of science journals. Economic Inquiry. 2014;52:1284–1300. doi: 10.1111/ecin.12064. [DOI] [Google Scholar]
- McCabe MJ, Snyder CM. Does online availability increase citations? Theory and evidence from a panel of economics and business journals. Review of Economics and Statistics. 2015;97:144–165. doi: 10.1162/REST_a_00437. [DOI] [Google Scholar]
- Metcalfe TS. The citation impact of digital preprint archives for solar physics papers. Solar Physics. 2006;239:549–553. doi: 10.1007/s11207-006-0262-7. [DOI] [Google Scholar]
- Michener WK. Ecological data sharing. Ecological Informatics. 2015;29:33–44. doi: 10.1016/j.ecoinf.2015.06.010. [DOI] [Google Scholar]
- MIT Libraries, Scholarly Publishing MIT Faculty Open Access Policy.http://libraries.mit.edu/scholarly/mit-open-access/open-access-at-mit/mit-open-access-policy/ 2016
- Murray-Rust P, Neylon C, Pollock R, Wilbanks J. Panton Principles, Principles for open data in science.http://pantonprinciples.org/ 2010
- National Institutes of Health NIH Data Sharing Policy and Implementation Guidance.http://grants.nih.gov/grants/policy/data_sharing/data_sharing_guidance.htm 2003
- National Institutes of Health Upcoming Changes to Public Access Policy Reporting Requirements and Related NIH Efforts to Enhance Compliance.http://grants.nih.gov/grants/guide/notice-files/NOT-OD-12-160.html 2012
- National Science Foundation Digital Research Data Sharing and Management.www.nsf.gov/nsb/publications/2011/nsb1124.pdf 2011
- Nature Publishing Group Author Insights 2015 Survey. Figshare. 2015 doi: 10.6084/m9.figshare.1425362.v7. [DOI] [Google Scholar]
- NWO Open Science.http://www.nwo.nl/en/policies/open+science 2016
- Neuberger J, Counsell C. Impact factors: uses and abuses. European Journal of Gastroenterology & Hepatology. 2002;14:209–211. doi: 10.1097/00042737-200203000-00001. [DOI] [PubMed] [Google Scholar]
- Norris M, Oppenheim C, Rowland F. The citation advantage of open-access articles. Journal of the American Society for Information Science and Technology. 2008;59:1963–1972. doi: 10.1002/asi.20898. [DOI] [Google Scholar]
- Open Knowledge The Open Definition.http://opendefinition.org/ 2005
- Open Research Badgeshttp://openresearchbadges.org/ 2016
- Open Science Collaboration An Open, Large-Scale, Collaborative Effort to Estimate the Reproducibility of Psychological Science. Perspectives on Psychological Science. 2012;7:657–660. doi: 10.1177/1745691612462588. [DOI] [PubMed] [Google Scholar]
- Open Science Collaboration . The reproducibility project: a model of large-scale collaboration for empirical research on reproducibility. In: Stodden V, Leisch F, Peng RD, editors. Implementing Reproducible Research. CRC Press, Taylor & Francis Group; 2014. pp. 299–324. [Google Scholar]
- Open Science Collaboration Estimating the reproducibility of psychological science. Science. 2015;349:e16800. doi: 10.1126/science.aac4716. [DOI] [PubMed] [Google Scholar]
- Open Source Initiative The Open Source Definition.https://opensource.org/osd 2007
- Oransky I, Marcus A. Retraction Watch: Tracking retractions as a window into the scientific process.http://retractionwatch.com/ 2016
- Pedregosa F, Varoquaux G, Gramfort A, Michel V, Thirion B, Grisel O, Blondel M, Prettenhofer P, Weiss R, Dubourg V, Vanderplas J, Passos A, Cournapeau D, Brucher M, Perrot M, Duchesnay E. Scikit-learn: machine learning in Python. Journal of Machine Learning Research. 2011;12:2825–2830. [Google Scholar]
- PeerJ Staff Who’s Afraid of Open Peer Review? PeerJblog, 2014.https://peerj.com/blog/post/100580518238/whos-afraid-of-open-peer-review/ 2014
- Phillips DP, Kanter EJ, Bednarczyk B, Tastad PL. Importance of the lay press in the transmission of medical knowledge to the scientific community. New England Journal of Medicine. 1991;325:1180–1183. doi: 10.1056/NEJM199110173251620. [DOI] [PubMed] [Google Scholar]
- Pienta AM, Alter GC, Lyle JA. The enduring value of social science research: the use and reuse of primary research data.https://deepblue.lib.umich.edu/handle/2027.42/78307 2010
- Piwowar HA, Vision TJ. Data reuse and the open data citation advantage. PeerJ. 2013;1:e16800. doi: 10.7717/peerj.175. [DOI] [PMC free article] [PubMed] [Google Scholar]
- PLOS Medicine Editors The impact factor game. PLOS Medicine. 2006;3:e16800. doi: 10.1371/journal.pmed.0030291. [DOI] [Google Scholar]
- Poline J-B, Breeze JL, Ghosh S, Gorgolewski K, Halchenko YO, Hanke M, Haselgrove C, Helmer KG, Keator DB, Marcus DS, Poldrack RA, Schwartz Y, Ashburner J, Kennedy DN. Data sharing in neuroimaging research. Frontiers in Neuroinformatics. 2012;6:9. doi: 10.3389/fninf.2012.00009. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Public Library of Science Open Access Funds.www.plos.org/publications/publication-fees/open-access-funds/ 2016
- Publons Get credit for peer review.https://publons.com/ 2016
- Redhead C. OASPA’s response to the recent article in Science entitled “Who’s Afraid of Peer Review?”. Open Access Scholarly Publishers Association, 2013.http://oaspa.org/response-to-the-recent-article-in-science/ 2013
- Research Information Network Nature Communications: Citation analysis.http://www.nature.com/press_releases/ncomms-report2014.pdf 2014
- Rockey S. Revised Policy on Enhancing Public Access to Archived Publications Resulting from NIH-Funded Research. National Institutes of Health, Office of Extramural Research, Extramural Nexus.http://nexus.od.nih.gov/all/2012/11/16/improving-public-access-to-research-results/ 2012
- Sahu DK, Gogtay NJ, Bavdekar SB. Effect of open access on citation rates for a small biomedical journal.https://web.archive.org/web/20121130165349/http://openmed.nic.in/1174/ 2005
- Schwarz GJ, Kennicutt RC Demographic and citation trends in astrophysical journal papers and preprints. 2004. e16800
- Schönbrodt F. Changing hiring practices towards research transparency: The first open science statement in a professorship advertisement.http://www.nicebread.de/open-science-hiring-practices/ 2016
- Science Commons Scholar’s Copyright Addendum Engine.http://scholars.sciencecommons.org/ 2016
- Sears JRL. Data sharing effect on article citation rate in paleoceanography. Presented at Fall Meeting of the American Geophysical Union, 2011.10.6084/m9.figshare.1222998.v1 2011
- Seglen PO. Why the impact factor of journals should not be used for evaluating research. BMJ. 1997;314:497. doi: 10.1136/bmj.314.7079.497. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Shamir L, Wallin JF, Allen A, Berriman B, Teuben P, Nemiroff RJ, Mink J, Hanisch RJ, DuPrie K. Practices in source code sharing in astrophysics. Astronomy and Computing. 2013;1:54–58. doi: 10.1016/j.ascom.2013.04.001. [DOI] [Google Scholar]
- SHERPA/RoMEO Publisher copyright policies and self-archiving.http://www.sherpa.ac.uk/romeo/index.php 2016
- SHERPA/JULIET Research funders’ open access policies.http://www.sherpa.ac.uk/juliet/index.php 2016
- Solomon DJ. A survey of authors publishing in four megajournals. PeerJ. 2014;2:e16800. doi: 10.7717/peerj.365. [DOI] [PMC free article] [PubMed] [Google Scholar]
- SPARC Author Rights & the SPARC Author Addendum.http://sparcopen.org/our-work/author-rights/ 2016
- SPARC Europe The Open Access Citation Advantage Service.http://sparceurope.org/oaca/ 2016
- Springer Springer statement on SCIgen-generated papers in conference proceedings.http://www.springer.com/about+springer/media/statements?SGWID=0-1760813-6-1456249-0 2014
- Stodden V, Guo P, Ma Z. Toward Reproducible Computational Research: An Empirical Analysis of Data and Code Policy Adoption by Journals. PLoS One. 2013;8:e16800. doi: 10.1371/journal.pone.0067111. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Stodden VC. Trust your science? Open your data and code. Amstat News. 2011;409:21–22. [Google Scholar]
- Suber P. The taxpayer argument for open access. SPARC Open Access Newsletter.https://dash.harvard.edu/handle/1/4725013 2003
- Suber P. Open Access. MIT Press; 2012. [Google Scholar]
- Swan A, Gargouri Y, Hunt M, Harnad S. Open access policy: Numbers, analysis, effectiveness. 2015. e16800
- Swan A. The Open Access citation advantage: Studies and results to date. eprints, 2010.http://eprints.soton.ac.uk/268516/ 2010
- UNESCO Open Access Policy concerning UNESCO publications.http://www.unesco.org/new/fileadmin/MULTIMEDIA/HQ/ERI/pdf/oa_policy_rev2.pdf 2013
- University of Liège Open Access at the ULg. Open Repository and Bibliography.https://orbi.ulg.ac.be/project?id=03 2016
- Van Noorden R. Publishers withdraw more than 120 gibberish papers. Nature. 2014a doi: 10.1038/nature.2014.14763. [DOI] [Google Scholar]
- Van Noorden R. Funders punish open-access dodgers. Nature. 2014b;508:e16800. doi: 10.1038/508161a. [DOI] [PubMed] [Google Scholar]
- van Rooyen S, Delamothe T, Evans SJ. Effect on peer review of telling reviewers that their signed reviews might be posted on the web: randomised controlled trial. BMJ. 2010;341:e16800. doi: 10.1136/bmj.c5729. [DOI] [PMC free article] [PubMed] [Google Scholar]
- van Rooyen S, Godlee F, Evans S, Black N, Smith R. Effect of open peer review on quality of reviews and on reviewers' recommendations: a randomised trial. BMJ. 1999;318:23–27. doi: 10.1136/bmj.318.7175.23. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Vandewalle P. Code sharing is associated with research impact in image processing. Computing in Science & Engineering. 2012;14:42–47. doi: 10.1109/MCSE.2012.63. [DOI] [Google Scholar]
- Virginia Commonwealth University Faculty Senate VCU Faculty Senate Resolution on Open Access Publishing.http://www.facultysenate.vcu.edu/tag/open-access-scholarship-promotion-and-tenure/ 2010
- Wagner B. Open access citation advantage: An annotated bibliography. Issues in Science and Technology Librarianship. 2010;60 doi: 10.5062/F4Q81B0W. [DOI] [Google Scholar]
- Walsh E, Rooney M, Appleby L, Wilkinson G. Open peer review: a randomised controlled trial. The British Journal of Psychiatry. 2000;176:47–51. doi: 10.1192/bjp.176.1.47. [DOI] [PubMed] [Google Scholar]
- Wang X, Liu C, Mao W, Fang Z. The open access advantage considering citation, article usage and social media attention. Scientometrics. 2015;103:555–564. doi: 10.1007/s11192-015-1547-0. [DOI] [Google Scholar]
- Wellcome Trust Wellcome Trust strengthens its open access policy.https://wellcome.ac.uk/press-release/wellcome-trust-strengthens-its-open-access-policy 2012
- Wellcome Trust Expert Advisory Group on Data Access.http://www.wellcome.ac.uk/About-us/Policy/Spotlight-issues/Data-sharing/EAGDA/ 2016a
- Wellcome Trust Position statement in support of open and unrestricted access to published research.http://www.wellcome.ac.uk/About-us/Policy/Policy-and-position-statements/WTD002766.htm 2016b
- Wellcome Trust Wellcome Trust and COAF Open Access Spend, 2014-15.https://blog.wellcome.ac.uk/2016/03/23/wellcome-trust-and-coaf-open-access-spend-2014-15/https://dx.doi.org/10.6084/m9.figshare.3118936.v1 2016c
- West JD, Bergstrom T, Bergstorm CT. Cost effectiveness of open access publications. Economic Inquiry. 2014;52:1315–1321. doi: 10.1111/ecin.12117. [DOI] [Google Scholar]
- Whitmire A, Briney K, Nurnberger A, Henderson M, Atwood T, Janz M, Kozlowski W, Lake S, Vandegrift M, Zilinski L. A table summarizing the Federal public access policies resulting from the US Office of Science and Technology Policy memorandum of February 2013. Figshare. 2015 doi: 10.6084/m9.figshare.1372041. [DOI] [Google Scholar]
- Wicherts JM. Peer review quality and transparency of the peer-review process in open access and subscription journals. PLoS One. 2016;11:e16800. doi: 10.1371/journal.pone.0147913. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wikipedia List of academic journals by preprint policy.https://en.wikipedia.org/wiki/List_of_academic_journals_by_preprint_policy 2016
- Willinsky J. The Access Principle: The Case for Open Access to Research and Scholarship. MIT Press; 2006. [Google Scholar]
- Wilsdon J, Allen L, Belfiore E, Campbell P, Curry S, Hill S, Jones R, Kain R, Kerridge S, Thelwall M, Tinkler J, Viney I, Wouters P, Hill J, Johnson B. The Metric Tide: Report of the Independent Review of the Role of Metrics in Research Assessment and Management. 2015. [DOI] [Google Scholar]
- Wohlrabe K, Birkmeier D. Do open access articles in economics have a citation advantage?https://mpra.ub.uni-muenchen.de/id/eprint/56842 Munich Personal RePEc Archive. 2014;56842 [Google Scholar]
- Xia J, Nakanishi K. Self‐selection and the citation advantage of open access articles. Online Information Review. 2012;36:40–51. doi: 10.1108/14684521211206953. [DOI] [Google Scholar]
- Xu L, Liu J, Fang Q. Analysis on open access citation advantage: an empirical study based on oxford open journals. 2011. pp. 426–432.
- Zawacki-Richter O, Anderson T, Tuncay N. The growing impact of open access distance education journals: a bibliometric analysis. International Journal of E-Learning & Distance Education. 2010;24 [Google Scholar]
- Zhang Y. The effect of open access on citation impact: A comparison study based on web citation analysis. Libri. 2006;56:145–156. doi: 10.1515/LIBR.2006.145. [DOI] [Google Scholar]
Decision letter
In the interests of transparency, eLife includes the editorial decision letter and accompanying author responses. A lightly edited version of the letter sent to the authors after peer review is shown, indicating the most substantive concerns; minor comments are not usually included.
Thank you for submitting your article "The benefits of open research: How sharing can help researchers succeed" toeLife for consideration as a Feature Article. Your article has been reviewed by three peer reviewers and theeLife Features Editor (Peter Rodgers), and this decision letter has been compiled to help you prepare a revised submission.
All three reviewers have agreed to reveal their identity: Robert Kiley; Chris Gorgolewski; Vincent Lariviere.
General assessment:
This paper is a much-needed overview of benefits and practical advice about open research. Instead of preaching and using moral arguments, the authors focus on benefits to the researcher of doing research openly, and provide evidence that researchers who practice open research (making articles OA, sharing data, publishing code under an open licence etc.) enjoy significant benefits (more citations, more media coverage) compared with researchers who don't practice open research.
Essential revisions:
1) (Section 2.1) Although the authors do provide evidence that OA leads to more citations, I think they need to recognise that there is significant disagreement within the community about this. The paper does cite one study from Davis (Davis, 2011) but he has published quite a few papers on this topic: please consider citing another one of these studies.
2) (Figure 1) It is misleading to plot only the maximum citation advantage: please plot the median advantage and/or the range of advantages (or, alternatively, drop the figure).
3) (Section 2.3) I found this a little self-contradictory. At the start the authors argue (rightly) that IF are a flawed measure, but then go on to quote the IF of OA journals. I know they go to so say "we reiterate that IF are flawed[…]" but if they really believe this, then arguing that some OA journals have high IF doesn't make sense. Please consider deleting the passage "In the 2012 Journal Citation Report...choose between IF and OA)."
4) (Section 2.4; first paragraph) This is debatable and should be toned down a bit. Results obtained with F1000 tend to show the opposite (https://scholarlykitchen.sspnet.org/2013/03/27/how-rigorous-is-the-post-publication-review-process-at-f1000-research/).
5) (Section 2.5.1) I think this section should mention the recent ASAPbio meeting – and subsequent researcher survey – which seem to suggest that researchers in the life sciences have woken up to the potential of preprints (albeit 25 years after physicists reached the same conclusion!)
6) (Section 3.2) Right at the start of the article (Introduction, first paragraph) the authors get to the nub of the problem – namely that open practices could present a risk to career advancement. This in my opinion is the big issue. Until researchers are persuaded that making their outputs open isnot going to adversely impact them (and ultimately come to believe that it will benefit them) then changing behaviour is always going to be difficult. As such I was surprised that section 3.2 seemed to suggest that Funder mandates are sufficient to bring about this change. Although I agree that some mandates are important, on their own they are not sufficient.
The article would be improved if it recognised that mandates are not enough and then set out a list of things funders could do to help move the needle in this space (e.g., maybe end-of-grant reports should actively recognise and reward behaviours like data sharing, undertaking peer review, publishing papers on preprint servers etc. – and be less fixated on counting journal article outputs.)
Further to this: the NIH requires grant applications to include a data-management plans: however, when the NIH is considering grant applications, it does not take into account if applicants have a history of sharing data, and it does not penalize applicants if data from previous grants have not been shared. Until data sharing becomes an important part of review procedure, change will remain slow.
7) (Section 2.7) To be transparent, please mention the higher APCs of PLOS journals, as well as of for-profit publishers like Elsevier, Wiley, Springer, etc. When taking about high APCs the authors may also wish to cite the data the Wellcome Trust (and others) have published on this topic (e.g. blog.wellcome.ac.uk/2016/03/23/wellcome-trust-and-coaf-open-access-spend-2014-15/).
8) (Section 4.2) Being devil’s advocate here: the authors report that research projects that share data produce twice as many publications as those that do not share data. Isn't the more likely explanation of this relation that successful projects that have published a lot can afford to share data because the risk and consequences of scooping are much lower/smaller?
9) (Table 1) Please add the following columns: "Ability to leave feedback", "Provides DOI", and "Indexed by Google Scholar".
10) (Figure 2) Please expand the caption for this figure to better explain the five different organisations shown in the figure, maybe by giving examples of each type of organization. Please also explain why there are categories for "research organisations", "funders" and "funders and research organisations". Also, please explain the category "multiple research organisations"
Essential revisions:
1) (Section 2.1) Although the authors do provide evidence that OA leads to more citations, I think they need to recognise that there is significant disagreement within the community about this. The paper does cite one study from Davis (Davis, 2011) but he has published quite a few papers on this topic: please consider citing another one of these studies.
To present this information in a more balanced way, we deleted the word ‘overwhelming’ from Section 2.1, first paragraph, and made other minor changes to wording of this section.
We added another paper from Davis (Davis, 2008):
P.M. Davis, B.V. Lewenstein, D.H. Simon, J.G. Booth, and M.J.L. Connolly. Open access publishing, article downloads, and citations: randomised controlled trial. BMJ, 337:a568, 2008.
We also added citations to 3 other studies which failed to find an OA citation advantage:
T.F. Frandsen. The effects of open access on un-published documents: A case study of economics working papers. Journal of Informetrics, 3(2):124–133, 2009.
P. Gaule and N. Maystre. Getting cited: does open access help? Research Policy, 40(10): 1332–1338, 2011.
V.C. Lansingh and M.J. Carter. Does open access in ophthalmology affect how articles are subsequently cited in research? Ophthalmology, 116(8):1425–1431, 2009.
We expandedFigure 1 to include some studies in which no OA citation advantage, or even a disadvantage, was found for certain disciplines (see below).
2) (Figure 1) It is misleading to plot only the maximum citation advantage: please plot the median advantage and/or the range of advantages (or, alternatively, drop the figure).
We revisedFigure 1 to include mean citation advantages (medians are often not reported). These are now shown as a relative citation rate, instead of percentage. We also expanded the figure to include more disciplines and more studies, including some in which an OA citation advantage was not found.
3) (Section 2.3) I found this a little self-contradictory. At the start the authors argue (rightly) that IF are a flawed measure, but then go on to quote the IF of OA journals. I know they go to so say "we reiterate that IF are flawed[…]." but if they really believe this, then arguing that some OA journals have high IF doesn't make sense. Please consider deleting the passage "In the 2012 Journal Citation Report…choose between IF and OA)."
We appreciate the reviewer’s concern, but believe strongly that we have to discuss IF since it is often cited by researchers as a barrier to publishing openly. In numerous author surveys, researchers repeatedly rank impact factor and associated journal reputation as among the most important factors they consider when deciding where to publish. To support this, we added citations to the following author surveys:
Nature Publishing Group (2015): Author Insights 2015 survey. figshare.https://dx.doi.org/10.6084/m9.figshare.1425362.v7
Solomon DJ. (2014) A survey of authors publishing in four megajournals. PeerJ 2:e365https://doi.org/10.7717/peerj.365
Given our researcher-centric approach, it is important to recognize concerns about IF as a practical, albeit regrettable, reality. We believe that ignoring this reality, and specifically removing the recommended passage with data on the IFs of OA journals, would weaken the paper. This information provides researchers with options that satisfy both their worry about publishing in high IF journals and their wish to do so openly. To clarify our goals with this section, we made several small changes in wording and sentence order, and added a closing statement, which reads:
“We hope that in the next few years, use of IF as a metric will diminish or cease entirely, but in the meantime, researchers have options to publish openly while still meeting any IF-related evaluation and career-advancement criteria.
4) (Section 2.4; first paragraph) This is debatable and should be toned down a bit. Results obtained with F1000 tend to show the opposite (https://scholarlykitchen.sspnet.org/2013/03/27/how-rigorous-is-the-post-publication-review-process-at-f1000-research/).
We reworded this section to read:
“Some studies suggest open peer review may produce reviews of higher quality, including better substantiated claims and more constructive criticisms, compared to closed review [Donovan, Watson and Osborne, 2015; McCabe and Snyder, 2015]. Additional studies have also argued that transparent peer review processes are linked to measures of quality. Other studies have reported no differences in the quality of open versus closed reviews [Wicherts, 2016; Rooyen et al., 1999]. More research in this area is needed.”
We have added the following references to controlled studies finding no difference in quality of open versus closed reviews:
S. Van Rooyen, F. Godlee, S. Evans, N. Black, and R. Smith. Effect of open peer review on quality of reviews and on reviewers’ recommendations: a randomised trial. BMJ, 318(7175):23–27, 1999.
S. van Rooyen, T. Delamothe, and S.J.W. Evans. Effect on peer review of telling reviewers that their signed reviews might be posted on the web: randomised controlled trial. BMJ, 341:c5729, 2010.
5) (Section 2.5.1) I think this section should mention the recent ASAPbio meeting – and subsequent researcher survey – which seem to suggest that researchers in the life sciences have woken up to the potential of preprints (albeit 25 years after physicists reached the same conclusion!)
We added mention of the ASAPbio meeting (end of Section 2.5.1) and cited the following references on the meeting and survey:
J.M. Berg, N. Bhalla, P.E. Bourne, M. Chalfie, D.G. Drubin, J.S. Fraser, C.W. Greider, M. Hen- dricks, C. Jones, R. Kiley, S. King, M.W. Kirschner, H.M. Krumholz, R. Lehman, M. Leptin, B. Pulverer, B. Rosenzweig, J.E. Spiro, M. Stebbins, C. Strasser, S. Swaminathan, P. Turner, R.D. Vale, K. VijayRaghavan, and C. Wolberger. Preprints for the life sciences. Science, 352(6288): 899–901, 2016.
ASAPbio. Opinions on preprints in biology. Accessed May, 2016 athttp://asapbio.org/survey. Data available via figsharehttps://dx.doi.org/10.6084/m9.figshare.2247616.v1.
6) (Section 3.2) Right at the start of the article (Introduction, first paragraph) the authors get to the nub of the problem – namely that open practices could present a risk to career advancement. This in my opinion is the big issue. Until researchers are persuaded that making their outputs open is not going to adversely impact them (and ultimately come to believe that it will benefit them) then changing behaviour is always going to be difficult. As such I was surprised that section 3.2 seemed to suggest that Funder mandates are sufficient to bring about this change. Although I agree that some mandates are important, on their own they are not sufficient.
While we agree that mandates are unlikely to bring about the culture change we would like to see, there is evidence that mandates are effective in increasing rates of article and data sharing (see work from Harnad and colleagues, especially). More importantly, our goal with this section is not to argue that mandates are sufficient, but rather that “[researchers] already engaging in [open] practices will likely have a competitive advantage”.
The article would be improved if it recognised that mandates are not enough and then set out a list of things funders could do to help move the needle in this space (e.g., maybe end-of-grant reports should actively recognise and reward behaviours like data sharing, undertaking peer review, publishing papers on preprint servers etc. – and be less fixated on counting journal article outputs.)
We recognize in the subsequent section (section 4) that “[funder mandates] may be a compelling but dissatisfying reason to practice openly”. However, our primary target audience for this article is researchers, so we have focused on outlining the steps they can take and showing them “there are other compelling reasons to share”.
Further to this: the NIH requires grant applications to include a data-management plans: however, when the NIH is considering grant applications, it does not take into account if applicants have a history of sharing data, and it does not penalize applicants if data from previous grants have not been shared. Until data sharing becomes an important part of review procedure, change will remain slow.
We added mention of policy revisions implemented by NIH and Wellcome Trust, detailing how funds can be suspended or withheld if researchers do not comply with mandates (Section 3.2, last paragraph). We cited the following relevant references, one of which (van Noorden, 2014) discusses how both funders have already followed through on enforcement:
National Institutes of Health (NIH). Upcoming Changes to Public Access Policy Reporting Requirements and Related NIH Efforts to Enhance Compliance, 2012. Retrieved June, 2016 fromhttp://grants.nih.gov/grants/guide/notice-files/NOT-OD-12-160.html. Last updated Feb., 2013.
Van Noorden, R. Funders punish open-access dodgers. Nature News, 2014. Retrieved June, 2016 fromhttp://www.nature.com/news/funders-punish-open-access-dodgers-1.15007.
Wellcome Trust. Wellcome Trust strengthens its open access policy, 2012. Retrieved June, 2016 fromhttps://wellcome.ac.uk/press-release/wellcome-trust-strengthens-its-open-access-policy.
7) (Section 2.7) To be transparent, please mention the higher APCs of PLOS journals, as well as of for-profit publishers like Elsevier, Wiley, Springer, etc. When taking about high APCs the authors may also wish to cite the data the Wellcome Trust (and others) have published on this topic (e.g. blog.wellcome.ac.uk/2016/03/23/wellcome-trust-and-coaf-open-access-spend-2014-15/).
We added mention of the higher APCs charged by some OA publishers, like PLOS and Frontiers (Section 2.7, last paragraph). We also felt the no-cost/low-cost examples here were numerous, so we have stricken two of them.
We added the suggested reference from Wellcome Trust, as well as one from Stuart Lawson documenting high APCs:
S. Lawson. APC data for 27 UK higher education institutions in 2015. figshare, 2016. Retrieved June, 2016 fromhttps://dx.doi.org/10.6084/m9.figshare.1507481.v4.
Wellcome Trust. Wellcome Trust and COAF Open Access Spend, 2014-15,. Retrieved June, 2016 fromhttps://blog.wellcome.ac.uk/2016/03/23/wellcome-trust-and-coaf-open-access-spend-2014-15/. Data available via figshare doi:10.6084/m9.figshare.3118936.v1.
We also added a citation to a new study of over 10, 300 OA journals, showing 71% do not charge an APC (Section 2.7, first paragraph), and that the average APC for OA journals is around $665 (Section 2.7, second paragraph):
W. Crawford. Gold Open Access Journals 2011-2015. Cites & Insights Books, 2016. Accessed June, 2016 viahttp://waltcrawford.name/goaj.html.
8) (Section 4.2) Being devil’s advocate here: the authors report that research projects that share data produce twice as many publications as those that do not share data. Isn't the more likely explanation of this relation that successful projects that have published a lot can afford to share data because the risk and consequences of scooping are much lower/smaller?
The authors of Pienta et al. admit that “It is unclear whether larger numbers of primary publications lead to data sharing or if sharing data leads to more primary publications”. However, the authors did control for factors such as Principal Investigator age, gender, career status, and funding history, as well as features of the grant such as duration as an indirect measure of grant size. None of these factors sufficiently explained the primary or secondary publication advantage conferred by data sharing.
9) (Table 1) Please add the following columns: "Ability to leave feedback", "Provides DOI", and "Indexed by Google Scholar".
We added the columns “Can leave feedback?” and “Third party persistent ID?”. The latter is broader and includes externally managed persistent identifiers such as DOIs, Handles, and ARKs. We added a footnote to the table saying that all the listed preprint servers and repositories are indexed by Google Scholar. After community feedback, we also added several relevant repositories.
10) (Figure 2) Please expand the caption for this figure to better explain the five different organisations shown in the figure, maybe by giving examples of each type of organization. Please also explain why there are categories for "research organisations", "funders" and "funders and research organisations". Also, please explain the category "multiple research organisations"
Based on the information provided by ROARMAP, we added the following explanation and examples to the figure caption:
“Data are broken down by policymaker type: funder (e.g. Wellcome Trust), joint funder and research organization (e.g. British Heart Foundation), multiple research organizations i.e. associations and consortia (e.g. Max Planck Society), research organization i.e. university or research institution (e.g. CERN), and subunit of research organization (e.g. Columbia University Libraries).”
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Citations
- Biosharing.org 2016. A catalogue of data preservation, management and sharing policies from international funding agencies, regulators and journals. biosharing.org/policies

