Cookies Notification
Access Options
You can be signed in via any or all of the methods shown below at the same time.
My Profile
Sign in here to access free tools such as favourites and alerts, or to access personal subscriptions
I don't have a profile
I am signed in as:
With my free profile I can:
- Set up favourite journals and register for email alerts
- List saved searches
- Edit account details
- Activate personal subscriptions and access content
With institutional access I can:
- View or download all content the institution has subscribed to.
Society
If you have access to journal via a society or associations, read the instructions below
Access to society journal content varies across our titles.
If you have access to a journal via a society or association membership, please browse to your society journal, select an article to view, and follow the instructions in this box.
Contact us if you experience any difficulty logging in.
Some society journals require you to create a personal profile, then activate your society account
I am signed in via:
With society access I can:
- View or download all the content the society has access to.
Need Help? Contact SAGE
Add Email Alerts
close Add Email Alerts DialogYou are adding the following journals to your email alerts
Cite
Citation Tools
How to cite this article
If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click on download.
Download Citation
If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click on download.
Share
Share
The e-mail addresses that you supply to use this service will not be used for any other purpose without your consent.
Recommend to a friend
Sharing links are not available for this article.
For more information view theSAGE Journals Article Sharing page.
Create a link to share a read only version of this article with your colleagues and friends. For more information view theSAGE Journals Sharing page.
Please read and accept theterms and conditions and check the box to generate a sharing link.
Explore More
Are Funder Open Access Platforms a Good Idea?
View ORCID profile
See all articles by this author

See all articles by this author
See all articles by this author
Article Information

Abstract
As open access (OA) to publications continues to gather momentum, we should continuously question whether it is moving in the right direction. A novel intervention in this space is the creation of OA publishing platforms commissioned by funding organizations. Examples include those of the Wellcome Trust and the Gates Foundation, as well as recently announced initiatives from public funders like the European Commission and the Irish Health Research Board. As the number of such platforms increases, it becomes urgently necessary to assess in which ways, for better or worse, this emergent phenomenon complements or disrupts the scholarly communications landscape. This article examines ethical, organizational, and economic strengths and weaknesses of such platforms, as well as usage and uptake to date, to scope the opportunities and threats presented by funder OA platforms in the ongoing transition to OA. The article is broadly supportive of the aims and current implementations of such platforms, finding them a novel intervention which stands to help increase OA uptake, control costs of OA, lower administrative burden on researchers, and demonstrate funders’ commitment to fostering open practices. However, the article identifies key areas of concern about the potential for unintended consequences, including the appearance of conflicts of interest, difficulties of scale, potential lock-in, and issues of the branding of research. The article ends with key recommendations for future consideration which include a focus on open scholarly infrastructure.
Introduction
In the age of open access (OA), research funding organizations have taken a more active interest in academic publishing. To increase access to research results stemming from their funding, they are increasingly directly funding publishing (via article processing charges), supporting infrastructures, and introducing policies to require their researchers to publish OA.
A step-change in this engagement is the recent phenomenon of OA publishing platforms commissioned by funding organizations. Examples include those of the Wellcome Trust and the Gates Foundation, as well as recently announced initiatives from public funders like the European Commission (EC) and the Irish Health Research Board (HRB). As the number of such platforms increases, it becomes urgently necessary to assess in which ways, for better or worse, this emergent phenomenon complements or disrupts the scholarly communications landscape.
This article examines ethical, organizational, and economic strengths and weaknesses of such platforms, as well as usage and uptake to date, to scope the opportunities and threats presented by funder OA platforms in the ongoing transition to OA.
Structural Conditions of Funder Engagement With Publishing
The relationship between research funding organizations and scholarly publishing seems to have entered a new, more active phase of engagement in the age of OA. Researchers’ ability to choose where to publish their results has long been taken to be a matter of fundamental academic freedom (American Association of University Professors [AAUP], 1940;United Nations Education, Science and Cultural Organization [UNESCO], 1997). Funders in the second half of the 20th century certainly required acknowledgment of their funding in publications, disseminated commissioned studies via publication offices, and sometimes supported the payment of “colour charges” (Hall & Bembridge, 1986, p. 273) and “page charges” (National Science Foundation, 1977). Moreover, by setting expectations for publication records, they did indirectly influence publications patterns. However, they seem largely to have avoided policy prescriptions regarding where or how their fundees should publish, and to have avoided direct intervention in the manner of research institutions and researchers’ membership organizations such as scholarly societies and national academies, which often directly operated publication initiatives (e.g., journals, serials, and presses).
This has changed. Since the rise of the OA agenda at the end of the last century, given urgent and compelling voice in the 2002 declaration of the Budapest Open Access Initiative (Chan et al., 2002), funders have taken an increasingly active interest in matters of publication. TheWellcome Trust (2018) states that “Transforming human health will take longer if research outputs—like publications, data, software and biological materials such as cell lines—aren’t managed, shared and used in ways that unleash their full value.” The Gates Foundation sees OA publishing of peer-reviewed research as holding the potential for researchers from diverse backgrounds to come together and accelerate the research process, in turn leading to new ways of making people’s lives longer, healthier, and more productive (Morgan, 2017). As a final example of funder motivations to actively support OA publishing, the Open Research Funders Group (ORFG;2018) has committed itself to the open sharing of research outputs, because it believes this will “benefit society by accelerating the pace of discovery, reducing information-sharing gaps, encouraging innovation, and promoting reproducibility” (ORFG, 2018). OA to publications means that research publications can be accessed online, free of charge by any user, with no technical obstacles. At the minimum, such publications can be read online, downloaded, and printed (ideally other rights to copy, distribute, remix, and mine would also be granted). Access can be either through author archiving in online repositories (“green OA”) or by publishing in OA journals (either full OA or hybrid) or other publication outlets (“gold OA”).
From the early 1990s, several initiatives have sought to harness the power of emergent digital networked technologies to provide access to research outputs. Often these have been driven by the research community, for example, the foundation of thearXiv.org preprint server in 1991 (Ginsparg, 2016). Several independent journals made their content freely available online, typically hosted by research institutes or departments. In the early 2000s, commercial (e.g., BioMed Central) and not-for-profit (e.g., Public Library of Science) publishers started to introduce and experiment with new OA business models, charging authors (rather than readers) for publication services. Observing these developments, and concerned both to increase access to their funded results and to find a solution to the spiraling costs of subscriptions in the early 2000s (the so-calledserials crisis) (Kiley & Terry, 2006), funders worldwide began to implement measures to support a transition toward OA.
To this end, since the early to mid-2000s, major funders have increasingly introduced policies or mandates to encourage or prescribe OA for publications deriving from their research funding (Vincent-Lamarre, Boivin, Gargouri, Larivière, & Harnad, 2016;Xia et al., 2012). For example, each of the 30 Science Europe member institutions now either have OA policies or are in the process of implementing one (Crowfoot, 2017). Perhaps mindful of the fact that such measures can be argued to infringe upon the academic freedom of researchers to choose where to publish (Johnston, 2017), funders remain keen to emphasize choice. Hence, funder OA policies, at least in the Global North, follow a broadly similar approach: they allow a mixture of green and gold OA options, fund article-processing charges, and impose restrictions on the maximum length of embargo periods (the publisher-prescribed length of time from publication until author-archived versions can be made openly available) for green OA. However, the nuances of these policies are often complex, with different legal, financial, disciplinary, and cultural contexts affecting factors like the extent to which gold or green is preferred and levels of funding for Article Processing Charges (APCs) (Science Europe, 2016). In Europe, for example, although many countries favor green OA, or a balanced approach, there is a preference for gold OA in the United Kingdom, the Netherlands, and Austria.
OA to publications is now a mainstream policy among major research funding organizations. Funders such as the EC have recently targeted that all European research articles should be available via OA from 2020 onward (Enserink, 2016;EC, 2017,2018b).
But this commitment brings an increasing need for funders to engage with the economics and politics of the provision of awareness-raising and support measures, publication funds, and repository infrastructures.
The barriers to OA are diverse, but top-line factors include lack of funding for APC gold publications, perceptions of lower quality of OA journals, and the complexities of embargo and licensing policies (Dallmeier-Tiessen et al., 2011). Other potential barriers include insufficient training, copyright/licensing challenges related to third-party content, and lack of incentives within organizations and research communities to move away from publication in traditional, restricted access journals.
Availability of financial support for APCs is hence a major driver for OA in many countries (Dallmeier-Tiessen et al., 2011;Jubb, Goldstein, Amin, & Pinfield, 2015). However, as shown by a recent survey among former grantees in the context of the EC’s FP7 post-grant OA pilot, many report difficulties in accessing funds for OA publication charges. On average, less than a fifth reported having access to an institutional publication fund (out of about 300 responses), while this share was particularly low for respondents from Eastern Europe (0%) and Northern Europe (5%). More common was that respondents used or had access to research grants (about 50%), personal funds (about 45%), and/or institutional or departmental funds (less than 30%) (Johnson, Fosci, Chiarelli, Pinfield, & Jubb, 2017).
Hence, many funders support the costs of APCs, either by making them eligible grant costs or by making available earmarked funds. This constitutes a considerable new financial burden for funders, who obviously have an interest in keeping costs down. However, controlling costs can be in conflict with the aim of increasing uptake of OA. The APC market is still emergent, with unresolved questions about what costs are reasonable, most obviously with regard to so-called “hybrid” OA, where the market has been branded “dysfunctional” (Björk & Solomon, 2014). This is exacerbated by a lack of transparency on the actual costs of publishing, and a perceived “price of prestige”—where APCs in more prestigious journals tend to be higher for similar levels of service in cheaper, less prestigious venues (Van Noorden, 2013).
Data from the Open APC initiative from 2005 to 2018 showed that across all 158 participating research performing institutions and research funders (mainly from Germany, the United Kingdom, Norway, Sweden, and Austria), the average APC for fully OA journals was €1,481 (Mdn = €1,407), but substantially higher for hybrid journals (average €2,490,Mdn = €2,443) (data as of May 6, 2018) (“OpenAPC Dataset,”n.d.). There is hence concern that hybrid APCs often reflect traditional publishers’ concern to maintain existing profit margins and market position rather than the true costs of publishing (Laakso & Björk, 2016). Currently a large share of APC expenditure goes to hybrid OA. For example, over the period 2013-2016, the Wellcome Trust spent around just a fifth (about €1.8 million) of its total APC expenditure on articles in fully OA journals (1,015 articles,M costs = €1,756,SD = €819,Mdn = €1,604) and over €7.1 million on articles in hybrid journals (2,767 articles,M costs = €2,572,SD = €893,Mdn = €2,565) (data as of May 6, 2018) (“OpenAPC Dataset,”n.d.). Exacerbating this, publishers have been accused of “double dipping” through hybrid OA (Björk & Solomon, 2014), gaining extra income by charging APCs and subscription fees for the same content. Given this situation, it has been plausibly claimed that subscription journals lack incentives to move toward OA (Johnson et al., 2017).
Some funders have reacted by capping the levels of APCs they will pay or refusing to pay for hybrid publications (De Castro, 2015). In other cases, costs for (hybrid) OA are included in big deal negotiations, for example, in the Netherlands (Heijne & van Wezenbeek, 2018). However, non-disclosure clauses often make it impossible to assess the true financial implications of such agreements. A study from 2013 targeted 10 biomedical research funders and investigated their approaches to the implementation of OA policies and related cost issues. Several of funders expressed worries about escalating costs as gold OA becomes more mainstream. In this context, they hoped or expected that OA would increasingly play a role in researchers’ decision-making processes about where to publish. Interviewees pointed out that researchers might currently be too insulated from the costs of publishing, and that an increased author awareness of costs would be a desirable outcome of the move toward OA. In addition, one interviewee believed that costs may play out as a factor when choosing between less prestigious journals (Collins, 2013).
To date, these decision-making processes have not been studied in detail, and it must be noted that OA continues to play a secondary role when it comes to the selection of where to publish. To a certain extent, it can be expected that awareness of OA publication costs is higher in projects where researchers have to cover these costs out of their own project budgets to secure compliance with a funder mandate. In turn, researchers will be less aware if these costs are directly covered by funders or institutions, or if deals with publishers are in place.
A number of efforts have been made to research the effect of “flipping” non-OA journals to OA (Solomon, Laakso, & Björk, 2016). There have been a few research institution–led initiatives to convert journals to OA at no cost to the author. A discipline specific initiative is SCOAP3 (Sponsoring Consortium for Open Access Publishing in Particle Physics), which involves redirecting subscription fees and instead paying for OA from a central fund (Romeu et al., 2014). At a much larger scale, the OA2020 initiative has been launched, led by the Max Planck Society based on a 2015 white paper (Schimmer, Geschuhn, & Vogler, 2015). It has many European national funders committed to a model of redirecting existing subscription fees into OA funds, at a large cross-disciplinary scale, with the aim of disrupting the existing subscription system. OA2020 has, however, been criticized for seeking to reproduce the current dependency on a very few large commercial publishers who have proven themselves to be expensive and resistant to change. Confederation of Open Access Repositories (COAR) and UNESCO (2016), in a joint statement, indirectly referred to the OA2020 initiative and pointed out that a number of issues need to be addressed during the large-scale transition to OA; in particular, such a system needs to provide support researchers from institutions with smaller budgets or developing countries may not be able to meet the fees; further concentration of the publishing market needs to be avoided and mechanisms should be developed to ensure cost reductions (Fecher, Friesike, Peters, & Wagner, 2017).
Among researchers, positive sentiments toward OA have yet to fully reflect publishing choices. Researchers are very aware of OA, and the vast majority believe it beneficial (Ruiz-Perez, 2017). However, this does not seem to translate into practice. Dallmeier-Tiessen et al.’s study from 2011 for the SOAP project (http://project-soap.eu/) found that although almost 90% of respondents reported positive attitudes toward OA, only 52% had actually published via that route (Dallmeier-Tiessen et al., 2011). The lesson here is that researchers value OA in the abstract but are more reticent to put it into practice. This can be attributed to a continuing lack of structural incentives to choose OA, especially in institutional promotion and tenure procedures (Xia, 2010), as well as lack of awareness about green OA “self-archiving” options, recurrent skepticism about the quality of OA journals, difficulties in accessing funds for OA publications, and general unease with novel workflows (Dallmeier-Tiessen et al., 2011). Hence, despite OA mandates, progress in OA transition to date has been relatively slow. A very recent study, for instance, estimates 28% of the scholarly literature to be OA (either green or gold) as of 2017 (Piwowar et al., 2018). Other studies reach different conclusions, depending on methodology and OA definition, but reflect the general conclusion of relatively low uptake across the piece. Jubb calculated that 16.6% of all articles are published in gold OA (Jubb et al., 2015). An Organisation for Economic Co-Operation and Development (OECD) report byBoselli and Galindo-Rueda (2016) estimated, meanwhile, that around 30% of publications are OA, with around 20% of closed articles later made available via green OA. In total, Boselli and Galindo-Rueda believe “approximately 50-55% of documents are openly available 3-4 years after publication.” A report by Science Metrix (Archambault et al., 2014) for the EC found that as of April 2014, more than 50% of the scientific papers published between 2007 and 2012 could be downloaded for free on the Internet, although this included a large proportion of articles which do not meet the definition of OA (being of dubious copyright or hosted on proprietary platforms like ResearchGate). What is more, growth in the OA market seems to be slowing, or at least no longer accelerating (Johnson et al., 2017).
Given these conditions, it is clear that achieving the transition to OA within a reasonable time period requires continued intervention from stakeholders interested in achieving that goal, including research funding organizations. Yet funders are increasingly aware that their interventions can influence market development in unexpected and potentially undesirable ways (Björk & Solomon, 2014). For example, current evidence suggests that generous funding for hybrid publications may lead to a steep increase in OA publications in the short-term but at the expense of a long-term increase in the level of average APCs (Johnson et al., 2017). In the United Kingdom, for instance, the 2013 gold OA-focused RCUK Open Access Policy and its provisions for APC Block Grants resulted in a large increase in hybrid expenditure. The result was that by 2015, U.K. institutions’ “use of OA in hybrid journals and of delayed OA journals is more than twice the world average in both cases, while its take-up of fully OA journals with no APC (Gold-no APC) is less than half the world average and falling” (Jubb et al., 2015).
Other Funder Infrastructure Investments
In parallel to these direct investments in OA publishing, funders have a longer history of supporting publishing infrastructures and other supporting services to foster OA:
Publishing services: The Public Knowledge Project (PKP), which develops and maintains the open source Open Journal Systems (OJS) is financially supported by the Canadian Foundation for Innovation, Canadian Internet Registration Authority, the Laura and John Arnold Foundation and the MacArthur Foundation. Another example of funders supporting publishing services is Collaborative Knowledge (Coko) Foundation, which is supported by Laura and John Arnold Foundation, the Gordon and Betty Moore Foundation, and the Shuttleworth Foundation.
Preprint servers: Preprints are complete drafts of scientific documents, not yet peer-reviewed, that are made available online, often via dedicated repositories known as “pre-print servers” (Bourne, Polka, Vale, & Kiley, 2017). ArXiv, established in 1991, is by far the most used preprint server (for physics, mathematics, computer science, quantitative biology, quantitative finance, and statistics). Further preprint servers were established a few years later—for example, RePEc (Research Papers in Economics; which indexes several digital archives) and SSRN (the Social Sciences Research Network; which was acquired by Elsevier in 2016). Spurred by the creation of bioRxiv by Cold Spring Harbor Press in 2013, and the advocacy efforts of ASAPbio, a scientist-driven initiative to promote the use of preprints in the life sciences, interest in preprints has grown sharply in recent years (Luther, 2017). A host of new preprint servers have since begun to appear, including, but not limited to, many hosted by the Center for Open Science: SocArXiv (social sciences, since 2016), PsyArXiv (psychology, since 2016), PaleorXiv (paleontology, since 2017), EarthArXiv (geosciences, since 2017), and LawArxiv (law, since 2017). SciELO, the Scientific Electronic Library Online, which provides OA to more than 1,200 journals from Latin America, Spain, Portugal, and South Africa, has also announced that they will launch a preprint service in 2018 (Packer, Santos, & Meneghini, 2017). Funders have played a role in fostering these developments. arXiv lists among its supporting members the European Research Council, the Austrian Science Foundation (FWF), and the Simons Foundation (arXiv 2018); BioRxiv receives support from the Chan Zuckerberg Initiative (CZI) (Callaway, 2017); and the group of pre-print servers hosted on the Open Science Framework are supported by the Center for Open Science, in turn funded by the Arnold Foundation (Center for Open Science, 2018).
Repositories: In 2000, the National Institutes of Health (NIH), through the National Library of Medicine (NLM), launched PubMed Central (PMC) as full-text journal article repository. From 2005 onward, it has become the designated repository for research articles in biomedical and life sciences funded by a number of U.S. government funders. In Europe, Wellcome Trust together with 27 other research funders supports EuropePMC, where research articles resulting from their funding are deposited in parallel to PMC (The Europe PMC Consortium, 2015).
Repository aggregators and abstracting/indexing services: Institutional repositories receive coordinational support via regional efforts like OpenAIRE (funded by the EC), SHARE (funded, in part, by the U.S. Institute of Museum and Library Services [IMLS] and the Alfred P. Sloan Foundation), and LA Referencia (funded by Latin American public science and technology agencies). Other services enable discovery of OA outputs by collecting, organizing, and systematizing information regarding OA publications from diverse platforms. Example services and activities include, for example, the OAPEN Library of OA books which provides a deposit service to the Wellcome Trust, the Austrian Science Foundation, and Knowledge Unlatched. In addition, OAPEN (Open Access Publishing in European Networks) is conducting projects with the Swiss National Science Foundation and the European Research Council (OAPEN, 2018).
Other enabling services: In addition, funders have supported a range of awareness-raising and capacity-building activities by providing information on OA at various levels, from the general (what OA is, its aims and objectives) to the specific (e.g., individual journal OA policies, registries of entities). The former can be exemplified by OpenAIRE’s network of 33 National Open Access Desks and the FOSTER Open Science training initiative, while examples of the latter include the SHERPA services RoMEO (journal policies) and JULIET (funder OA policies), as well as OpenDOAR (OA repositories)—services supported via U.K. infrastructural funder JISC. Research funders have also supported several studies which investigated the relationship between OA policies and services, as well as the development of strategies for sustaining core services (Johnson & Fosci, 2015).
Funder OA Platforms
Faced with high APC costs, at the same time as trying to foster change to a sustainable OA ecology, the idea of funder OA platforms has come to the fore.
Wellcome Open Research
The Wellcome Trust, one of the world’s largest biomedical charitable foundations, in July 2016 announced its plan to launch an OA publishing platform to be titled Wellcome Open Research (henceforth WOR) (Butler, 2016). The announcement specified that management of the platform would be contracted to the OA publishing platform F1000Research and follow that platform’s publishing model. In the F1000Research model, following only an initial light “sanity check” by a professional editor, research outputs are immediately published and then openly peer-reviewed, with review reports and reviewer names published alongside in real time (F1000, 2018).
Wellcome has traditionally been at the forefront of debates about OA and data sharing. It has supported APC payments since 2003 and, in 2006, introduced a strict OA mandate that all publications must be made available within 6 months of publication via PMC (Kiley & Terry, 2006;Walport & Kiley, 2006). In 2012, together with the Howard Hughes Medical Institute and the Max Planck Society, Wellcome launched eLife, a peer-reviewed OA journal for biomedical and life sciences that aimed to compete with the most prestigious journals likeNature, Cell, andScience (Schekman, Patterson, Watt, & Weigel, 2012). In so doing, Wellcome took a step beyond merely supporting OA to take a direct interest in publishing. eLife remained editorially independent from its funders, however, committed to publishing all research based on merit regardless of funding organization. In 2017, Wellcome Trust was even revealed to have been among a group of investors who invested 52.6 million in ResearchGate, the academic social network (ResearchGate, 2017).
The 2016 announcement of WOR, however, was a step-change in engagement in publishing. It was welcomed as such by OA advocates like arXiv founder Paul Ginsparg, who said, “This really is a potential game changer for a major funder to be taking control of the research output” (Bohannon, 2016). Robert Kiley explained Wellcome’s motivation for the platform as stemming from a wish to increase speed, transparency, and reproducibility in scholarly communications, by offering a venue with no author-facing charges and relative cost-effectiveness for the funder, that would allow its researchers to publish all their research outputs (from articles and datasets to case reports and protocols, to null and negative results). All Wellcome researchers would be able to use the platform but could still publish wherever else they wished. The stated APCs charged per article were not significantly different than those charged by the F1000Research platform. The platform was opened for submissions in October 2016 and the first group of articles were published a month later. The next section gives an analysis of the outputs from WOR’s first year.
Analysis of the First Year of WOR
In this section, we report some findings based on the publication metadata and related events on the WOR publication platform, and consider all 192 publications (all versions included) which have been submitted between October 17, 2016, and November 17, 2017. A more detailed version of this analysis is available online (Schmidt, 2018a,2018b).
Over this period of 13 months, the submission rate to WOR was rather modest, with about 15 papers per month, and no acceleration of submissions could be observed (seeFigure 1).
Several article types can be published on WOR. So far, about three out of five of all articles are research articles (88 articles, 62%), followed by method articles (13 articles, 9%), study protocols (10 articles, 7%), and several smaller categories.
Of the 142 papers published on WOR by end of November 2017, 95 papers have only one version, 47 have two versions, and three papers have three versions. The rate of papers with only one version seems to be rather high. This might be partially due to the fact that for some papers, the review-revise process has not been closed yet.
Overall, 1,110 authors have been involved in the writing of 142 publications. In addition, seven consortia contributed to the writing of seven papers. On average, about eight authors were involved in each paper (M = 7.9,SD = 5.5, minimum = 1, maximum = 31).
Regarding duration between publication events, there was some variation depending on publication type (seeFigure 2). For research articles, the first review was typically received within about 43 days, and the second review within another 12 days. Indexing in PubMed and other bibliographic databases was accomplished by Day 65. The time until receiving the first review was somewhat longer for study protocols (Mdn = 57 days) and shortest for open letters (22 days) and data notes (28 days).
As the information in the WOR dataset was incomplete regarding the review outcomes (variable “review status” with the following possible values: accepted, accepted with reservations, rejected), we only considered those articles who were already indexed by Europe PMC. It must be noted that publications are indexed only after they have “passed” peer review. A paper is considered to have passed peer review if it has received at least two approved referee reviews, or one approved plus two approved with reservations reviews (WOR, 2018). In consequence, the review ratings for papers on Europe PMC will naturally be somewhat skewed toward more positive reviews.
For WOR articles, review status information was parsed from the Europe PMC website. Information on 354 review reports was retrieved for all 111 WOR articles which are available on Europe PMC. In addition, there were 100 author responses. The distribution is strongly skewed toward positive review ratings: over three fourth approved (267 reviews, 75.4%), nearly one fourth approved with reservations (84 reviews, 23.7%), and less than 1% (three reviews, 0.9%) rejected the article under review.
This result is in line with the review ratings on the parent platform F1000Research. Based on a retrieval of all F1000Research research articles that have been indexed in Europe PMC, we consider 3,880 records of review reports which are related to 1,200 records of research articles. The distribution is very similar to the above: About three out of four reports approved (2,913 reviews, 75.1%), nearly one out of four reports approved with reservations (901 reviews, 23.2%), and only 1.7% (66 reviews) rejected the research article under review. Research articles have received between two and eight reviews, on average three reviews.
From this analysis, although no prior indication was given by Wellcome as to what would constitute success, it must be admitted that WOR cannot yet be regarded a full success. Operationally the processing of submitted papers seems to work well, but the overall uptake can be argued to be low compared with the investment made by the Wellcome Trust (although the total cost of the platform is unknown). The 142 publications on WOR amount to a share of about 2% of all WOR publications (estimate based on average number of publications indexed by Europe PMC in 2013-2016; overall over 27,000 publications). Kiley points out that WOR has been the fifth most popular publication venue during this first year of operation, afterScientific Reports, PLoS ONE, Nature Communications, andeLife (Kiley, 2017). It should also be noted that while information on APCs is publicly available, further information about cost breakdown of running WOR, also in comparison to F1000Research, is not publicly available, thus preventing further assessment of WOR from the point of view of cost-effectiveness.
The fact that the rejection rate on F1000Research is very low has been strongly criticized byVines (2013) for the very high rate of positive reviews (“approved” and “approved with reservations”), in comparison with a sample of papers from medical journals for which the average length of reviews was substantially longer (464 vs. 254 words) and only 42% were positive. Vines goes even so far to completely dismiss the reviews, that is, readers are advised to consider papers on F1000Research as if they have never been through peer review. Although this view seems somewhat exaggerated, it seems reasonable that in the case of positive review ratings, the motivation for authors to revise a paper may be lower. In addition, the label “not approved” is not to be confused with “rejected” (see WOR FAQs). The notion that journals advertise high rejection rates as a measure of prestige has been criticized by several authors, not the least because the most cited journal do not necessarily have the highest rejection rates (Schultz, 2010), and low rejection rates can actually be interpreted as a sign of self-regulation and high efficiency (Pöschl, 2012). Perhaps most importantly, when peer review is focused on assessing methodological quality rather than perceived importance of the reported research, rejection rates are expected to be lower as no artificial scarcity is created by selectivity.
Further Funder Platforms
Inspired by the Wellcome example, in March 2017, the Bill and Melinda Gates Foundation, another major philanthropic funder of biomedical research, announced it would also be launching a platform based on the F1000 platform (Butler, 2017). The first Gates Open Research articles were published in November 2017. As of March 1, 2018, a DOI was available for a subset of 25 records. Since then, the number of publications has doubled: according to Crossref, there were 53 articles with registered DOIs on the Gates Open Research platform as of May 10, 2018. Regarding submitted article types, about three fifth were research articles, followed by about one fifth study protocols and data notes, open letters, method articles, and systematic reviews ranging between 4% and 7%.
The time from submission to publication across all publication types was about 19 days (Mdn), ranging from 10 days for method articles and 52 days for data notes. The first review typically arrived after 31.5 days, again taking shortest for method articles (21 days) and longest for data notes (65 days). The second review was available after another 9 days. Publications were indexed after about 41 days. Overall, these durations were slightly shorter than for submissions to the WOR platform. However, it must be noted that the dataset only records the first 4 months of operation of the platform, and thus these findings are only indicative.
An increasingly long list of other funders, research organizations, and institutions have since followed the example of Wellcome and Gates, with F1000-powered publishing platforms announced by the HRB Ireland, the African Academy of Sciences, UCL Great Ormond Street Institute of Child Health, and the Montréal Neurological Institute and Hospital. These platforms remain in various states of development at the time of writing.
With the success of this model, in July 2017, F1000 announced Open Research Central, a centralized portal through which researchers will be able to submit work to any of these F1000-powered open research publishing platforms. This had been signaled in advance by Kiley on WOR’s announcement a year earlier, telling Nature “the expectation is that this, and other similar funder platforms that are expected to emerge, will ultimately combine into one central platform” (Butler, 2016). Of note here, however, is F1000’s stated intention to eventually transfer governance of this portal to the community:
While F1000 is currently maintaining Open Research Central and the publishing platforms, our longer-term plan is to transition Open Research Central to being owned and governed by the international research community with broad representation across research funding agencies, research institutions, and researchers themselves. We will assemble a governing board shortly to start this process. (F1000, 2017)
The case of the HRB Ireland gives us some indication of the behind-the-scenes workings of these deals, as it is F1000’s first agreement with a public funder. The public tender report (Office of Government Procurement Ireland, 2017) advises that the sum of €400,000 had been made available to “establish a single operator framework for the provision of an Open Research Publishing Service” for a total of 4 years. The tender seemed implicitly targeted toward a narrow range of possible providers by stipulating that the “platform should provide users with immediate publication followed by invited, transparent, post-publication peer review.” Only one tender application was received and the contract was granted to F1000.
EC’s Open Research Europe
The EC in mid-2017 announced its intention to also provide such a platform for researchers funded via its framework program Horizon 2020 (Enserink, 2017). More details were given in an Information Note published in December 2017. That note made explicit that the Commission was following the example of Wellcome and Gates to raise the level of OA publications stemming from their funded research in a cost-effective manner. The note is also careful to emphasize the voluntary nature of the platform, which would be free to use for Horizon 2020 grantees. It foresaw the benefits of raising OA compliance rates, giving more flexibility to researchers, and demonstrating the EC’s position as a leader in Open Science implementation, as well as enabling competition though transparency regarding costs.
Horizon 2020 allocates almost €80 billion of funding over 7 years from 2014 to 2020 (EC, 2018c). As a public funder, the Commission faces different constraints and considerations than private funders, including more scrutiny and regulations. Also, the range of subjects covered by its funding is much larger than the more targeted approach of the Wellcome Trust, the Gates Foundation, and HRB, which are explicitly addressed to health/life sciences. Hence, for the EC to enter this space will be a huge step in legitimizing such platforms. In all, €6.4 million will be allocated for a period of maximum 4 years for the EC platform—dwarfing the €400,000 allocated for the HRB platform for the same amount of time. The Open Research Europe tender was published by the EC on March 31, 2018 (EC, 2018a).
The platform is intended for Horizon 2020 beneficiaries to publish “scientific articles” in all major fields of scholarship, including Social Sciences and Humanities (SSH). The publication model specified diverged somewhat from the other funder platforms established until that time, in that it should offer two options: (a) a standard option in which manuscripts are peer reviewed before publication and (b) a model in which manuscripts are uploaded to a preprint server in advance of peer review. Peer review would in both cases be “open peer review,” although there were no exact specifications as to what aspects (Ross-Hellauer, 2017) of open peer review should be included, nor whether the publication of reviewer names or reports should happen after publication or in real time. Both preprints and peer-reviewed articles should be licensed either Creative Commons CC0 or CC-BY “or equivalent,” and text- and data-mining should be offered “in accordance with existing practices as they evolve over time.”
The contract notice explicitly stated that the EC is looking for customization of an existing publishing solution. The tender specifications hence included a number of criteria which seemed designed to ensure that only very established providers could tender, including needing to guarantee uptime of greater than 99.999%, having experience in IT publishing infrastructure in at least three European Union (EU) countries, and having an annual turnover of more than €1 million for the last 2 financial years. Such strict terms caught criticism from innovative noncommercial providers, such as Martin Eve of Open Library of Humanities (Eve, 2018) and Jean-Sébastien Caux from SciPost (Caux, 2018). The concerns of both were that these stringent conditions would prevent an innovative and truly open but budget-wise small solution from competing for the platform.
The platform architecture was not required to be open source, but there was a stipulation that it should be portable (not: forkable), and planned hand-over to the Commission (or party designated by the Commission) at the end of a 4-year period should be made possible. As part of this handover, the contractor would need to provide whatever is necessary for the Commission or a third designated party to run and maintain the entire platform infrastructure and, if necessary, redeploy it in a new environment. This would imply the transfer of both the content of the system and the workflows.
Processes, policies, and operational costs (including price-per-article) should be fully transparent to the public. The €6.4 million budget was broken down into €1 million for implementing and maintaining the platform infrastructure, communications, and sustainability (prepare for handover), with the remaining budget to be used for the production of peer reviewed articles, on a per-article cost basis (with preprints excluded from this budget calculation). The tender foresaw 5,600 peer-reviewed publications in 4 years, which would translate to an average publishing cost of €965 per article. A question mark should be raised about whether the platform will reach such levels of uptake, however. The projected 5,600 peer-reviewed publications in 4 years would represent 10% of projected number of Horizon 2020 publications. Given the Wellcome example, where the first year saw only 2% of Wellcome publications published via WOR, this could be a difficult target to achieve.
Finally, the tender contained stipulations on governance and sustainability. A scientific advisory board (whose role and mandate were not made explicit) should be selected by the contractor and approved by the Commission, while the contractor would also be responsible for developing a sustainability strategy to plan for operation of the platform beyond the initial 4 years, exploring potential synergies, business scenarios, funding models, and potential additional streams of revenue.
A Review of Roles and Motivations
From funder public statements and other sources, we can discern the following purposes that funder OA platforms aim to serve: increase OA uptake, control costs of OA, lower administrative burden on researchers (including for post-grant publications), demonstrate commitment to fostering open practices, and increase funder branding of research.
The recent move of research funders toward providing own funder-branded OA publishing platforms indicates that funders claim a new role in scholarly communication. This raises interesting questions regarding intentions and effects: What are possible motivations of funders in pursuing this route? What effects will this have on the scholarly communication landscape? and Will these effects match the funders’ intentions, and ultimately serve the interest of the research community and society as a whole?
As stated by, for example, the Wellcome Trust, the Gates Foundation, and the EC, the primary intention of funders in providing their own publishing platforms is to make a larger proportion of research outputs which result from their funding available in OA. In principle, they can do so by stimulating researchers to use existing platforms, such as F1000Research, through a combination of OA mandates and the provision of financial support. The fact that an increasing number of funders decide to launch their own publishing platforms, so far all built on F1000Research, may have to do with costs, branding, and/or editorial control.
By commissioning publishing platforms themselves, funders exercise stronger control over the costs of OA publishing resulting from funded research. If funders are, for instance, able to negotiate a better APC-rate for a branded platform, that will be advantageous to them. If they then can convince researchers to use the funder platform in favor of other publication venues (e.g., with higher APCs), these savings can be used to fund more research. Of course, F1000Research (or any other provider) also will charge for setting up and maintaining a bespoke publishing platform; so these costs are to be taken into account, as well. In any case, by commissioning a platform themselves, funders have control over the price of the service. Another aspect to consider here is a potentially lower administrative burden for researchers (or their institutions) and funders alike for publishing on a funder platform which would not involve a transfer of APCs. As such, a funder publishing platform can fill a gap, providing a service at a reasonable price for every funded researcher.
Another reason for funders to start their own publishing platforms could be branding. This may be as straightforward as having the opportunity for funders to display the output of their research in a central place, and use this to increase their visibility and reputation as a funder. But branding might also make it easier for a platform to build a reputation as a valuable publication venue that authors will actually submit their publications to. For authors, three important aspects can be thought to influence their decision to publish on a platform (either new or existing): trust in the platform itself, expected reach of their publications, and the effect of the venue on the reputation of their research output and, by extension, their own reputation. Branding of a platform may help develop trust in its technical standards and guarantees for longevity, although this would of course need to be borne out by the actual functionalities and standards of the platform. Branding may also increase the visibility of the platform and by extension increase the reach of the research published on it. The increased network effects and community size surrounding the platform may convince more researchers to publish there. Regarding reputation, this is something a branded platform can influence by its editorial policies (e.g., scope and criteria for peer review and acceptance). However, also the mere name attached to a platform could influence its use and standing in the research community. As we discuss below, this could be a negative consequence: Will publications on the Wellcome or Gates platform be valued differently than publication on F1000Research itself, instead of being judged on their merits only? This might be an unintended consequence of having dedicated funder platforms instead of facilitating publication through existing, non-branded platforms.
Funder control of the publication process can take several forms. In its most simple form, as already mentioned above, a funder-specific publication platform allows funders to obtain (and display) a better overview of publications resulting from funded research, and monitor usage and uptake of the use of the platform more easily. A more direct form of control arises when funders would directly require research funded by them to be disseminated on the funder-specific publishing platform, either exclusively or in addition to publication elsewhere (e.g., by aggregating research outputs published elsewhere, enabled by open licenses). A similar scenario could be envisioned for preprint server platforms (partially) financed by funders (e.g., bioRxiv by CZI or OSF by the Arnold Foundation). WhileCZI (2018) does not require CZI-funded researchers to post their preprints on bioRxiv, the organization states in its approach to supporting scientific projects, “We strongly encourage, and in some cases, may require, researchers to deposit manuscripts as preprints before peer review.”
Whether a mandate might in future extend to the choice of platform remains to be seen. So far, all funders involved have emphasized that their publishing platforms should be seen as complementary to not replacing other publication venues for their authors, so these forms of control have not yet materialized. Clearly though, these new developments can cause a shift in the balance between mandating OA, providing the platforms for such dissemination, and requiring authors to make use of these platforms.
Further steps could be envisioned in the context of editorial control. In the context of existing funder-commission platforms, it could be envisioned that funders require further adaptation of the publishing model such that it better fits their needs. This would of course require (re)negotiation of the agreement with the platform provider, but in theory, such changes would be easier to implement on a bespoke version of a platform, be it F1000Research or another platform. One hypothetical example of such changes could be a decoupling of the preprint functionality and the formal publishing functionality, so that authors could post their research output as preprints on the platform, and either pursue further publication on the same platform or use other publication venues. Another example would be setting criteria on scope, type of research output, and criteria for peer review (if any).
In this sense, funders can accelerate OA through their own market interventions—but not just buying what is offered on the market but by actively encouraging the development of adapted and/or new models—and thus contribute their share to fostering and steering desirable innovation in the scholarly communication landscape.
Issues and Open Questions
Funder OA platforms, as with any top-down policy intervention, bring concomitant concerns about unintended or negative consequences. In this case, we can discern the following areas for concern:
Conflict of Interest: Potential control of the funder over the publication process (in the various ways described above) brings to light the possible conflict of interest that may be perceived when funders provide the publishing platform for the research they finance. This concern was vividly described byKent Anderson (2016): “imagine if this were Pfizer Open Research teaming up with another commercial publisher. Would you believe that Pfizer Open Research—dedicated to Pfizer researchers—and the commercial publisher were making publication decisions in the same manner as a third-party journal run by an independent company? The motivations for Wellcome—to demonstrate value for funding, to have research outputs, and to show research throughput—may not be entirely commercial, but they are prone to the same conflicts of interest.” In our view, transparent editorial policies are imperative to address this perception: there should be a clear separation of editorial decision making from funder involvement, and all decisions regarding selection and peer review should be transparently documented to enable outside scrutiny. In addition, publication on a funder-specified platform should never be enforced but offered as a possibility alongside other publication venues meeting stated quality criteria. A separate concern may arise when funders set caps on article processing charges to be covered for publications resulting from their funding, and also provide financial support (other than meeting APC-costs) to certain publication platforms enabling them to ask APCs below these caps. This could be seen as unduly influencing market development. On the contrary, there are multiple business models for scholarly communication infrastructure, and using funder money (either grant money or more permanent financing, such as for EuropePMC) is but one of those that platforms can choose to pursue.
Scale: Another concern is that this approach may not be suitable for smaller funders, who may believe they do not have the name-brand recognition to carry such a platform or be concerned about the costs of operation. However, for funders, there may in the future be options to join up with existing platforms (this is explicitly mentioned as a possibility in the EC platform tender). If this only involves little further adaptations, the earlier investments of funders may in turn benefit from economies of scale. On the contrary, it can be argued that such platforms, in striving to keep costs down, might de facto be limited to a model of post-publication peer review such as F1000. Imposing a system of expert editorial boards which were able to cover all the possible subjects on which H2020 researchers might want to publish and cover all disciplines, not only science, technology, engineering and mathematics (STEM) but also SSH, would greatly add to the cost of such platforms. These costs would be especially onerous in the beginning—who would find and select the boards, for example. Hence, funders in embracing such platforms with the aim of fostering change could be incentivized to buy-in to the post-publication model, although this model has not yet found wide-scale uptake at other publishing venues and its effects are as yet relatively little-studied. This itself is an intervention, the effects of which are not yet properly understood.
Lock-in: Using private-sector infrastructure to support such platforms also brings with it an all-too familiar concern, however: How to avoid vendor lock-in? Such concerns are particularly pressing in light of the fact that Wellcome’s Robert Kiley seems to foresee an ultimate merger of such funder platforms: “The expectation is that this, and other similar funder platforms that are expected to emerge, will ultimately combine into one central platform” (Butler, 2016). It is natural that funders might want to make use of service-ready, tested platforms, to ensure a quality product and smooth service so as to build trust. For example, the EC publishing platform tender specifically requires that the platform is built on existing technological infrastructure for scientific publishing. Hence, it is sensible that these platforms should make use of the best available technologies, whether in the private or public sector. However, such platform should also be organized such that they do not become locked-in to one specific organization for its technologies or workflows. At the very least, publishing workflows should be transparent and re-implementable on another platform. Going a step further would be requiring the use of open source software and making all data on the platform open for export and re-use. The aim must be to avoid becoming bound to any one platform or organization such that the cost of transferring to another platform/organization becomes prohibitive. Plans should be made for the migration of content should a platform prove too expensive or no longer fit for purpose and/or to make sure that the content is not exclusively hosted on the funder OA platform.
Need to support wider OA initiatives: In addition, to support true innovation, funders should also continue to support wider initiatives in scholarly communication and seek to integrate them with their existing infrastructure on the basis of interoperability. A possible model for such support is SCOSS (scoss.org), the Global Sustainability Coalition for Open Science Services, a community-led effort to help maintain, and ultimately secure, vital infrastructure.
Branding issues: While the focus of publishing should be the on the quality of the research itself, a venue also takes on its own value. There are two distinct dangers here. First, that such funder OA platforms come to be seen as second-class venues for “the rest” of research—that prestigious publications go to traditional prestigious, high Impact Factor journals, and the rest to these platforms. This may negatively impact the perceived value of the platform and its content. Part of the answer is to ensure and demonstrate (through transparency) high quality editorial and peer review processes. However, building a reputation takes time, and users often rely on proxies, such as famous names attached to a platform, rather than on facts alone. This leads to the second concern that especially in the case of highly selective funders, the funder name becomes its own perceived badge of quality. This tension is visible in Robert Kiley’s explanation of the motivation for WOR, where although the point is made that researcher assessment should be based on specific outputs, “rather than using the journal’s name as a proxy of quality,” Kiley nonetheless next uses funder brand as a potential selling-point (albeit for a narrow reason): “We hope the Wellcome name and branding will encourage our researchers to publish on the platform, safe in the knowledge that their outputs will be considered in researcher assessment alongside more traditional outputs” (Kiley, 2016). The concern here must be that for prestigious funders, the prominent branding of the research as stemming from that funder could become a new erroneous proxy for the quality of the published research, in a way similar to the way journal brand has become a proxy for the quality of individual pieces of research. This would be harmful to the broader aim of evaluating the quality of research in itself.
Principles and Recommendations
Given their aims of increasing uptake of OA, lowering OA costs, decreasing administrative complexity, and signaling support for innovative Open Science systems, funder OA platforms are, in our view, a welcome step forward.
Based on the foregoing, we can begin to discern some guiding principles for the future development of such platforms. Assuming that the aim of funders is to create platforms for the sharing of research outputs which remain innovative, responsive to the needs of scientific communities, avoid lock-in to particular providers, and enable research outputs to be assessed on their own terms rather than via proxies like journal brand, we suggest the following. Many of these recommendations directly relate to the Principles of Open Scholarly Infrastructures as proposed by Neylon, Bilder and Lin that can serve as a touchstone guiding decisions and developments (Bilder, Lin, & Neylon, 2015). Specific recommendations for publication venues to adhere to and be assessed on in regard to promoting openness in scholarly communication can also be found in the TOP Guidelines (Nosek et al., 2018) and the report “Opening Academic Publishing—Development and Application of Systematic Evaluation Criteria” (Björk, Paavola, Ropponen, Laakso, & Lahti, 2018).
Listen to stakeholders and respect diversity: Uptake from researchers requires that platforms reflect researcher-needs and expectations in the present, and evolve in response to emergent user needs and attitudes in the future. Unfortunately, there do not seem thus far to have been any large-scale engagement of researchers right from the beginning of the planning for these platforms. Future co-evolution, however, can still be assured through concrete measures such as stakeholder governance, regular stakeholder feedback- and requirements-gathering, and active monitoring of use. In addition, such platforms should reflect genuine difference in attitudes among different stakeholder groups, and the need for situated openness as stressed in the OSCDNet Open Science Manifesto (OCSDnet, 2018) Statements about the need to avoid a “one-size-fits-all” approach could be dismissed as truisms or a means of avoiding difficult decisions. Yet, the reminder is crucial: scholarly communities are very diverse not only in the methods they use but in their attitudes toward various aspects of scholarly communication. Ignoring these differences will impair uptake, particularly in those communities at present most resistant. To give two examples: (a) Martin Eve points out that the CC BY/CC0 licensing conditions for the EC’s Open Research Europe platform might harm uptake among researchers from disciplines where re-use of third-party material is common (Eve, 2018), and (b) the use of open peer review, where attitudes remain highly variable across disciplines (Ross-Hellauer, Deppe, & Schmidt, 2017). Of course, there is a trade-off to be achieved in reducing complexity—every option within a workflow increases the complexity of the process, and this complexity must be supported technologically and via support and training structures. Care should also be taken that disciplinary differences do not serve as an excuse not to pursue greater openness. Funders are pushing a new vision of scholarly communication, and of course, some will be more receptive than others. Still, it may be that options tailored for different communities would allow a smoother transition and facilitate researcher uptake.
Maximize operational transparency and accountability: Given the potential for the appearance of conflicts of interest in a funder directly supporting a platform for the dissemination of its research, it is imperative to build trust via openness and transparency of processes. This is obviously supported by the openness of peer review and editorial processes which such platforms have thus far employed. However, transparency should extend beyond individual editorial publishing decisions. To ensure trust in the development of the platform as a whole, higher structures of governance should also be maximally transparent—not only responsive to the community, as suggested above, but accountable to it. To ensure long-term commitment and trust, independence of higher structures of governance are also crucial. Broadly speaking, a wide community of experts should govern all the aspects of the platform, from editorial boards to technical roll-out. This managed consensual activity would have oversight of several important areas, including the ownership of publishing process assets (databases, coding), overview of transparent workflows between authors and editors, trustworthy terms and conditions for sharing and access of articles, ownership of data, decisions on budget, and management of funds. Moreover, given the interests in controlling costs and aiding understanding of the costs of publishing, transparency of revenue-management should be expected. Finally, making as much of the data about publishing processes as open as possible will allow external researchers to evaluate the efficacy and value of the processes used.
Embrace interoperability: It perhaps goes without saying that for maximum re-usability, reproducibility, and transparency, such platforms should publish all research objects (including data, software, research protocols), with open standardized metadata to establish the links between them, and apply open licenses to maximize re-use by humans and machines. In addition, there is a question of the extent to which such platforms themselves should become interlinked—and interoperable with the wider open science landscape. We saw earlier that it is the aim of F1000 to establish Open Research Central as a central access point for funder platforms “owned and governed by the international research community with broad representation across research funding agencies, research institutions, and researchers themselves.” As many funders may lack the resources, scale, or brand-awareness to commission their own platforms, collective action would also be wise. Coordination could be taken on by groups like Science Europe (https://www.scienceeurope.org/), an association of European research funding and performing organizations, or the ORFG (http://www.orfg.org/), a collective of philanthropic funders. At the same time, increased coordination also increases concerns about control, highlighting the need for transparency in decision making and implementation.
Prefer open source: Whether from the private or public sector, it is crucial that OA funder platforms avoid becoming bound to specific organizations for technologies or workflows such that the cost of transferring to another platform/organization becomes prohibitive. At the very least, this implies portability of content and workflows, but ideally any platform should be open-source to ensure that the system itself is forkable if required (Bilder et al., 2015).
Think bigger: The platforms commissioned thus far reflect the state-of-the-art in established standards and technologies for Open Science publishing platforms. Such thinking, though, can also from the start close the door to more innovative developments. One solution could be to also use such platforms, especially once established, as venues for experimentation with genuinely ground-breaking models and technologies. As suggested byRoss-Hellauer and Fecher (2017), one such approach would be to draw together ongoing efforts to find alternative models for scholarly publishing. Could we, for example, re-integrate the green and gold roads—of public repositories, institutional publication models, and state-of-the-art publishing platforms? Could research funding and performing organizations, in collaboration with research infrastructure providers, pool their collective efforts into creating an innovative public publication infrastructure? Funders could also consider to support more radical approaches to sharing information that is used and generated to solve scientific and societal issues they are concerned with as a funder. These could include moving away from the articles/papers paradigm, putting data first, integration of information used and generated as well as the review/assessment thereof by project funded, optimizing output for machine readability and mining, and so on. Envisioned here is a sustainable, truly interoperable Open Science commons. Many elements already exist, including for discovery (e.g., BASE, CORE), publishing (CoKo Foundation’s PubSweet, PKP’s OJS), archiving/sharing publications and preprints (OSF, OpenAIRE, arXiv), and archiving/sharing code and data (Zenodo, DRYAD). Decentralized paradigms like DAT (datproject.org) and Blockchain could further bring decentralized data ownership to the core of scholarly communication. The way ahead lies in linking up such efforts to coordinate them into an interoperable public infrastructure, sustainably funded by public institutions (e.g., research libraries, funders). Ultimately, this would offer researcher-centric, low-cost, innovative, and interoperable tools for research, superior to the present, largely closed system. The time for Open Science to think big is now, with the introduction of large-scale initiatives like the EU’s European Open Science Cloud (EC, 2016). There is plenty of money within the system; it need to only be better directed to sustainably support open, interoperable infrastructure.
Acknowledgements
The authors thank Benedikt Fecher, Jon Tennant, Jeroen Bosman, and two anonymous reviewers for discussion and comments that led to the development and improvement of this article. They also thank Robert Kiley of Wellcome Trust and Ashley Farley of Gates Foundation for their willingness to share data regarding Wellcome Open Research and Gates Open Research, respectively.
Declaration of Conflicting Interests
The author(s) declared the following potential conflicts of interest with respect to the research, authorship, and/or publication of this article: T.R.H. is Editor-in-Chief ofPublications (ISSN 2304-6775), an open access (OA) journal on scholarly publishing published quarterly by MDPI. T.R.H. is a senior researcher at Know-Center GmbH, Graz, Austria. The Know-Center is funded within the Austrian COMET program—Competence Centers for Excellent Technologies—under the auspices of the Austrian Federal Ministry of Transport, Innovation and Technology; the Austrian Federal Ministry of Economy, Family and Youth; and the State of Styria. COMET is managed by the Austrian Research Promotion Agency FFG. B.S. is affiliated with the OpenAIRE initiative at the University of Göttingen. OpenAIRE is an EC-funded initiative to implement and monitor Open Access and Open Science policies in Europe and beyond. OpenAIRE has contributed to a tender submission for the EC’s Open Research Europe platform. B.S. and B.K. are members of the current Horizon 2020 expert group on Future of Scholarly Publishing and scholarly Communication (E03463). B.K., at the time of writing the manuscript, was seconded at the Ministry of Ministry of Education, Culture and Science of the Netherlands. All authors are committed advocates of Open Access and Open Science.
Funding
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work was partially funded by European Commission H2020 project OpenUP (Grant agreement: 710722). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
ORCID iD
Tony Ross-Hellauerhttps://orcid.org/0000-0003-4470-7027
References
American Association of University Professors . (1940). 1940 statement of principles on academic freedom and tenure (Committee on Academic Freedom and Academic Tenure of the American Association of University Professors). Retrieved fromhttps://www.aaup.org/report/-statement-principles-academic-freedom-and-tenure Google Scholar | |
Anderson, K. (2016).Wellcome money—Involvement with F1000 opens door on sketchy peer review, COIs, and spending decisions. The Scholarly Kitchen. Retrieved fromhttps://web.archive.org/web/20180517221815/https://scholarlykitchen.sspnet.org/2016/07/20/wellcome-money-involvement-with-f1000-opens-door-on-sketchy-peer-review-cois-and-spending-decisions/ Google Scholar | |
Archambault, E., Amyot, D., Deschamps, P., Nicol, A., Provencher, F., Rebout, L., Roberge, G. (2014,October22). Proportion of open access papers published in peer-reviewed journals at the European and world levels—1996–2013 (RTD-B6-PP-2011-2: Study to develop a set of indicators to measure open access). Science-Metrix. Retrieved fromhttps://web.archive.org/web/20180922070456/http://science-metrix.com/sites/default/files/science-metrix/publications/d_1.8_sm_ec_dg-rtd_proportion_oa_1996-2013_v11p.pdf Google Scholar | |
Bilder, G., Lin, J., Neylon, C. (2015). Principles for open scholarly infrastructures-V1. doi:10.6084/m9.figshare.1314859.v1 Google Scholar | |
Björk, A., Paavola, J.-M., Ropponen, T., Laakso, M., Lahti, L. (2018,January31). Opening academic publishing: Development and application of systematic evaluation criteria. Open Science and Research Initiative. Retrieved fromhttps://avointiede.fi/documents/10864/12232/OPENING+ACADEMIC+PUBLISHING+.pdf/a4358f81-88cf-4915-92db-88335092c992 Google Scholar | |
Björk, B.-C., Solomon, D. (2014).How research funders can finance APCs in full OA and hybrid journals. Learned Publishing, 27,93-103. doi:10.1087/20140203 Google Scholar | |
Bohannon, J. (2016).U.K. research charity will self-publish results from its grantees. Science. doi:10.1126/science.aag0636 Google Scholar | |
Boselli, B., Galindo-Rueda, F. (2016). Drivers and implications of scientific open access publishing: Findings from a pilot OECD international survey of scientific authors (OECD Science, Technology and Industry Policy Papers 33). OECD Publishing. Retrieved fromhttps://ideas.repec.org/p/oec/stiaac/33-en.html Google Scholar | |
Bourne, P. E., Polka, J. K., Vale, R. D., Kiley, R. (2017).Ten simple rules to consider regarding preprint submission. PLoS Computational Biology, 13(5), e1005473. doi:10.1371/journal.pcbi.1005473 Google Scholar | |
Butler, D. (2016).Wellcome Trust launches open-access publishing venture. Nature News. doi:10.1038/nature.2016.20220 Google Scholar | |
Butler, D. (2017).Gates foundation announces open-access publishing venture. Nature News, 543(7647), 599. doi:10.1038/nature.2017.21700 Google Scholar | |
Callaway, E. (2017).BioRxiv preprint server gets cash boost from Chan Zuckerberg initiative. Nature News, 545(7652), 18. doi:10.1038/nature.2017.21894 Google Scholar | |
Caux, J.-S. (2018).Thoughts on the call for tenders for the EC’s open research publishing platform. Jean-sébastien Caux Blogpost. Retrieved fromhttps://web.archive.org/web/20180514155902/https://jscaux.org/blog/post/2018/04/02/ectender/ Google Scholar | |
Center for Open Science . (2018). Our sponsors. Retrieved fromhttps://web.archive.org/web/20180129072118/https://cos.io/about/our-sponsors/ Google Scholar | |
Chan, L., Cuplinskas, D., Eisen, M., Friend, F., Genova, Y., Guédon, J.-C., . . . Velterop, J. (2002). Budapest open access initiative. Retrieved fromhttps://web.archive.org/web/20171213093708/http://www.budapestopenaccessinitiative.org/read Google Scholar | |
Chan Zuckerberg Initiative . (2018). Chan Zuckerberg science. Retrieved fromhttps://web.archive.org/web/20180227194319/https://chanzuckerberg.com/science Google Scholar | |
Collins, E. (2013).Publishing priorities of biomedical research funders. BMJ Open, 3(10), e004171. doi:10.1136/bmjo pen-2013-004171 Google Scholar | |
Confederation of Open Access Repositories, & United Nations Educational, Scientific and Cultural Organization . (2016). Joint statement about open access by COAR and UNESCO. Retrieved fromhttps://web.archive.org/web/20180225233452/https://www.coar-repositories.org/news-media/coar-and-unesco-joint-statement-about-open-access/ Google Scholar | |
Crowfoot, A. (2017).Open access policies and science Europe: State of play. Information Services & Use, 37,271-274. doi:10.3233/ISU-170839 Google Scholar | |
Dallmeier-Tiessen, S., Darby, R., Goerner, B., Hyppoelae, J., Igo-Kemenes, P., Kahn, D., . . . van der Stelt, W. (2011,January). Highlights from the SOAP project survey. What scientists think about open access publishing. arXiv: 11015260 [Cs]. Retrieved fromhttp://arxiv.org/abs/ 1101.5260 Google Scholar | |
De Castro, P . (2015).The OpenAIRE2020 FP7 post-grant open access pilot: Implementing a European-wide funding initiative for open access publishing costs. Information Services & Use, 35,235-241. doi:10.3233/ISU-150786 Google Scholar | |
Enserink, M. (2016).In dramatic statement, European leaders call for “immediate” open access to all scientific papers by 2020. Science. doi:10.1126/science.aag0577 Google Scholar | |
Enserink, M. (2017).European commission considering leap into open-access publishing. Science. doi:10.1126/science.aal0977 Google Scholar | |
The Europe PMC Consortium . (2015).Europe PMC: A full-text literature database for the life sciences and platform for innovation. Nucleic Acids Research, 43(Database issue),D1042-D1048. doi:10.1093/nar/gku1061 Google Scholar |Medline | |
European Commission . (2016). Realising the European open science cloud: First report and recommendations on the European open science cloud. Directorate-General for Research, European Commission. Retrieved fromhttps://web.archive.org/save/https://ec.europa.eu/research/openscience/pdf/realising_the_european_open_science_cloud_2016.pdf Google Scholar | |
European Commission . (2017). Guidelines to the rules on open access to scientific publications and open access to research data in horizon 2020 (Version 3.2). Retrieved fromhttps://ec.europa.eu/research/participants/data/ref/h2020/grants_manual/hi/oa_pilot/h2020-hi-oa-pilot-guide_en.pdf Google Scholar | |
European Commission . (2018a). Open research Europe—The European commission open research publishing platform, 2018/S 064-141558 [Tender document]. Retrieved fromhttp://ted.europa.eu/TED/notice/udl?uri=TED:NOTICE:141558-2018:TEXT:EN:HTML Google Scholar | |
European Commission . (2018b). Recommendation on access to and preservation of scientific information (C [2018] 2375). Retrieved fromhttp://ec.europa.eu/newsroom/dae/document.cfm?doc_id=51636 Google Scholar | |
European Commission . (2018c). What is horizon 2020? Retrieved fromhttps://web.archive.org/web/20180128015939/https://ec.europa.eu/programmes/horizon2020/en/what-horizon-2020 Google Scholar | |
Eve, M. P. (2018). The tender document for the European commission’s open access platform asks for an awful lot for not very much. Retrieved fromhttps://web.archive.org/web/20180514155642/https://www.martineve.com/2018/04/01/the-tender-document-for-the-european-commissions-open-access-platform-asks-for-an-awful-lot/ Google Scholar | |
Fecher, B., Friesike, S., Peters, I., Wagner, G. G. (2017).Rather than simply moving from “paying to read” to “paying to publish,” it’s time for a European open access platform. LSE Impact of Social Sciences. Retrieved fromhttp://blogs.lse.ac.uk/impactofsocialsciences/2017/04/10/rather-than-simply-moving-from-paying-to-read-to-paying-to-publish-its-time-for-a-european-open-access-platform/ Google Scholar | |
F1000 . (2017). How it works—Open research central. Retrieved fromhttps://web.archive.org/web/20170831173650/https://openresearchcentral.org/about Google Scholar | |
F1000 . (2018). How it works—F1000Research. F1000Research. Retrieved fromhttps://web.archive.org/web/20180102123424/https://f1000research.com/about Google Scholar | |
Ginsparg, P. (2016).Preprint Déj�? Vu. The EMBO Journal, 35,2620-2625. doi:10.15252/embj.201695531 Google Scholar |Medline | |
Hall, A. R., Bembridge, B. A. (1986). Physic and philanthropy: A history of the Wellcome Trust 1936-1986.Cambridge, UK:Cambridge University Press. Google Scholar | |
Heijne, M. A. M., van Wezenbeek, W. J. S. M. (2018).The Dutch approach to achieving open access. Bibliothek: Forschung Und Praxis, 42,36-41. doi:10.18452/18646 Google Scholar | |
Johnson, R., Fosci, M. (2015). Putting down roots: Securing the future of open access policies. Retrieved fromhttp://repository.jisc.ac.uk/6269/10/final-KE-Report-V5.1-20JAN2016.pdf Google Scholar | |
Johnson, R., Fosci, M., Chiarelli, A., Pinfield, S., Jubb, M. (2017). Towards a competitive and sustainable OA market in Europe—A study of the open access market and policy environment (Deliverable 5.3 of OpenAIRE WP5, FP7 Post Grant Gold Open Access Pilot).Nottingham, UK: Research Consulting. doi:10.5281/zenodo.401029 Google Scholar | |
Johnston, D. (2017).Open access policies and academic freedom: Understanding and addressing conflicts. Journal of Librarianship and Scholarly Communication, 5(1), eP2104. doi:10.7710/2162-3309.2104 Google Scholar | |
Jubb, M., Goldstein, S., Amin, M., Pinfield, S. (2015). Monitoring the transition to open access. Retrieved fromhttp://www.researchconsulting.co.uk/monitoring-the-transition-to-open-access/ Google Scholar | |
Kiley, R. (2016). Why we’re launching a new publishing platform Wellcome.Wellcome Trust. Retrieved fromhttps://web.archive.org/web/20171116115906/https://wellcome.ac.uk/news/why-were-launching-new-publishing-platform Google Scholar | |
Kiley, R. (2017). Wellcome Open Research—Publication data year 1 (Nov 2016-Nov 2017). Retrieved fromhttps://figshare.com/articles/Wellcome_Open_Research_-_publication_data_year_1_Nov_2016-Nov_2017_/5639197/2 Google Scholar | |
Kiley, R., Terry, R. (2006).Open access to the research literature: A funders perspective. In Jacobs, N. (Ed.), Open access: Key strategic, technical and economic aspects.Chandos Publishing. Retrieved fromhttp://eprints.rclis.org/7531/ Google Scholar | |
Laakso, M., Björk, B.-C. (2016).Hybrid open access—A longitudinal study. Journal of Informetrics, 10,919-932. doi:10.1016/j.joi.2016.08.002 Google Scholar | |
Luther, J. (2017).The stars are aligning for preprints. The Scholarly Kitchen. Retrieved fromhttps://web.archive.org/web/0/https://scholarlykitchen.sspnet.org/2017/04/18/stars-aligning-preprints/ Google Scholar | |
Morgan, L. (2017).Taking steps to expand access to high-quality scientific publishing. Medium. Retrieved fromhttps://medium.com/bill-melinda-gates-foundation/taking-steps-to-expand-access-to-high-quality-scientific-publishing-6db7a6bfe9be Google Scholar | |
National Science Foundation . (1977). NSF grant policy manual.Arlington, VA: Author. Google Scholar | |
Nosek, B. A., Alter, G., Banks, G. C., Borsboom, D., Bowman, S. D., Breckler, S. J., . . . Yarkoni, T. (2018,May29). Transparency and Openness Promotion (TOP) guidelines. OSF Preprints. doi:10.1126/science.aab2374 Google Scholar | |
OAPEN . (2018). Funders OAPEN. Retrieved fromhttp://oapen.org/content/join-funders Google Scholar | |
OCSDnet . (2018). Open science manifesto. Retrieved fromhttps://ocsdnet.org/manifesto/open-science-manifesto/ Google Scholar | |
Office of Government Procurement Ireland . (2017). Contract award notice information. Etenders. Retrieved fromhttps://web.archive.org/web/20180220201040/https://irl.eu-supply.com/ctm/Supplier/PublicTenders/ViewNotice/194760 Google Scholar | |
OpenAPC Dataset . (n.d.). Retrieved fromhttps://github.com/OpenAPC/openapc-de Google Scholar | |
Open Research Funders Group . (2018).About—Open research funders group. Author. Retrieved fromhttp://www.orfg.org/about Google Scholar | |
Packer, A., Santos, S., Meneghini, R. M. (2017). SciELO preprints on the way.SciELO in Perspective. Retrieved fromhttps://web.archive.org/web/20170223042748/http://blog.scielo.org/en/2017/02/22/scielo-preprints-on-the-way/ Google Scholar | |
Piwowar, H., Priem, J., Larivière, V., Alperin, J. P., Matthias, L., Norlander, B., . . . Haustein, S. (2018).The state of OA: A large-scale analysis of the prevalence and impact of open access articles. PeerJ, 6, e4375. doi:10.7717/peerj.4375 Google Scholar |Medline | |
Pöschl, U. (2012).Multi-stage open peer review: Scientific evaluation integrating the strengths of traditional peer review with the virtues of transparency and self-regulation. Frontiers in Computational Neuroscience, 6, Article 33. doi:10.3389/fncom.2012.00033 Google Scholar |Medline | |
ResearchGate . (2017). ResearchGate secures investments from Wellcome Trust, Goldman Sachs investment partners, and four rivers group as the place where scientific progress happens. ResearchGate. Retrieved fromhttps://web.archive.org/web/20170805220642/https://www.researchgate.net/blog/post/researchgate-secures-investments-from-wellcome-trust-goldman-sachs-investment-partners-and-four-rivers-group-as-the-place-where-scientific-progress-happens Google Scholar | |
Romeu, C., Kohls, A., Gentil-Beccot, A., Mele, S., Vesper, M., Mansuy, A. (2014).The SCOAP3 initiative and the open access article-processing-charge market: Global partnership and competition improve value in the dissemination of science. CERN Document Server preprint. doi:10.2314/CERN/C26P.W9DT Google Scholar | |
Ross-Hellauer, T. (2017).What is open peer review? A systematic review. F1000Research, 6,588. doi:10.12688/f1000research.11369.2 Google Scholar |Medline | |
Ross-Hellauer, T., Deppe, A., Schmidt, B. (2017).Survey on open peer review: Attitudes and experience amongst editors, authors and reviewers. PLoS ONE, 12(12), e0189311. doi:10.1371/journal.pone.0189311 Google Scholar | |
Ross-Hellauer, T., Fecher, B. (2017).Journal flipping or a public open access infrastructure? What kind of open access future do we want? Impact of Social Sciences Blog. Retrieved fromhttp://blogs.lse.ac.uk/impactofsocialsciences/ Google Scholar | |
Ruiz-Perez, S. (2017). Drivers and barriers for open access publishing: From SOAP 2010 to WOS 2016 (Doctoral thesis). Universidad de Granada,Granada, Spain. doi:10.5281/zenodo.842016 Google Scholar | |
Schekman, R., Patterson, M., Watt, F., Weigel, D. (2012).Scientific publishing: Launching eLife, Part 1. Elife, 1, e00270. doi:10.7554/eLife.00270 Google Scholar | |
Schimmer, R., Geschuhn, K. K., Vogler, A. (2015).Disrupting the subscription journals’ business model for the necessary large-scale transformation to open access. Max Planck Digital Library preprint. doi:10.17617/1.3 Google Scholar | |
Schmidt, B. (2018a). WOR: Wellcome Open Research—Exploration of year one data. Retrieved fromhttps://github.com/gitti1/WOR Google Scholar | |
Schmidt, B. (2018b). Gitti1/WOR: WOR_year1_expl.Goettingen, Germany: Zenodo. doi:10.5281/zenodo.1249402 Google Scholar | |
Schultz, D. M. (2010).Rejection rates for journals publishing in the atmospheric sciences. Bulletin of the American Meteorological Society, 91,231-244. doi:10.1175/2009BAMS2908.1 Google Scholar | |
Science Europe . (2016). Open access publishing policies in Science Europe member organisations key results from Science Europe and global research council surveys. Retrieved fromhttps://web.archive.org/save/https://www.scienceeurope.org/wp-content/uploads/2016/10/SE_OpenAccess_SurveyReport.pdf Google Scholar | |
Solomon, D. J., Laakso, M., Björk, B.-C. (2016). Converting scholarly journals to open access: A review of approaches and experiences.Cambridge, MA:Harvard Library. Retrieved fromhttps://dash.harvard.edu/handle/1/27803834 Google Scholar | |
United Nations Education, Science and Cultural Organization . (1997). Recommendation concerning the status of higher-education teaching personnel. Retrieved fromhttps://web.archive.org/web/20180227142802/http://portal.unesco.org/en/ev.php-URL_ID=13144&URL_DO=DO_TOPIC&URL_SECTION=201.html Google Scholar | |
Van Noorden, R . (2013).Open access: The true cost of science publishing. Nature News, 495,426-429. doi:10.1038/495426a Google Scholar |Medline |ISI | |
Vincent-Lamarre, P., Boivin, J., Gargouri, Y., Larivière, V., Harnad, S. (2016).Estimating open access mandate effectiveness: The MELIBEA score. Journal of the Association for Information Science and Technology, 67,2815-2828. doi:10.1002/asi.23601 Google Scholar | |
Vines, T. (2013).How rigorous is the post-publication review process at F1000 research? The Scholarly Kitchen. Retrieved fromhttps://scholarlykitchen.sspnet.org/2013/03/27/how-rigorous-is-the-post-publication-review-process-at-f1000-research/ Google Scholar | |
Walport, M., Kiley, R. (2006).Open access, UK PubMed central and the Wellcome Trust. Journal of the Royal Society of Medicine, 99,438-439. Retrieved fromhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC1557892/ Google Scholar | |
Wellcome Open Research . (2018). FAQs. Author. Retrieved fromhttps://web.archive.org/web/20180514165753/https://wellcomeopenresearch.org/faqs Google Scholar | |
Wellcome Trust . (2018). Open research. Retrieved fromhttps://web.archive.org/web/20180928082745/https://wellcome.ac.uk/what-we-do/our-work/open-research Google Scholar | |
Xia, J. (2010).A longitudinal study of scholars attitudes and behaviors toward open-access journal publishing. Journal of the American Society for Information Science and Technology, 61,615-624. doi:10.1002/asi.21283 Google Scholar | |
Xia, J., Gilchrist, S. B., Smith, N. X. P., Kingery, J. A., Radecki, J. R., Wilhelm, M. L., . . . Mahn, A. J. (2012).A Review of Open Access Self-Archiving Mandate Policies. Portal Libraries and the Academy, 12,85-102. doi:10.1353/pla.2012.0000 Google Scholar |
Author Biographies
Tony Ross-Hellauer is a senior researcher in open science at Know-Center and Graz University of Technology. He is editor-in-chief of “Publications” (ISSN 2304-6775), an international peer-reviewed open access journal on scholarly publishing. His main research interests are open science models and infrastructures, science policy, alternative models for peer review, and philosophy of technology.
Birgit Schmidt coordinates international and national Open Science activities and projects and leads the unit Knowledge Commons at Göttingen State and University Library. Her activities focus on policies, e-infrastructures and training in support of the implementation of Open Access and Open Science. She co-chairs working groups on research data management (Association of European Research Libraries, Research Data Alliance) and contributes to several international committees, e.g. the European Commission’s Horizon 2020 expert group on the Future of Scholarly Publishing and Scholarly Communication, Knowledge Exchange’s Open Access Experts Group and formerly the Belmont Forum’s working group on Open Data. Previously, she acted as scientific manager of the European OpenAIRE project and as executive director of the Confederation of Open Access Repositories (COAR). She has a background in Mathematics and Philosophy, and a postgraduate degree in Library and Information Science.
Bianca Kramer (@MsPhelps) is a librarian for life sciences and medicine at Utrecht Library, with a strong focus on scholarly communication and Open Science. Through her work on the project ‘101 innovations in scholarly communication’ (including a worldwide survey of >20,000 researchers) she is investigating trends in innovations and tool usage across the research cycle. She regularly leads workshops on various aspects of scholarly communication and the openness aspects thereof for researchers, students and other stakeholders in scholarly communication. She is a member of the steering committee of the FORCE11 Scholarly Commons Working Group and the executive board of FORCE11, as well as a member of the European Commission’s Expert group on the Future of Scholarly Publishing and Scholarly Communication.