
A 2017 Belmont Forum survey shows that “lack of data standards and exchange standards” is one of the top researcher challenges with data use.
(Image credit - Belmont Forum Skills Gap Analysis)
On Sept. 18, the National Academies study committee
At the symposium, representatives from several publishing companies emphasized their ongoing commitment
Holly Falk-Krzesinski, vice president of global strategic networks at Elsevier, discussed the differences between various open access models being used in scientific publishing. In “gold” open access journals, all content is available for free online immediately after publication, and authors or institutional funders of their research must pay an article processing charge (APC). “Green,” also sometimes called “delayed gold,” open access allows articles to be released publicly in their final form but only after a specified embargo period. Journals may also permit authors to make articles freely available either on preprint servers prior to publication or as manuscript versions on the author’s website or in institutional archives. Michael Forster, managing director of IEEE Publications, described how some non-open access IEEE journals also give authors a “hybrid” option of paying an APC to make articles immediately available.
Several speakers expressed concern that APCs could become a significant financial burden on researchers and institutions. Ivy Anderson, director of collection development and management for the California Digital Library, and Tyler Walters, dean of university libraries at Virginia Tech, suggested that universities could designate or contribute funds to help cover the APCs. Jennifer Hansen, senior officer at the Bill and Melinda Gates Foundation, noted that her organization covers all costs of open access charges, and requires all articles from funded studies to be open access immediately upon publication.
Federal agencies typically require publications resulting from research grants they sponsored to be open access after a specified embargo period. Howard Ratner, executive director of the Clearinghouse for the Open Research of the United States (CHORUS), explained that his organization works with the agencies to facilitate open access to publications within the existing infrastructure for scholarly communication. Ratner noted that CHORUS has developed partnerships agency-by-agency and it has aimed to ease the administrative burden of researcher compliance with open access requirements.
Joerg Heber, editor-in-chief of the Public Library of Science (PLOS) ONE and John Inglis, executive director of Cold Spring Harbor Laboratory Press and co-founder of bioRxiv, highlighted the importance of preprint servers as a way for researchers to share and evaluate data and findings while the lengthy peer review and publishing process is underway. However, Inglis indicated that there is still some anxiety in the research community over preprint results being “scooped” by other researchers without attribution.
Inglis also said that publishers are becoming more accepting of preprint servers, observing that the tool is “gaining momentum” across scientific disciplines. The American Geophysical Union (AGU) recently announced
Several publishers emphasized the importance of adopting community guidelines that would require or encourage researchers to make the data underlying publications available whenever possible. Although publishers have made exceptions to open data requirements for data related to national security or patient privacy, for example, other publishers have been hesitant to adopt overarching data policies. Both Kresinski and Forster expressed concern about any potential implementation of “one-size-fits-all solutions.”
Kenton McHenry, technical coordinator of the National Data Service Consortium, and Daniel Goroff, a vice president at the Alfred P. Sloan Foundation, indicated that uncertainty around data security is a serious concern that erodes trust in repositories. Goroff suggested that repositories could address the issue by limiting access to data and creating a network to share standards and best practices for generating data.
Another significant challenge highlighted in many presentations is the need for common guidelines and adherence to the “FAIR principles” — standards for making data findable, accessible, interoperable, and reusable. Shelly Stall, assistant director of AGU’s data management assessment program, highlighted a recent survey that indicated a lack of data and exchange standards as one of the top challenges for data use.
A 2017 Belmont Forum survey shows that “lack of data standards and exchange standards” is one of the top researcher challenges with data use.
(Image credit - Belmont Forum Skills Gap Analysis)
Stall explained that publishers and data repositories are working to meet the FAIR principles through actions such as adopting the Center for Open Science’s TOP Guidelines
Kerstin Lehnert, director of the Interdisciplinary Earth Data Alliance at Columbia University, stressed the importance of creating domain or discipline-specific data facilities, as they tend to be more trusted by researchers. She said that budgets supporting repositories compete with core science budgets all while having to continuously update requirements and technologies. She emphasized that the long-term sustainability of these repositories must be addressed.
Lehnert also emphasized that partnerships among data facilities and their users “are essential to make protocols and policies more effective and the landscape manageable for all stakeholders.” As examples, Lehnert and Stall highlighted COPDESS and the National Science Foundation’s EarthCube as two successful collaborations between data facilities and publishers in the Earth and space sciences.
Representatives from federal science agencies, including U.S. Geological Survey and the National Institutes of Health, explained that their data management plans are already addressing the need for data repositories that are trusted and comply with open access standards. Agencies have developed these plans
There is no certainty as to which points discussed in the meeting will be considered in the committee’s final report. However, it is becoming apparent that many stakeholders are eager to stave off “one-size-fits-all solutions” for moving the research enterprise toward open science. The committee will continue their discussion later this year.