You are viewing a javascript disabled version of the site. Please enable Javascript for this site to function properly.
Go to headerGo to navigationGo to searchGo to contentsGo to footer
In content section. Select this link to jump to navigation

APE 2021: The New Face of Trust – A Virtual Conference on Academic Publishing in Europe

DAY ONE: Tuesday, 12 January 2021

Prof. Dr. Christoph Markschies (President of the Berlin Brandenburg Academy of Sciences and Humanities, Berlin) opened the 16th APE. He stressed he would have preferred to meet face to face and regretted this was not possible. He said that trust acts as the foundation on which the publisher-author relationship is built and that the academic field is a network of relationships connected by trust. He stressed that trust is mutual. He took us back to the year 1944. June 29 used to be an annual day of celebration for the Berlin Brandenburg Academy of Sciences and Humanities. That year, 1944, was different of course. However, special academic achievements were still honored, and the golden Leibniz Medal was awarded to Leopold Glotz. Prof. Markschies said that evidently, even during the Nazi-regime it was demonstrated that academic independence should always be ensured. This event demonstrated that academic publishing can preserve freedom of academic endeavors in difficult times.

The two Keynotes were introduced by Dr. Georg W. Botz (Coordination Open Access Policy of the Max Planck Society, Munich).

Prof. Dr. Dorothea Wagner (Chairwoman of the German Council of Science and Humanities, Cologne) started her keynote presentation: Open and Autonomous. The Basis for Trust in Science by saying that it is too early to assess the historic dimension of COVID 19 and that future historians will make a distinction in before and after COVID 19. She said the pandemic has revealed the role of science in society; It has shown that scientific research is essential for decision making. The pandemic laid bare the weaknesses and strengths of the academic system and sciences and that we should learn from that.

She posed the question: How can the science system be made more resilient in times of crises and how can science contribute to the resilience of society? These questions are connected and a starting point to think about the future role of science in society.

She said that it was encouraging to see that trust in science still existed during 2020 although there were differences around the globe. The level of trust seemed to be correlated with the level of infections. It seemed lack of trust was not about the fear that scientists were incompetent but the fear that scientists were not acting independently. She said trust in science is connected with scientific publishing: the collective knowledge production system. Publications are the primary medium through which science influences society (e.g. through journalists). Therefore, the published record of science must always be reliable and trustworthy.

Trust does not only depend on access to publications but also on the transparency of the publishing process. It should be an effective system whereby transparency enables quality assurance. However, politicians and the general public should know how science works, transparency alone is not sufficient. Trust depends not only on understanding the system. It also depends on the motives of the people working in the system and the interpersonal relationships. She added: Trust is the expectation that the other will do you no harm.

We should also look at the motives within the scientific publishing system. E.g. the danger of predatory publishers, publishing as an end in itself, that it’s just about mere quantity, profit maximizing behavior, taking advantage of scientists. The question of trust with regard to plagiarism. We should not tolerate this, it will undermine the system of what we consider good scientific practice. She said that the transformation to Open Science is an opportunity to change the discourse about science: to create new knowledge and make this knowledge available for the public.

Lauren Kane (President, Society for Scholarly Publishing (SSP) and Chief Strategy Officer, Morressier, Washington, DC) started her Keynote: Reinvention or Return to ‘normal’? Scholarly Communications at a Crossroads with the question if the events of 2020 would catapult us forward with innovation or send us backwards seeking the security of the known. She said that before 2020, there was no shortage of new ideas and business models to better disseminate knowledge, however, change could be incredibly slow. Everybody agreed on the idea that Open Access (OA) was for widespread benefit, but there was still a lot of debate on how it should be achieved. Should OA be mandated like Plan S? More DEAL agreements? Also, the awareness that OA is not synonymous with equitable access had gained more attention and global inclusiveness remains an important issue. Collaboration e.g. with funders is important too, as well as across disciplines.

Kane continued with a general concern that 2020 would be a lost year, but the community refused to consider this a lost year. Many societies and organizations used this period to push things forward. According to Kane, this was somewhat surprising for the publishing industry as it had usually been an industry of conservatism and slow change. But she said that 2020 has shown swift actions from various organizations to ensure that the show would go on and collaboration would continue. In 2020, outputs exploded across STM, and the community also used this as a time to experiment, e.g. with virtual events and innovations in the Peer Review process. Kane added that perhaps the biggest impact was that the pandemic put science in the spotlight. And as a result, there is a better understanding of the critical role that societies, funders and publishers play in supporting this research output.

Kane did not think this time will bring us backwards, there are challenges, but she believed it will be a roadmap to create opportunities and collaborations for a more inclusive research process. Broadening of the research workflow will speed up scientific advancement. Kane concluded that she still felt very optimistic that we will end up in a healthy, stronger and better industry.

Frank Vrancken Peeters (CEO, Springer Nature, Berlin) explained in his Keynote: Opening Doors to Discovery: How Partnerships are Key to advancing Open Science why Springer Nature believes that Open Science (OS) is key to accelerating knowledge and the way forward for research. Vrancken Peeters stated that sharing protocols, communicating positive and negative results and making underlying data available and reusable, bring many visible benefits to society. He added there is still a long way to go to full OS and the transition to OS is moving slowly. He said this is not just caused by publishers, but by researchers and funders as well.

The pandemic has shown that the transition can be faster, preprints of COVID 19-research are published three times faster as other preprints. Springer Nature makes relevant research on COVID 19 freely available, and collaborated on this effort with other stakeholders. In addition, many teaching resources were made freely available to help a quick transition to remote teaching and learning. He said this demonstrated the very responsive nature of the publishing sector and that speed showed what can be achieved if we work together in partnerships to move to OS; we need to be open and experiment with researchers and new players in the market. He added that Open Access is the key building block to OS.

Vrancken Peeters is convinced that Gold OA is the preferred way forward (above green OA) even though full transition may be difficult. The alternative green route does take away incentives for gold. Publishers should collaborate with researchers and funders to publish gold OA. Researchers should be able to continue to publish in journals that disseminate content in the best way in known and trusted journals. Humanities Social Sciences’ OA articles more than tripled in transformative agreements. He said that transformative agreements are great examples of partnerships, while there are no blueprints of such agreements and sometimes it is necessary to take risks. You have to build trust with the funders, as well as the research community, first. The key benefit of transformative agreements is that it enables ALL researchers to publish OA irrespective of background. Content that is published OA immediately benefits non academic readers as well.

Publishing is only a small component of total spending but the benefits of OS are huge in comparison to costs spent on publishing. Vrancken Peeters concluded that there are still a lot of challenges to address but that it’s an exciting opportunity for publishers to play a bigger role in the research process and increase the benefits for science.

The panel discussion Restoring Trust in Published Research was introduced by Prof. Dr. Ulrich Dirnagl, (Quest - BIH Center for Transforming Biomedical Research, Berlin Institute for Health Research, Charité, Berlin, also representing the Berlin University Alliance (BUA)). He stated there is a pressing need to restore trust in published research.

Prof. Dr. Malcolm MacLeod (Professor of Neurology and Translational Neuroscience, Centre for Clinical Brain Sciences, Edinburgh) started his Keynote: Restoring Trust in Published Research with an example to demonstrate the risk of bias in preclinical research. He continued about the impact of the NPG checklist, which can be used to check whether or not the study used randomization, blinding, sample size calculations and reported exclusions. He said that a study in collaboration with Nature, showed improvement for each of these criteria over time but that the improvement took a lot of input and effort of Nature’s editorial team.

MacLeod continued on compliance with the ARRIVE guidelines as there are issues hampering full compliance. Therefore, new guidelines, (NEW ARRIVE), have been implemented last year to make the process easier and simpler. He said that restoring trust in science has traditionally focused on researchers’ integrity and getting rid of plagiarism. But since researchers are very different, restoring trust in science should be system-focused instead of individual-focused. He shared a list in which universities are valued by their level of improvement instead of actual performance. So the focus is on the university’s research improvement strategy. MacLeod believed that way biomedical laboratory research can continue to improve what is considered best scientific practice.

Dr. Nathalie Percie du Sert (Head of Experimental Design and Reporting, National Centre for the 3Rs, London) said in her talk Author Checklists (ARRIVE, CONSORT, et al.): Quality Assurance or Box-Ticking Exercises? that just the mere presence of checklists does not provide quality assurance. Some journals request authors to complete checklists but there are still discrepancies between checklist and submitted manuscript. To ensure full compliance, the checklists should be kept short, with prioritized key items and be checked carefully by reviewers. It is also important to make sure the checklists are understood properly so examples should be included. Dr. Percie du Sert stressed that to enhance acceptability and understanding of the author checklists it should be kept in mind that the checklist is just one tool to improve quality and that they are part of a complete toolbox.

The question: Does Peer Review live up to its Promise? was answered by two opposing panelists. Firstly: Dr. Remco Heesen (The University of Western Australia, Perth) did not think so. He said that the meaning of this promise is twofold: Firstly, it’s a promise to authors that Peer Review offers a source for improving their manuscript. Secondly, it’s a promise to readers for quality assurance, and that Peer Review acts as a selecting mechanism. Dr. Heesen stated that Peer Review fails to live up to the second promise because of several reasons. There is little evidence to support the claim that high profile journals publish better quality manuscripts, there is no proof for sound methodologies. Also, reviewers tend to disagree with each other a lot which undermines the claim of selecting the highest quality research. Instead, traditional Peer Review seems to prevent rapid dissemination of knowledge. So instead of traditional Peer Review, he would favor post publication Peer Review because it is more transparent.

Dr. Deborah J. Sweet (VP of Editorial, Cell Press, Cambridge, MA) explained why she thought Peer Review does live up to its promise. In general, Peer Review does make the manuscript better, it builds trust and helps to filter the sheer amount of research that is produced around the world. She acknowledged that Peer Review can be a frustrating and delaying process, but survey has shown that usually, papers do get better. A 2019 study of researchers’ satisfaction with current Peer Review systems showed that satisfaction had gone up compared to a similar study done in 2009. Trust in research depended on whether it was peer reviewed or not. She admitted there are flaws in the Peer Review system but the alternatives - Professional peer reviewers? No Peer Review at all? An algorithm? – All of these have their flaws as well so probably the community would come back to Peer Review anyway.

Anne Scheel, M.Sc. (University of Eindhoven) shared her opinion on Preprints and the Crowdsourcing of Peer Review, from the Early Career Researcher’s perspective. She criticized the traditional Pre-Publication Peer Review as it is a closed off system. The development of having preprints accessible for everyone offers opportunities for a shift towards post-publication Peer Review. This offers advantages: Anyonone can comment on these preprints and it is detached from the publication process in a specific journal. Scheel stated that it is misplaced trust if we only go by the label that a manuscript has been peer reviewed. There is a vast amount of peer reviewed research that is not trustworthy. Peer reviewers are under a lot of time pressure and do not have a lot of incentives. Peer Review is not transparent, the reviews are not made public. Just the recognition that merely having something published does not mean that its automatically of high quality. With Post Publication Peer Review, anyone can criticize, Scheel though this to be a huge advantage. Initiatives such as the Peer Community In Platform and a new eLife journal whereby only preprints are reviewed will improve the quality of published research.

Dr. Stavroula Kousta (Chief Editor, Nature Human Behaviour, Nature Research, London) shared her insights on Preregistration: A Silver Bullet to increase Research Quality? She stated that preregistration in itself has no value and can even be harmful if it is just about ticking boxes for a funder for example. Preregistration can help to improve research quality when authors adhere to the preregistration protocol. Otherwise, preregistration is meaningless. She added that journals can play a key role in ensuring the adherence to protocols. The protocols should incorporate an examination if authors actually did what they said they did, if they are clear about what they didn’t do or did differently, and it should include methodology and analysis. There could be a shift in the research process towards registered report formats, a shift in the emphasis of the research process with a focus on data collection and registered research methodology. This rigorous research will improve research quality.

Anton Bespalov (Founder, Partnership for Assessment and Accrediation of Scientific Practice (PAASP), Heidelberg) explained more on why Safeguarding Research Quality before the Horse is out of the Barn, that is: a paper submitted, will lead to good research practices. The current system creates a large burden for scientists and is too much about just ticking boxes to get results published. Why talk about a burden? Scientists are asked to do things for which they are not trained and do not fully understand. They are worried that if they do not tick all the checkboxes it may have a negative effect on the assessment of the manuscript. So, the checklist only prevents publication and certain criteria pose a strong pressure on scientists to find solutions and cut corners. To prevent these practices, clear guidelines and quality practices should already be in place and introduced before the research is conducted. Looking at the research quality before the results are obtained and prior publication, will lead to good research practices and good publications.

The subsequent panel discussion was moderated by Prof. MacLeod and focused on the pros and cons of Peer Review. Sweet stressed that the current system is not perfect but it is better than nothing, papers do really improve with Peer Review. Kousta added that choosing between prepublication Peer Review versus post publication Peer Review should not be necessary, there could be a coexistence of these systems. And Percie du Sert stressed that the transparency of this process should be improved. Kousta added that standardization is extremely important too and Sweet agreed that developing automated systems would be useful and if this is done with other stakeholders, it could be implemented before the publication process. Steel pointed out that Early Career Researchers can struggle with peer review processes. Disentangling the different goals of the review process would be helpful.

The panel dove a bit deeper into the alternative models of Peer Review and ways to create more connections between prepublication review and ‘official’ Peer Review. Heesen said that a journal can be run in different models: with traditional Peer Review, with algorithms; in a broader ecosystem. The panel agreed that the community cannot stay static with the Peer Review system and that alternative models should be explored and issues like finding enough reviewers, the sheer amount of submitted papers should be tackled. Scheel added that the current system is overloaded, and with the hierarchy in journals, some papers are reviewed at multiple journals which is inefficient. The panel agreed that a duplication of efforts should be avoided.

The panel discussion OA: Creating a Level Playing Field for the Global South was moderated by Anne Kitson (SVP Cell Press and The Lancet, Elsevier, London).

Andrea Powell (Outreach Director and Publisher Coordinator, Research4Life, STM) stated that for 20 years now, Research4Life (R4L) has worked to reduce the knowledge gap between the global North and South, with currently 160 publishers, 5 UN agencies and registered institutes participating. She explained that equitable access is not a development objective in itself; it is a catalyst for development, it underpins development of the UN Sustainable Development Goals (SDGs). True equity is not just about access, it is about participation in the whole research process. There is a risk that OA will enhance the inequity. Without funds the exclusion barrier shifts from access to content, to access to publishing opportunities. The R4L board discussed this problem and researched the uptake of OA in R4L countries, this resulted in a white paper.

Key takeaways from the white paper: over the last decade there has been a steady increase in output from authors in the Global South (GS), the transition to OA is not happening rapidly in the GS and most of OA publications are gold OA. Researchers are not aware of the existence of OA waivers. Powell added that there are of course differences between the different LMICs, depending on funding mechanisms, national infrastructures and other reasons. Generally, there is still a preference to publish in internationally recognized journals. One of the reasons is that many national journals are not indexed. The waiver policies should be communicated better as they cause confusion.

Recommendations are to create best practice guidelines, to have more inclusive Editorial Boards and peer reviewer networks, and to create more capacity building resources. Powell concluded by stating that it is evident that a business model built around subscriptions is unsustainable in the long run. Diversifying the publishing system needs further dialogue with funders and ongoing commitment of all R4L partners.

Dr. Haseeb Md. Irfanullah (Independent Consultant - Environment, Climate Change, & Research System, Bangladesh) shared four key issues on OA publishing in the Global South. Firstly: every country has its own story. He said that in Bangladesh one of the problems in the research system is that researchers are only recruited and promoted when they publish in indexed journals. Dr. Irfanullah suggested these systemic barriers could be overcome if funders do not differentiate between indexed journals or whether or not journals are gold or green OA, local or international journals. The second issue followed on from the R4L white paper recommendation: Research should be linked to the SDGs. He said that this is already changing how researchers from the GS are conducting research. Since 2019 there is a new ranking system in place: Impact Ranking. This system evaluates how a university is contributing to the SDGs. Not only with regard to research but including teaching, outreach and stewardship as well. Many universities are joining, and this makes researchers more aware on OA publishing issues. The third issue is related to the costs for OA publications. Dr. Irfanullah pointed out the APCs can be a multiple of a researcher’s salary and funding to publish is not available. The fourth issue is to continue capacity building with researchers, authors and editors. He suggested to take this further and build capacity of the local, smaller society journals as well, with the contribution of global networks like R4L and INASP.

Dr. Uduak Okomo (Clinical Assistant Professor in The Gambia, London School of Hygiene and Tropical Medicine) started by saying that all researchers would like to have their work published and read by their peers on a global scale. As the R4L white paper has shown, many researchers prefer to publish in international journals. But in The Gambia, it is also important to publish locally, to have content more readily accessible in the country. There are many funding issues: there’s no government funding for publishing. The high Article Processing Charges (APC) are a problem too, they prevent researchers to invest money in doing research. Money should be invested into research and not in publishing just one paper.

Dr. Okomo continued that it is not just an issue of high APCs or limited funding, it is about the entire research process and communication. Reducing the APCs will not be a solution. It should be more about collaboration and instead of just the “Global North’s prestigious journals” providing a high-quality context, journals from the GS should be indexed too. It is about working together and ensuring that all journals provide a high-quality context to cancel such a strong division. This division has given predatory journals the opportunity to aggressively market their services in the GS. Researchers do not have time to check these journals carefully. Dr. Okomo concluded by saying that the current publishing system is a vicious circle: when researchers from the GS aren’t published in international journals, they are consecutively not invited to sit on international Editorial Boards.

Prof. Yap Boum (Regional Africa Representative, Epicentre (MSF) Cameroon) shared some insights into the inequality in authorship between GS and GN researchers. This is not only because of the high APCs but due to other imbalances as well. He said that of all research done in Africa, only 30% of first authors are African. He said that when researchers collaborate with researchers from top universities, it is less likely that the African author is the first or last author. The same applies for grants. To move to more equitable partnerships, better communication is extremely important. With regard to global health issues more inclusive truly global research should be done and multilingual journals could support more diversity. Additionally, there should be funding available for manuscript writing because it is very challenging for an African researcher to get published. And the problem is twofold: it is not only difficult to publish, it is also a challenge that when research is published to gain recognition. Prof. Boum said he felt that this issue is also related to the conference theme: to bring trust and better communication between the GS and GN. Prof. Boum recommended to strengthen partnerships with universities, to build willingness and know-how by extending workshops and outreach. He added that the relationships built during the Elsevier Foundation’s workshops proved their importance and effects afterwards. Powell added that continuing the engagement and sustaining relationships is key.

The panel discussion started around the problems around predatory journals. Dr. Okomo stated that it is very big problem, researchers receive 3–4 e-mails a day. More awareness should be raised around this issue and to always check carefully whether or not a journal is on the list of predatory journals. The discussion continued on how publishers can help with creating a more inclusive publishing field. As the funding issues differ from country to country, it is very difficult to give a straightforward answer. Dr. Okomo stated that creating an inclusive body of publishers to discuss challenges in GS, whereby editors from the GS meet up with their counterparts in GN would be helpful. Powell added that innovative business models for the GS should be explored too and the models should take into account the differences between countries. Publishers should work collectively on that. Twinning up GS and GN journals may not be that easy; there is the risk that more visible journals are chosen so it might not be very inclusive. Dr. Irfanullah added that finding champions within countries is helpful; they can influence the system. Creating incentives for young researchers is important too. When recruiting Editorial Boards and reviewers, publishers should stimulate diversity too. Powell added that a target number for EBs could be set. The publishing industry should think about the long-term sustainability of gold OA as waiving APCs is not sustainable.

Powel added that R4L is not just about access but also more and more focused on authorship. With the SDG publishers compact, more awareness has been raised and the publishing community is heading in the good direction.

The session: OA and the Value of Selectivity was moderated by Liz Ferguson (VP, Open Research, Wiley, Oxford).

Dr. Shina Caroline Lynn Kamerlin (Professor of Structural Biology, Uppsala University) started her presentation Publishing in OA Journals and Researcher Choice by sharing her personal experiences with OA and OS in general. Over the past decade she has seen a big evolution in the field, whereby different funders have different OA mandates. According to Kamerlin, researchers do not choose a journal because of a high Impact Factor, usually the selection criteria are: what kind of researchers read the journal, impact on the community, journals with rigorous peer review. She added that selective journals have high standards for Peer Review which is important, also for researchers moving into another research field.

She said that many researchers have limited knowledge of OA, or the implications of publishing preprints. They haven’t heard of Plan S for example and do not have a thorough understanding of the different licenses. A Taylor and Francis Researcher Survey 2019 showed that the least favorite license was CC-BY. CC-BY NC ND was preferred.

Kamerlin and colleagues responded to Plan S via an open letter. Researchers should have the freedom to choose a publication venue, and ideally a diverse publication landscape is preferred. Publish-and-Read (PAR) agreements seem helpful for researchers, however, they do create inequity between researchers. Also, between researchers who have access to big publishing budgets and researchers without these budgets. The costs should not become a barrier to publication: Paywalls to read should not become paywalls to publish. She emphasized the role preprints can have in a diversified publishing landscape. Publishing preprints is great for researchers especially when the journal allows preprint updates, however, this is financially challenging for journals: currently there is no sustainable financial model. When looking forward, the need for a diverse publishing landscape with sustainable funding models that respect both researcher and journal needs, is evident.

In his talk Dimensions of Selectivity and how they add Value, Dr. Bernd Pulverer (Chief Editor of The EMBO Journal and Head of Scientific Publications, EMBO, Heidelberg) shared his insights from an editorial viewpoint: the EMBO journals select on scope, scientific quality, interest, advancement and discovery. The dimensions of selectivity are administered by different levels (editors/referees) and the focus is on factors such as depth and breadth of the dataset, claims supported by the data and more. EMBO uses a very transparent process.

Dr. Pulverer continued on the different steps in the editorial process and why selectivity is important. Because it provides the certification of science, it highlights important scientific discoveries, and inspires across disciplines. EMBO feels it is an obligation of selective journals to optimize efficacy of selective publishing. They try to address serial submissions by inter-journal/inter-manuscript transfers and refereed preprints (this allows authors to post the referee reports with the preprint version). This concept was extended to pre-journal peer review (Review Commons). Reproducibility is a big issue in biomedical research and a lot of money is wasted because research cannot be reproduced.

Because of the selectivity and added value, the costs per article in Biomedical sciences range between 1.000–10.000 Euro. The costs are for editorial processes, peer review, quality control, curation, editing services, publicity, but also for investing in publishing and OS infrastructures. Pulverer said there is a need to find separate payment mechanisms for journals and specific services, e.g. peer review services. Full OA is only possible when there is fair access to journals for authors and readers irrespective of field, geography, seniority and funding support. Certain papers in the journals are not equably supported by OA, e.g. commentary and journalistic section in the journal vs. the research papers and reviews.

He closed by saying that short reviews and commentaries should be included in PAR deals and submission fees should be considered in funder mandates so the industry could move to an OA system that spreads costs fairly.

Alison Mudditt (CEO, Public Library of Science (PLOS), San Francisco, CA) continued on Price Transparency and the Cost of Selectivity. She said that the community should look at the criteria that are used to select content and then redefine selectivity. There is a lack of trust between publishers and other stakeholders. At PLOS, transparency has always been a core value, the relationship between price and value should be fair. The conversation about this should be more open, with all stakeholders involved.

PLOS launched the Community Action Publishing program and price transparency is an essential part of this program. The actual costs are 50% higher than charged APCs but raising APCs is not an option as PLOS is committed to an equitable ecosystem. PLOS and libraries are partnering with this community-led model to make selectivity work without charging high APCs. The goal is to ensure that costs are balanced across stakeholders and communities. With CAP, the APCs shift to institutional flat fees to distribute the costs of publishing based on the publishing costs of these institutions, this will minimize the barriers to publishing. That way, the costs will be transparent.

Mudditt continued with the question if enough revenue could be generated through this partnership model. Are we able to move away from APCs? Over time, the model will evolve and PLOS can develop a new model for OA. Transparency is a fundamental requirement for this new model to focus on shared value. Mudditt concluded by saying that there is money in the system. The community should focus on how to make selectivity more affordable and ensure the maximum amount of money is spent on research. It is not about maximizing profit, it is about building an equitable sustainable OA system and transparency has to be the cornerstone for rebuilding trust.

Dr. James Butcher (VP Journals, Nature Portfolio and BMC, Springer Nature, London) talked about the Crisis in Communication: the Functions and Future of selective Journals. In 1965 Sir Theodore “Robbie” Fox published a book called ‘Crisis in Communication’, which outlined the functions and future of medical journals. Over 50 years later this book still has lessons for scholarly publishing. According to Sir Robbie there were two types of journals: recorder journals (primarily for authors) and newspaper journals (primarily for readers, consumers). Butcher said when looking at the Nature journals they can be seen as ‘newspaper journals’ in many ways.

Dr. Butcher continued on the functions of Nature journals. The filtering element: about 8% of submitted papers are accepted. Enhancement is important too: Through value added during the peer review process but also the value added by editors during the process. There are lot of different editors and vital support functions. There are not only support functions for specific journals but also more general support functions as legal, HR, Web developers etc. Nature journals also play an important role in amplification. Butcher mentioned the new journal Nature Ageing that will be launched shortly. It does not exist formally yet but online 1st papers have already been accessed 60,000 times because of the work done by journalists.

When looking at the future of publishing OA in Nature Journals: there are 3 routes: Firstly, Transformative agreements, Secondly, Author-chosen OA. The APC for both paths is 9,500 Euro and these costs can be explained by the high-quality standards offered by the number of editors. The third route is the guided OA pilot. The essence is that authors submit to the Nature journal portfolio and work with the editorial team to come up with the best possible publication scenario. The idea is that researchers can trust the Nature Portfolio editors to identify the best home for their paper. When the best possible publishing option is selected and the author takes up the editor’s offer, the APC is 4790 Euro, around 50% of the list price. Butcher added: This does not lower the Nature standards, papers will still be rejected when they are not scientifically sound.

The panel discussion started with a question on the experience of the panelists on collaborating with institutes and other stakeholders to further develop and support highly selective journals. Mudditt stated that working with libraries and other consortia was very important in the development of the new PLOS CAP model. Transparency on costs and fee structures were important to build understanding and trust. After constructing policies, PLOS got participant feedback and could adapt when necessary.

Butcher added that for Springer Nature, when developing the guided OA model in collaboration with both researchers and funders, they got feedback on the development process and whether or not they were moving in the right direction. Funders embraced this concept of thinking differently and it was also a good process for the editorial team to think in different ways. Pulverer added that this way of looking from the ‘consumer’s’ point of view is an important new development, without losing sight on quality. The development of Review Commons was also an iterative process of feedback from the community. It is a positive development to have informal discussions and translate these into working groups to make standards for different publishers.

Kamerlin stressed that highly selective journals are not just the big, high impact journals; smaller society journals can be selective too. One of the challenges is that research becomes much more interdisciplinary. Reviewers cannot keep up with developments in all fields and step outside of their field so rigorous Peer Review is important as a trust metric. Pulverer added that publishers have to do a better job in reducing wastage and quality control with regard to reproducibility. Checking tools are essential, it should be a combination of ethics and quality control, peer review altogether. It’s an obligation for selective journals to capture high quality research papers and select on quality. Steps to increase efficacy should be taken. Butcher agreed that common standards are crucial, and Mudditt added that the role of selecting journals is also to look at possibilities to get Peer Review in at the front end, at the stage of preregistration. This is beneficial for researchers as well and it takes the pressure of the current Peer Review system.

David Crotty (Editorial Director, Journals Policy, Oxford University Press, New York) started the session: Beyond the Paper, the Data, and then a bit further – Capturing more of the Research Workflow, by saying that even though the research paper is still an efficient summary of a research project, the community could do better and capture more of the results to increase transparency and reliability. In the print era it was not possible to make additional materials available. How to maximize the value from the entire research workflow? A simple solution would be to get all results out there but it is not that easy. Cultural issues, like the current career path system built on competition and funding systems, whereby researchers only show their most successful work, prevent this. Researchers should be given rewards for not embracing secrecy. Crotty said not to underestimate the need for training to make this cultural shift. A combination of strategies is needed. He proposed to start with parts of the research workflow that will quickly provide value instead of opening up the complete research cycle. The Open Data movement is a good starting point, providing valuable resources both for reuse and reproducibility purposes. To make this happen, Crotty has worked on a policy proposal to get researchers motivated to share their research data.

In his talk Registered Reports, Dr. David Mellor (Director of Policy Initiatives, Center for Open Science (COS), Charlottesville, VA) explained how we can capture value and bring more quality through pre-registration and registered reports. He explained how the Center for Open Science increases trust and credibility in scientific research through three main activities: Reproducibility projects; Advocacy and policy work to align incentives; and building tools and infrastructures for processes like data sharing, OS methods and pre-registration collaboration.

He continued on the need for protocol transparency. A publication can be a summary of work of many years but does not necessarily cover everything, like early drafts, protocols and experiments that did not work. With preregistration you can capture that early work and get it disseminated while still maintaining credibility and interpretability. A pre-registration is a time-stamped research plan before the study is conducted, so before the data are collected. It is submitted to a public and searchable registry. A registered report is an article type, a proposed study submitted for peer review. All registered reports should include a pre-registration but not all preregistrations will be published as registered reports. Both of them make a clear distinction between planned hypotheses and unplanned discoveries. This helps to address the publication bias against null results.

A 2 stage peer review process is included so feedback from reviewers is received before the study is conducted so protocols can be modified. Stage 1 peer reviewers discuss the statistical analysis. By having that discussion at that stage without looking at final results, they can demonstrate if methods are sound and this should be done in an unbiased way. Then the work is conducted, and authors can submit final results a few years later. Method and protocols should not have changed; new parts: results and discussion. At Peer Review Stage 2, reviewers check if conclusions are justified by the data.

The advantages of the preregistration format: it is reproducible, there is added detail to the protocols, so it is more transparent with Open Data and open materials, and more credible, there is no hindsight or publication bias.

He said that research shows that hypotheses are at least 3 times more likely to be disconfirmed in registered reports compared with regular articles so this means that published results represent a biased representation of our knowledge. There is no evidence that registered reports are cited less, and more and more journals have adopted registered report formats. A database of resources for authors, reviewers and editors is available on the COS website and will help incentivize preregistrations.

Maryann Martone (Professor Emerita of Neuroscience, University of California at San Diego) started her talk Reporting Research Methodologies and Reagents on the spectrum of reproducibility. COVID 19 has showed us how important this is. One aspect is methods reproducibility to obtain the same results. But how do you do that? The data, tools and operating system all matter but a lot of this information is not included in the article as the current publishing platforms are not optimized for this. And they should not be. These data can be on other platforms whereby physical research objects can be transformed to digital persistent identifiers, such as a digital ORCID iD for people. How to do that for materials and methods?

One of the current problems with the method section is that there is not enough detail, information is missing. This is not the researchers’ fault as you cannot go into much detail: the method section is about establishing credibility for your research; the paper is for communication purposes and not an instruction guideline. Martone said that it would be good if the methods section could take the form of a recipe for science: the materials you need and the method described as a list of steps. These steps can be shared, reviewed and you can question them. The research objects like data, workflow specification, complete results can be hosted on separate platforms. You could include a PID to link to a specific antibody for example, and add information that you don’t need to include in your paper. A 2013 study has shown that up to 50% of research objects are not identifiable from information provided in the article. Research Resource Identifiers (RRIDs) were introduced in 2014 to identify the resources and track who else has published with this resource. RRID is having an impact, it is possible to aggregate data in the research information network and to pull in citations.

Protocols.io is a community platform where each protocol receives a DOI. It takes a lot less time than writing an entire article, it also manages versions and people can ask questions; in a research paper that’s not possible. Researchers should be encouraged more to use these protocol services.

Prof. Martone concluded by saying that it is good to see that methods sections and reproducibility are receiving more attention. Methods reporting should be made more FAIR and more structured through linking of research objects.

Scott Fraser (Provost Professor, Director of Science Initiatives, University of Southern California, Los Angeles) talked about Publishing a complete Record of a Research Project (Capturing the details; Enabling reproduction). He said there are challenges in how we can do this: in numbers, in size and meta-analysis of data and in making data available and useful for others. The Open Microscopy Environment (OME) produces open tools to support data management for microscopy, such as OMERO, an image data management platform.

Another problem is the explosion of different sorts of standards being used. The trap of yet another image standard should be avoided. Data management right from the start is key so the research value can be preserved. And Scientific Data Management is to empower people to do meta-analyses (increasing value of previously curated data). The challenge is how to incentivize researchers, just saying it is required or good for the field does not work.

Fraser said that capturing the entire workflow should be part of daily life; it will make researchers’ own experiments easier to do. The focus should be on transparency, context and long-term accessibility. They experienced the benefits of good data management in an experiment on “Synapse” organization and analysis pipeline (SOAP), where they documented right from the start what they were doing. Fraser shared best practices for transparency and reproducibility: these included among others: keep track of how results are produced, avoid manual (meta-)data manipulation, record intermediate results (using standardized formats), connect text/figure to underlying results, provide public access to scripts and results, add DOIs. By this, you follow the FAIR principles to make the process transparent (all the way, raw to polished) and you can make better experiments and the data and tools are reusable and interoperable. Fraser concluded: It provides the carrot as a motivation for good data management and there is no need for a stick.

At the start of the panel discussion, Crotty said that all panelists talked about the importance of thinking in advance, it’s about the start. How do we change the mindset for researchers to make experiments an integral part of their regular workflows? Fraser said that most labs finish a study and then figure out where to store the data and then never use it. It is wasted money and effort for storage. Imaging protocols should be made an integral part from the start as it will result in better science and better archives. Mellor agreed: it takes better planning early on and better organization to change the workflows. He advised to take additional, different steps at a time, try them out, e.g. planning datasets, data sharing preregistration, and see how these changes benefit the researcher’s workflow.

Martone added: you have to prepare to share and prepare to share fair. It’s a mindset that has to be changed and data management plans have to be followed. Labs should organize data infrastructures as this has not happened on a large scale. This should be an investment. Mellor said there is already growing evidence for citation benefits for papers with datasets, funders could also value plans that use transparency and promote OS practices.

Crotty asked if it would be a negative thing when preregistered reports will include more negative results? Mellor agreed that more negative results will be published with preregistered reports workflow as it is a more effective way for getting null results out there. Martone said that not everything should be forced into the paradigm of research articles. You can open up venues that are more suitable for negative results e.g., in data repositories. With new data science practices there will be a rise of value for negative results. Fraser added that the paper should be a pointer to the database. When data management is part of the workflow it shouldn’t be extra work to publish negative results because you already have the data.

Crotty concluded that it is good to see so much work is actually being done and that the industry can continue to think about ways to better connect the dots in the research workflow, such as pulling value from subsequent works, e.g., grant applications, post-publication, peer review etc.

DAY TWO: Wednesday, 13 January 2021

Yvonne Campfens (Executive Director, OA Switchboard, The Hague) spoke about: From Complexity to Transparency: How the OA Switchboard is building a cost-effective collaborative Infrastructure Solution for an OA-driven scholarly Communications Landscape. Campfens started by saying there are practical and emotional trust issues in the way of a faster transition to Open Access. These are related to issues like the redistribution of the money in the system and lack of transparency. Transparency is needed for trust and accountability for assessing and comparing value, to prevent money to be wasted. In addition, everybody is reinventing the wheel: there is a myriad of workflows and systems. The publishing industry could learn from other industries where they collaborate and use shared infrastructures. According to Campfens these are heated topics, preventing a faster transition to OA.

The OA Switchboard was launched to contribute to the solution as a neutral, independent intermediary, providing shared infrastructure, standards and back office services for funders, institutions and publishers. It is a central information exchange hub, taking charge of multi-lateral OA publication level arrangements. It does not get involved in invoicing, but it enables exchange of information about the OA related publication-level arrangements and ensures a financial settlement can be done. The OA Switchboard is a message hub in the middle, between different stakeholders. It is not building a database: it is a communication hub. The OA Switchboard validates the messages and enables and streamlines communications. This simple technology solution tackles multiple use cases in different steps of different workflows, not just manuscript submissions, also data. It is flexible and it is Open Source. Campfens added that this is essential for building trust and transparency. To deal with the heated topics, the project had to be inclusive from the start, they had to be clear and have clear and common goals, include all stakeholders and regular reporting to keep everybody involved.

For futureproof governance, it was decided to found a new foundation, the Stichting (Foundation) OA Switchboard. This legal structure was chosen because neutrality and independence are preserved.

Campfens concluded by saying that the OA switchboard will continue to build trust through transparency, efficiency and cost-effectiveness. It is about the ecosystem to work better and the researcher to focus on research. It will work best if all stakeholders participate. The years 2021/2022 are regarded as the ‘launch phase’, to achieve wide adoption, and allow time for (technical) integration and implementation, and for continuous improvement.

The session: New Dotcoms to Watch, was introduced by Drs. Eefke Smit (STM Director of Standards and Technology, Amsterdam). The following start-ups in scholarly communication were presented:

Sciscore.com by Martijn Roelandse (Lead Business Development)

The reproducibility crisis causes money wastage and delays. To solve this, Sciscore has introduced resource identifiers; using them makes better (reproducible) papers. Sciscore can evaluate methods sections and measure the reproducibility of research in a few seconds. 6 automated screening tools and 55 algorithms check the methods section of an article, and you get a 100% report. The report shows if certain rigor criteria like blinding, randomization of groups, and power analysis are met. Sciscore Version 2 includes features such as code and data information. The Sciscore report includes a rigor table and resource table, and you get a score on a scale of 1–10 that publishers can use.

When the average Sciscore is increasing, it means better reproducible papers can be published. Sciscore not only creates dashboards for journals to check performance, institutions or funders can also have dashboards helping them to take better-informed policy decisions. In conclusion: the reproducibility of science is made measurable, publishers and editors are able to assess the reproducibility of their journals and Sciscore advises editors where to focus their efforts to improve their journal.

Labforward.io by Florian Hauer (Founder/CPO)

Hauer stated there are different challenges in science: it has become more difficult to make groundbreaking discoveries – productivity (the internal rate of return (IRR) on R&D spending) has declined 5 times over the last decade which has led to a $15 billion productivity gap. In addition, there is a reproducibility crisis: money is wasted each year on basic biomedical research that cannot be repeated successfully. Where do these problems come from? Different steps of the production workflow are happening in silo, there is no connection between different cycles and no integration across data silos. Labforward offers software to manage data across the scientific life cycle: Labfolder, Labregister and Laboperator. Different tools and platforms enable horizontal integration to make more out of the data that is generated. Vertical integrations allow the required depths at all stages of the R&D process. The software is developed for scientists, scientific organizations and STM publishers.

iris.ai by Anita Schjøll Brede (CEO/Co-Founder)

Schjøll Brede said we live in a world of exponential growth of (scientific) knowledge. How do we navigate all of this to find the information we need? When you use a citation system, there is a risk of citation bias so it is not a system that can be trusted. You can search with keyword queries but how do you use good keyword queries for interdisciplinary searches for example? When using Artificial Intelligence you need human-machine collaboration and trust this leads to proven improved results.

Isis.ai is a set of tools to semi-automate literature review with AI. It is an exploration tool to find a match with scientific papers across the globe. The tool uses text you inputted and builds a contextual fingerprint and gets an understanding of your problem and then finds results. You need to have human-machine collaboration. It has been proven that the tool works through a series of experiments, large scale review and hackathons with teams to compare performance. The tool reduces time and increases quality, and the tool connects to a variety of content outputs. It is a tool that can be trusted because it allows to machine-human collaboration in an efficient way, better than existing paradigms.

MagmaLearning.com by Maxime Gabella (CEO/Founder)

A lot of what we learn and read, will be forgotten very quickly. Therefore, MagmaLearning has developed ARI 9000: a personal AI tutor app powered by machine learning that helps to consolidate what you learn via personalized learning algorithms able to self-improve with experience. ARI will learn from the learner and vice versa in order to further personalize the learning process. It uses Natural Language Processing and generates micro-learning components like puzzles and summaries automatically. It ensures long-term retention with least effort and it’s the best way to consolidate knowledge.

Learning is very personal so to get this learning right, AI and machine learning are used so every person gets the right content at the right moment. Experiments have shown significant benefits for groups using the app. Knowledge visualization and continuous feedback are used so learners can see any knowledge gaps to keep them motivated. The app is used in many sectors, at schools, universities, hospitals, companies and non-profits. The personalized learning experience adapts to diversity and promotes inclusion, for example children with dyslexia or an international company with different backgrounds.

Scholarcy.com by Emma Warren-Jones (CoFounder)

The problem is that there are too many papers and there is not enough time to read them all. In 2018 over 3M articles were published. It is a universal challenge for researchers; they are overwhelmed with the volume of research, trying to skim read to keep up but they might quickly forget what they have read.

Scholarcy offers the solution: articles distilled into summary flashcards. With deep learning technology, Scholarcy does the skim reading so researchers can determine if the paper is relevant for their research.

Scholarcy distills a paper into flashcard, gives a headline summary, a set of significant terms with links to Wikipedia, and bullet points. Translations are covered by Google translate.

Research shows that Scholarcy cuts reading time by over 60% and makes complex information easier to digest. Researchers and students, libraries and research communications departments and content and discovery services are using Scholarcy in different ways, for example to share their own research or to generate lay summaries from complex articles to promote their research output. Publishers and aggregators use the Scholarcy API to create rich XML metadata from documents in any format, and to support the publishing process.

Researcher-app by Oliver Cooper (CEO)

With so much content published each year it is difficult for researchers to stay up to date: they don’t want to miss anything. Publishers on the other hand have the challenge to reach and engage audiences. Researcher-app is designed to solve problems on both sides of the scholarly cycle.

In 2017, less than 10 % visits to journal websites were mobile, so Researcher app was built to solve the problem of discoverability on mobile devices and to make sure researchers will not miss anything.

Publications from different disciplines are indexed and content is pushed to scientists and researchers. Content offering is expanding all the time. Users also get to see additional relevant content e.g. call for papers, newsletters campaigns etcetera. The journey so far: Almost 2m users generate 3,8m content impressions per week.

After the 6 presentations, the audience was asked to vote for the different dotcoms by answering questions such as: which of these dotcoms would you like to collaborate with, in which dotcom would you invest your money, in 3 years who will be the most successful dotcom? At the end Smit declared Schjøll Brede (Iris.ai) as the winner.

The Session Collaborations built on Trust was chaired by Dr. Manuela Gerlof (VP Publishing Humanities & Social Sciences, De Gruyter, Berlin) and Prof. Dr. Andreas Degkwitz (Director, Humboldt University Library, Berlin)

Gerlof started the session by saying that this year, De Gruyter celebrates its 10 year OA book anniversary. In collaboration with partners, De Gruyter publishes 3000 OA book titles and 600 OA journals. In 2020, 10% is published gold OA which means that 90% of the book frontlist is not freely accessible so there is still a lot to do to full OA transition. This can be done through a collaborative approach and relevant partnerships. Gerlof added that the OA transition is challenging in HSS because of various reasons such as diversity of research areas, different types of content and HSS researchers’ affinity for print.

In his talk ‘Subscribe to Open’ as an alternative path to OA Transition, Dr. Kamran Naim (Head of Open Science, CERN, Geneva) explained why in recent years, diversification in the underlying business models was needed to support OA publishing. He stated that APC funded models don’t work across the board; there is lack of funding in some disciplines and the model is incompatible with some types of research outputs. APCs are a challenge for authors in the Global South and the model continues to create new barriers.

He said that some alternative non-APC OA models have focused on a collaborative approach such as the consortium approach (SCOAP3 Model), the publisher driven PLOS Community Action Publishing (CAP) Model and the Subscribe to Open Model. The S2O model has been built on the basis of collective action theory to motivate participation, as opposed to models which rely on altruism or pro-group behavior. The model can support publishers to transition their subscription journals sustainably to OA.

He continued on the points underlying the model: 1: Selecting S2O is in an institution’s economic self-interest, it is offered at a discount over price of subscription; 2: It targets current, existing subscribers with expressed demand for content 3: It avoids and does not require collective coordination and leverages economic self-interest; 4: It uses existing procurement processes and maintains existing relationship between publishers and libraries; 5: Openness is guaranteed only with full participation, to reinforce that S2O is a subscription. Control of decision to publish OA remains with the publisher; and 6: Designed to recur annually, stability of the model requires continued participation of sufficient subscribers to meet revenue requirements.

Naim concluded with the S2O community of practice: this platform allows publishers to share experiences with the model and to familiarize the libraries with the model, open to all to see to which extent all offerings align with principals of this models in order to avoid confusion around the model.

Dirk Pieper (Deputy Director, University Library Bielefeld and Head of National Contact Point) spoke about Collaborative Open Access: The National Contact Point Open Access Germany OA2020-DE.

He started off with the concept of trust: trust is relevant in organizational change processes and the success of the OA transformation is strongly related to trust. The National Open Access Contact Point in Germany was launched in 2017 as a complement to the DEAL project with the aim of creating the conditions for large-scale OA transformation. The project will end in the summer of 2021. Pieper said that an OA monitor has been created to get better insight into publications and the costs of OA transformation. Complementary business models for OA transformation in HSS and for books have been developed, and workshops have been organized with an emphasis on community building to carry on the OA transformation when the project ends, reports are available on OA2020-DE.

Pieper continued with 3 examples of what the OA Contact point has achieved:

1: Cooperative financing for transforming books to OA, this resulted in the participation of more than 40 libraries and academic institutions and the transformation of more than 180 books to OA.

2: Supporting editors and publishers to implement Subscribe to Open in Germany, this is relevant for smaller publishers, publishing content in the German language.

3: Creating the Enable! Community for OA transformation in HSS. This community is to develop an inclusive, stakeholder-driven OA culture, enables exchange of experience, and promotes fair OA business models for books and journals in HSS.

Trust in the in the large-scale OA transformation is extremely relevant. Librarians need to trust the stakeholders, the financial mechanisms, workflows and outcomes of the OA transition. Pieper concluded with a quote from Luhmann to illustrate that change processes are very complex and that we have to expand from personal trust into system trust.

In his talk: Data-driven Publishing and scalable Reading: co-designing the ‘Journal of Digital History’ Prof. Dr. Andreas Fickers (Director, Luxembourg Centre for Contemporary and Digital History, University of Luxembourg) spoke about the new Journal of Digital History (JDH), launched by the Luxembourg Centre for Digital History and De Gruyter. The mass digitization of sources offers new possibilities for digital history research. So far there have been little opportunities for digital historians as most platforms are still paper-based. The journal offers an innovative publication platform for historians to demonstrate what they do.

The JDH will be setting new standards in history publishing based on the principle of multilayered articles: the first layer will enable to produce transmedia narratives (narration layer); the second layer will explore the authors’ reflection on the methodological implications of using digital tools/data (hermeneutic layer); the third layer will give access to data and code through a professional infrastructure (data layer). Fickers said that JDH aims to build trust with historians in order set new standards in history publishing.

The journal team is driven by the idea of co-designing the journal with the publishing world and the world of historians. The core of the editorial process is based on Jupyter Notebooks: this open source research tool is used to build an interface to bridge the different article layers. The source files on Jupyter will be translated into specific elements of the journal article, such as abstract, methods etc. To create trust and optimal adoption, the platform is co-designed with authors. This is done through workshops, ongoing communication and agile development (writing the 1st issue and building the platform are linked processes). Fickers said that this process differs from ‘classical’ submissions to traditional journals; it is true co-development.

Fickers concluded with an example of how an article in the journal will look like. The article format will allow scalable reading; when you refer in the article to data, you can access a direct link to the data and interact with the data in real time. Scalable reading is the future for digital humanities, combining close and distant reading. Readers will be able to interact with the text, the code and data visualizations. The first issue will be online autumn 2021.

Ros Pyne (Director, Open Access Books and Book Policies, Springer Nature, London) spoke about The Future of the Monograph: long-form Scholarship in the digital Age. She stated that the monograph still plays an important role in the HSS field. Over the last decade, there have been changes in the format, journal like features have been introduced and the digital formats allow linking in references for example. Electronic Supplementary Material (ESM) can be added and it’s also possible to add ESM to printed monographs and access that material via an app. Additionally, there has been a rise in audio books although this does not have a huge impact on the core scholarly publications. Other innovations include books written by AI.

Pyne said despite the digital developments the changes are not that radical because evidently, the monograph already serves its purpose very well. The print monograph persists for both practical end sentimental reasons. In 2020 there has been a greater demand for e-books because physical libraries were closed. Pyne stated that even though more can be done to improve the e-book reading experience in this digital age, the focus should be on the transformation to OA to support the widest possible dissemination of HSS in a format valuable for the researchers and audience.

A study has shown that SN OA books have higher usage and a more diverse readership. SN OA books are downloaded 10 times more than non-OA books, and OA books are cited 2,4 more. OA books reach more readers, in more countries. Pyne thought this to be a very compelling argument to have more OA books. But in order to accomplish this, challenges have to be dealt with. A 2019 survey showed that the main reasons for not publishing a book OA are: inability to find funding for OA books, low awareness, concerns of quality. Pyne said this can’t be solved at once and the community should work together to achieve this.

A recent collaborative initiative was the development of the OAPEN OA Books Toolkit. It is a trusted source of information, providing answers to help authors in publishing an OA book. Workshops were organized to understand the community’s needs and an editorial advisory board with a wide range of perspectives was established. The toolkit was written collaboratively with different stakeholders and launched the end of 2020. The toolkit includes a wide range of information with introductions, info on OA licensing, funding etc. It is a living resource, and it is aimed to broaden the toolkit out, with more info and in other languages for example.

The Session: Climate Action. Influencing Policy and tackling real-world Challenges – how can scholarly Collaboration support rapid Action? was introduced by Dr. Liz Marchant (Global Journals Portfolio Director - Life, Earth & Environmental Sciences, Taylor & Francis Group, Abingdon). She said that it is good to see that over the past few years, the United Nations Sustainable Development Goals (SDGs) are increasingly becoming part of the discussion in the community.

Dr. Lewis Collins (Editor in Chief, One Earth, Cell Press, Cambridge, MA) spoke about Climate Change – what does the data tell academic publishers and how should you respond? The journal One Earth addresses grand challenges: complex socio-environmental issues that cannot be solved by one discipline; they cannot be treated in isolation and require a cross-disciplinary approach. The problems need multiple perspectives and publishers have a key role to play here. Agreements such as the SDGs interlink overarching challenges.

Collins continued on the role of journals and publishers: to advance the understanding of climate change. Despite the huge amounts of articles that are being published on climate change, nothing has happened, so it is a public duty to make the climate change science heard. Journals should also encourage co-creation to establish trust in science and breakdown barriers between and across disciplines. By providing additional context around the articles, publishers have the power and responsibility to impact the world.

Last year Elsevier published a report on SDG related research to verify if publishers are accumulating the right knowledge. Climate research is growing rapidly, not only in volume but also in number of citations. So, the knowledge is there and is being cited but needs to be converted into action. Climate research is dominated by high-income countries; this is a concern because climate change is felt most in low and middle-income countries. The research cannot be dictated by the Global North, there should also be more collaboration and publishers can play a role in this too.

Collins said that climate research lacks a social science focus. Climate change is a societal issue and addressing this will require shift in society and behavior so that’s a challenge. We need to see an increase of research in this area. There is low consideration of sex/gender in climate research and there is growing evidence that climate change has a different effect on genders so that is concerning. Collins concluded by emphasizing that journals can have an impact and added that the response to the COVID has shown that the publishing industry can contribute to positive change.

Dr. Joanna Depledge (Former Editor, Climate Policy Journal, Research Fellow, Centre for Environment, Energy and Natural Resource Governance (CEENRG), University of Cambridge) spoke about Advancing SDG 13: maximizing the policy impact of academic publishing on climate change. The journal Climate policy was founded 20 years ago, when research on climate policy was in its infancy. Climate policy research and publishing have grown rapidly over the past years. Depledge said that publishing on climate policy is unique. The authors are mission-oriented and highly motivated to reach policymakers, practitioners and negotiators. The field is inherently interdisciplinary, and there is an active policy environment.

She continued on the barriers limiting policy impact and how to overcome them: Firstly, there is a culture clash between academia and policymakers. This can be overcome by simplifying the message or for example through collaborative authorship between policy makers and academics. Secondly, there is an information overload, this can be addressed by making papers stand out through free access or promotion, or grouping them in thematic areas. Thirdly, it is difficult to measure and assess and therefore value policy impact. The dominance of the impact factor is a disincentive to taking on papers that might have policy importance, but only limited citation prospects. This would require a change in the whole community. Additionally, there is a failure to breakout from Global North dominance of the academic publishing world. At climate policy they have tried to tackle this by providing more support to authors, actively encourage submissions and working with the Editorial Board to increase awareness. These structural problems cannot be addressed by individual journals.

Depledge concluded with her key recommendation: the publishing industry should seriously commit to widening the geographical reach of academic publishing by setting up a mentoring programme to provide training on academic writing and publishing expectations for researchers from the Global South, and consider providing free or low cost academic translation services, translate abstracts & key messages, compile and disseminate strategic special issues on and by under-researched regions.

Dr. Andrew Kelly (Portfolio Manager at Taylor & Francis Group, Abingdon) stated in his talk What does the Research Community need from its Stakeholders? that there is a drive for researchers to solve grand challenges. Taylor & Francis ran a survey to help understand why and how authors choose their topics and journals. Questions included for example: What benefits do our journals offer? To what extent do global challenges shape research ambitions? Are the current frameworks shaping researchers’ ambitions?

90% of respondents indicated that their work either currently contributed to tackling real-world problems such as SDGs or that it would be a priority for them in the future. In the Earth and Environmental Sciences, this is real incentive. But why? For researchers, it is important that their research contributes to tackling big real-world problems and they care about impact of their work, such as citations from within their field, contribution to the advancement of science and readership downloads. A comparison between the results in the US, China, UK & Europe and India showed there are changing global priorities and that there are issues hampering researchers achieving their desired impact.

Kelly said that publishers can address this and bridge the gap by communicating the link between highly focused projects and live policy issues, including “lay summaries” and “policy highlights” and fostering interdisciplinarity. Additionally, publishers can build a pipeline to policy through more-robust feedback mechanisms, education for advocacy and for speaking the policy-maker’s language, improved translation and dissemination beyond personal networks and interoperable standards to link research outcomes. Publishers can lean into the role of bridge-builder, within and without academia and foster public engagement. The academic currency could be changed through clarifying policy applications and rewarding and recognizing policy impact. Kelly concluded that in order to fully support researchers’ ambitions to addressing real-world problems, a collective ambition and multiple solutions from all stakeholders are required.

Sherri Aldis (Chief of UN Publishing at United Nations, New York) and Dr. Michiel Kolman (Chair of the Inclusive Publishing and Literacy Committee, International Publishers Association (IPA), Geneva) spoke about The SDG Publishers Compact. The 17 Sustainable Development Goals (SDGs) were negotiated by all UN member states and finalized in 2015 to transform the world. Publishing contributes directly to SDG 4: education, and influences SDG 1: eradicating poverty.

Kolman said that COVID 19 worsened the situation in achieving the goals. Reaching SDG 4 (education) was already lingering but now because of the school closures, the situation has worsened. And because of the lockdown, there has been an increase of domestic abuse and violence to girls and women, which makes achieving SDG 5 (gender equality) more problematic too.

To achieve the SDGS by 2030, raising awareness should be a continuous effort, with a focus on accelerating progress during the “decade of action” (2020–2030). Publishers can be agents of change through publications and accessible content. Publishers should have their own house in order, travel less for example. And in the area of gender equality there is room for improvement too.

The SDG Publishers compact was launched at FBF 2020: The Compact is designed to inspire action among publishers. Launched in collaboration with the International Publishers Association, the Compact aims to accelerate progress to achieve the Sustainable Development Goals (SDGs) by 2030. There are already over 60 signatories and many are from STM. Aldis concluded by saying that publishing is already very much aligned to SDGs and that many publishers are already contributing to SDGs without realizing.

The panel: Balancing the Need for rapid Sharing with the Need for rigorous Evaluation – the Role of Preprints and Peer Review, was moderated by Magdalena Skipper (Editor-in-Chief, Nature, London). She stated that the adoption of preprints has been rising for the last of couple of years and seen an astronomic increase during the COVID-19 pandemic. She said preprint sharing is not something new, it goes back to at least the 1960’s and has evolved over the years. The benefit is clear and it is crucial within the health and clinical sciences that scientific findings are appropriately scrutinised before made public.

Prof. Dr. Christopher Aiden-Lee Jackson (Equinor Professor of Basin Analysis, Imperial College, London) shared his views on preprints from a (geoscience) researcher’s perspective. He said that there are a lot of preprint services, more than 50. These can be generic or discipline- or region-specific, and span a huge range of topics. He emphasized that preprints are not just valuable in the biomedical and health sciences. It is important to look at other disciplines too, such as geosciences. Peer Review and the translation into a publication or to the general public are also important. He felt that this part is a bit lost in the discussion; it is also about researchers – the human drive to get research out. The discussion should be shifted to how we can optimize preprints for this drive and increase transparency. He added that transparency is a core element of academic practices and in that respect Open Peer Review is valid.

Dr. Theodora Bloom (Executive Editor, The BMJ, London) spoke about Growing a clinical preprint server during a pandemic, from a journal editor’s perspective. She said the case for preprints is clear: they can speed up science (faster dissemination in the research community), improve feedback from the community and they are freely available. However, the worry in clinical medicine has always been that they may present information incorrectly. When medRXiv was launched in 2019, a lot of time was spent on how to do this. MedRXiv is a not-for-profit, publisher-neutral service and similar to bioRXiv. Submissions should be original research articles, following community norms, and when these requirements are met, they undergo a series of risk mitigations and checks. The key question asked during this process: Is there a benefit to sharing now vs. after Peer Review? Over the past year, the service has grown a lot >15.000 preprints are posted now. Bloom added that the growth is not just because of COVID 19. She concluded by saying that issues around ethics will continue to be persistent. Cautions (for journalists for example) are included with the publication of preprints stating that they are preliminary results.

Dr. Sowmya Swaminathan (Head of Editorial Policy & Research Integrity, Nature Research, Springer Nature USA) spoke on Integrating preprints with journal Peer Review from a global publisher’s perspective. She said that Peer Review is at the heart of what Springer Nature does and different types of peer review (single/double anonymized, transparent or open) are possible at Springer Nature. All imprints have supported preprints throughout the years and in 2019 a unified policy on preprint sharing, citation and licensing was developed to maximize the benefits of early sharing and to complement the journal peer review process.

In 2019, In Review was launched: a service integrating preprint posting with journal Peer Review. This provides a way for authors to share their research as a preprint while under review. The preprint is citable and offers real time updates. This provides a possibility to extend transparency in Peer Review, from submission through publication. Peer reviews versions are released in real-time, preprint versions are directly linked to the article on the journal platform. When Open Peer Review is offered, reports are available and connected. This service is now available for 476 Springer Nature journals.

She concluded by saying that preprints are clearly shifting the status quo. Elements of registration, certification, dissemination and archiving that were originally focused on research articles, are now more uncoupled, with preprints allowing the community to explore new ways of Peer Review. With this shift, publishers are also forced to innovate.

Dr. Thomas Lemberger (Deputy Head of Scientific Publications, EMBO, Heidelberg) spoke about Review Commons, a new Peer Review platform service that organizes independent peer review before journal submission, working with a consortium of different publishers and two preprint servers. The aim is to improve 3 aspects of the Peer Review process: 1: Efficiency: The cross-publisher consortium agreed on avoiding cycles of Peer Review which means that the Peer Review process does not need to start from scratch when resubmitting to another journal. 2: Quality: The journal independent Peer Review focuses on science. 3: Transparency, authors have the option of depositing the referee reports alongside the publication.

Lemberger explained the workflow: After authors have submitted their research to Review Commons, it is sent out for PR, and the full set of review reports is circulated across referees. Review Commons does not make any editorial decisions but after PR, the authors can submit their work to journals of the consortium, up to 4 times. When the work is published in a journal, the full revision process can be posted alongside the article. He said that it turned out that all 17 journals participating in the experiment, have accepted manuscripts with Review Commons reviews. In 95% no new PR was done. Only 30% of authors posted reviews alongside the preprint so more work needs to be done to improve transparency.

He concluded that the audience of review reports now also includes readers, so not just the authors and editors. Clarity and transparency are crucial for the adoption of new PR models. And expert PR can be combined with automated curation methods that for example build knowledge graphs and show emerging topics. He added that human expertise is still needed to meet consistent scientific and editorial standards and that’s why journals will continue to play an important role.

Rebecca Lawrence (Managing Director, F1000 Research, London) spoke about post-publication Peer Review and preprints. She stressed that one of the advantages of preprints is around minimizing editorial bias. She continued on outstanding challenges around preprints: There are often limited checks before posting, lack of clarity and underlying source data, links between preprint and peer reviewed version are not always available and the subsequent Peer Review status is often opaque, and there is inadequate understanding of preprints by media and the public.

She continued on the changing publication models; when preprints are published, you get access early on but you still miss the review process and then you still have to wait for the end of the process to view the revised, published article. It would be more transparent to have the PR reports published and linked to the publication. With F1000 they try to address that by being completely transparent and allowing versioning. She said there are many different forms of pre-review publication, and hence of post-publication peer review.

Different publisher approaches (Elsevier’s First Look, Springer Nature’s In Review and Wiley’s Under Review) are compared with F1000, on key elements of the different services such as editorial review and commercial model. Are review status, peer review commentary and revision life cycle transparent? The key difference is around post rejection, with F1000 the VOR remains publicly available; flaws are transparent. This is a broader look and takes into account the advances for society. She concluded with the key benefits of post-publication Peer Review: the accountability and credit for reviewers, reproducibility, one connected process, all articles go through Peer Review, and it’s a living publication.

The subsequent panel discussion kicked off with the question if preprints are here to stay. Jackson thought this depended very much on the individual and their respective research field. Bloom added that the benefits of preprints go beyond the research community and that making research available as rapidly as possible is beneficial for all disciplines. Even though preprint adoption may vary between disciplines. The benefits are more evident for certain research areas e.g. with the pandemic: clinical trials to get vaccines out there and the need for transparency in underlying data.

Lemberger added that every domain has to find its own set of rules, there are different cultures of research assessment. Lawrence said some communities already tend to be more collaborative in the 1st place. They are used to sharing and having open discussions, e.g. in medicine there is an obvious need and benefit for society. Swaminathan said that it is already happening organically, and it will help when journals provide infrastructures e.g. around data sharing.

Skipper said that even though it is a positive development that research is available publicly, there could be a risk of confusing the public. Bloom said that educating the public about how science is done, is crucial. Journalists are crucial translators. The community should build a general awareness that scientists build trustworthiness together. Lawrence agreed that educating the public that there is not just one view is important. One way to do this is by publishing peer review reports, there are different views and it shows how research evolves. She added that the community needs to change the way how retractions are perceived, we need to change the research culture. Swaminathan added that openness around preprints could help realize this.

Another question focused on mandating preprint deposition. Bloom said she was never in favor of a journal imposing something before the community was ready. Lemberger agreed; they saw that the community is not ready yet for posting review reports for example. However, he felt that a clear vision for the future is needed but to mandate it now would be too early. Swaminathan added that when thinking ahead, the community should not just keep ‘dominant’ researchers in mind but also early career researchers or researchers from the Global South. Lawrence said the broader scholarly system should be addressed if we want to move to preprints more broadly, in conjunction with the research culture and awarding system. Jackson agreed that tools for culture change are needed, researchers also need to trust their own communities. Skipper said that culture is changing over time and maybe mandating is not necessary: it will just happen. Bloom concluded: posting preprints should not be a goal in itself, the goal should be to advance science.

Skipper said that peer review models are already changing, e.g. with Open Peer Review, she asked if it would be easy to change the culture. Lemberger said it has to be done carefully, it has to be made useful for the readers too and the review process should not be too complicated. Lawrence thought that for research with immediate impact, publishing the review reports is immediately beneficial. E.g. for COVID 19 papers providing context for the media; it brings richness to the discussion. When comparing transparent PR in 2013 with the current situation, researchers have changed, reviewers are much more comfortable. Key is getting peer reviewers credit. Bloom added that the main difference they saw was in the tone of the reports. (Financial) conflicts of interests should also be exposed. Transparency is also key when dealing with inappropriate referee reports for example. Jackson said that the discussion should be open otherwise it’s a loss to science. Skipper concluded the session by emphasizing that if we want trust in science, transparency on how it is done and evaluated and grows over time, is needed.

The Final Session: The Fourth Paradigm: Data-Intensive Scientific Discovery: More than 10 years later. was moderated by Dr. Irina Sens (Dep. Director, TIB, Hannover). She said the current fourth paradigm of data science follows on the 1st paradigm: empirical science; 2nd paradigm: theoretical science; and 3rd paradigm: computational science.

Prof. Dr. Claudia Draxl (Physics Department, Humboldt-Universität Berlin) stated in her speech Recycle the Waste! that we are just at the beginning of the 4th paradigm. New tools for data sharing and analysis should be developed. There are tools available but more out-of-the-box thinking is required. A lot of data, with different properties and different functions is collected but the right data is needed too: “the pearl on the ground”.

Today’s physicists can never publish all information that has been collected. That would be too much for writers and readers. But the community should do better and collect the ‘trash’ that is currently not included in publications. E.g. Synthesis recipes: synthesis of perfect crystals may not be included because of human reasons: people keeping secrets for competitive advantages, or the metadata is incomplete because of hand-written notes. She said that publishing only success stories leads to bias in research. Failures should also be published. A lot of data is thrown into ‘trash’ because it is not needed for the current research, but it may be useful for something else.

The Novel Materials Discovery (NOMAD) Lab has been created for the community to enable data-sharing. NOMAD is an internationally embedded FAIR data infrastructure, and has collected more than 100 million calculations. A human accessible form for the archive is available.

She said interoperability is the biggest issue and FAIR data infrastructures are a must. The German initiative Nationale ForschungsDatenInfrastruktur (NFDI) is an important step towards this. Changing our publication culture is key: only if we build a comprehensive FAIR data infrastructure, capturing all aspects of data – from synthesis to experiment and theory – and interweaving data, metadata and processing tools with novel concepts of AI, materials science will reach a new level.

She closed with her vision for the library of tomorrow: it will still include books, as well as research journals connected to data. The library will be enhanced by interactive tools for data sharing, metadata will be centralized so the community has access.

Dr. James Milne (President, ACS Publications, Oxford, and Chairman of the Board of STM, the International Association of Scientific, Technical and Medical Publishers) presented on the STM Research Data Year 2020 - A Review. He said that sharing data is one of the most fundamental aspects of Open Science and that data has been essential to science throughout history. Powered by AI, data can offer science a lot of benefits: create new hypotheses, design new theories, explore new connections and causes etc. He added that STM Publishers have held a long-standing commitment to and a crucial role in FAIR data.

There is an annual growth of 21% of articles linked with datasets. Even though this growth is substantial, compared to the total output of research articles, this is still a minimal amount. There are areas where publishers can support data sharing. And the The State of Open Data Report 2020 confirms that publishers still have a key role in FAIR data sharing. Researchers would be more motivated to share data when they get full data citation or co-authorship on papers. Researchers do not get sufficient credit for sharing data.

Milne said a lot has been done already and STM wants to keep momentum going with SHARE-LINK-CITE. Activities to encourage authors are scheduled. Milne added we cannot work alone we must start earlier in the process, before the manuscript is submitted.

The STM 2020 research data year, launched at last year’s APE, started with 6 and ended with 21 participating publishers. The dashboard reports on progress as the initiative to promote data sharing has not finished yet. The whole research cycle and additional topics, such as Data Peer Review, need to be addressed. The sharing, linking and citing of research data should be an integral part of scholarly communication.

Prof. Dr. Karel Luyben (Chair of the Executive Board of the European Open Science Cloud (EOSC), Brussels) spoke about the European Open Science Cloud. He said that Open Science, with increased Open Access, FAIR data and Citizen science, form the direction where we are headed. Data skills, the reward and metrics system, for a university or country not just for individuals, underpin the Open Science movement. He said that it is also important to define Open vs FAIR Data. FAIR is the direction we have to move to, to stimulate interoperability, as FAIR and as Open as possible. He added that data includes any digital research output, including identifiers, standards & code and metadata.

The EOSC cannot live without the European e-infrastructure organizations; they belong together. Together they form a web of scientific insight; it is a federation of existing services for storing and interoperable data, making use of the quality mark: data made in Europe. The boundary conditions for EOSC are: The funding comes from EU, it has to be inclusive for all stakeholders, the core follows the subsidiarity principle: it should provide us with a shared purpose. In addition, we have to realize that countries have different structures and accommodate those, it should be self-inclusive, hardware agnostic and the focus is on FAIR data.

The core functions for the EOSC Association 2020+ are to develop and govern a federating core that will manage a compliance framework, trusted certification, the Authentication and Authorization Infrastructure (AAI). Other functions are: PID policies and outreach to stakeholders, monitor services and transaction, manage EOSC trademark(s) and contribute to Horizon EU policy.

The overarching principle for developing EOSC is that research should be in the heart of EOSC, including a multi stakeholder approach and openness. The EOSC association is established to govern EOSC, it is a European co-programmed partnership with 187 member organizations, including research funding, research performing and service providing organizations. All member countries are represented.

The draft MoU between the EOSC association and the EU is a contractual arrangement, not legally binding, the duration is until end 2030, and partners have to commit to openness and transparency and set up and implement an effective reporting and monitoring system. The minimum viable EOSC is a federated set of data used by a core that makes it possible to exchange data. When looking into the future, it is aimed to include not just publicly funded research but to include other public sectors as well, such as education or the private sector.

Sens asked all panelists what they think 2031 would look like. Draxl thought that full interoperability could be reached within materials sciences and with AI we could be a big step further. Luyben thought it would take 20 years before 50% of all relevant research data is as FAIR as possible. Milne emphasized that the behavior of scientists towards data sharing has to change too. It should be kept simple and be harmonized. To motivate scientists to make data fair, tools that will save time have to be provided, this also requires a change of culture. Luyben added that data stewards are important, there should be an optimal mix of expertise available in both libraries and labs. Not one size fits all. In order to avoid the risk of scientists getting confused where to deposit data, it was emphasized that infrastructures should be fully interoperable.

The Academic Research Enterprise: Structure, Leadership, Challenges, and Adaptation, a pre-recorded session, presented to CNI on 15 December 2020 was added as an extra session.

Speakers: Jane Radecki, Analyst, Oya Y. Rieger, Senior Strategist, and Roger C. Schonfield, Director of Libraries, Scholarly Communications, and Museums, Ithaka S+R, New York. Introduction by Clifford Lynch, Director of the Coalition of Networked Information (CNI).

Universities are becoming more sophisticated in their management of the research enterprise, a significant element of their mission and also a major source of revenue that remains strong even during this year’s disruptions. This year, at Ithaka S+R we have been examining the state of the academic research enterprise — how it is managed, the strategic priorities that universities are pursuing for it, and the disruptions caused by the pandemic.

You can also read the two papers from which we presented. Our landscape review of the pandemic’s disruptions to the research enterprise emphasized financial and budgetary impacts, research project impacts, and the human impacts. This project, sponsored by Springer Nature, was principally conducted by Jane Radecki, and Roger C. Schonfeld contributed to it.

Additionally, Oya Y. Rieger and Roger C. Schonfeld examined the role of the senior research officer, a generic title for the vice provost, vice president, or vice chancellor of research. We interviewed 44 of these research leaders from the largest research universities in the US, examining the nature of the role itself, key responsibilities and collaborators, and strategic priorities and challenges. Ex Libris sponsored this project.

Auf Wiedersehen! Goodbye!

Eric Merkel-Sobotta (Berlin Institute for Scholarly Publishing) and Arnoud de Kemp (Founder of APE and Chairman of the Program Committee) closed this year’s APE by thanking the program committee and speakers for realizing this program. They apologized for some of the technical challenges and hoped next year would be in real life again. Merkel-Sobotta added that the Berlin Institute for Scholarly Publishing will start running workshops and seminars this year.

Please note: APE 2022, 11–12 January 2022.