You are viewing a javascript disabled version of the site. Please enable Javascript for this site to function properly.
Go to headerGo to navigationGo to searchGo to contentsGo to footer
In content section. Select this link to jump to navigation

Summary Report APE 2023, 10--12 January, Berlin, Germany Berlin Re-Visited: Building Technological Support for Scholarship and Scientific Publishing

Abstract

This paper summarizes the 18th Academic Publishing in Europe (APE) Conference: Berlin Re-Visited: Building Technological Support for Scholarship and Scientific Publishing, held as a hybrid event 10 and 11 January 2023, and organized by the Berlin Institute of Scholarly Publishing (BISP), a not-for-profit organization dedicated to bringing publishers, researchers, funders, and policymakers together. This year’s conference theme “Out with the old, in with the new!” was discussed and presented in keynote speeches, the APE lecture, and several panel discussions. Current challenges within scholarly publishing, e.g., with research integrity, trust, and research assessment, have much to do with the old ways of doing things. To move science forward, new technologies and innovations, like the decentralized web, FAIR digital objects, and blockchain technology are needed to shape new paradigms. In many sessions it was stressed that not just new technologies are needed to move science forward, but human collaboration and partnerships as well. The changing role of the journal and the importance to recognize more diverse research outputs, beyond the journal article, was a topic of importance. Not only research assessment reforms and improved collaboration amongst different stakeholder groups are needed to address this, new publishing systems, better metadata, and open infrastructures as well. A session presenting different start-ups showcased how Artificial Intelligence, Natural Language Processing and software technology can be used to tackle problems in e.g., finding relevant funders and peer reviewers, and detecting image plagiarism. It was also discussed what publishers can do to help achieve the Sustainable Development Goals. Collaboration, data transparency and sharing best practices amongst researchers, funders, and policy makers are key. Another important topic during this year’s APE was how publishers can support Early Career Researchers: establishing new workflows and infrastructures that enable publication of a wider range of research outputs, and broader recognition of these outputs, will incentivize ECRs. This year’s APE was concluded with the APE Award Ceremony. The winner - Vsevolod Solovyov from Prophy - has used AI to enhance the current peer review system and made a very important contribution to improving the scholarly communication system.

The 18th Academic Publishing in Europe (APE) Conference, and first hybrid APE, was opened by Marta Dossi (CEO, Berlin Institute for Scholarly Publishing (BISP)) and moderated by Yasmin Lange (Brill).

Prof. Dr. Ulrich Dirnagl (Founding Director, QUEST Center for Responsible Research, Berlin Institute of Health at Charité) started his opening speech with APE’s mission: to encourage debate about the future of value-added scientific publishing, information dissemination and access to scientific results, and to offer an independent forum for open discourse. He said that academic publishing has changed a lot, and that there are different challenges on an economical, legal/criminal, conceptual/procedural and academic level.

Prof. Dirnagl focused on a few of the challenges: e.g., the current academic reward system. Due to the exponential growth of science inputs and outputs, metrics have become very important, and content has been ‘downgraded’. He stressed that academia is responsible for the creation of a ‘reputation economy’. The ‘wave of academic publishing’ consists of many different outputs, such as preprints, data, blog posts etcetera. And there is even an additional mass of ‘dark matter’: nonpublished outputs such as negative results and non-publication of clinical trial results. He added that if all these outputs were to be published, the wave would even be much bigger.

He continued on another problem: Peer Review. He mentioned a few of the reasons why Peer Review is overrated as a quality control mechanism: Methods and approaches are too complex, it is time-consuming, not scalable, untransparent etcetera. Some of these problems can be fixed by technology but others cannot. The root problem for Prof. Dirnagl consists of the academic rewarding and incentive system. He said the system needs to go back to content, have fewer metrics, more narratives, and reward Open Science and Team Science. This will give fewer papers, more time for research and more quality, diversity and innovation. He said that change is happening, there are initiatives like the Coalition for Advancing Research Assessment [1] and there are lots of other developments like registered reports, review commons, open peer review, Open Lab Notebooks etcetera.

Prof. Dirnagl finished with what he saw as the future of scholarly communication: Heavily pre-print based, fewer classical journals, disentanglement of the publishing and review process, modular, automated, less commercial, and integrated perspectives and approaches of non-WEIRD (Western, Educated, Industrialized, Rich and Democratic) countries. And additionally: look at what ECRs expect of the scholarly publishing system.

The Leadership Panel: Embracing the New was chaired by Dr. Caroline Sutton (CEO, The International Association of Scientific, Technical and Medical Publishers (STM), Oxford/The Hague). Panelists were Judy Verses (President, Academic and Government Markets, Elsevier), Rachel Burley (CPO, American Physical Society), Dr. Guido Herrmann (Sr Vice President, Wiley Partner Solutions), Harsh Jegadeesan (CSO, Springer Nature) and Dr. Damian Pattinson (Executive Director, eLife).

Dr. Sutton opened the session by asking the panelists what they thought to be opportunities for the publishing industry. A few panelists stressed how the industry should make better use of technology and that collaboration with researchers, funders, and policy makers should be enhanced, e.g. when tackling climate issues. Dr. Pattison saw the rise of preprints as an opportunity, and Dr. Herrmann added that the key opportunity for publishers was to support researchers in the transition to Open Science. Burley mentioned the changing role of the journal. Will the next generation still be very much journal-focused? How can we make it more intuitive and easier for researchers?

The discussion continued on innovation in the publishing industry. Verses said that for the so-called Tiktok generation, the industry is never innovative enough. The primary artifact in scholarly communication is still a flat pdf. ‘We should push towards a reproducible knowledge stack’. Jegaseesan said that Open Science will enable a new world of connecting different stakeholders. Dr. Herrmann said to be mindful of the complex ecosystem: the PDF can mean different things for e.g. an author, a reviewer, a leader. Dr. Pattinson added that innovation on tools for e.g. code, data, dynamic figures should continue, even though this can be challenging.

The panelists agreed that already a lot is being done when it comes to driving innovation, e.g. fully digital workflows; but not all innovations are always visible to researchers. The panelists saw the additional need to exploit the full potential of new technologies, e.g. AI. There is also innovation potential in making literature more trustworthy. From a biomedical perspective, preprints are the future. Dr. Pattinson added there are also concerns and barriers, e.g., authors need to get used to public criticism. Verses said that one of the results of the Elsevier Confidence in Research report [2] was that authors are anxious of misinformation getting out there. Dr. Herrmann added that the shift to Open Access and Open Science has an impact on authors but on readers as well. There is additional innovation potential in getting information to readers. Jegadeeshan said that because of growing content, more needs to be done for researchers to keep track. The systems cause a lot of extra work, we need to make publishing much easier. Publishers should help researchers to communicate their research and this should be done in an equitable way.

Additional challenges mentioned by the panelists were: the article-based economy, duplication of efforts, lack of standards e.g. on metadata. Efficiency with regard to Peer Review and research integrity, should be increased. Tracking and measuring inequities in scholarly publishing should also be tackled. Sutton said to be aware of OA Business Models creating new inequities, not just North-South but also well-funded vs less funded research.

Questions from the audience focused on quality control, trust in science and publishers’ responsibility. Dr. Herrmann thought publishers could be more transparent, e.g. regarding image and data checks. Pattinson said that everyone should get used to the idea that science can be messy and wrong. By educating the audience, creating lay summaries, making better use of AI, publishers can do more. Additionally, quality control beyond the PDF format should be improved through the development of research infrastructures that allow for combined research outputs. This cannot be done in silo, e.g. the NFDI [3] is being co-created with multiple stakeholders. It is also about incentivizing researchers and involving different communities, such as ECRs and different disciplines. Within the academic system there are fundamental challenges that go across all disciplines, and innovations to address these challenges will benefit science as a whole.

Prof. Jörg Rocholl (President, ESMT Berlin) started his Keynote: Sustainable Transformation: The Key to Thriving in Disruptive Times by saying that transformation and disruption take place in all industries, at all company levels, and are part of history as well. There are multifaceted challenges, such as climate change, biodiversity loss, and the war in Ukraine, that prevent sustainable transformation. However, innovative technologies and digitalization create new opportunities, and transformational leadership can provide a vision for sustainable transformation.

He explained how at ESMT they looked at how their current expertise, internal processes, and their interaction with different stakeholders affected transformation and how leadership needed to change to lead through the digital transformation. He said that for the first time in history, the majority of MBA students are enlisted in online programs. Dissemination of knowledge and interaction have changed completely, and experimenting through e.g. social innovation labs and enriched learning experiences, is necessary to optimize interaction. He added that digital fatigue amongst the younger generation should also be taken into account and the fact that digital learning environments will never completely replace in-person classrooms.

He said that for sustainable transformation, an interdisciplinary approach is needed. Collaboration, and wide dissemination of insights are key to sustainability, and it is crucial that scientific insights reach society. He added that the life cycle of invention to innovation needs to be optimized, and research and technology need to be better integrated. This can be done through empowering individuals. Even though there are challenges, Prof. Rocholl was convinced science can overcome these. He concluded his presentation on an optimistic note: the fall of the Berlin wall has shown that disruption offers opportunities for fundamental change.

The Session: Innovation, Technology & Infrastructure was chaired by Dr. Daniel Hook (CEO, Digital Science, London) and focused on challenges around digital transformation from three different perspectives.

The first speaker, Dr. Nandita Quaderi (Editor-in-Chief and VP Editorial, Web of Science, Clarivate, London) explained in her talk Misuse of bibliometrics in research assessment infrastructures how metrics do have value in Research Assessment when they are used responsibly. She said there is still demand for quantitative performance indicators, even though they have led to perverse incentives and unintended consequences in the past, such as the scores or ranks becoming a goal in themselves. She added that new forms of manipulation are rising through e.g., salami slicing (splitting data from a single article across multiple articles), inappropriate self-citation and fraudulent enterprises, such as: Paper Mills, predatory journals, hijacked journals and author/citation marketplaces. These problems are recognized in the community, and initiatives like DORA [4], the Metric Tide [5], the Leiden Manifesto [6], Harnessing the Metric Tide [7] are formed.

She continued on the reasons behind the continued misuse of bibliometrics. One example was how the Journal Impact Factor (JIF), a journal level metric, was used as a proxy for individual researcher and article evaluation. The JIF is only responsible when used to measure journal performance, it does not include information about why an individual article is cited. Another example: The H-index was originally designed for theoretical physics but is now being applied in other research fields. In addition, it is based on total citations which build over time, so younger researchers are disadvantaged. She concluded that the future of responsible Research Assessment should be based on a balance between technology, metrics, and expert quality reviews. With an increased use of data profiles, an introduction of measures for real world impact that influence policy & practice, indicators of trust can be created.

The next speaker Dr. Bianca Kramer (Sesame Open Science, Utrecht) said in her presentation Research infrastructure as a social norm that changing the research culture involves multiple layers: from infrastructure, user interface/ experience, communities, incentives, to policy; these layers are all connected. Changing Research Assessment needs multiple perspectives, and should include looking at a wider range of research outputs and impact factors. In addition, the community needs to look at who makes the decisions: Governments, funders, universities? There needs to be balance and different disciplines and researchers should be taken into account. And the community needs to ask: Is there a need for standardized indicators? For specific research groups? How to obtain the data we need? Who is in control over information on research outputs? How open are the infrastructures we use? There are several initiatives that stress the importance of transparency and long-term sustainable solutions, such as CoARA and the UNESCO Recommendation on Open Science [8].

In order to achieve a change in research culture, Dr. Kramer stressed the importance of open data and open infrastructures, to prevent vendor lock-in and control, and to prevent selective coverage and exclusion. An open infrastructure enables full transparency, allows institutional control over data and the stories the institutes want to tell. This will allow a more equal global research ecosystem. Dr. Kramer concluded that vendor lock-in through commercial providers can be prevented by data that is open at the source; there are several initiatives: Initiative for Open Citations [9], Initiative for Open Abstracts [10], OpenAlex [11].

The third speaker Prof. Dr. Philipp Koellinger (School of Business and Economics, Vrije Universiteit, Amsterdam) explained in Decentralizing web tools for scientific publishing how tools can be combined to help build an OS platform for digital research objects. A decentralized web consists of a new set of technologies and protocols that reorganize the current centralized internet. In a decentralized web, Persistent Identifiers based on hashes are used for content addressed data storage, which is more reliable and cheaper. The decentralized web also offers opportunities for scientific publishing: new tools for Peer Review, content creation (e.g. gateways), and new business models.

Prof. Koellinger explained how In the current internet there are problems with e.g. link rot, content drift, and version control. He said that identifiers that are currently used, like for example DOIs, do help but are often not unique nor persistent. He introduced cryptographic hashes: to convert strings of arbitrary length into strings of fixed length. If you change anything in the input (like a single word) you get a different hash. Hashes are unique and immune to link rot and content drift. With this, a publication protocol, in which rich research objects consist of hashes indexed on a blockchain, can be constructed. When you add or update a small component in for example a manuscript, you get a new hash, and as a result a new research object is created; the old version of the manuscript is not overwritten, both versions will remain accessible. This will look like a public registry of research objects in a blockchain, stored on different servers.

The decentralized web will also enable interoperability and granular, version-controlled citations. Citations as function calls can be implemented, which are the building blocks for FAIR interoperable research objects. Prof. Koellinger added that the decentralized design prevents vendor lock-in and enables gateways for different stakeholder groups, such as publishers, universities and funders. He ended his presentation with a question for the audience: Why is this not going to work?

In the subsequent discussion with the audience, this question was not answered but instead, the feasibility of reproducing code in a decentralized web was discussed: how would it work in a research system that is large and vast? Prof. Koellinger said the blockchain architecture of the decentralized web has the advantage of track records: when a compute job would take months, it would only have to be done once.

Other questions focused on research assessment and peer review. How for example better use of AI could lead to more automated PR. Right now there is too much trust on journal articles and citation performance; other research outputs should be rewarded. Data publication, reproducibility, and pre-registered reports should be incentivised and rewarded. It was also discussed how to develop indicators for impact and quality. Dr. Kramer said that impact can mean different things, it is very context-dependent, perhaps more could be done with help of technology but this is not clear yet.

The Panel: (Re?)Building Trust in Research Integrity was chaired by Alice Meadows (MoreBrains Cooperative). Panelists were Rachel Burley (CPO, American Physical Society, College Park, MD), Chris Graf (Research Integrity Director, Springer Nature, London), Dr. Damian Pattinson (Executive Director, eLife, Cambridge, UK), Dr. Sabina Alam (Director of Publishing Ethics and Integrity, Taylor & Francis Group, Oxford), and Gabriela Mejias (Community Manager, DataCite, Hannover/Berlin).

Meadows started off by explaining the question mark in the title: Do we have a problem with trust in Research Integrity? Some people would say yes: e.g., the increase in article retractions is faster than the increase of papers being published. But on the other hand, people say no, there is no problem: retractions can be a sign of a healthy research process. She added that there are differences amongst disciplines and publishers are collaborating to address the problems. Technology is both helping (identifying issues) and hindering, e.g.: continued citation of retracted papers, biases get replicated and amplified. She added there are possible solutions: the NISO CREC Group [12], Peer Review Terminology Standardization [13], increased adoption, implementation and validation of PIDs and associated metadata, and improvement to the PR process. Meadows stressed the importance of a community approach. “We will fail if we don’t collaborate”.

Meadows continued the session by asking the panelists: do we have a problem with Research Integrity? Dr. Pattinson said the pressure to publish is getting worse. Dr. Alam believed the problems are broad and that there are varying issues in different disciplines. She thought it to be the responsibility of publishers to raise awareness through education and training. Mejias agreed that RI is an issue and the COVID pandemic showed how fragile trust in science is. She said it is important for the general public to know that research is not flawless and that there are mechanisms in place to address these issues. Burley added there is a role for publishers to improve trust in science, e.g., with AI tools. Graf mentioned the STM integrity hub [14] and added that quality control is the publisher’s core business.

The discussion continued on improving trust. Dr. Alam said not enough has been done to crosslink different research outputs: “We live in a digital world but don’t really use it”. Meadows said that PIDs could be used for linking outputs. Mejias agreed that transparency needs to be improved and that this can be done with PIDs and better metadata. For this to work and add real value, global adoption of PIDs is needed and inequities should be addressed. Burley agreed a multistakeholder solution is needed and a way to discourage bad actors should be found. Graf added that the transition to OA has taken up a lot of effort of all involved. It also generated goodwill and that should be used to address e.g., Paper Mills.

One question from the audience focused on the opportunity publishers have to get hold of RI by inviting reviewers, and other groups to reproduce results. Would this be feasible? The panelists agreed that this would be an unbelievable amount of work, and different for the various disciplines. In theory, publishers could enable reproducibility for some disciplines, by making sure methods and code are described and by validating completeness. Another question focused on tracking the quality of the authors: e.g.: sanctions to authors who have been caught submitting fake papers. Meijias said that PIDs and metadata help build rigor to the scholarly record, e.g. with ORCID iDs, by e.g. selecting reviewers with at least 5 records in their ORCID iD. Burley said that awareness amongst publishers has to be raised. Perhaps it is a sense of shame with publishers when doing a retraction? The panelists agreed that a more nuanced set of mechanisms is needed to move beyond the binary system of retraction/non retraction, paper mills vs. good papers.

The session concluded with the views of the panelists on how technology could help with Research Integrity. Graf stressed the importance of identity validation. Meadows added this could be ORCID. Burley said that authors can also make mistakes innocently. Authentication is important, and tools to identify issues upfront should be used. Mejias stressed that associated metadata and open infrastructures can provide the context on how research was produced and better enable reproducibility. Dr. Alam added that linking outputs through identifiers will help to stop bad actors. She said that we still need humans to make judgments, but tools definitely help with scalability. Graf thought that we don’t have the balance between human efforts and technology right yet: we are still relying too much on PR. Alam said that doing integrity checks could be a dayjob for an academic editor. More division of labor is needed and publishers cannot be held exclusively responsible. Meadows concluded that to address these challenges, increased collaboration between research institutions and publishers is needed.

The APE Lecture: The Journal of Impossible Results was presented by Prof. Dr. Günter M. Ziegler (President, Free University Berlin). He said that what was thought impossible in the past, turned out to be possible today and should be published Open Access in the Journal of Impossible Results, published by the Royal Society of Magic. With three examples that used to be science fiction, he showed how the impossible has been made possible with the use of mathematics.

The first example was the Babel Fish in the film: Hitchhikers’ guide to the galaxy: Automated translation was considered impossible, but now, with automatic DeepL translations it is made possible by using the knowledge of mathematicians and computational scientists. The second example was on protein folding, this was considered impossible too, but now, by using mathematics and AI, gigantic leaps in research on protein folding have been made. Research still continues and Prof. Ziegler thought it to be a shame not all research is published OA. The third example was heart CT scans. It was considered impossible to visualize the coronary arteries noninvasively. But now, because of mathematics, heart CT scans are possible and work even better than invasive scans.

The three examples showed that with AI a lot of things have become possible that were thought impossible before. He added, however, to be mindful of AI. He said that Joseph Weizenbaum already warned about possible dangerous implications of AI. And recently in the Tagesspiegel [15], a German newsletter, it was mentioned that robots are being used in the war in Ukraine and can kill on their own, without human intervention. Prof. Ziegler concluded: do we really want that?

The second Day of the 18th APE started with the session: STM’s New Dotcoms to Watch, presented by Todd Toler (VP Digital Product Strategy and Partnerships, Wiley).

Dustin Smith introduced Hum [16]. He said that publishers sit on a goldmine of data, mostly siloed, unstructured and inconsistent which makes the data difficult to use. Hum developed CueBERT: a content understanding engine, to create structural behavioral data. They combine data to create a record and to understand users. This can also be done with data from anonymous users. This is valuable so publishers can better engage with their audiences, and generate content recommendations. He said there are many ways to use Hum: such as targeted emails and advertising, and it could be used for author recruitment targeted marketing e.g. when customers are searching for instructions for authors.

Judy Mielke started off with the ScientifyRESEARCH [17] mission, which is to empower researchers, connect them with research funding information, and enable fair funding competition. She said there are so many funders that it is impossible for researchers to keep track. It takes up a lot of their time and the focus is on well-known funders. With ScientifyRESEARCH an openly curated database of funders has been created. There are different versions available which make it easier to find relevant funding, with content alerts. Mielke concluded: we have broad experience and expertise in various research fields, work with research influencers, and want to apply openness to the whole research cycle allowing equitable access to funding information for all.

David Harvey from Prophy [18] said that Peer Review is at the heart of the scientific process. But how to find reviewers? This can be done through author suggestions, editors’ networks, and/or automated tools to search journal databases. However, research shows that reviews are biased. At Prophy they would like to change the current review system. From a database of over 150 million articles, they extract scientific concepts and use AI to relate concepts and create a map to match papers to other papers. In seconds 1000s of candidates are found, with conflicts of interest immediately raised. The scientific concepts can be grouped to help find interdisciplinary reviewers. Funding institutions, journals, universities, and researchers across different disciplines are already using Prophy.

Patrick Starke said that with Imagetwin [19], they want to increase the quality and trust in science. He explained that image plagiarism is not easy to identify: the image could have a different color, or be rotated: it is impossible to detect manually. Within the scientific community, there is an image integrity problem: around 100 million papers are reviewed each year and every 20th research paper does have an image problem. At Imagetwin they developed software to detect image manipulation and duplication. After the pdf is uploaded, a record with potential issues is created; all kinds of duplications will be found. Imagetwin was launched in 2022, and already has more than 150 users. In 2023 more manipulation types and new image types will be added.

Alex Garcia introduced Audemic [20]. They aim to make knowledge more accessible by creating new, and friendlier ways to digest academic research. He said that it is impossible to keep track as a researcher and read all relevant papers, especially for young researchers. At Audemic they make research more innovative and enjoyable through design and audio. With NLP technology, the rise of audio, and the growth of podcasts, much more is currently possible. Audemic was funded in 2022 and aims to make content accessible to a wider audience. Audemic already has different partnerships with publishers, universities, societies, and higher education institutions.

Martijn Roelandse introduced Sciscore [21]. They aim to enhance rigor reproducibility in scientific research. Sciscore has introduced resource identifiers (RRIDs), and have built an API, which is currently being integrated into a number of platforms e.g. Editorial Manager. Roelandse said that during the pandemic, there was a huge increase in preprints. The problem was there was no PR, no editorial check: preprints are not as good as papers when it comes to rigor and transparency. Sciscore built a pipeline, which combines a number of checks, which are then pushed to a hypothesis report. With several partners, Sciscore aims to further increase visibility of research resources and make them part of preprints.

After the presentations, the audience was asked to vote for the best Dotcom when looking at the solutions from an investor perspective. Prophy was voted as the best Dotcom.

The Panel: Stepping up on Climate Action was chaired by Thea Sherer (Director of Sustainability and Climate Action Officer, Springer Nature London) and panelists were Rhea Kraft (Senior Manager, Ernst & Young, Berlin), Dr. Hakim Meskine (Data Scientist, Wiley VCH, Berlin), and Dr. Jessica Sänger (Director for European and International Affairs, German Publishers and Booksellers Association, Frankfurt am Main). Sherer opened the session by saying that this was the first time that this topic was discussed at the APE. She said the publishing sector can make an impact and many publishers have already been stepping up on climate action, e.g. on reaching the Sustainable Development Goals [22].

Kraft spoke about the transformation of European sustainability reporting requirements. The main target is to be climate neutral in 2050 with a strong focus on biodiversity and the establishment of a circular economy. With the introduction of the EU’s Corporate Sustainability Reporting Directive [23], the EU wants to encourage businesses to change and transform. Companies subject to the CSRD will have to report according to European Sustainability Reporting Standards (ESRS). With regard to climate change, an individual pathway for companies on how they can contribute must be defined. Kraft stressed that companies can define relevant topics and that the directive also allows them to focus on opportunities.

Dr. Meskine introduced Wiley’s climate goals: to be carbon net zero by 2040 in line with science based targets. He explained how Wiley took the 1st step: look at where emissions are. It turned out that Wiley’s most emissions were in scope 3: vendors, purchased goods and services as well as capital goods. Wiley put in place a variety of policies, developed tools to start collecting information from vendors and established a multi-year pathway with specific targets.

Dr. Sänger explained how associations can support the publishing sector in reducing CO2 emissions. She said that umbrella organizations like e.g., IPA, STM, EIBF can support collaboration, raise awareness and share best practices. There are several initiatives which have the SDGs in focus: e.g. the SDG book club [24] and the SDGs Publishers Compact [25]. Other initiatives collaborate on other topics, such as ecological, economic, and social sustainability, and share best practices on e.g. returns policies (RISE bookselling project [26]). The German Borsenverein sustainability working group [27] set up task forces on production & logistics, reporting requirements, sustainable operations, and collaborate to get funding.

Sherer said that the UK publishers association has also set up an SDG-pact and she urged publishers to use the platforms they have to raise awareness. For example the IPA Publisher 2030 Accelerator [28]. This platform goes beyond the academic publishing sector, and topics, like print on demand, accounting and financial systems and book returns, are discussed.

The discussion continued with the question if the aim should not be “carbon-zero” rather than “carbon-neutral”. Kraft was of the opinion that offsetting is still better than doing nothing. It is a good starting point and it is easier to start when you present data. Sherer added that collecting data for big companies like SN can be very complex, as there is a lot of data, e.g. on traveling and events but they aim to be transparent about data collection. Meskine agreed that transparency is important to show that publishers really care.

Sherer asked the panelists if they thought publishers should collaborate more on tangible examples such as reducing print. Kraft said that when it really comes down to making an impact it is very complicated, no one can do this alone, and sometimes the most direct impact might not be the best climate impact in the long run. Sänger agreed that the industry needs to be objective: it is not just print, digital emissions as well. Meskine said that Wiley has been moving away from print for a while now. But print will not go away completely. He added that digital carbon emission data is also part of Wiley’s calculations.

It was concluded that capturing data is very complicated, sharing experiences is important, and engaging colleagues and working collaboratively on innovations that account for sustainability, are key.

The session: Bold new Paradigms – in it for the long-term? was chaired by Dr. Liz Marchant (Global Portfolio Director, Life, Earth & Environmental Science Journals at Taylor & Francis Group, Oxford) and discussed the changing scholarly publishing landscape and emerging new models. Dr. Marchant said for new models to become new paradigms, they should be valuable to all people, financially sustainable, and adoptable at scale across subjects, geography, different types of researchers, and institutions.

Dr. Hylke Koers (CIO, STM Solutions, Oxford/The Hague) explained in his presentation New Paradigms in Access & Identity that the current status in Access & Identity is fragmented: there are many types of logins and accounts, and there are disconnected digital identities. Not just one solution is the best: balance between privacy, user experience and data security is needed. Federated access/authentication allows users to log in to a Service Provider (e.g. a publisher platform) with credentials from an Identity Provider (e.g. a university library). Information about the user may be exchanged (with user permission) and this requires interoperability standards and trust. Dr. Koers added it was not surprising that during the pandemic federated authentication usage increased by 60% because a lot more people were working from home.

SeamlessAccess [29] is an initiative working on top of a federated authentication system and makes it easier to use for researchers, it has three added value elements. The first is Recognition: the button is a standard visual cue that researchers recognize and trust. The second added value element: It also provides a service for researchers to find their institutions. The third component is the persistent layer that remembers the user’s chosen institute. It works across different websites. Data shows that SeamlessAccess boosts usage of federated authentication and adoption is growing.

Dr. Koers continued on attributes: they contain information about a user and are passed on to a publisher or a service provider after authentication. He explained how Project GetFTR [30] is an example of how knowledge of attributes can help smoothen the user journey. GetFTR offers infrastructure and flexible integration options in different contexts and use cases, and offers e.g. access to the best version of the article and maximizes the number of accessible articles.

At the end of his presentation Dr. Koers highlighted two important changes: Browser changes and decentralized identities. There are ongoing developments at big tech browser vendors like Google, Apple, etc. E.g., Apple has started to hide IP addresses. Publishers should inform themselves about these changes, e.g. through the Federated Identity Community Group [31] and discuss these changes with library customers. Decentralized identities have the potential to be the new paradigm, it is the next big thing in how we think about digital identity. With a decentralized identity, the user is really at the heart and in control, whereas in federated authentication, the user is a conduit.

Dr. Erik Schultes (FAIR Implementation Lead, GO FAIR Foundation, Leiden) started his presentation When papers become FAIR digital objects with the idea of the FAIR digital object itself. He said the core technology of the internet was invented in 1969. In 1995, Robert Kahn and Robert Wilensky started working on the paper: a framework for distributed digital object services [32]. Because of the vast amount of data, manual manipulation of data on the internet was no longer scalable. The paper described a way to identify objects on the internet. Twenty years later, the paper The FAIR Guiding Principles for scientific data management and stewardship [33] was published. This paper describes the FAIR Principles (Findability, Accessibility, Interoperability, Reusability) and has an enormous amount of citations. The FAIR principles are about enhancing the ability of machines and making digital objects machine-readable. In October 2022, the 1st conference in FAIR digital objects was organized. Dr. Schultes said that Kahn was not able to attend in person but luckily he was able to join online because the internet exists.

Dr. Schultes explained the FAIR digital object in more detail. There is a digital resource (could be anything: an abstract, paper, dataset, author etc), and a GUPRI (Globally Unique Persistent Resolvable Identifier), metadata about the resource, the metadata schema, and a link to the location are added. A computer can find out what it is, what you can do with it and what you are allowed to do with it. Dr. Schultes continued on Nanopublications: these make large datasets more manageable. A nanopublication is an assertion with provenance and publication info which enables a type of blockchain representation. You can make assertions about scientific content, administrative things etc. Dr. Schultes’ research group saw how FAIR digital objects and Nanoplications are moving on the same track which resulted in The Comparative Anatomy of Nanopublications and FAIR Digital Objects [34].

How do nanopublications and FDOs relate to the publishing industry? The GO FAIR Foundation suggested integrating FAIR into academic publishing processes. With IOS Press they started developing FAIR Connect [35]: an OA publishing platform for the development and dissemination of good practices for FAIR Data Stewardship. It also offers search engine services for FDOs and the journal FAIR Connect [36]. With FAIR Connect they try to FAIRify the academic publishing process, in which they see a paradigm shift from narrative articles that serve human users, with units of science and siloed publications, to FDOs that serve human and machine users, enable fine granularity of content and in which the publication is mediated by an open, decentralized infrastructure. Dr. Schultes concluded: The value will be found less in data storage and more in data services that are compliant to emerging standards enabling the principles of FAIR.

Dr. Irina Sens (Dep. Director, Head of Library Operations, German National Library of Science and Technology, Hannover) said in her presentation After THE DEAL that besides the DEAL agreements, there are other OA Agreements at TIB, such as Diamond OA agreements (e.g. KOALA), membership (e.g. SCOAP3) and OA Framework contracts (e.g. PLOS). TIB participates in the DEAL agreements to ensure long-time archiving and also to represent other smaller organizations. Dr. Sens asked the publishers in the audience to think about what DEAL means for them.

She continued on the current status of the DEAL agreements: Wiley is negotiating for DEAL II, Springer Nature as well and talks with Elsevier have restarted. DEAL was initiated by an alliance of German scientific organizations to support the OA transformation, steered by the German scientific community and to replace excessive cost increases with a comprehensible pricing system. She added that DEAL has been a Publish and Read success story for institutions in Germany. She said the German Wissenschaftsrat has stated in January 2022 that Gold OA should be the standard for all publications, and Recommendations on the Transformation of Academic Publishing: Towards Open Access [37], were published.

With regards to costs, within DEAL there has been attention to have a fair balance of interests e.g. from small and big publishers, and specific support for research-strong and publish-strong organizations. The DEAL group checks the publication figures, forecasts publication increases, and establishes simple conditions for participation, also for smaller organizations. They also look at what other countries do, e.g., JISC in the UK is working comparably. Recommendations from the German Science and Humanities Council (Wissenschaftsrat) are a.o. to support the development of transparent information budgets in libraries and to include publication costs as part of the research budget.

Dr. Sens said that the lessons learned include a.o.: financial solidarity has its limits, libraries need to get the university presidents on board, there is no one-size-fits-all solution, and SLAs with publishers are necessary. She concluded by looking to the future, what comes next? What will the OS developments, author-based publishing, growth rate in articles, and new business models mean? Dr. Sens concluded: we need good deals with publishers, attractive cost distribution models for participating institutions, and trust the data; there is no way back. She ended her presentation with a question: What would a DEAL contract look like if there was no past?

In the panel session: At the Crossroads, the results from a virtual brainstorming event, held January 4–5, 2023, and the APE 2023 Satellite Symposium, held on January 9, 2023, were discussed. The Symposium was jointly organized by BISP and the QUEST Center for Responsible Research, BIH Charité, Berlin. The aim of the Symposium was to bring Early Career Researchers and Scholarly Publishers together. The panel discussion was introduced by Dr. Anke Beck (Head of Public Affairs and Advocacy Europe, Frontiers), she said that the virtual brainstorming event organized by QUEST, in the week before the conference, was predominantly attended by ECRs whereas the Symposium held on 9 January was dominated by publishers.

Dr. Tracey Weissgerber (QUEST Center for Responsible Research, BIH Charité, Berlin) one of the organizers of the pre-event, moderated the panel discussion. She said that during the online pre-event there were a lot of participants from all over the world which meant they were able to gather a lot of different perspectives. She asked the panelists what they felt could be improved for ECRs in the publishing system. Johannes Wagner (Copernicus Publishing) felt the need for more inclusion and recognition of diverse research outputs; ‘now the only thing that matters are papers’. Jo Havemann (Access 2 Perspectives) said that a more equitable approach and more diverse global participation are necessary. She added there is also pressure on ECRs in other continents, so a mix of infrastructures is needed, as e.g. AfriXiv. Maia Salholz-Hillel (QUEST Center for Responsible Research, BIH Charité, Berlin) felt that publishers should invite ECRs for input and feedback much more. She added that ECRs contribute to published papers but are not recognised sufficiently. Chris Hartegerink (Liberate Science) felt that publishers are not very transparent, e.g. on the data they collect from users. He said publishers are reluctant to do something about these power dynamics, but that they should be aware that researchers will not accept these power imbalances indefinitely. Bernd Pulverer (EMBO) saw an enthusiasm with ECRs in contributing to the publishing process, on a strategic level, for example on rethinking Research Assessment, whereby Peer Review is seen as a formal research output.

The discussion continued on how to involve ECRs more in the Peer Review process. Salholz-Hillel said that for ECRs, there is a lot of time pressure and no recognition. She added that there are mixed views when it comes to Open Peer Review. Weissgerber said there is a need to increase diversity. For ECRs Peer Review is seen as an entry into publishing but more innovation is needed to include ECRs, e.g. ECR advisory boards, preprint review programs, ECRs as innovation editors. Pulverer said it should be clear for ECRs what they get out of it. Havemann added that train-the-trainer and capacity building programs are needed, as well as monetization incentives, as salaries in the GS are low. Additionally, language editing in multiple languages is needed.

When asked for ideas to improve scholarly publishing, the panelists agreed that the scholarly infrastructure should increase focus on other outputs, beyond journal articles. There should be more incentives for that, and better systems are needed, e.g. better linking opportunities for research data. Hartgerink said to look at the broad scope of other research outputs. This changes quickly and often publishing systems look backwards. Weissgerber said that getting new and better systems to allow ECRs to do the science of the future is really important. Pulverer added that much research is not shared because it does not fit within the system. Wagner felt that publishers are working on this, as they collaborate with societies to hear the needs of scientists. However, he heard editors say that they don’t want to add DOIs to their datasets because they fear citations to the journal article will decrease. So from a publisher’s point of view the uptake of new technology and publishing systems can be very slow amongst the research community. Pulverer agreed but felt that publishers still have a responsibility to provide mechanisms to publish other outputs and to valorize these outputs.

Weissgerber asked Salholz-Hillel if there was one thing that was different for ECRs compared to more senior researchers. Salholz-Hillel felt that the pressure to publish is stronger for ECRs and that publishers should make it safe for ECRs to take risks, e.g. Elife has done this with preprint publishing. In conclusion, Weissgerber encouraged publishers to engage in continuous discussions with ECRs and she pointed to the publication in PLOS Biology: Recommendations for empowering early career researchers to improve research culture and practice [38].

Before the 18th APE was closed by Marta Dossi (BISP), the APE Award for Innovation in Scholarly Communication, supported by Digital Science, was handed out by Daniel Hook. There were two honorable mentions: Dr. Joshua Nicholson for his work on extending citations to become more multidimensional and valuable. And Dr. Vivienne Bachelet for her work to create a fully multilingual end-to-end platform for scholarly publishing. Vsevolod Solovyov (Prophy) won the APE Award for his work to make reviewers more discoverable and thus helping to ensure that research is correctly and efficiently reviewed.

Please note: APE 2024 will be held on 9–10 January 2024.

References

[1] 

https://coara.eu/.

[2] 

https://impact.economist.com/projects/confidence-in-research/.

[3] 

https://www.nfdi.de/.

[4] 

https://sfdora.org/.

[5] 

https://www.ukri.org/publications/review-of-metrics-in-research-assessment-and-management/.

[6] 

http://www.leidenmanifesto.org/.

[7] 

S. Curry, E. Gadd and J. Wilsdon, Harnessing the Metric Tide: indicators, infrastructures & priorities for UK responsible research assessment, Research on Research Institute. Report, 2022. 10.6084/m9.figshare.21701624.v2.

[8] 

https://en.unesco.org/science-sustainable-future/open-science/recommendation.

[9] 

https://i4oc.org/.

[10] 

https://i4oa.org/.

[11] 

https://openalex.org/.

[12] 

https://www.niso.org/standards-committees/crec.

[13] 

https://www.niso.org/standards-committees/peer-review-terminology.

[14] 

https://www.stm-assoc.org/stm-integrity-hub/.

[15] 

https://www.tagesspiegel.de/gesellschaft/kunstliche-intelligenz-im-ukrainekrieg-wenn-roboter-eigenstandig-toten-9122236.html.

[16] 

https://www.hum.works/.

[17] 

https://www.scientifyresearch.org/.

[18] 

https://www.prophy.science/.

[19] 

https://imagetwin.ai/.

[20] 

https://audemic.io/.

[21] 

https://www.sciscore.com/.

[22] 

https://www.undp.org/sustainable-development-goals.

[23] 

https://finance.ec.europa.eu/capital-markets-union-and-financial-markets/company-reporting-and-auditing/company-reporting/corporate-sustainability-reporting_en.

[24] 

https://www.un.org/sustainabledevelopment/sdgbookclub/.

[25] 

https://www.un.org/sustainabledevelopment/sdg-publishers-compact/.

[26] 

https://risebookselling.eu/.

[27] 

https://www.boersenverein.de/interessengruppen/ig-nachhaltigkeit/.

[28] 

https://sdg.internationalpublishers.org/cop26-accelerator/.

[29] 

https://seamlessaccess.org/.

[30] 

https://www.getfulltextresearch.com/.

[31] 

https://www.w3.org/community/fed-id/.

[32] 

10.1007/s00799-005-0128-x.

[33] 

10.1038/sdata.2016.18.

[34] 

10.3897/rio.8.e94150.

[35] 

https://fairconnect.pro/.

[36] 

https://www.iospress.com/catalog/journals/fair-connect.

[37] 

10.57674/0gtq-b603.

[38] 

10.1371/journal.pbio.3001680.