When I first talked about open access and the decline of the scientific journal, 20 years ago, it was fortunate that I had Dirk Haank available to tell the world not to listen to demented consultants with no skin in the game. When I spoke some 15 years ago, about the inevitable declined of the subscription science Journal, it was pleasing to hear Kent Anderson reassuring us, all that I was simply a mad dog out on license. Now, as I read the strategy revision for their open access policy published by the Gates Foundation, on April 7, I am very happy to indulge the Panglossian philosophers of the scholarly communications marketplace once again and while I wait for them to tell us that nothing has really changed and everything will go on  just as before in the best of all journal, publishing worlds, I am heading down to the marketplace to link arms with Cassandra. We shall chant “ O woe! O woe ! The day of the open access, journal is nearly over, and it’s end can be told with confidence!“

Of course, this might take another 15 years. I’ve reached an age myself when time is not a very worrying factor. In the 57 years that have passed since I started work in the educational and academic publishing sector I have been acutely, aware that commercial publishers, while being politely prepared to entertain speculation about the future, have necessarily to attend to  this year’s financial results and the expectations of investors. When my speculations were deemed too far-fetched, my clients in the boardroom tended to say “our strategies are clear – follow the money!” Today, my response to them would be quick, and immediate:“Watch what the funders are doing with the money, and then, follow the data! “

Many will argue that Gates is a small funder in terms of article contributions. It’s work creates around 4000 articles a year, and through its payment of APCs it contributes a mere $6 million per annum  to the coffers of scholarly publishers . But it is an influential player and in its revised open access strategy it may have detected something which is present in the minds of the larger funders, and eventually of governments themselves. What is the duty of the funder in terms of ensuring that articles detailing research results are available to the community at large? In the time of Henry Oldenberg in the 1660s, the answer would have been to get them into the Transactions of the Royal Society. Today, it is to get them onto an authorised pre-print server with a CC-BY license as soon as possible after the research is completed and the article is ready, and to accompany it by linked datasets of the evidential material on a similar license on a similarly approved site. Speed is of the essence, access to all is key and critical. Subsequent reuse of the material in a journal, subsequent acts of peer review and downstream reuse are not the key concerns of the funding foundation. By this fresh twist in the end of its open access policy, the Gates Foundation have saved $6 million, which can now go back into the research fund . And by using F1000 , who already supply the internal Gates, publishing systems, to create F1000 Verixiv, the pre-print server of choice, they have provided tools, which researchers can use (or not) to fulfil the mandate.

If other funders follow this route, then the scholarly communications research community in science faces a choice. For many, more pressurised by getting the next research program underway than anything else, it will be simple to leave things there, and not necessarily press forward to eventual journal publication. For others, given the needs of institutions for publication, to secure tenure or satisfy other funders requirements, publication will remain essential until the way in which science results are assessed, begins to change.One of the things that I recall from conversations with Eugene Garfield, in the 1980s , was his repeated assertion that better ways than citation indexing would be found to assess the worth of science research articles. Like Winston Churchill on democracy, he maintained vigorously that what he had created was the “best worst way“ of doing the job. The challenge now, I would suggest, is whether some latter day Garfield can perform his 1956 breakthrough, and create a way of indexing and illuminating what is good science for a modern world. That measurement and indexation has to be available as soon as possible after the first appearance of the claim, wherever it appears in digital form.In the meanwhile, getting the knowledge immediately into the marketplace, and getting the data available to aide reproduceability supports other research in progress and supports integrity. And that is critical for funders and researcher alike.

Such new systems will emerge in their own time. In the meanwhile the way we measure, achievement, t which have been gamed and manipulated endlessly and need in any case to be renewed or replaced , experienceincreasing pressure,. This applies as much to peer review as anything else. If publishers are to stay in the loop, then they need to change their relationships as wellAs the relationship between Gates, andF1000 shows, whatever takes place in terms of “publication “ and where it takes place in the ecosystem may become more important to the institution or the funder to the researcher or the research lab. In terms of attracting sponsorships, investment, and industrial research cooperation,  universities may have more interest in publication than most, especially if the research community sort out a better way of ranking science than by citation indexng.(Footnote: what a clever man that Vitek Tracz was! The Tesla of science publishing! Long after his retirement, we shall be using the tools he created for white label sponsored publishing! )

So there it is! Cassandra and I have now done a full lap of the forum, and I can feel that the rotten vegetables are getting ready to fly through the air! next time, if I survive, I plan to “follow the data” myself, and look at the role of publishers as data aggregators, data curators, and data traders. and we shall remember the old saying: “how do you know if the searcher is a person or machine? Well, only machines read the full article!“

Scientific Communication as a Volume Business

Last week’s  OAI 12  ,the Geneva Workshop on Innovation in Scholarly Communication, hosted as always by CERN and Geneva University , was a delight . Real scientists talking with real passion . Genuine case studies that underlined some critical issues where science can do better . A good sample of “ citizen science “ involvement to remind us that real people , not just scientists , can perform science as well as experience it and benefit from it . Once again the meeting was truly international and once again , it featured not just the performance of Open Access but the much wider implications , seen  through the wider lens of Open Science . And it was immediately clear that Open Science is not just a lens but a prism , and those who look through  it  experience some very different emotions . 

There is , for example , a chasm of intent between those who embraced Open Science as the democratisation of science , and those who worried about the purity of scientific performance . There is now a strong and practical demand to open up a wider understanding in the general public about scientific conclusions and what they mean . This has been given sharper point by the pandemic , but it is worth noting that while professorships in the public understanding of science go back 30 years in many countries , and we have had many very distinguished science journalists , politicians and the press have real ( and sometimes deliberate ) difficulty in explaining what science means – and admitting its limitations . People who rally under this banner tend also to believe that all research funded by the state should have its results published by the state , so that all citizens and taxpayers can have access to it . They are met with voices who hold that too much science is published , too little selectivity is exercised , and too much duplication of identical experimental results is permitted in an Open Access context .

This tribe in turn is confronted by a fervent lobby who believe that the publishing research results is notoriously incomplete . Where , they ask , is the data , evidential or not , that surrounds the scientific process ? Certainly not lodged with the article , too often not even linked to it , too often not even available , just because commercial publishers never found a way to monetise it . And even when it is in a repository or linked to the article , it is often not presented in a way that makes it usable , either to another scientist using different analytics , or to another computer trying to reproduce the experiment . What hope , they then say , for “ Open Science “ when so much science is closed even to the re-use of other scientists ?

Beyond these knowledgeable Geneva conference attendees are the worried ranks of working researchers who have a suspicion that not everyone is following the same basic rules . Does evidence sometimes get distorted to meet the claims of the hypothesis ? Is someone gaming the citations in order to get tenure or preferment ?  Is someone distorting what was actually put on record to create panic and discord ? ( It is hard to attend these meetings without being given a case history of anti-vaxxer conspiracy ) . As in any community , rumour takes flight , and while it is impossible to talk about the extent of malpractice , Open Science now also means “ open up science “ and shine a more public light on retractions , on plagiarism , and upon the claims of experimentation that defies reproducibility .  

One striking conference session featured the very vigorous crop of small presses developing OA books programmes and sharing infrastructure to do so . ( ScholarLed , COPIM).  They may point to the increasingly comprehensive and available  workflow software for publishing , which may serve the desired democratisation by enabling every research team to report results and data to open platforms , subject to automated primary peer review , leaving the eventual status of the work in the hands of its readers and users across time . The proponents of ‘too much ‘ will be appalled , but this has been the drift since Open Access itself began – a fee-based business fuelled by APCs can only be a volume business .And if the future really is Diamond OA , it may cease to be a business at all . This will please some and not others , and the fault lines became clear in the final session .  Under the chairmanship of Tracey Brown , the Director of Sense about Science , Geoffrey Boulton  of Edinburgh University and Kent Anderson of Caldera Publishing debated what had gone wrong . Would that they had debated what to do about it , because there is now a tendency in these sessions to search for a villain . For Professor Boulton the universities are to blame . They created the “ publish or perish “ world and cannot retreat from it fast enough . It is they who have “ lined the pockets “of major publishers with profits from articles read by “ about 0.5 readers “ per article . For Kent Anderson it is the techno-utopians ( a term he has kindly used on me in the past !) and the Tech companies .  Academic Publishers are the virtuous  providers of journals  “ whose focus on rigour and quality “ is so lacking elsewhere . He points especially at  the irresponsible pre-print servers . ( “ MedRxiv and ArXiv are funded by Facebook essentially “). At times , while condemning Google and Facebook for  amplifying conspiracy theories it almost sounded as if , by connecting Steve Bannon, anti-vaxxers  , CERN and predatory journals in the same context , we were knitting a few of our own  . A sad lapse in a very interesting session . 

 It was a really interesting and informative five days , with many voices heard that are normally silent or ignored . Our urgent needs for finding more effective means for evaluating research and researchers ,  for giving  scope to the ongoing evaluation of science research as it changes over time ,for  ensuring and recording its reproducibility and safeguarding its accessibility, and getting the evidential data in place and reusable by both man and machine engages all of us in scholarly communications – publishers and software developers and data and analytics companies as much as researchers , funders , institutions and librarians . Our path to a new concession will be eased if we concentrate on the debate and avoid the smears . ( https://oai.events )