Mar
13
Pandemics Change Publishing
Filed Under Uncategorized | 1 Comment
The current coronavirus crisis – for it is a crisis whether complacent politicians decide we are in “delay” mode or in a “contain” strategy – also asks questions of scholarly communications which have needed answers for at least 20 years. The best of my publishing friends in the industry have always espoused a service ethic – “we are there to ensure quality control and effect dissemination” – while for many of them that could only be done by a peer reviewed article in a subscription journal. As virologists scramble for a vaccination, and every form of current media is looking at the impact of what is happening to us, those who work in scholarly communications have questions of their own to consider.
And I am not just thinking about speed, though I did meet a publisher recently who spoke candidly of his article backlog and his two year voyage from acceptance to publication. If market forces really did work in science publishing, he said, he should be out of business, but reputation and prestige kept him in place. A couple of weeks ago, the Wall Street Journal (22-23 February) devoted its review page to the question of speed (“Sharing Data Faster to Fight an Epidemic”). It pointed out an increased usage of pre-print servers to speed transmission of new virology findings. virological.org, medRxiv and bioRxiv are cited. It is said that the latter two are getting about ten papers a day on the coronavirus issue: medRxiv has so far published 105 and bioRxiv 67. Later a contrast is drawn with New England Journal of Medicine, which at this point was receiving between 20 and 45 coronavirus papers a day and had so far published 10.
But this is not an argument about the differences and qualities of peer reviewed journals and pre-prints, whatever the WSJ might imagine. It is about the speed and efficiency with which the science available gets to the researcher. It points to the need to get the editorial workflow as fully automated as possible. While this is easy in, say, Plagiarism checking, it is harder checking references and citations. Getting the data associated with the article and ensuring the ability to at least attempt reproducibility are more complex. But here we are, in a race for information yet again, and the publisher, of all people, must appear not as the delaying factor but as the facilitating factor. This is the time to search the archives and bring together historical collections of apposite precedental material. Now is the time for collaborative actions by market competitors to ensure the completion of collections.
Once information, of whatever type, is publicly available, how soon does it get to the right places? In an increasingly Open Access world, it is even easier for things to get overlooked. And if the vital communication is not even an article? We could be moving towards the day when specialist researchers register to get alerts. The social media function and the publishing function, become one. Posting something with this server or that journal will at least mean that every researcher who wants to be has been alerted.
Finally, what if the vital information is not an article? Maybe it’s a procedural correction. Or a reproducibility attempt that failed, or simply a description of something noted. Almost 20 years ago I started writing about the Cell Signalling Alliance and their huge datasets held on the supercomputer at UC San Diego. Scientists looking at the chemical and electrical communications between cells are in the frontline in cancer research, immunology, and, of course, viral and bacterial infection. The teams involved, from many different institutions, were, when I last looked, in the habit of recording observations, findings and data as elaborated data sheets which they called Molecular Pages (http://www.signaling-gateway.org/aboutus/). They rightly regarded this, once placed in the relational database built for them by Nature, as a new publishing form. Each page had a DOI. Peer review was community-based and ongoing. And we have to remember our place: we are facilitators serving communities.
We seem to be in great haste to write off innovation before it has failed enough to have a chance of succeeding. The saying of Mark Twain, to the effect that all successful decisions are the product of experience, and the best experience is gained by failing at something, seems appropriate. This applies to pre-prints and many other things. If the scholarly communications community, including its publishers, is to respond to the pandemic challenge, then we have to really understand the urgent needs of research teams for the appropriate information of assured quality at the right time, and persist until we have it right, regardless of forms or conventions. And as the results of that flow back through our processes, we will stop talking about trivia like “digital transformation” and start gripping the real customer needs with the more than adequate skills and technologies at our disposal.
Mar
10
Why Innovation beats Format every time!
Filed Under Uncategorized | 1 Comment
I am back ! Three months of infections and operations are over and I am again upright on a brand new knee . Apologies to those who expected a continuous word stream . Even greater apologies to those who were enjoying the silence .
Lying on my back and staring at the ceiling should have been a moment of zen- inspired rehabilitation . Instead it was punctuated by moments of intense annoyance when I read of people self-styled as “ publishers “. sallying out to defend the “ book” and the “ article” and the “ journal”. These things need no defence . Nor does the codex or the Sumerian clay tablet . Scholarship doe not and never has lived by format . Knowledge transmission will always find the appropriate channel , like water round a dam . So eventually I was better , and got up and went to an international publishing day organised by a leading software supplier in three cities simultaneously. But then a representative of an ancient university press got up and used the privilege of a hearing in this gathering to treat us all to a discourse on the advantages and disadvantages of publishing in….books or journals !
I had to take a firm grip . All of a sudden I had been robbed of 35 years of my life . But then , as I made my way home , I remembered another strand of my bedridden reading . The growing strength of the pre-registration movement in science research began to dawn on me when I read that PLoS was adopting pre- registration (https://plos.org/open-science/preregistration/). Then I recalled the eminently sensible investment by EBSCO in protocols.io ( https://www.ebsco.com/products/protocols). So here we have a publisher like PLoS recognising that a post-Open reality may be an urge amongst funders and researchers to improve the credibility and reproducibility of scientific findings and results . And the way to ensure that aims and objectives do not distort during process is to preregister the research objectives and do so with a description of the methodologies that will be employed to explore the hypothesis .
If this catches on it will be important . Either the journal publisher or an independent site like protocols.io can then become the repository of comparative research methodologies . At the moment this material normally appears in the front half of most articles . It is variously treated in metadata terms by different publishers and often inadequately edited ( I am told that “ apply to authors for full details of techniques employed “ is still quite common ) . And the real point for publishers is the time-lag . The preregistration occurs before the research phase , and before any findings or evidential data are available ( a period of years ) . Thus the “article” effectively appears in two parts , at different times. And in different , but linked places? It is very possible , of course to link the preregistration site to the site where the findings are described and to the repository where the evidential data is held , but this does not sound , to me at least , very much like the journal as operated by journal publishers today .
We have known for a long time of the heavy , intensive search usage by researchers seeking methodological models and templates in order to pursue a specific enquiry . The gradual removal of that activity into specialist service sites could have effects on usage levels generally and thus on library journal buying pressures . And another factor comes into play here as well . Methodological improvement is poorly communicated and seldom recognised. If a researcher , in reproducing an experiment , finds a quicker. easier , cheaper way to the same result , be it by a tiny fix or a larger short cut , this does not normally lead to a new article . Often communication is by hearsay , blog or email , and it is not attached necessarily to the searchable corpus of knowledge . Nor is the researcher who has made the improvement normally recognised . Preregistration is susceptible to annotating the protocols and recognising the source of the suggested improvements . protocols.io has that vision , of an annotated library of experimental methodology with acknowledgement of suggested changes and their source .
During the 1990s and the early years of this century I took part as an evaluator in several rounds of Article of the Future discussions . Elsevier , to their great credit , were prominent in these . Structural improvements were made , metadata improved , huge flexibility introduced around the use of graphs , and their manipulation by readers , video was used for the first time but constrained by package size , likewise audio and slide presentations. By the end I had begun to feel that “ article” was becoming a much of a size constraint digitally as it has been , for many years , in print . The format words of the print world seemed , back then , to have outgrown their usefulness . Innovation in end user Performance and expectation were making tradional format terms redundant . It was just that we were too lazy to find new ones
« go back — keep looking »