It was inevitable that some readers of my piece on Gold OA earlier this week would come back and say that I have grown too fond of defining what won’t work and should become more proactive about stating the new directions. Fair cop. Here then are two “assertions” about that future  for science journal publishers which include areas in which “traditional” text-based “publishing” has only the slightest current experience and skills base, yet which will be vitally significant for the industry five years out. Both fit into a vision of scholarly communication, and the evolution of the publisher’s role away from primary publishing (which will become the prerogative of librarians and  repositories) and into workflow management solutions and the quality of content within process.

My two candidates for step-change status are:

1.  The evolution of video into an accompanying feature and then into the vital core reporting medium for scientific research reporting.

2.  The development of robust and auditable techniques for evaluating the impacts of content on research, creating measures for ROI in content purchasing, and fresh, searchable data derived from the evaluation of usage. This data, along with content metadata, will be more valuable to players in this market than the underlying content on which it rests.

Lets start with the first issue. I am fed up with being told that science and scientists are too dull or too complex for video. Too dull? Just go and play these two minutes of an interview with John Maynard Smith, the great biologist, on Vitek Tracz’s pioneering site Web of Stories (http://www.webofstories.com/play/7277?o=MS) and try to maintain that view. And this site has excellent metadata, as does the  video-based Journal of Visual Experimentation (JoVE) which announces its extension this month to covering experimental reporting in physics and engineering as well as the life sciences (www.jove.com/about/press-releases). Note that both of these sites set a premium upon narrative, and recall the narrative argument in my recent piece on next generation learning (After the Textbook is over... 3 June 2012) which was demonstrated in some wonderful transmedia software (http://www.inthetelling.com/tellit.html). Once again this demonstrates that video is quite capable of forming the narrative stem onto which links, citation, indexation, abstracts and other aids to discovery and navigation can be attached. Indeed, the text can be attached, along with demos and lectures and evidential data. Video file sharing is notoriously easy in the broadband world. Some academics will complain that they lack video story-telling skills, and this in turn may be something that publishers can add to the APC – as long as they acquire those skills themselves in time!

And then there is data. I have thundered on about evidential data and the importance of using the article as the routefinder that leads researchers to the statistics, or the findings, or the software used in the analysis. And we have all talked endlessly about metadata searching, about applications of Big Data in science and about data analytics. But I am moving to the view that we are crucially underplaying the importance of another sort of data, which we used to characterize as “usage data” and wonder whether it was going to become significantly exploitable. The CIBER team have long warned about the underuse of usage logs, but the force of the point has been increasingly brought home to me by an appreciation of what excellent data output can be derived from interfaces like Mendeley or ReadCube. We now begin to appreciate almost for the first time what usage patterns can be mapped – and what they mean. This is important for researchers, and vital for publishers. Users will rightly demand this form of data analysis, and will become increasingly interested in what, of the mass of data that they buy access to, is effective and cost-effective. This will start at the sharp end, in areas like drug discovery, but will grow into a habit of mind as data overload becomes ever more daunting. Concentrating purchasing policies on data that can be demonstrated to support improved decision making or better compliance or increased productivity will drive us to collect and analyse our user data to demonstrate that what we have makes a difference. And as we are driven this way we will get deeper and deeper into examining what users do with our data, and we will be surprised by how much we can track and know. And that knowledge will form another layer in our content stack, alongside metadata itself.

This game is already afoot in the biopharma sector. Eight weeks ago Relay Technology Management (http://relaytm.com) launched a “real-time Business Intelligence and Data Visualization Solution” for life sciences. Building on their RVI (Relative Value Index) formula, this new BD Live! (Business Development Live!) construction demonstrates some of the ways in which scientists and researchers in the future will want to have their data assets assessed – and the ROI of their purchases demonstrated. It is probably no accident then that Nature Publishing Group made an investment in Relay at the end of last year.


Comments

Name (required)

Email (required)

Website

Speak your mind

1 Comment so far

  1. P U B L I S H I N G » Blog Archive » Totally New Directions for Science Publishing on July 20, 2012 11:15

    [...] more: davidworlock.com [...]