Last week’s blog on growing STM sparked some debate, mostly around the realization that if existing powerful publishers who work in research article publishing drive business development towards the workflow of researchers and towards the implications of data analysis and visualization, then current business models may collapse in time – and much else with them. For many publishers this may be expressed as a struggle to “buy time” – prop up the existing business model while culturing the new one. And a part of that culture change is almost never discussed publicly in STM circles. It is the skills changes that will have to take place as publishers (article and journal vendors) have to move from satisfying the generalized, “Big Deal” based procurement requirements of powerful research librarians, library network tsars and institutional information management to coping with the market of many, the precise needs of a researcher, a project or a department. While I have not been to a library-based meeting in over a decade without someone raising the plight of disintermediated librarians, no one seems worried by the idea that the front part of the publishing pantomime horse is innovating in development terms a sequence of product and market shifts that the sales and marketing back legs have no clue how to sell.

This thought struck particularly forcefully this week when I read an article (http://editorsupdate.elsevier.com/issue-39-june-2013/how-to-handle-digital-content/) posted on the Elsevier website on 26 June. Ten years ago I played a brief part as a judge in a competition called The Article of the Future (AotF) organized by Elsevier’s David Marques. Then our aspiration was simply to make the article a born digital environment, not a digitized print artifact. A decade later that is triumphantly achieved, and now the question is the other way about: the article is something which can only exist digitally, and may never again be satisfactorily “printed”. The value in an article, for Elsevier, can only be revealed inside ScienceDirect. The data cursor and interactive plot viewer that enables you to look at the author’s data points will only be available there. Presentation which puts the article in a central pane with a column of navigation on the left and references and tools on the right will be the way to view an article there. Here Kitware SAS have installed a 3D molecular viewer and 3D archaeological viewer – authors upload the model as a supplementary file, and the service then copes with both ribbons and “balls-and-sticks” modelling. A neuroscience 3D imaging package follows.

Then there are the Executable Papers. Just as f1000 has been insisting that data files must accompany articles where relevant, so Elsevier has been experimenting with the journal “Computers & Graphics”. One of the things an article was always intended to do but never managed on paper was to “achieve the full reproducibility of key scientific findings”. Here is a dream of scholarly communication coming closer. Then add some tools: Elsevier show an interactive (Google) map viewer, a chemical compound viewer, interactive phylogenetic trees, and MATLAB figures. And here at last are simple links to connect articles with data held in data repositories, and alongside them links to a PubChem Compound viewer that they have built jointly with the National Center for Biotechnology and Information. Finally, for authors publishing in this AotF format, why not add some AudioSlides? Here, in a voice file with some slides, you can add introduce the concepts and add your own view, outside of the article itself but attached to it, on why this may be important. If article publishing is researcher marketing, this must be a great advance.

So here we have scholarly communication back in the hands of scholars, in the context of wholly digital networked exchanges. With f1000 now creating a logic for post-publication peer review, we can envisage the complete disappearance of the second and third tier journals, with the high brand top journals selecting their articles as post-initial publication edited versions, reflecting some of the feedback and adding more data and supplementary information. In some fields the data and its modelling and the researcher conclusions will stand alone as citable “papers”. The Big Deal argument collapses, as it already threatens to do, into a discussion on database access, and Open Access (more of a threat to librarians than publishers). While publishers use the added value digital article game as a way of bridging the move into workflow markets, they need to know that this is a temporary bridge: in less than five years what seems futuristic today about the Article of the Future will be part of the desktop toolset of every scientist preparing an article for initial publication in his own repository. The emphasis then, and the business of many who call themselves publishers now, will be on selling those tools, creating the services that integrate content in to the context of the research enquiry, enabling the retention and cross-referencing of knowledge, and tracking the benefits – and costs – of lines of research. The race to the Electronic Lab Manual and its successors was never more apparent.

And in this world where are the Research Librarians? One cannot argue with those who point out that important roles of preservation need to be tackled, or that research teams, departments and individuals will all need support and advice. But if those tasks are information management roles within the research team, supported and funded just like the publication of articles, then the infrastructure of buildings and people and budgets surrounding the word “library” may become an anachronism. Publishers who see this as a major release of resources may be tempted to rejoice. Those in sales and marketing who loved the years of brokering “Big Deals” may cry, for the world that beckons requires them to do what every other digital marketplace has had to do, often with limited success: understand the working lives of ultimate end-user customers with an understanding of how they might save time and trouble in the search for greater productivity, better decision making, and improved compliance with research benchmarks and good practise.

Two events this week turn us back towards this perennial question. One is the purchase of Springer by BC Partners after a prolonged affair and a lover’s tiff which forced the price up a notch to $4.4 billion, but left 10% of the equity in the hands of the sellers. The other is the latest set of results from Wiley, covering the fourth quarter and thus the complete picture in 2013. While Wiley is the larger company, by virtue of its major presence in education markets, the two are very comparable in size terms in the science, technology and medical sectors. Both have STM units of plus or minus a billion dollars, and both have STM market shares of around 3% each. And they have another shared characteristic: neither of them is showing much by way of organic topline growth, and there are some very good reasons for this. Global recession and library budget cuts do not suggest growth, and nor do the consequent falls in book and journal purchasing. But both companies have gone digital to the extent that print declines are largely offset by eBook and eJournal supply, though often at lower revenues (and greater margin). This again does not indicate growth, but confirms a view of settled publishing environments in fairly stable markets with high margins: the impression that they like to give, and which analysts and investors like to believe. But underneath the surface, I believe that these markets are now boiling over with activity, and that both of these companies, and all of their peers, now face challenging growth targets if they are to deliver to private equity investors and shareholders real growth in returns in recovering economies, as well as investing in retooling for a digital data age.

In the first instance, digital transition from print is now over. Nothing more marks the point than the news this month that Elsevier the sector leader in journals, is to outsource its eJournal transaction completion to Atypon. This then is just a cost, and each player will seek opportunities to drive it as low as possible. Journal articles are getting commoditized and will be universally available before long, so this is not a growth area. Wiley’s growth in its Research sector – its new name for STM, was -1% in FY13. It is hard to imagine that Springer was more than low single digit, and indeed it is possible that the industry average is less than 2%. Given current constraints on price increases (between 5 and 9 % across the sector) and it is easy to imagine that we are suffering a market contraction. Yet private equity investors cannot do financial restructuring all the time and shareholders expect dividend growth as markets come back. So what are the growth strategies which will deliver that?

It seems to me that there are two current hopes for sustainable long term growth. Neither will be new to these two companies, since in a number of ways both of them are experimenting here. Both involve investment, but in both cases the investments will lead directly to productivity gains, and to the possibility of very rapid new product development. Neither is a long stretch beyond the current managerial capacity of these players, since both have strong and capable technology strength. The key question is whether they are flexible enough in managerial terms to embrace a future beyond the formats and business models on which they were reared and upon which they have grown comfortable in historical times. The two directions both rely upon the data they already hold, and data which they can obtain by alliance and joint venturing with third parties. They can be described as the development of workflow tools for the processes of research on the one hand, and the building of analytical tools and datasets/knowledge stores for researchers on the other. In order to play here the publishers will need a data platform which allows the cross-filing and searching of content-as-data, and ways of developing search in this context on both structured and unstructured files. They will be pushing the envelope on metadata development, imposing text enrichment disciplines to increase the value of their content, building extensive triples stores, and using their expertise as a draw for researchers to deposit experimental/evidential data with them, as well as publishing their articles. And, having decided their niches, they will be collaborating with other publisher data-holders, sourcing Open Data deposits and turning themselves into a part of the research value chain itself. When peer review gives way to PPPR (post-publication peer review) their grip on the “barrier to publication” cycle, in which the publisher-managed peer review is necessary for the researcher to enter the market, will be broken anyway.

So what will these new products and services look like? Well, both of these players already know something about that. Springer has successfully re-platformed on the widely-used MarkLogic system, which creates a completely different data-handling opportunity and is widely used in the sector. And Springer has form in the researcher workflow market through its recent purchase of Papers, the Dutch article production software (Mekentosj BV). Likewise, Wiley have made real strides in developing knowledge stores in support of their chemistry browser project and in response to the strength of their chemistry list (as noted here already). But these instances are swallows, not summer. There has to consistent and sustained development to create batteries of data services in chosen sectors, and the data enrichment must be widespread, not experimental. The workflow tools will include some acquisitions, but will reflect a great deal of home grown learning as many publishers discover, for the first time, what the eventual user (not the library intermediary) does for a living – and how he can be helped and supported by data-charged service modules which will become as essential to his view of research as, well, journals once were. The real issue, then, is not technology: it is the mindset to forge a new business out of the old, with end-users, not buyers, and with data, not pre-formatted reporting, at its core. It sounds like a choice, but it isn’t really. Growth in real terms is the key to survival. It is time to start thinking again about how we satisfy markets, and investors.

« go backkeep looking »