If you are an STM publisher reading this, then it may already be too late for you to act decisively enough to put yourself in the vanguard of change. For I am not the first to say what I am about to say, and there is now a good literature based around the idea that the network is a world of small beginnings, followed by mass change at unprecedented rates that catch whole industries unawares. We are coming to one of those points, and my growing realization was triggered into certainty by being sent a link to a Harvard Business Review article from November 2010 (thank you, Alexander van Boetzelaar for making sure I saw this).  Since HBR as an old world publisher makes a business of paid-for reprints I cannot give the link, but it is reprint R1011B.

The article is called “The Next Scientific Revolution”, by Tony Hey, a director of Microsoft Research and one of the Fourth Paradigm people who made such an impact in 2009. Their arguments, pioneered by the late Jim Gray, saw scientific enquiry gathering force as the experimental methods of early Greece and China were subsumed into the modern theoretical science of the Newtonian age, and then carried forward through computation and simulation into the age of high performance computing in the last century. So now we stand on the verges of a fourth step , the ability to concentrate unprecedented quantities of data and apply to it data mining and analytics, that, unlike the rule-based enquiries of the previous period, are able to throw out unsuspected relationships and connections that in turn are the source of further enquiry.

All of this reminds me of Timo Hannay of Nature and his work with the Signalling Gateway consortium of cell science researchers based in San Diego. I am not sure how successful that was for all parties involved, and to an extent it does not matter (especially given the lead time in experience given to Nature by this work). To me this was a signal of something else: on the network the user will decide and make the revolutionary progress, and we “publishers” will have to be ready in an instant to follow, developing the service envelope in which users will be able to do what they need to do. At the moment we are all sitting around in STM talking about overpublishing, the impossibility of bench science absorbing the soaring output of research articles, or libraries to keep up on restricted budgets, when the real underlying problem we are not seeing is the fact that the evidence behind those articles is “unpublished” and unconcentrated, and that as the advanced data mining and analytics tools become increasingly available they have insufficient scale targets in terms of collected data.

Of course, there are big data collections available. And their usage and profitability is significant. Many are non-profit and some are quasi-monopolistic. But I see huge growth in this area, especially in physics, chemistry and the life sciences, to the point where “evidence aggregation and access management and quality control” is the name of the business, not journal publishing. Mr Hey comments in his article “Critically, too, most of us believe scientific publishing will change dramatically in the future.”  “We foresee the end product today – papers that discuss an experiment and its findings and just refer to datasets – morphing into a wrapper for the data themselves, which other researchers will be able to access directly over the internet, probe with their own questions, or even mash into their own datasets in creative ways that yield insights that the first researcher may never of dreamed of.”

What does “access directly” mean in this context? Well, it could mean that universities and researchers allow outside access to evidential data, but this poses other problems. Security and vetting loom large. Then again, evidential peer review may be a requirement – was the evidence created accurately, ethically or using reliable methodologies? Plenty of tasks for publishers here. Then again, can I hire tools to play in this sandpit? Is the unstructured content searchable, and is metadata consistent and reliable? These are all services “publishers” can offer, in a business model that attracts deposit fees for incoming data as well as usage fees. But there will be natural monopolies. It may be true, as Mr Hey claims, that “through data analysis scientists are zeroing in on a way to stop HIV in its tracks”, but how many human immunodeficientcy virus data stores can there be? Right, only one.

So the new high ground will have fewer players. A few of those will be survivors from the journal publishing years, and I hope one at least will have the decency to blush when recalling the pressure put on people like me, in my EPS days, to remove the ever-growing revenues of the science database industry (human genomics, geospatial, environmental, for the most part), from the STM definition since it was not “real” science publishing – and reduced their share-of-market figures! But then again, maybe they should look around them. Isn’t what is being described here exactly what LexisNexis are doing with Seisint and Choicepoint, or Thomson Reuters with Clearforest. And why? Because their users dictate that this shall be so. For the same reason this is endemic in patent enquiry: see my erstwhile colleague David Bousfield anatomizing this fascinatingly only last week (https://clients.outsellinc.com/insights/index.php?p=11416). And why have market-leading technology companies in this space – think of MarkLogic and their work on XML and the problems of unstructured data – made such an impact in recent years in media and government (aka intelligence)? I see a pattern, and if I am right, or even half right, it poses problems for those who do not see it.

I rest my case. Next Friday I shall do the Rupert Murdoch 80th birthday edition, for which I plan to bake a special cake!

This invocation, spoken by the referee to initiate a scrummage in the English game of Rugby Union, has been echoing through what remains of my mind in a weekend where, as ever in my place at Twickenham, I have watched England eke out a gritty victory against the French. American readers may jump a paragraph if they will at this point, or join with me in wonderment at the glories of time transfer (my son and I came home to analyse the game in painstaking detail using slow-mo on a previously recorded version, and then spent this afternoon joyously alternating between coverage of Scotland v Ireland from Edinburgh and India v England at cricket, live from Bangalore). Masters of Time and Distance, and only beholden to the Laws of Commerce (I was unable to see previous games in the US since ESPN hold the rights for excerpts and summaries!). But when this wonderful sporting escapism shrugs off the constraints of territoriality and becomes a live factor on my Tablet the dynamics of daily life change again (phone call from my daughter at university this evening: unable to complete essay because too much distraction on time-lapse internet TV). We must bear in mind that it was sports coverage (in the soon to end days of geographical exclusivity) as much as anything else that built the House of Murdoch, so this is no trivial subject matter. Nor is my concern that my children may never qualify for anything at all if they have to shrug off a barrage of media possibilities and temptations never made available to me.

And this is a futuristic conversation in another sense, and perhaps I should now make an alarming confession. I do not own an iPad. I can defend this and increasingly often have to do so, by saying that “I never buy before 6.0” (makes one sound smugly superior instead of poor and outdated) or, even more often “I am waiting for the HTC Flyer”! This usually throws the inquisitor off the scent – either he does not recognize the Taiwanese industry wunderkind, whose smartphone is so readily promoted by Google at present, or he has heard of the Flyer, due to launch later this year, and can debate with me on the merits of  having a 420 gramme machine (same weight as a paperback, half the weight of an iPad), on which you can draw or write as well as use touch screen access. By this time I have covered my tracks on the ownership issue, and we part agreeing on how clunky the iPad really is. Until next Wednesday, that is, when Apple unveil iPad 2.0 and the pressure mounts again for me to come aboard.

I have been having some excellent debates in recent weeks about this unrefereed scrummage which is technology innovation, and its impact on the rapidly moving world of  business and professional information. At the moment so many of our preconceptions are built around the consumer uses of the tablet world and around the access advantages that the devices provide in business and elsewhere for travellers, that we are not yet tuned into the impact that this mobile computing power could have on our workflow activities and the integration of still separate elements of our intranet and extranet worlds. In my view, carrying your connectivity on a Tablet will place renewed pressure on improvements in voice-text transliteration, and at last begin to move machine language translation from the esoteric to the standard. Words spoken will need to be stored and subject to textual analysis, as well as being copied to third parties. Documents exchanged will need to exist multi-lingually where necessary. And nothing will be stored that cannot also be heard. All of this will ready the tablet to its eventual role, as portended by the laptop releasing us from the desk, as the complete personal assistant – the PDA  comes to fruition at long last. Then I will discard my Blackberry, throw out the Netbook that loyally travels thousands of miles with me and embrace the Future. But, since you ask, I am currently at Pause, and not yet Engage.

Finally, some updating of previous efforts here. In the first instance it is always good to remind ourselves of past worlds and where we came from, and the trading statement of Yell, the yellow pages publisher who named itself after its online service but never really invested in it does that splendidly.  Pre-tax profits in the nine months to December were halved and revenue was 11.8 % off target. In its UK businesses print revenue went down 22.3% and online went up 1.8 %. Recovery is proving worse than recession. Like much of the newspaper world, this advertising sector is now dead wood, in my view. While it remains interesting to see who can recreate in digital services the hyperlocal environments that once gave rise to local newspaper publishing, the heirs to  classified advertising directories are now fully entrenched in network marketplaces. Time to write the history here.

And can the same be said of consumer book publishers? Not quite yet, perhaps, but since I wrote about Ms Amanda Hocking (26 year old care assistant from Minnesota with 4 paranormal romances in the USA Today bestseller list last month), other evidence of successful self-publishing comes to light. This time it is British thriller writer Stephen Leather. Although an established conventionally published author, his latest novellas were rejected by Hodder and Stoughton (Hachette) as being too short. So he published them, like Ms Hocking, on Amazon. One, a gritty everyday tale of a serial killer in New York, has topped the Amazon bestseller download list for 7 weeks, and his other works have been at the top, he estimates, for 90% of the past 3 months.  He claims in interviews to sell 2000 books a day, and to be earning £11,000 a month from this activity, but this is not what interests me most. Like Ms Hocking, his works are short, and sell for $0.99 /£0.70. I smell a trend – short enough and cheap enough to read on the train!  I don’t commute and don’t have an iPad, but I do see that survival as a publisher may mean moving one’s focus to where the buyers are going. Or is that just old-fashioned consultancy talk!

« go backkeep looking »