No, I did not go to STM in Frankfurt. Or ToC for that matter. I spent the day in bed, instead, trying to clear an infection and raging at hotels and their procedures. Big American chain hotels. Run by Germans. “We cannot check you in without you paying for all of the room occupancy in advance”. “But I am a long-standing member of your loyalty scheme…” “It doesn’t matter – the new rule is cash in advance before you get a key…” “Now my key has stopped working…”. “You can get a new one if you show me photo ID…”  “But it is locked into the room… .” “No excuse, all Germans are expected to carry photo ID at all times…” “Mr Manager, I want to be catered for as an individual with a good credit record in your hotels…” “Sir, I am simply following our rules and policies.” The sooner this man is replaced by some workflow software the better, in my view – since he cannot vary any procedure then this would be a logical step. In the meanwhile I vent my spleen on TripAdvisor and ponder the wonder of Elsevier and the Article of the Future.

Yesterday Elsevier Science Direct unveiled a new web HTML version of the articles held in Science Direct. And having once played a tiny role as a judge in the unfolding Article of the Future story I have a sympathy for what they are about, and a conviction that they are as close to the money as anyone in taking scholarly communication in science to the next step. Here is what they say their new article format will achieve:

Now please read back through that again carefully. The first is a no brainer. Good market research companies like Euromonitor have been facilitating re-use in this way for some time, and I saw a great application in a different discipline only last week. We should be asking why only now and why Elsevier are the first. The second is more specific, and removes a long-standing annoyance. But it is the third and fourth items which really get me going. Searching horizontally across articles in several different disciplines and moving seamlessly between references and citations is becoming key to the dream of a rational discovery system for science. Science will have its Big Data solutions alright (though very few current publishers will be participants, it seems). But the survival hope for the so-called “journal publishers” is sure that the article will come to be seen as a viewer, or a vorspeisen as we would say in this hotel room. In this vision the article becomes the way of entering the enquiry at one point and then moving horizontally, using references, citations and contextual information, across sub-disciplines and into fields of conjectural interest. Yes, we shall have more effectiveness from the semantic search, and, yes, our advanced taxonomies and ontological structures will do much of the back-breaking stuff. But where the quirky human mind of the researcher is the search engine, we shall want the article to be linked to the relevant evidential data, to the blogs and the posters and the proceedings and the powerpoints and the minutiae of scholarly research, just as we shall want the invaluable navigational aid of A&I to stop us from wasting time with what does not merit attention, and review articles to help us see what others have seen before us.

So we need the Article of the Future. And as the article adds more diverse content by way of embedded video, more images, manipulable graphs, links to evidential data or attached programming, we need it urgently. But it is an emerging standard, so how many do we need? Well, only one. As Elsevier remark, this is and always will be work in progress. But it is work in everyone’s progress. If I have subscriptions with Springer, Sage, Wiley Blackwell and Elsevier, to name but a few, will I not want all articles to be manipulable and cross-searchable in every way, not just cross-referenced in Google? So is this not a point where industry collaboration in the face of overwhelming user dissonance might for once pay dividends?

I have just reread that sentence with a sinking feeling. This industry seldom collaborates. What could happen as a result is, if Elsevier are strategically adroit and make their Article format Open, that users adopt it for their own repositories, loaded onto vehicles like Digital Science Figstore. Then users can launch a search from Science Direct and get everything. Of course, as long as “publishing” equals “tenure and research rating” people will still seek journal publication, but those links are fragile and may collapse under their own weight. Research through searching threatens to become another matter, divorced from the publication cycle. If that happens there will be few survivors in current publishing, but Elsevier at least should be one. Everyone else needs to think hard about collaboration, or plan for business diversification for all they are worth.

My quote of the week? “The solutions will come when science goes mobile in research terms – then someone will have to step in to configure the device and all these publishers will simply be rewarded with royalties for their content contribution to a solution they did not make and cannot control.”

Sounds a bit like Apple meets STM!

As I read the headline my head filled with Frank Sinatra and Celeste Holm.”Who wants to ride behind a libre chevaux / Do I want? No, sir”. But the headline covered a story that every attendee at last week’s PPA (Periodical Publishers Association) Business Media Digital Publishing Conference in London should have been forced to read and repeat before taking a seat. It would have been the antidote to the head-nodding, comfort rag clutching, industry-at-prayer stuff that went on at the beginning. With an agenda stuffed with new business models, change agents, catalytic and galvanizing case studies, we had to approach the subject of digital publishing by bleating, in chorus, that we were safe from “migration” and its perils if we had “really good content” and that nothing bad would happen to us if this really good content was what our clients really, really wanted. And to this I can only say, in my normal and mild-mannered way: “Rubbish”!

So lets take a deep breath and go back a step. The article on earning a million was sent to me by my daughter as a follow-up to both of us quoting the performance of the web service www.teacherspayteachers.com. Here US teachers deposit their learning plans and receive royalties on their re-use. Deanna Jump, a kindergarten teacher from Georgia, has become the first teacher to earn a million dollars in royalties. The site has had 50 million page views in the last 30 days and teachers post over 800 resources a day. The site has a rival (not mentioned in the article) in the shape of the UK periodical Times Educational Supplement, which has adopted this business model and teamed up with the leading US teachers union in a jv, while exploiting a global market from London. As advertising retreats the TES has executed a wonderful transition: not migrating so much as re-inventing itself in close alignment to what its readers needed to be better teachers. Indeed, in some ways this is re-inventing the textbook as much as the magazine, but whatever it is the outcome is the same: understanding how users work and supplying (in this case user-generated) content in the right context and with the right interface is the new publishing.

So what got my goat? That morning I had flown in from a wonderful cultural and artistic exploration of Georgia and Armenia, landed at dawn and rushed straight to the conference centre to chair my session. Slightly light-headed with the joy of thinking about new things for a couple of weeks, and happy at being on time, I settled into my seat to listen to Duncan Painter, CEO at Top Right (aka EMAP), being interviewed. I have always understood him to be a shrewd and intelligent man, and if I was a critic of dividing EMAP horizontally into three so that events and publishing could be sold, that was purely because I thought that GMG and Apax could make more by selling the verticals. Here he skirted the fact that two thirds of the outfit was for sale, praised his publishers (no doubt deservedly), and launched into the old “as long as we have our great content we are safe” nostalgia of the last century.

Duncan gave a view of publishing that would have been a safe compliment to his hosts around 1991. Everything will be OK if you have quality content because people will always want that. For a dreadful moment I thought he would say “content is king” but he checked himself. Now I know that he was once at Experian, a superbly successful company who add value to what is becoming increasingly commoditized content – company information. So much so that a later speaker, Damian Kimmelman, at Duedil (the PPA’s Newcomer company of the year from the previous night’s awards ceremony!) values it so highly that he gives it away. He then adds value with his data connections, his contextualization of third party content – and will no doubt do so more effectively as semantic search intensifies the re-exploitation of this material in new contexts and through new connections.

Duncan’s theme was echoed in the next session by Rupert Turnbull, the Publisher at Wired UK. Again we had the classic Publisher stance – the world will beat a path to your door if the content is good enough. Ironically, here he fingers a critical problem of our times. “Good Enough” content very often supplants “the best”. The story of Google could be written in these terms. In Q&A I protested: I use Wired not because of its articles but because I can flick through and find new leads, using it like a check list of change and noting what trends are now seen as popular enough to get this sort of treatment. Do I value it as content? Not really: like the Economist and the Tatler it is consumer publishing created to support high price display advertising by parading what a certain social grouping all think they should know.

Then Chris Flook of ICIS spoke, quietly but to great effect, and I was at once back in the world of real understanding of how users behave and how solutions can be grafted together for them. In my own session Hilary Lambert of Thomson Reuters spoke convincingly about how you re-framed access for lawyers on mobile, and Sean Howe of IHS Janes pointed to a revolution in visualization – effectively summarizing and linking this content in a way that presented it more effectively within the user requirement. Greg Kilminster of Gambling Compliance demonstrated that the compliance environment wraps around gaming as effectively as financial services, and Jonathan Morris, fresh from the triumphs of DataExplorers, launched into a whole spate of interesting start-ups that I shall come to anon. In short, this was a really interesting conference, with some fine speakers and some really fresh ideas about business models and about understanding customers. It was summed up for me by Jan Reichelt, co-founder of Mendeley, who has built a brilliant business by helping scientists understand the content they have already, and learn what they need by understanding how their peers behave. In other words, his content is content about the performance of users in the science labs of the world – content created by the network which may be more valuable than the underlying content itself.

Once upon a time content in this industry was the reworked press releases that kept the advertising apart on the printed page. It was never valuable and it isn’t now. What is valuable is a deep understanding of what users need in order to better accomplish their work – and a determination to build technology and content into contexts that make improvements that people will pay for and where they will deposit their own content as well. So please, organizers, can we at last stop beginning conferences with the Hymn to Content? And ban the word “migration” Re-invention is what we need!

« go back