Oct
13
The Road to Vathek
Filed Under B2B, Blog, internet, news media, online advertising, Publishing, STM, Uncategorized | 3 Comments
The road to anywhere starts from where you are, so I was not a bit surprised to find that the first item up on the agenda of the International STM conference last week alongside the Frankfurt Book Fair was fully aligned with the themes of the Outsell meeting the previous week. And as I ranted (now an over-used term) in my last blog about neutral platforms and the architecture of content delivery to a multiplicity of devices, so in turn I found myself listening to an organization that aspires to do just that. Accordingly, Stephen Dunn, Head of Technology Strategy at Guardian News and Media is my Player of the Week, and I shall watch out for his slides coming up on the STM site (http://www.stm-assoc.org/event.php?event_id=56) with great interest. He gave a clear description of how you use a neutral platform to respond to very different customer expectations, and I reflected that the Guardian’s problem is common to most of us. Different customer types in different cultures and geographies whose level of intensity of use is very varied and whose connection to the brand ranges from committed to peripheral. So when we got to Q&A I asked Stephen who gave the Guardian its brand mandate – the 300 thousand or so who read the paper or the 36 million online registrants who identify with something of the aura of moderate liberal politics exuded by its communications. In a sense this was a cheat, since I had once asked the editor in chief, Alan Rusbridger, the same question. He ducked and weaved but eventually came back to the newspaper and its mission. Stephen was much smarter: he coolly said that he didn’t know, leaving his audience with the feeling that it might have been divine, or at least, manifest destiny. The God of the Nineteenth Century sent the Scott family to earth with a purpose, and a right to perpetual existence to deliver it.
I wish, however, that I had asked the next question. When he was asked about the size and margins of the business that he was building in the ruins of a once profitable Manchester newspaper, the answer came back that it was a profitable business, though its margins were not dramatic, and that in revenue terms it was a smaller business than the newspaper at its apogee. This is a secret fear that we all feel. While we can demonstrate explosive examples of growth businesses created in the network, with very high margins, these start-ups disguise the fact that it remains very hard to point to real world media businesses that have fully migrated to the Web and grown at both top and bottom line. Now the Guardian is a charitable trust and its equivalent to the “Do no evil” incantation is “Make no losses”, since it is only the impact of long term losses which can sever the trusts mission to keep publishing, and in print. But for other, commercial players, the thought of being smaller in revenue terms, and not nearly as profitable as print was in its heyday, is a very dubious sort of salvation.
I had not meant to go on at length about this, but I sat there reflecting that this problem is widely faced in B2B, and in professional publishing fields like STM, and everywhere whenever someone is trying to extend brands from print to digital. It arises from the lower cost expectations of users around digital, whether that is in advertising or access, and it is fundamentally formed by the unspoken idea that content on a screen is somehow ephemeral unless it does something – enables an action, or a transaction. By contrast, print feels permanent, meaty, residual in value, even if it enables nothing. The problem is chronic in STM, and is present everytime a major publishing player negotiates with a university librarian.
So it was not a problem for my interviewee, Bob Massie, the President of Chemical Abstracts (American Chemical Society) when we reached the public interview item on the end of the agenda. Chemical Abstracts (CAS) is an example of a service born to be digital, but born a little early and briefly encased in print. Here was an example of service values which could produce growth, margins for re-investment and an unchallenged position in the industry through being the instrument of user willpower. No question about who wanted the comprehensive certainty of the CAS Registry of every searchable molecule known to Man. No doubt about who mandated the workflow modelling of SciFinder. As Bob put it, the industrial and research chemists had driven solutions to their content handling dilemma in the world of content overload, and there is chemistry in all scientific endeavour” which explains the growth (as does the geography, since Bob strongly reminded us that Asia is now the critical centre of research and usage).
Avoiding more of the Fair, I pushed on to Lisbon and a conference in the misty hills of Sintra. I had no copy of Vathek, William Beckford’s Gothic tale (actually an Arab horror story) written in these woods in 1787 and one of the key influences on the romanticism of the next century. But I was reading William Gibson’s new thriller, Zero History, which in its way is an exact echo of Beckford. One of its protagonists wants a piece of information that no one has ever been able to have. He calls it the “order flow”: “Its the aggregate of all the orders in the market. Everything anyone is about to buy or sell, all of it. Stocks, bonds, gold, anything. If I understood him, that information exists, at any given moment, but there is no aggregator. It exists, constantly, but is unknowable. If someone were able to aggregate that, the market would cease to be real”.
But maybe that is just it. The networks are not Publishing, an abstraction, but they are Life. Only when they we are able to replicate the forms and processes of our real lives in our virtual worlds will we get anywhere near the return that we seek.
Jul
8
Gribbling in the Dark
Filed Under Blog, Education, Industry Analysis, internet, Publishing, semantic web, STM, Thomson | Leave a Comment
So there was a word for it after all. Some kindly soul at a conference last week, seeing that I was unable to describe the strange digital burbling that took place when you dialled up a database in 1979 and inserted the telephone handset into the accoustic coupler, kindly shouted out the correct expression – the noise was “gribbling” and I was delighted to be reunited with a term which should never have been lost. And it allows me to remark, if I have not lost you already, that it is a mature industry whose terms of art, invented for a purpose, have now fallen into disuse because the processes they describe have become redundant. I expect to have to explain to my children how my typographer’s ruler works, or what slug setting, or galleys, or heavy leading or hot metal meant. The fact that the first generation digital expressions are already themselves redundant (who last saw an accoustic coupler?) tells an important story.
And that story is particularly relevant to the fascinating conference that I was attending. Last week’s seminar on “Ready for Web 3.0?” organized by ALPSP and chaired by Louise Tutton of Publishing Technologies was just what the doctor ordered in terms of curing us of the idea that we still have time to consider whether we embrace the semantic web or not. It is here, and in scholarly publishing terms it is becoming the default embedded value, the new plateau onto which we must all struggle in order to catch our breath while building the next level of value-add which forms the expectation of users coming to grips with a networked information society today. And from the scholarly world it will spread everywhere. I will put my own slides from the introductory scene-setting on this site, but if you can find any of the meaty exemplar presentations from ALPSP (it is worth joining them if they are going to do more sessions of this quality) or elsewhere then please review them carefully. They are worth it.
Particularly noteworthy was a talk by Professor Terri Attwood and Dr Steve Pettifer from the University of Manchester (how good to see a biochemistry informatician and a computer scientist sharing the same platform!). They spoke about Utopia Documents, a next generation document reader developed for the Biochemical Journal which identifies features in PDFs and semantically annotates them, seamlessly connecting documents to online data. All of a sudden we are emerging onto the semantic web stage with very practical and pragmatic demonstrations of the virtues of Linked Data. The message was very clear: go home and mark-up everything you have, for no one now knows what content will need to link to what in a web of increasing linkage universality and complexity. At the very least every one who considers themselves a publisher, and especially a science publisher, should read the review article by Attwood, Pettifer and their colleagues in Biochemical Journal (Calling International Rescue: Knowledge Lost in the Literature and information Landslide http://www.biochemj.org/bj/424/0317/bj4240317.htm) Incidentally, they cite Amos Bairoch and his reflections on Annotation in Nature Precedings (http://precedings.nature.com/documents/3092/version/1) and this is hugely useful if you can generalize from the problems of biocuration to the chaos that each of us faces in our own domains.
Two other aspects were intriguing. Utopia Documents had some funding from the European Commission, EPSRC, BBSRC, the University of Manchester and, above all, the BJ’s publisher, Portland Press. One expects the public bodies to do what they should be doing with the taxpayer’s cash: one respects a small publisher putting its money where its value is. And in another session, on the semantic web collaboration between the European Respiratory Society and the American Thoracic Society, called felicitously “Breathing Space”, we heard that the collaborators created some 30% of the citations in respiratory medicine, and that their work had the effect of “helping their authors towards greater visibility”. Since that is why the industry exists, it would seem that the semantic promise underpins the original publication promise. Publishers should be creating altars for the veneration of St Tim Berners Lee and dedicating devotions to the works Shadbolt and Hall, scholars of Southampton.
Sadly they are not, but coming out of this day of intense knowledge sharing one could not doubt that semantic web, aka Linked Data, had arrived and taken up residence these several years in scientific academe. Now if it will only bite government information and B2B then we shall be on our way. And, as Leigh Dodds of Talis reminded us, we shall have to learn a new language on that way. Alongside new friends like ontologies and entity recognition and RDF, add RDFa, SKOS (simple knowledge organizing systems to you!), XCRI education mark-up, OpenCalais (go to Thomson Reuters for more), triples, Facebook Open Graph, and Google Rich Snippets. Even that wonderful old hypertext heretic Ted Nelson got quoted later in the day: “Everything is deeply intertwingled”. And lets remember, this is not a “lets tackle these issues at our own pace when we think the market is ready” sort of problem: it is a “we are sinking under the weight of our own data and the lifeboat was needed yesterday” sort of a problem. Publishers must tackle it: if we learn how to resolve it without intermediaries then we certainly shall not need publishers.
« go back — keep looking »