May
4
A Stroll in the Tiergarten
Filed Under Big Data, Blog, eBook, eLearning, Industry Analysis, internet, mobile content, Publishing, social media, STM, Uncategorized | 1 Comment
It was almost May. The asparagus is just arriving and the rhubarb at its best. This can only be the backdrop for the annual Publishers Forum in Berlin, now celebrating its 12th year and consistently performing as the focus for publishing discussion in central Europe, and celebrating the global view Europeans now take of publishing in all its forms and marketplaces. This show is put on by Klopotek for the industry it serves, which is a service that its industry should appreciate With some 260 delegates from Germany and central Europe, that appreciation certainly seems to be in place. This year’s theme “How to Reconstruct Publishing: Competing Visions, Channels and Audiences”, was the first under the direction of Dr Ruediger Wischenbart, but was as typically challenging as ever. A real debate about where we are going is still hard to find.
In a typically stirring piece in Scholarly Kitchen this week Joe Esposito (http://scholarlykitchen.sspnet.org/2015/05/04/the-half-life-of-print/) made the point that whenever we debate the future of publishing someone stands up and asks about the future of the book. I agree with him, and I find this as annoying and pointless as he does. Quite apart from the fact that print has disappeared in very many contexts in society, the digitally networked world releases us from this fruitless debate by the promise of being able to deliver anything to anyone at the point of use in their preferred medium. Ergo, print will survive where people value it and disapear entirely where they do not – yellow pages, trade maggazines, academic journals, newspapers…? Well, you see what I mean. Joe makes the point that digital publishing has not yet been kind to coffee table artbooks, so I was interested to hear Rolf Grisebach, CEO at Thames and Hudson, give one of the opening keynotes in Berlin.
His not-unreasonable argument turned on the large file size and lack of a decisive advantage in image viewing that digital currently offers users of art books. In last weeks’ piece in this place I pointed to the virtual reality benefits of displaying architecture online, as practiced by the New York Times. I would like art publishing that allowed me to focus on the eyes of the artist and then move me through a slide show of Rembrandt’s self-portraits in chronological order. I would like a virtual reality tour of Christopher Wren. I have the Waste Land app on my iPad and I am a customer for new approaches to valuing art, literature, architecture and music in a digital age. Here I think we can do more, though I was very grateful to Rolf for re-awakening memories of his company founder, Walter Neurath, and for reminding me that the company is named for its two founding cities, London and New York.
In some ways there was more comfort for the progressives in the next keynote, from Jacob Dalborg, the CEO of Bonnier Books. Here was an integrated vision which sounded like an investible business plan on the one hand, while stressing the way the digital world makes marketing to niches more potentially profitable than ever before. Any session that hammers home the need to build and exploit metadata and expand metadata values must be of prime importance today. With global standard expertise on the agenda (Graham Bell, Director of Editeur) this conference could hardly be accused of ducking the issue, but I still feel that we see this as “marketing utilities” and it always gets sidelined when we talk “creativity”. Well, if you want to create markets there is no more important subject, and it was good to see Jacob Dalborg underlining it.
This conference does bilingual brilliantly, but it also does breakout sessions that create wonderful debate but mean I lose some agenda items. Thus I really wanted to hear Publishing goes Pop: instead I moderated a session with a small group in which a very valuable discussion took place. Across the table was an Open Access STM publisher from Poland and a consumer publishing marketing executive from Germany. The others at the table were left to listen as these two set out to demonstrate the parallels in their very different specialities and effectively draw together the themes of the conference. This was the antidote to any idea that publishing is pulling apart. Indeed, at the end of this I was convinced that the digital network is helping publishing of all types re-focus on the user, and services to the user, in a way that in the world of physically formatted publishing we could only pay lip service.
And of course we had some technology, but it is now noticeable that we do not talk “tech” to these audiences at all. Matt Turner, CTO at MarkLogic, talks about flexibility, about speed of new product generation, and, in this agenda, putting content and context into action. It remains a surprize to many of us that publishers seem to set so much value on creative content, understandably, while according such reduced value to the contextual data about customers and how they use content in general, and their own content in particular. Meanwhile, Steve Odart of IXXUS moved us into a consideration of how we run our businesses and how we innovate when he took the Agile project management philosophy away from tech and into business as a way of working creatively in digital marketplaces.
Two days and we did not even get a stroll in the park – though perhaps that was what we enjoyed in the sort of company which is thinking seriously, not about the book, but about where publishing goes now.
Mar
16
Data is a Commodity, Analytics is not a Solution
Filed Under B2B, Big Data, Blog, data analytics, Financial services, Industry Analysis, internet, Publishing, Search, Thomson, Uncategorized, Workflow | 1 Comment
This will only get worse. The latest announcement from the Thomson Reuters GFMS service, the premier data analytics environment around gold and silver, indicates that their Copper commodity service on Eikon now moves from mining company to mine by mine performance. “It all adds another data-rich layer of fundamental research to our customers’ copper market analyses” says their head of research. And there, in that line, we have a “fundamental” issue that lies behind the torrent of announcements we see in the B2B sector at the moment. Think only of Verisk buying Wood Mackenzie last week at a price which went well beyond the expectations (17X ebitda) of counter bidders like McGraw Hill, and which shocked private equity players who relish the data sector but find it hard to imagine 12X as an exceedable multiple. The question is this: Risk management and due diligence are vital market drivers, but they are data-insatiable; any and all data that casts a light on risk must be included in the process; it is the analysis, especially predictive analytics, which adds the value; so who will own the analytics – the data companies, the market intermediaries (Thomson Reuters, Bloomberg etc), or the end user customers?
Those of us who come from the content-driven world – they were out in force at Briefing Media’s splendid Digital Media Strategies event last week in London – find this understandably hard to argue, but our biggest single threat is commoditization. Even more than technology disruption, to which it is closely related, data commoditization expresses the antithesis of those things upon which the content world’s values were built. When I first began developing information services, in pre-internet dial-up Britain, we spoke lovingly of “proprietary data”, and value was expressed in intellectual property that we owned and which no one else had. For five years I fought alongside colleagues to obtain an EU directive on the “Legal Protection of Databases”, so it is in a sense discouraging to see the ways things have gone. But it is now becoming very clear, to me at least, that the value does not lie in the accumulation of the data, it lies in the analytics derived from it, and even more in the application of those analytics within the workflow of a user company as a solution. Thus if I have the largest database of cowhide availability and quality on the planet I now face clear and present danger. However near comprehensive my data may be, and whatever price I can get now in the leather industry, I am going to be under attack in value terms from two directions: very small suppliers of marginal data on things like the effect of insect pests on animal hides, whose data is capable of rocking prices in markets that rely on my data as their base commodity; and the analytics players who buy my data under licence but who resell the meaning of my data to third parties, my former end users, at a price level that I can only dream about. And those data analytics players, be they Bloomberg (who in some ways kicked off this acquisition frenzy five years ago when they bought Michael Liebrich’s New Energy Finance company) or others, must look over their shoulders in fear of the day when the analytics solutions become an end user App.
So can the data holding company fight back? Yes, of course, the market is littered with examples. In some ways the entire game of indexation, whereby the data company creates an indicative index as a benchmark for pricing or other data movement (and as a brand statement) was an attempt to do just that. Some data companies have invested heavily in their own sophisticated analytics, though there are real difficulties here: moving from that type of indicative analytics to predictive analysis which is shaped as a solution to a specific trader’s needs has been very hard. Much easier was the game of supplying analysed data back to the markets from which it originated. Thus the data created by Platts or Argus Media and the indexation applied to it has wonderful value to Aramco when pricing or assessing competitive risk. But in the oil trading markets themselves, where the risk is missing something that someone else noted, analysts have to look at everything, and tune it to their own dealing positions. Solutions are changing all the time and rapid customization is the order of the day.
Back out on the blasted heath which once was B2B magazine publishing, I kept meeting publishers at DMS who said “Well, we are data publishers now”. I wonder if they really understand quite what has happened. Most of their “data” can be collected in half an hour on the Open Web. There is more data in their domains free on DBpedia or Open Data sources than they have collected in a lifetime of magazine production. And even if they come up with a “must have” file that everyone needs, that market is now closing into a licensing opportunity, with prices effectively controlled, for the moment, by those people who control the analytics engines and the solution vending. Which brings me back to Verisk and the huge mystery of that extravagant pricing. Verisk obviously felt that its analytics would be improved in market appearance by the highly respectable Wood Mackenzie brand. Yet if a data corner shop, let alone Platts or Argus Media, were to produce reporting and data that contradicted Wood Mackenzie, anyone doing due diligence on their due diligence would surely demand that Verisk acquire the dissenting data and add that to the mix? If data really is a commodity business, far better to be a user than an owner.
« go back — keep looking »