Do you hear that slightly soggy, slushy sound, followed by a low moan? That is the sound of a whole industry falling on its sword. The signs are everywhere. Here is W H Smith in the UK, banning self-published books because it cannot see which ones are pornographic and which are not (http://goodereader.com/blog/electronic-readers/whsmith-boycotts-self-published-authors). Remember when booksellers used to be able to read? And over here, in my favoured B2B zone, I came across a suggestion in the building and construction information marketplace last week that, to quote my correspondent “it is not the publishers job to produce data for automated building processes; builders have to learn that for themselves”!

As you may imagine, this lit my blue touch paper. In the user-centric world of the network, it will probably become a capital crime to tell the user what he should do – but for me it will always be a criminal offence to stop short, or define the publishing role so that it stops short, of going to the very last point of user satisfaction. Our obligation in every instance must surely be to create the ultimate in end user satisfaction in order to prevent third parties from inserting themselves between us and our customers, and removing our value-add leverage. I do not much mind if in the process we become a content- to -software company – or even a software company – since I see less risk in this than in the removal of our direct-to-client relationships and our ability to raise prices and margins through value add and enhanced client satisfaction. And I refuse to believe that the construction industry, as it slowly unfreezes from recession in Europe, is any different from any other vertical market. I have talked in these columns about the aircraft industry, the workflow of science, medical diagnostic systems, legal practice work-engines and similar developments in the auto industry and elsewhere.

So you cannot find a vertical market without finding automation of basic functions, workflow modelling, data analysis and predictive analytics, and machine-to-machine communications. And when I made my phone call I was looking at the German and Nordic construction markets, since there has been interest recently in Docu Group, a major player comprised of the former Bau Verlag companies of Springer, with the ByggFacta companies that I well recall from Thomson. And I was surprized by how little reaction there has been amongst the European construction information services to the arrival of BIM. And my call was intended to check out whether this was true as well in the UK. It appears that it was, and that no one was very willing to invest in doing anything about it.

This may be linked to the deep recession in the building industry, and it is certainly does not mean that the companies concerned do not have the data. While they remain publishers of magazines, for some curious reason, all the players I looked at – EMAP, Bau, (former) UBM ,Byggfacta – all had familiar collections of data on new building starts, on materials and labour pricing, and on regulation, and good services for bid-monitoring. They showed no sign of collecting public or third party data, enriching it or building knowledge stores of any sort. And yet if you look at the US websites of Hanley Wood, Reed Construction or the McGraw-Hill services you see an acute awareness of the importance of predictive data modelling and an anxiety to help customers use it. This I find almost completely absent in Europe. Yet in a networked society the architect, the builder, the contractor, the engineer, and the property developer are working cheek-by-jowl. It is in everyone’s interest to ensure, through predictive data modelling, that what is planned will work, within its budget, before physical work begins. And we are not short in Europe of institutions like the Institute for Applied Building Informatics in Munich to tell us how to do this. So why not?

My conclusion at the end of my phone call was that many in the information industry went as far as “research” and then stopped dead. It is almost as if by assembling data and allowing users to search it, the publishers satisfied all processing requirements, and that the role of the industry, at least in Europe, stops there. Well, said my colleague, AutoCAD will do all that when its needed, and the big global players will build their own, and the small builders aren’t interested and anyway our events are where we make our money and we are going to concentrate on those. And much of this may be true (indeed, Bau runs excellent technical conferences that train builders to use innovations that they themselves play no part in providing). But to me it all sounds like elements in a collective death wish. If AutoCAD are to do it (and I have not a clue as to where their ambitions lie), who will be their partners and marketeers in European countries. Who can supply the data attached to trusted brands?

There can be no doubt that the European construction industry will revive. Negotiating the streets in central London closed by construction work last week it was hard not to think that it had not done so already. But will the European construction information industry recover in time to be any use other than as legacy data providers? And while this appears to me to be acute in Construction, is it also the case in other vertical markets? Recovery depends upon re-investment and new partnerships. Where are they? And if they are not in place, then those trying to sell untransformed B2B players with ancient lineage but no leverage may find the prices offered hugely disappointing.

The great Book Messe is the autumn opener for me. As showers and brown leaves gust across the huge fairground, I seized the hour for bier and knackwurst, and contemplated the future in the light of a kindly young woman having stood up on the Pendelbus to allow my ancient self, lame of leg and rheumy of eye, to sit down. Having concluded that this was evidence that the human race has hope, I had another bier.

Yet despite this blip, the highlight of the fair for me was not a book or a party, though plenty of both were in evidence, but an interesting conversation with a group who really knew what they were talking about at one of the International Friday conference sessions. Here, in the Production stream (how very strange it now seems to call data part of “Production”!) we did a session on Discoverability and Metadata. As speakers we had Jason Markos, Director of Knowledge Management at Wiley to get us started, followed by Timo Hannay, CEO of Macmillan Digital Science; Dr Sven Fund, CEO of De Gruyter; and, to keep us technologically honest, Andreas Blumauer, CEO of Vienna’s Semantic Web company. So, a mass of talent from whom came massive elucidation of what I take to be a critical developmental issue for STM today and the rest of the information marketplace tomorrow. The problem of knowledge. The problem that when we have solved the knowledge problem, will we ex-publishing groundlings still be needed?

Jason got us afloat in a very admirable way. As we move from a world of documents and segments of former documents (book, journal and article moving to lower levels of granularity – abstract, reference, citation) – so we eventually recognize that entity extraction and text enrichment become ways of interconnecting thoughts and ideas horizontally in knowledge structures that represent the discovery of new insights that were not effectively available in the years when we were searching documents for word matches. Once we are underlining meaning with a range of possibilities and allowing semantic web analysis to use knowledge of meaning in context to illuminate underlying thinking (along with what is not on the page but is implied by what we have read or written), then we are into a Knowledge game which moves past content and beyond data and into some very new territory.

Companies like Wiley and Macmillan and Elsevier and Springer will exploit this very effectively using their own content. In disciplines like chemistry, building knowledge stores and allowing researchers to archive both effective and failed discovery work will become commonplace. Extended metadata will take us beyond the descriptive towards recording analytics and following knowledge pathways. People like Timo will create the knowledge – based infrastructure that allows this to become a part of the workflow of science. Sven will keep our feet on the ground by ensuring that we do not try to sell concepts before users are ready, and Andreas will help us to master the disciplines of the semantic web – and then, just as I was padding round the audience with a microphone picking up some really interesting questions, our little theatre was over, we could strut and fret no more and the audience could escape from Frankfurt’s economy drive of the year – no wifi in conference spaces!

So I was left on the Pendelbus and under the biergarten tarpaulin to ponder the impact of all of this. In the self-publishing future, when scholars publish straight to figstore and F1000 does post-publication peer review, the data to be organized will have to be collected. Indeed current Open Access has already begun the fragmentation. As knowledge structures grow, some scholars will demand that, except in extreme circumstances, they will never see primary text, but work only on advanced, knowledge-infused metadata. Further, that metadata will have to cover everything relevant to the topic. Will the lions and lambs of Elsevier and Wiley, Springer and Macmillan, lie down with each other and co-operate, or will external metadata harvesting become a new business, over the heads of primary content players. And will it be professional bodies like ACS or RCS who do this – or technology players? Knowing where everything is, being able to visualize its relationships with everything else, and being able to organize universal knowledge searching may be a different business from the historical publishing/information model. And the metadata that matters in this context? Who owns it may be a matter of the context in which it is collected. Certainly owning the underlying content would seem to give that owner very few rights to ownership of metadata derived from enquiries and research that included it, yet here I predict future sturm and drang as publishers seek to own and control the extension of metadata into the knowledge domain. And if these are autumnal topics, what can winter be like when it comes?

« go backkeep looking »