I blinked at today’s announcement with incredulity. Neilsen Expositions sold to private equity for $950m? (http://www.followmag.com/2013). Where does this madness end? Since 2008 we have been living, in traditional B2B markets, with the reality of the network. We have all talked increasingly confidently about the irreversible decline of advertising in print, and our inability to replace it in a satisfactory way online. We have talked of companies getting smaller – but more profitable – and we have talked about the future in terms of creating workflow solutions for our customers, using our data to create these service solutions for them, and using our metadata as the sandbox of new product development to build applications that really bind customers to us. The opportunity is now open to us to effectively lead our markets into the future, basing our claim to our clients squarely on the proposition that we can improve their productivity (and thus cut costs), and enhance their decision-making by getting all the salient knowledge into the right framework at the right time, while protecting their backs against the thorn hedge of re-regulation that encroaches the post-recession world. This is a wonderful opportunity, and how good it is to see Thomson Reuters, Reed Business, Lexis Risk and others getting fully to grip with it.

Meanwhile, how sad it is to see the old B2B players in Europe dodging the inevitable. While Schibsteds and Axel Springer in the declining newspaper market now make a fetish of collecting B2B classifieds services (Reed sold Total Jobs to the latter very shrewdly), mainstream B2B in the UK, outside of the market leaders mentioned, seems to have something of a collective death wish at the moment. Like Gaul, EMAP is in three parts, each of them unsalable as they stand. The data section is too diverse, the exhibitions is too small, and the magazines too unprofitable. Over at UBM, they now talk the language of exhibitions and conferences as if it was the golden hope. B2B at Informa remains a collection of fragmented and unrelated businesses, which was how management wanted things historically, but now ignores the need to centre on data, and play the combined strengths of all the data into the key markets you want to grow. And if Datamonitor does not provide a rich way of enhancing service values across the group then what does? Meanwhile Incisive and Haymarket seem to groan for solutions, while only Centaur amongst the smaller players seems to have woken up and smelt the coffee.

I am reciting this doleful catalogue as a way of steeling myself for this week’s PPA Conference in London. What would make me most happy is hearing someone say – “Yes, we are re-investing our events portfolio with a transformative agreement with a software partner. The object is to build readership into virtual events, extending our conferences and exhibitions into year-long happenings, open 24/7. Yes, we know we have to give attendees at real events more – find out what they want, research and book meetings for them etc, while giving exhibitors a better deal, client introductions and profiles, and a year long follow-up with new product releases and regular contact. Yes, we know that, even if it almost too late, we need to build community urgently before we finally lose the chance, and we know that conference delegates, exhibition attendees and exhibitors all want a better deal. Not several better deals – just the one will be good enough”.

I was once, briefly, non-executive chairman of an events software company. I know that rapid development has taken place to assemble data, match buyers and sellers, set up itineraries and update core data holdings with key changes year by year. And I go to about 15 conferences and exhibitions each year, but have yet to be asked who I wanted to meet, or what I wanted to realise from the experience. Afterwards, however, I am deluged with surveys about what I accomplished and how good the show was. This seems to me to be quite upside down. Like most of my fellow citizens, I am well-known in the network: find me on LinkedIn or Twitter and you could even guess, from my friends and contacts, who else I might like to meet. UBM bought the rights to reality-failed COMDEX, and launched a virtual exhibition in November 2012. It attracted an audience that seemed to please UBM, but on the website I see no mention of a 2013 edition, or even of a web presence continuing from the last effort. And last year’s registration asked none of the questions that might be thought relevant to using the meeting effectively. Yet, as I have mentioned here before, if virtual reality is cheap enough to teach language learners spoken English proficiency (www.rendezvu.com) then it will surely sustain the 5000 visitors and 50 exhibitors that came last year. Or will it just slip away, just as London’s Online show has slipped back into a library conference in the hands of Incisive.

So I am worried by what I will find at the PPA. Meanwhile, virtual reality is being used intensively in other places – particularly in the cash-starved museums and art galleries of Europe. Maybe our publishing directors should organize an outing to the local resource to see how its done!

A sudden thought. Doing an interview with some consultants yesterday (we are fast approaching the season when some major STM assets will come back into the marketplace) I was asked where I had estimated Open Access would be now when I had advised the House of Commons Science and Technology Committee back in 2007 on the likely penetration of this form of article publishing. Around 25%, I answered. Well, responded the gleeful young PhD student on the end of the telephone, our researches show it to be between 5-7%. Now, I am not afraid of being wrong (like most forecasters, I have plenty of experience of it!). But it is good to know why and I suspect that I have been writing about those reasons for the last two years. Open Access, defined around the historic debate twixt Green and Gold, when Quixote Harnad tilted at publishers waving their arms like windmills, is most definitely over. Open is not, if by that we begin to define what we mean by Open Data, or indeed Open Science. But Open Access is now open access.

In part this reflects the changing role of the Article. Once the place of publisher solace as the importance of low impact journals declined, it is now the vital source of the things that make science tick – metadata, data, abstracting, cross-referencing, citation, and the rest. It is now in danger of becoming the rapid act at the beginning of the process which initiates the absorption of new findings into the body of science. Indeed some scientists (Signalling Gateway provided examples years ago) prefer simply to have their findings cited – or release their data for scrutiny by their colleagues. Dr Donald Cooper of the University of Colorado, Boulder, used F1000Research to publish a summary of data collected in a study that investigated the effect of ion channels on reward behavior in mice .In response to public referee comments he emphasized that he published his data set in F1000Research “to quickly share some of our ongoing behavioral data sets in order to encourage collaboration with others in the field”. (http://f1000.com/resources/Open-Science-Announcement.pdf)

I have already indicated how important I think post-publication peer review will be in all of this. So let me now propose a four-stage Open Science “publication process” for your consideration:

1. Research team assembles the paper, using Endnote or another process tool of choice, but working in XML. They then make this available on the research programme or university repository, alongside the evidential data derived from the work.

2. They then submit it to F1000 or one of its nascent competitors for peer review at a fee of $1000. This review, over a period defined by them, will throw up queries, even corrections and edits, as well as opinion rating the worth of the work as a contribution to science.

3. Depending upon the worth of the work, it will be submitted/selected for inclusion in Nature, Cell, Science or one of the top flight branded journals. These will form an Athenaeum of top science, and continue to confer all of the career-enhancing prestige that they do today. There will be no other journals.

4. However, the people we used to call publishers and the academics we used to call their reviewers will continue to collect articles from open sources for inclusion in their database collections. Here they will do entity extraction and other semantic analysis to make what they will claim as the classic environments which each specialist researcher needs to have online, while providing search tools to enable users to search here, or here plus all of the linked data available on the repositories where the original article was published – or search here, on the data, and on all other articles plus data that have been post-publication reviewed anywhere. They will become the Masters of Metadata, or they will become extinct. This is where, I feel, the entity or knowledge stores that I described recently at Wiley are headed. This is where old-style publishing gets embedded into the workflow of science.

So here is a model for Open Science that removes copyright in favour of CC licenses, gives scope for “publishers” to move upstream in the value chain, and to increasingly compete in the data and enhanced workflow environments where their end-users now live. The collaboration and investment announced two months ago between Nature and Frontiers (www.frontiersin.org), the very fast growing Swiss open access publisher seems to me to offer clues about the collaborative nature of this future. And Macmillan Digital Science’s deal on data with SciBite is another collaborative environment heading in this direction. And in all truth, we are all now surrounded by experimentation and the tools to create more. TEMIS, the French data analytics practice, has an established base in STM (interestingly their US competitor, AlchemyAPI, seems to work most in press and PR analysis). But if you need evidence of what is happening here, then go to www.programmableweb.com and look at the listings of science research APIs. A new one this month is BioMortar API “standardized packages of genetic patterns encoded to generate disparate biological functions”. We are at the edge of my knowledge here, but I bet this is a metadata game. Or ScholarlyIQ, a package to help publishers and librarians sort out what their COUNTER stats mean (endorsed by AIP), or ReegleTagging API, designed for the auto-tagging of clean energy research, or, indeed, OpenScience API, Nature Publishing’s own open access point to searching its own data.

And one thing I forgot. Some decades ago, I was privileged to watch one of the great STM publishers of this or any age, Dr Ivan Klimes, as he constructed Rapid Communications of Oxford. Then our theme was speed. In a world where conventional article publishing could take two years, by using a revolutionary technology called fax to work with remote reviewers, he could do it in four months. Dr Sam Gandy, an Alzheimer’s researcher, is quoted by F1000 as saying that his paper was published in 32 hours, and they point out that 35% of their articles take less than 4 days from submission to publication. As I prepare to stop writing this and press “publish” to instantly release it, I cannot fail to note that immediacy may be just as important as anything else for some researchers – and their readers.

« go backkeep looking »