We who dare to call ourselves “analysts” of information marketplaces must bear a heavy burden of guilt. As we analyse we are forced to categorize, and as we categorize we over-simplify, and as we over-simplify we construct truly skewed pictures of market activity, sometimes seized upon by sectoral groups to make sectional points. One frustrating exemplar of this for me was always academic spending, as industry strategists sought on the one hand to get a grip on what content access universities bought, while many publishers sought to narrow the focus to books and journals, thus omitting entirely the huge sums spent on data and flattering their market share in a smaller “marketplace”. Similar contortions exist around education and training. I observe that almost every publisher I know is definably “educational” (a context, not a category), and, of course, every vertical marketplace  devotes considerable amounts to education in formal and informal contexts. This implies a huge marketplace which is really hard to categorize, so we define it by identifying a sub-group of players whose interests are wholly or mostly educational, and let them stand as a proxy. This may serve for some trend analysis purposes, but it is essentially arbitrary. I was very amused yesterday when one of our finest UK educational publishers said to me that she could discern market share moves “from the EPC figures”. The EPC (Educational Publishers Council), an offshoot of the UK’s Publishers Association, collects data from its members, and thus records the sales of school books and software from sources that define themselves as publishers. Thus it forms a small proportion of school spending on educational content, and one whose trend line may or may not reflect buying patterns in schools. Turn to training and the problem is magnified  many times: trainers and corporates buy what suits them whatever the source, and increasingly training and assessment environments are built into information products and services that might ostensibly be about a wholly different process or workflow.

For these reasons amongst others I read Joe Esposito’s two pieces on “Creating a New University Press” on Scholarly Kitchen (http://scholarlykitchen.sspnet.org/author/jesposito/) this week with great relish. Much of what he says applies to starting any form of new “press” (glorious anachronism there!). He clearly feels that what he describes as a “library publishing” model is viable, and I can see that this would make sense of things like institutional repositories, which are surely currently collections without purpose for the most part. But I am wholly on his side when he says that the New University Press “sits outside of any particular institutions, is borne-digital, avoids areas where established publishers have staked out territory, experiments with publishing forms and distribution channels and is NFP (not for profit).” And his examples – PLoS, JSTOR, the Humanities eBook programme at ACLS, and OCLC – while having market impact on spending, are very hard to categorize and may or may not be in anyone’s categorization scheme. Yet they can certainly be categorized amongst the “university presses of the future”, and in some instances, like PLoSPlus, could have a real trend impact on market development and the future of cherished publishing shibboleths like peer review.

And then again, Joe’s piece caught my attention in another sense I have been reading an excellent review article on Linked Data co-authored by Tom Heath of Talis. When Joe pinged Morgan and Claypool (http://www.morganclaypool.com/) as an example of what he was talking about, I found that they were the publishers of this piece. I also found that what I had was part of a really interesting borne digital publishing programme, creating sector reviews in a wide range of academic disciplines as Synergies and Colloquium, and bundling them for license to faculties or individual students. This does not need to be peer-reviewed – the publisher process and the reputation of the contributors does that – so the expense is mostly in formatting and editing – and overwhelmingly in marketing. So, having smugly discovered something new to me, courtesy of Joe, I commenced this blog, and my first research established that this company was founded in 2002, so from the aftermath of Dot Com Bust  they are coming up to 10 years old. That’s a millennium in Gutenburg terms, and I have no doubt that they are still not inside anyone’s statistical net. We may be approaching the time when the new markets that we do not measure are as large as the old ones that we do.

Which is not to say for a moment that old publishers are not innovative. A favourite of mine over the years has been Nature (Macmillan) an offspring of the Victorian love affair with Science. Their event this week was the launch (http://www.nature.com/nature_education/biology.html) of a $45 Principles of Biology digital “textbook”. Real innovation here: this is a rental access model, they will update and the pricing is entirely different from the existing print market. And this venture has a strategic partner in California State University. Partner and anchor client (but at least one Californian university group still loves Nature!) Downside (in my view): we still have to call this wonderful product built for an iPad world a “Textbook” when it is really a learning experience – and one which also allows teacher monitoring. Cal State will block purchase and include the cost in tuition fees – and will that get into the market statistics at any point?

So, said my fine educational publisher friend, digital still doesn’t seem to be having much impact on educational marketplaces. To which I must respond, “It depends what you are measuring, but I am certain sure that wherever you look there is unmeasured digital education – in everything.”

A wise man said that “Content without Technology is lame; Technology without Content is Blind”. Einstein was working his way towards this conclusion, but it was in fact Timo Hannay of Macmillan/Digital Science who came out with this formulation during this week’s ePublishing Innovations Forum at the IET in London (http://www.epublishing-forum.com/). Incisive Media, who also do the Online conference and exhibition at Olympia in December, having been doing this Spring meeting for four years, and I have been their privileged chairman for each. So I know the sea change of the past half decade, I know that change just gets quicker, and I know that Timo is fundamentally right and is one of only a handful who are doing something about it. I also see that “publishing”, if it is useful to retain the term, is almost redefined everytime we hold this meeting, and that the players making strides in solutioning (ugly term), collaboration and community seem to be mining the seams that have revenues and margins embedded in them.

The conference contained several beautifully worked case studies. Take Timo as an example. His themes are about knowledge discovery, research management and software tools (http://www.digital-science.com/). The ability today to read chemical names and turn them into chemical structures and use them to cross search literature and patent databases is a beautiful expression of what we mean when we say that we have to produce solutions that reduce costs and increase productivity. Tomorrow we will want to take this, and his ability to track and map research patterns and structures, and his investments in experiment and project management systems and roll them into career duration, compliance required Electronic Lab Manuals (ELN). Then a few of us will sit down over a beer and reflect that Elsevier sold the ELN market leader, MDL, almost a decade ago. The circularity of markets is only a wonder to those who have been swept full circle several times!

Then lets take David Craig, who came to the microphone to announce that his Thomson Reuters GRC (Governance, Risk and Compliance) division (http://accelus.thomsonreuters.com/) had the day previously finalized the acquisition of World-Check (said on the New York grapevine to be a $530m dollar deal), and was now pushing hard towards the content integration and software services needed to flesh out the complete solutioning picture around regulatory compliance in all its phases. He too speaks the language of collaboration, and now appears to prefer the term “community” to “workflow”. And the distinction is interesting and not an idle one. He does not want to build content-injected process models for the individual corporate units that severally and separately do compliance. He wants to do corporate engines that unite functions to get results, so that he is not tied to the future fortunes of compliance officers or finance departments or auditors or corporate counsel or tax advisers, but provides structures in which they all participate, share content and create outcomes. And if that argues for a different culture in the fully networked corporation, he also sees content creation and sharing between corporates, professionals and othe participants (especially regulators) which allows risk information to be shared rapidly in the network. Again, the high ground is becoming a universal solution which is so widely plugged in that unplugging threatens the health of the participants themselves.

And then take Donal Smith. The CEO of Data Explorers (http://www.dataexplorers.com/) defined what happens to this type of process in the completely satisfying niche. He showed us how certain types of unregulated content must be collected and analysed to keep markets safe from themselves. In this case the content concerns contracts to “borrow” equity against future equity movements – the activity known as “shorting”. Markets must know what proportion of a company’s equity is already committed, so Data Explorers is a venture of necessity, using user-generated content to create indices which allow markets to work efficiently. Its operating principles are ubiquity and non-exclusivity. Process? Collaboration? Its all here.

I could go on. I loved the energy in the education sector, with Cambridge University Press and Global Grid for Learning using similar models in the workload of teachers, and Microsoft, in the guise of David Langridge, their education partnerships director, coming from the other to position the new Office 365 as the vehicle for content integration in schools. And I am aware that by stopping here I ignore many excellent presentations that followed parallel themes. We did interviews and panels which enabled participants to see these trends at work. We looked at the future of the newspaper with Julian Sambles of the Telegraph and the future of the eBook with Tim Cooper of Harlequin (Mills and Boon). Adriana Lukas, coming from the user side as an advisor to major players like Johnson and Johnson, caused a run on the bar by exploring the powerful virtues of five widely used ad-blockers during the opening of her examination of social media as marketing. Elsewhere we discussed the importance of metadata and even paradata (could be my new word!) and finally Geoff Metzger of Superdu brought us down to earth by revealing marketing technology in a box – how to create instant web presence (without waiting for the IT department) to promote books and services. Back to earth, and back to books, in a voyage that began with Kate Worlock, for Outsell, defining the global marketplace, its growth, strengths and weaknesses and some of these key trends. I can now tell you how it feels to introduce one’s own daughter as a keynote speaker (Wonderful!!).

And so much more that I must apologize to those who I have omitted. I wandered away from the IET (Institute of Engineering and Technology, appropriately enough) no longer wondering why they changed their name from Institute of Electrical Engineers. Its the technology, stupid. And now we cannot do without it.

keep looking »