Since I last wrote a piece here I am older by three conferences and an exhibition. And no wiser for having spoken twice on cyber-security, a subject that baffles me every time I stand up to talk about it. The simple truth is that the world is changing in the networks at a pace that bewilders, yet the visions we have of where we are going hang before us like a tantalising but currently unattainable vision. Thus, if you ask me about the future of education, I can spin you a glowing tale of individuals learning individually, at their own pace, yet guided by the learning journey layer out by their teachers, who have now become their mentors. The journey is self-diagnostic and self-assessing, examinations have become redundant and we know what everyone knows and where their primary skills lie. Or in academic or industrial research, projects are driven by results, research teams recruited on that basis, and their reputation is scored in terms of the value their peers set on their accomplishments. The results of research are logged and cited in ways that make them accessible to fellow researchers in aligned fields – by loading and pointing to evidential data, or noting results and referencing them on specialized or community sites, or by conventional research reporting. Peer review is continual, as research remains valid until it is invalidated and may rise and fall in popularity more than once. And so on through business domains, medicine and healthcare, agriculture and the whole range of human activity…

But at this point, when I talk about the growing commonality of vision, the role of workflow analysis, RPA, what happens next with machine learning, the eventual promise of AI, a hand shoots up and I find myself answering questions from the ex-CFO/now CEO about next years budget, and when will the existing IT investment pay back, and can this all be outsourced and surely we don’t need to do any more than buy the future when it arrives? And of course these questions are all very pertinent. We all need to assure revenues and margins next year if we are to see any part of this future. And next years revenues will come from products and services which will look more like last years than they do like the things we shall be doing in 2025, even if we had an idea of what those might be. It is one thing knowing something about the horizons, quite another to design a map to get there. So at every point we seek every way we can to buttress future-proofing, and at the moment I am seeing a spate of that in acquisitions. Just as last year putting the word “Analytics” at the end of your name (Clarivate, Trevino) added a billion to the exit valuation, so this year the dotai suffix has proved to be a real M&A draw.

But those big Analytics sales were made, and will be onsold to people who want to expand their data and services holdings. The .ai sales are transplants from the seedbed, and far earlier stages of transplantation are involved. Having worked for some years as an advisor to Quayle Munro (now, as an element of Houlihan Lokey, part of one of the largest global M&A outfits) I realise that smaller and smaller sales may not be considered a good thing, but I cannot resist the idea that seeking some future tech developments into your incubator environment is going to have some really beneficial long term effects. It already has at Digital Science. As Clarivate lerans from what Kopernio knows it will help . As the magic of Wizdom.ai rubs off on T&F, it will help there.

But, again, we are begging a hundred questions. Can you really future proof by buying innovation? Well, only to a limited effect, but by having innovators inside you can learn a lot, at least from their different perspective on your existing customers. Don’t you need to keep them from being crushed by the managerial bureaucracy of the rest of the business? Yes, but why not try to fee up the arthritic bits rather than treating the flexible bits? What if you have bought the wrong future tech? Even the act of misbuying will give you useful pointers the next time round, but if you have bought the right people they will be able to change direction. What if software people and text publishing people do not get on? They will need to be managed – this is your test – since if we fail the future will be conditioned entirely by software giants licensing data from residual fixed income publishers.

Are there any conditioning steps I should be taking to ease into this future? Yes, forget ease and go faster. Look first at your own workflow. To what extent is workflow automated? Do you have optimum ways of processing text? Are people or machines taking the big burdens on proof reading, or desk editing or manuscript preparation? Is your marketing as digital as it could be? Are you talking the language of services, and designing solutions for your users, or are you giving your users reference sources and expecting them to find the answers? Indeed, do you talk the language of solutions, or the ritual language of format – book, journal, page, article. Are we part of the world our users are entering, or are we stuck in the world they are exiting?  The exhibition I attended this month was the London Book Fair. I love it in all its inward-looking entrancement with itself, and its love affair with the title Publisher, the profession for which no qualification other than skill at explaining away unsuccess has ever been required. I can only take one day since I rapidly become depressed. But still there were very sparky moments – an impromptu discussion with the Chennai computer typesetter TNQ (www.tnq.co.in) about their ProofControl 3.0 service told me that these guys are on the ball. But moments like this were rare. More often I felt I was watching the future –  of the industry in 1945!

I am a passionate fan of the World of Open. Without a network ethic of Open we will strangle the world of connectivity with restriction and force ourselves into re-regulating just those things which we have so recently decongested. Yet not all the things we want to designate as Open will be open just because we put the word in front of the previous descriptor. In most instances being Open does not necessarily do the trick on its own – someone has to add another level of value to change Open meaning “available” to Open meaning “useful”. So, the argument that Open Access journals are an appropriate response to the needs of researchers and to a wider public who should be enjoying unfettered access to the fruits of taxpayer-funded research would seem to be a given. But creating real benefits beyond such general statements of Good seems very hard, and the fact that researchers cannot see tangible benefits beyond occupying the moral high ground probably connects with the grindingly slow advance of Open Access to around a quarter of the market in a decade.

I feel much the same about Open Citations. Reading the latest Initiative for Open Citations report at i4Oc.org, I find really good things:

“The present scholarly communication system inadequately exposes the knowledge networks that already exist within our literature. Citation data are not usually freely available to access, they are often subject to inconsistent, hard-to-parse licenses, and they are usually not machine-readable.”

And I see an impressive number of members, naturally not including either Clarivate or Elsevier. Yet using the argument above I would say that either of these is most likely to add real value to Open Citations, and certainly more likely than most of the members of the club. What we have here, all the time, is a constant effort to try to emulate a world which has largely now passed by, and we do it by trying to build value propositions from wholly owned artifacts or data elements, thus turning them into proprietory systems. This urge to monopoly is clearly being superseded: what has happened is that the valuation systems by which markets and investors measure value has not caught up with the way users acknowledge value.

Outside of the worlds of research and scholarly communication it seems clear that the most impressive thing you can say about the world of data is “Use it and lose it”. The commoditization of data as content is evident everywhere. The point about data is not Big Data – a once prominent slogan that has now diminished into extinction – but actionable data. The point is not collecting all the data into one place – it can stay wherever it rests as long as we can analyse and utilise it. The point is the level of analytical insight we can achieve from the data available, and this has much to do with our analytics, which is were the real value lies. Apply those proprietory analytics back into the workflow of a particular sector – the launch music around Artifacts in Healthcare in Cambridge MA was ver notceable last week – and value is achieved for an information system. And one day, outside of copyright and patents, and before we get to the P&L account, someone will work out how we index those values and measure the worth of a great deal of the start-up activity around us.

So from this viewpoint the press release of the week came from Clarivate Analytics and did not concern Open at all directly. It concerned a very old-fashioned value indeed – Brand. If the world of scholarly communication is really about creating a reputation marketplace, the ISI, Eugene Garfield’s original vehicle for establishing citation indexing from which to promulgate the mighty Impact Factor, is the historical definition point of the market in scholarly reputation. By refounding it and relaunching it, Clarivate seem to me to be not just saying how much the market needs that sort of research right now, but to be aiming at the critical value adding role: using the myriad of data available to redefine measurement of reputation in research. In many ways Open Citations will assist that, but the future will be multi-metric, the balance of elements in the analytics will be under greater scrutiny than ever before, and ISI will need to carry the whole marketplace with them to achieve a result. That is why you need a research institute, not just a scoring system. And even then the work will need a research institute to keep it in constant revision – unlike the impact factor the next measure will have to be developed over time, and keep developing so that it cannot be influenced or “gamed”. In the sense I have been using it here, ISI becomes the analytical engine sitting on top of all of the available but rapidly commoditising research data.

We have come very quickly from a world where owning copyrights in articles and defending that ownership was important, to this position of commoditized content and data as a precursor to analysis. But we must still be prepared for further shortening of the links and cycle re-adjustments. Citations of evidential data, and citations on Findings-as-data without article publishing will become a flood rather than the trickle it is now. Add in the vast swathes of second tier data from article publishing in India, China or Brazil. Use analytics not just for reputational assessment, but also for trend analysis, repeat experiment verification and clinical trials validation. We stand in a new place, and a re-engineered ISI is just what we need beside us.

 

« go backkeep looking »