It has been a month of contrasts. Good solid results at Reed Elsevier have the market analysts demanding the sale of Lexis Legal: the chief break-up irritant at Bernstein can forecast a 20% increase in value as soon as it is done. Reed Elsevier now trades at a discount to last year’s valuation as well as to the wider quoted marketplace. And how does this come about? Simply by representing the company to analysts as a diversified investment portfolio, and then disappointing them with the results, which always prompts a demand for the sale of the weakest bit and the purchase of something stronger.

Meanwhile, over at Thomson Reuters, it has been a dynamic July in terms of forward progress. You can measure that in terms of acquisitions if you like, but to me the key element is the strategic positioning of these purchases and what they do to pursue the goal of market leadership in services and solutions for corporate finance, tax and regulatory, from banking and equity trading, law and tax/accountancy in practitioner terms right across to the desktop of the corporate finance, tax and legal department at the other. If the Thomson Reuters vision works out, it will connect up all of these functions and activities into a series of solutions which will compel big and then medium and small corporates into easier methods of information handling, and methods that get easier the more reliant they become on inter-related services and solutions from Thomson Reuters. This is about integration in the face of user need, about recognizing the primacy of the network, and about bringing one huge company with many specializations into focus on the issues of service and solution. This is not a diverse portfolio of disparate elements: it buys the bits it needs and sells the bits that do not fit, but the definition of acquisition has to do with whether these global aims are satisfied as well as whether the purchase makes financial sense and the required return.

Lets take a few July examples. The headline purchase was the acquisition of FX Alliance for $625 m. Here, then, are Thomson Reuters, a leading player in the sell-side interbank foreign exchange market, one of its traditional strengths, pulling in a major player from the bank-focused currency trading business for corporates, asset managers and hedge funds. Foreign exchange is a huge diversified marketplace, involving some $5 trillions of transactions per day, and this deal gives Thomson Reuters the ability to work in both the internal institutional markets and the corporate-facing external market, using electronic platforms and high speed trading techniques all the while.

By comparison, my next two examples are smaller in scale, but demonstrate other aspects of the process that is going on . Having written extensively about the launch of the Thomson Reuters GRC division – bringing legal and tax into focus with financial services in the areas of Governance, Regulation and Compliance, I want to mark the arrival of “Eikon for Compliance Management” with a special commendation. It seems to me that this now closes a huge loop, and provides a service environment which was never more urgently needed. It is said that there are now 60 new regulatory announcements a day from some 230 regulators and exchanges in financial markets, yet less than a third of traders report having any compliance training or update in the last 3 months. But to join up solutions for your customers you need to start with a joined-up company yourself.

My last example dates back to the experience of some 25 years as an external director on the international side of the Bureau of National Affairs, which has now disappeared into Bloomberg. An issue that intrigued me there, which attracted a great deal of attention and was crying out for a service solution, was Transfer Pricing. Boring? More likely stupifying! Here was an area that always demanded a software-based solution, since most tax lawyers and finance specialists were deeply reluctant to get entrapped in its intricacies, and access to someone who knew what they were talking about was rare and expensive. BNA produced great books on the subject, and so did WK and others. But Thomson Reuters has produced ONESOURCE Transfer Pricing, with an Analyser to get update on the compliance requirement for corporates who trade across borders and a Documenter and Benchmarking solution, ensuring that users have the right forms, and, vitally, ensuring that they are benchmarking against corporations whose solutions have already been accepted by the authorities. Here then is a vital but expensively neglected field of corporate activity, which reflects on much that Thomson Reuters is now about.

The final reflection is upon “platform”. At all levels Thomson Reuters is on a multiplicity of platforms, and while content integration and re-use has led to access being eased and common metadata standards evolved, this still clearly has a good way to go. And there is strength in this multiplicity – no one wants to interrupt the steady absorption of Eikon, now beginning to fulfill expectations, or damage the market primacy of WestlawNext. I expect, however, in the age of data, to see the continuing back-end integration of this very large  player’s systems to be a continuing theme. At the moment its greatest rival is a company, Bloomberg, who is swaddled in the limitations of a Victorian corset – the Terminal. That too will have to go, as Bloomberg limit their own future through an inability to get its new plays in law and government to sell to end users who do not want even a flat-priced, all you can eat deal on a  box originally built for traders. There is a midpoint, and Thomson Reuters’ migration looks like getting them there first.

It was inevitable that some readers of my piece on Gold OA earlier this week would come back and say that I have grown too fond of defining what won’t work and should become more proactive about stating the new directions. Fair cop. Here then are two “assertions” about that future  for science journal publishers which include areas in which “traditional” text-based “publishing” has only the slightest current experience and skills base, yet which will be vitally significant for the industry five years out. Both fit into a vision of scholarly communication, and the evolution of the publisher’s role away from primary publishing (which will become the prerogative of librarians and  repositories) and into workflow management solutions and the quality of content within process.

My two candidates for step-change status are:

1.  The evolution of video into an accompanying feature and then into the vital core reporting medium for scientific research reporting.

2.  The development of robust and auditable techniques for evaluating the impacts of content on research, creating measures for ROI in content purchasing, and fresh, searchable data derived from the evaluation of usage. This data, along with content metadata, will be more valuable to players in this market than the underlying content on which it rests.

Lets start with the first issue. I am fed up with being told that science and scientists are too dull or too complex for video. Too dull? Just go and play these two minutes of an interview with John Maynard Smith, the great biologist, on Vitek Tracz’s pioneering site Web of Stories (http://www.webofstories.com/play/7277?o=MS) and try to maintain that view. And this site has excellent metadata, as does the  video-based Journal of Visual Experimentation (JoVE) which announces its extension this month to covering experimental reporting in physics and engineering as well as the life sciences (www.jove.com/about/press-releases). Note that both of these sites set a premium upon narrative, and recall the narrative argument in my recent piece on next generation learning (After the Textbook is over... 3 June 2012) which was demonstrated in some wonderful transmedia software (http://www.inthetelling.com/tellit.html). Once again this demonstrates that video is quite capable of forming the narrative stem onto which links, citation, indexation, abstracts and other aids to discovery and navigation can be attached. Indeed, the text can be attached, along with demos and lectures and evidential data. Video file sharing is notoriously easy in the broadband world. Some academics will complain that they lack video story-telling skills, and this in turn may be something that publishers can add to the APC – as long as they acquire those skills themselves in time!

And then there is data. I have thundered on about evidential data and the importance of using the article as the routefinder that leads researchers to the statistics, or the findings, or the software used in the analysis. And we have all talked endlessly about metadata searching, about applications of Big Data in science and about data analytics. But I am moving to the view that we are crucially underplaying the importance of another sort of data, which we used to characterize as “usage data” and wonder whether it was going to become significantly exploitable. The CIBER team have long warned about the underuse of usage logs, but the force of the point has been increasingly brought home to me by an appreciation of what excellent data output can be derived from interfaces like Mendeley or ReadCube. We now begin to appreciate almost for the first time what usage patterns can be mapped – and what they mean. This is important for researchers, and vital for publishers. Users will rightly demand this form of data analysis, and will become increasingly interested in what, of the mass of data that they buy access to, is effective and cost-effective. This will start at the sharp end, in areas like drug discovery, but will grow into a habit of mind as data overload becomes ever more daunting. Concentrating purchasing policies on data that can be demonstrated to support improved decision making or better compliance or increased productivity will drive us to collect and analyse our user data to demonstrate that what we have makes a difference. And as we are driven this way we will get deeper and deeper into examining what users do with our data, and we will be surprised by how much we can track and know. And that knowledge will form another layer in our content stack, alongside metadata itself.

This game is already afoot in the biopharma sector. Eight weeks ago Relay Technology Management (http://relaytm.com) launched a “real-time Business Intelligence and Data Visualization Solution” for life sciences. Building on their RVI (Relative Value Index) formula, this new BD Live! (Business Development Live!) construction demonstrates some of the ways in which scientists and researchers in the future will want to have their data assets assessed – and the ROI of their purchases demonstrated. It is probably no accident then that Nature Publishing Group made an investment in Relay at the end of last year.

« go backkeep looking »