We live in fevered times. What happens at the top cascades. This must be the explanation for why  revered colleagues like Richard Poynder and Kent Anderson are conducting Mueller – style enquiries into OA (Open Access). And they do make a splendidly contrasting pair of prosecutors, like some Alice in Wonderland trial where off-with-his-head is a paragraph summary, not a judgement. Richard (https://poynder.blogspot.com/2018/11/the-oa-interviews-frances-pinter.html). wants to get for-profit out of OA, presumably not planning to be around when the foundation money dries up and new technology investment is needed. Kent defends vigorously the right of academic authors to make money from their work for people other than themselves, and is busy, in the wonderful Geyser (thegeyser@substack.com) journal sniffing the dustbins of Zurich to find “collusion” between the Swiss Frontiers and the EU. Take a dash of Brexit, add some Trumpian bitters, the zest of rumour, shake well and pour into a scholarly communications sized glass. Perfect cocktail for the long winter nights. We should be grateful to them both. 

But perhaps we should not be too distracted. For me, the month since I last blogged on Plan S and got a full postbag of polite dissension, has been one of penitent reflection on the state of our new data-driven information marketplace as a whole. In the midst of this. Wellcome announced its Data Re-Use prize, which seems to me to exemplify much of the problem.  (https://wellcome.ac.uk/news/new-wellcome-data-re-use-prizes-help-unlock-value-research?utm_source=linkedin&utm_medium=o-wellcome&utm_campaign=). Our recognition of data has not properly moved on from our content years. The opportunities to merge, overlap, drill down through, mine together related data sets are huge. The ability to create new knowledge as a result has profound implications. But we are still on the nursery slopes when it comes to making real inroads into the issues, and while data and text mining techniques are evolving at speed, the licensing of access and the ownership of outcomes still pose real problems. We will not be a data driven society until sector data sources have agreed protocols on these issues. Too much data behind paywalls creates ongoing issues for owners as well as users. Unexploited data is valueless. 

It’s not as if we have collected all the data in the marketplace anyway. At this year’s NOAH conference in London at the beginning of the month I watched a trio of start-ups in the HR space present, and then realised that they were all using the same data collected differently. There has to be an easier way of pooling data in our society, ensuring privacy protection but also aligning clean resources for re-use using different analytics and market targets to create different service entities. Lets hope the Wellcome thinking is pervasive, but then my NOAH attention went elsewhere as I found myself in a fascinating conversation about a project which is re-utilising a line of content as data that has been gratuitously ignored. And in scholarly communication, one of the best ploughed fields on the data farm. 

Morressier, co-founded in Berlin by Sami Benchekroun, with whom I had the conversation, is a startling example of the cross-over utility of neglected data. With Justus Weweler, Sami has concerned himself with the indicative data you would need to give evaluated.

Progress reporting on early stage science. Posters, conference agendas, seminar announcements, links to slide sets – Morressier is exploring the hinterland of emerging science, enabling researchers and funders to gauge how advanced work programmes are and how they can Map the emerging terrain in which they work. Just when we imagined that every centimetre of the scholarly communication workflow had been fully covered, here comes a further chapter, full of real promise, whose angels include four of the smartest minds in scholarly information, morressier.com is clearly one to watch. 

And one to give us heart. There really are no sectors where data has been so eked out that no further possibilities, especially of adding value through recombination with other data, in fact, in my daily rounds, I usually find that the opposite is true. Marketing feedback data is still often held aloof from service data, few can get an object based view of how data is being consumed. And if this is true at the micro level in terms of feedback, events companies have been particularly profligate with data collection, assessment and re- use And while this is changing it still does not have the priority it needs. Calling user data “exhaust” does not help: we need a catalytic converter to make it effective when used with other data in a different context. 

When we have all the data and we are re-combining it effectively, we shall begin to see the real problems emerge. And they will not be the access and re-use issues of today, but the quality, disambiguation and  “fake” data problems we are all beginning to experience now and which will not go away, Industry co-operation will be even more needed, and some players will have to build a business model around quality control. The arrival of the data driven marketplace is not a press release, but a complex and difficult birth process.