…or the capital of civilisation reaches the capital of Flanders. For those of us who have been many times through Lille by train but never stopped to look, this was a very pleasant surprise on many fronts. And if, like me, you were checking into the 19th Fiesole Retreat, a unique conference which brings librarians, academics and publishers together to communicate in a group small enough to allow that to happen and large enough to be representative, there was double pleasure. Vieux Lille is fascinating, and the city has the second great art collection in France, laid out with huge imagination at the Beaux Arts. This edition of Fiesole, as ever meticulously managed by the Casalini team working with the Charleston Conference, was hosted by Julien Roche, director of the brilliant LILLIAD learning centre and innovation on the new university campus, which housed the retreat.

As soon as I arrive at a Fiesole Retreat I wonder why other conferences do not have this feel. During the days of the Retreat this really does feel like peers explaining to peers how all this new digital stuff is working out in academic life. The opening session was named “Linked (Open) data – Big Data” and reminded me at once of why I really enjoy these meetings – whatever the questions raised there is a chance here to develop your own agenda and pursue it in discussion at breaks and lunches with people who are unlikely to share your background and the limitations of your experience with experts from both the French and German national libraries on the roster we were bound to get differences of approach. What i found rather unexpected was the unanimity around the basic concepts of a data driven research world, and the underlying, central importance of text and data mining in sustaining that world. And as the concepts build from the experience in the room, one realises the gulf between the world into which Retreat members are emerging and the one from which they are departing. Between a world where licensing text and data mining is still non-standard, and where the corpus of knowledge can be searched in a single sweep, where barriers of ownership and control and location frustrate at every turn.

I have found this regularly happens to me at Retreat meetings. Once a theme has become apparent to my mind, I find it recurring in every subsequent session. The agenda went on to consider Reshaping Collection Development for 2025, but the issues that grabbed me came from Laurent Romary from Switzerland discussing “How to open up Digital Libraries for Digital Scholars”. Similarly when the session on “The Changing Scholarly Communication Ecosystem” came along, absorbing sessions from Jayne Marks (Sage) and Bas Straub (Konvertus) began to sharpen my view on the sustainability of current academic publishing practice. Anna Lunden from the Swedish National Library, describing the huge effort they have made to accommodate Open Access in one country alone, and then Frank Smith of JStor addressed the comparative poverty of the Open Books effort, despite Knowledge Unlatched, And then Michael Keller, librarian at Stanford, summed up in his crisp and masterful way. If Stanford spend $2.1 million on APCs this year then the argument about Open Access begins to collapse as cheap, effective publishing software turns every researcher and his librarian as the publisher of source. As Charles Watkinson reminded us, the growth of US (and UK, I would add) university presses has been remarkable. While the traditional Journals market players have tried to defend their branded journals, their requirement for copyright, and their control of the market through peer review, the smoke seems to me to be clearing, revealing a very different picture.

So when you can submit an article with reviews, ready for publication on a pre-print server or a university repository or figshare, will we be too concerned about the publisher of record as long as the metadata is in place? As long as the metadata is properly organized by libraries working together will we worry about brand or journal? Will today’s publishers become tomorrow’s organisers of reputation, ranking scholars and reviewers and contributions to the scholarly communication chain in terms of what other researchers did as a result – cited, blogged, downloaded, annotated etc? And will this turn into a rating system that helps to guide investors in governments and the private sector, or universities making appointments? And will the article cease to exist in the new workflow of scholarship, at least as something read only machines, or will it be replaced by conclusions directly annotated on the data and cited? And obviously, while every discipline and geography is different, where will the first movers be?

No one knows, of course, which is why a Retreat, particularly one focussed on what we are going to collect, store and search in the future, is so valuable. I clearly see now that Open Access is not the answer, but part of a journey, and part of the next stage will be the emergence of funders (Gates and Wellcome are there already) as publishers. But I am hooked – and will be at the 20th Fiesole in Barcelona to debate the issues with colleagues I have come to trust.

Are we seeing the emergence here of a new truth? Or just an old lie tarted up? The sale of BvD ( Bureau van Dijk) by EQT to Moody’s for $3 billion is either a great celebration of the need for data in a data analytics business ( aka IBM Watson and Trueven Analytics in the Healthcare sector, or Verisk and Wood Mackenzie in energy ) or the need to persuade wary initial users that the analytics they are getting are backed by the familiar brands of the research databases on which they were formerly reliant . And if it is the latter then large database companies would be well advised to sell their data to emerging analytics companies now , because sometime soon users are going to discover that data searched on the Open Web now is often equal in value and sometimes throws up startling new insights when compared with the hoarded and highly curated stuff we have been relying on for years . But it is the stuff from the research databases that has the total credibility

Think of it this way . BvD have been building databases since the early 1980s . As the Belgian member of my board in the Eropean Information Providers Association , Marcel van Dijk was openly sceptical about the future for research databases , a sideshow in his computer bureau business , and a hobby of his colleague , the luminary Professor Bernard van Ommeslaghe. The latter built a business that was bigger than the Bureau at Marcel’s retirement , and started the chain of wealth enhancement that led through Candover and Charterhouse and ended with EQT , in successive deals that have grown from $600m to $3billion over a ten year period . And has BvD grown commensurately with that value ? Well , it is a highly profitable $280m company well plugged into corporate research , especially around credit , risk , compliance , M&A , and that great money earner of our generation , transfer pricing .By the time we entered the age of Compliance the company was already in PE hands and getting expensive , but much of its data was available from public sources and its much vaunted private company data was as good as private company data can be – patchy , and increasingly easy to research , and in markets where you really wanted to know (China) fiercely defended by someone more powerful ( People’s Bank of China ) .

So they did the right and obvious thing to do . While van Ommeslaghe tuned the search engine a decade or so earlier as his response , they now went for the “new wave ” and started an analytics based solutions business , launched in each of their sectors and branded “Catalyst” . I have never seen the figures though I have constantly asked market analysts who know everything about one of the most intensively researched companies in its sector and they change the subject . No mention of this was made in the sale release either , where analytics was concentrated around the ability of the Moody Analytics division to transform BvD . I draw my own , possibly erroneous , conclusion: not for the first time internal re-invention failed to convert a successful team to a new sales pitch and a new business model .

Which would be a good point at which to sell , especially if Moodys Analytics division is as hot to trot as its press releases suggest . And do not forget that these are critical days for the rating agencies . While performance at Moody , S&P and Fitch has returned to pre-recession levels – almost- there are still critical regulatory issues and continuing disquiet about the role the se agencies played , or didn’t play , in that crisis . And for the first time for years there are competitive threats : governments and regulators wonder if there is a better way , while start-ups like Credit Benchmark in the banking sector suggest that aggregating the research and decisions made by all banks can produce valid choices and rating decisions for individual players . In short , we are now removed from the glory days when this market was a “license to print money ” and we are back to the struggle for survival . Will Analtyics be the Get out of Jail card?

Let’s then go back to Marcel and Bernard and the risk decision they made in in or around 1985 . Suppose Marcel says ” Look , are you sure that there will ever be enough librarians and information managers to justify renting them space on our computers ? The IBM PC means that our outsourced bureau service running payrolls and utilities business is in decline but we are competitive , we know how to sell and we can still hold our own “. And Bernard responds ” No , the new business is just like the old one , except we are storing content for the individual use of the endusers of our clients , the business model ( then ) is the same – time based access – all we need to do is learn how to sell access ” . Life is not perfect – as it turned out there was not enough bandwidth in 1980s phone lines , so they ended up succeeding on CD ROM . But there were enough intermediary users . Today cost conscious employers want to cut out the intermediary librarians and deliver solutions directly into the workflow of the ultimate user . Do Moody Analytics , or any of us for that matter , yet know enough about pricing , selling or distributing these solutions . The gap between the decisions made in the 1980s and the decisions that need to be made now is much greater . It is no good retaining the Marcel belief that somehow the good old business will just go on , but that investors and customers will only appreciate the “Change ” badge on your lapel if you spend a very great deal of money on it ( and how did they come by that valuation ….. !)The purchase of BvD takes Moody’s revenues over the billion mark and adds to its margins , so it ticks some analysts boxes , but the horse that lived in the BvD stable has long since bolted , and it is hard to believe that a new incumbent is ready to graze that data – or find some better stuff in the open pastures of the internet