In many parts of the Scholarly Communications world we have moved long past the milieu that Gene Garfield’s ushered in  thirty years ago with the establishment of ISI and the impact factor . But conservative marketplaces have a long tail , much of the academic world is still firmly tied to the evaluative gold standard of the impact factor , and no one will be persuaded to do anything differently until they are also persuaded that it works . And it will take a long time to persuade people that a new governing metric is available , a platinum standard for distinguishing the highest quality of research over time , because a great deal of data has to be assembled and analysed before that can be accomplished . Yet leading voices in the sector , from funders and governments to researchers and librarians , continue to seek evaluation that includes all the available metrics , that allows places for network usage appraisal as well as the impact of blogs and tweets and posters and demonstrations , and which tries to draw a more holistic picture of the impact of a researcher or a research team .

 

And while we need tools like this for the benefit of funders and governments in making funding decisions , we are equally desperate to help researchers manage and discriminate amongst the tsunami of research publishing at the present time , and the and the wall of knowledge building up behind it as access to second tier research from a host of countries outside of Europe , the US and Japan  supplements the very considerable research contribution of China , India , Brazil and other counties in recent years . And since we long ago lost sight of a specialist being able to read everything published in his sector in a year , we vitally need the data , and especially the metadata , which will enable us to summarise what we cannot read , apply standards where. We cannot form judgements , help us to derisk the issue of missing something significant and ensure that our literature reviews and our management of prior knowledge does not sink the research ship before we set of on the real voyage of discovery which takes place after the discovery of the known .

 

And we have been preparing ourselves for this for years and this is clear in commercial market movies over the last few years . When Clarivate Analytics , owners of Web of Science and thus the citation based world we have inherited , bought Publons , I have always imagined , since using the peer review data in this context will be very important . Equally , Elsevier began the whole long development track behind SciVal in order to use vital data from Scopus and elsewhere to begin this immense evaluative task . And Informa bought Colwiz and Wizdom.ai in order to get into the same boat . As of today, the boat is getting just a bit more crowded . The launch this morning of Dimensions , based on the funding data work of the existing Dimensions site at Digital Science , but now integrating data from six Digital Science companies , is a dramatic step and raises. the bar for all of its competitors . It teaches us something important about bringing data from a variety of sources in a way that gives users a new view with an enhanced utility and it shows how a publisher service like ReadCube  can also be turned round to become a discovery engine in its own right . And with a marketing plan that starts from free access for researcher and moves through institutional fees at different levels of value it really accommodates itself to the prevailing culture around the costs of service provision .

 

This is a great start from which to build – and there is a long build yet to go . There is both geographical data and altmetrics data . Then there are the evaluative tools that will support research management in its urgent evaluation needs . Impressive as this aggregation is , there is more in front than there is behind . , But Dimensions certainly puts Digital Science in position .  It is the first service development that makes Digital Science a company and not an incubator , andthis may be very important to its next steps . Like Clarivate , Digital Science is not a publisher . It there fore has Clarivate’s vital neutrality from the things that it is rating . Above all , it is a data and analytics company and was created to work in this research management space . A place where many publishers will try to struggle into as journal publishing , the cash cow of the 1990s , becomes a more and more difficult and less predictable marketplace .

 

Yet all of these players face one sobering difficulty. Building the foremest evaluative research engine is not like other competitive parts of this marketplace . A new standard is just that . Four competing standards help no one . Winner takes all , and after a time the prevailing system becomes supported by other people’s tools , and competition is reduced . This is not to say that there is not good business in providing researchers with good analytical tools , just that it is most likely that the successor to the impact factor will remain a single entity managed for tha market by one of these players – or by one we do not yet see coming , perhaps from the non-profit sector . In the meanwhile , “new “ dimensions should be celebrated , and not least by Holtzbrinck . If they do need to suck debt from Springer Nature post IPO then Digital Science is becoming a very valuable token for doing that and ensuring their majority position . If that does not happen , then they have a an asset with fast growing value in the group .