Apr
11
The Play’s the Thing…
Filed Under Big Data, Blog, Industry Analysis, internet, Publishing, Reed Elsevier, Search, semantic web, social media, STM, Thomson, Uncategorized, Workflow | Leave a Comment
…by which to test the seriousness of the industry. (Yes, I went to the new Hamlet production at Stratford last week). And this week’s play, acted out to a packed house of industry watchers and market analysts, has been the seduction and vanquishment of the fair Mendeley by all-powerful Elsevier, so rudely forced. Or, if you prefer, the seduction of barbarous Elsevier by maidenly Mendeley. Whatever, here was a deal done for a company with negligible revenues at a price , with earn-outs, of something up to $100 m, according to those ever-present “people familiar with the deal”. And since I have seldom had more requests to explain, here is my take: Mendeley represents the greatest leap forward since Eugene Garfield in representing the worth of a science research article. If it went to Thomson Reuters it would put them back into a game where Elsevier have spent a huge amount, culminating in SciVal, in competitive efforts to diminish them. As in days of yore (who remembers BioMedNet?) the competitive threat potentially posed by Mendeley proved greater than the price misgivings. If it went to Macmillan, who already have an investment in ReadCube, Mendeley’s competitor, it would create another axis of competition which would be unwelcome, given the strides that Macmillan Digital Science made by investing in Altmetric, as well as figstore. Since every article is unique and not a competitor with other articles, the true point of competition in science research publishing now lies in workflow tools which make researchers more productive – and help them to decide what to actually read, and what to reference and visualize. So, continuing my Danish theme, this is a pre-emptive strike, like Nelson destroying the fleet at Copenhagen. Do we know whether Mendeley is the ultimate social tool for tracking who buys and reads what? No, any more than we know whether FaceBook is the player in place for life in social networking. But we do know that more than 2 million active researchers value it immensely, and so it posed a question – and one that for 20 years Elsevier have been adroit in answering.
This begs a few questions. Will Elsevier be able to run it independently enough to re-assure those critics who regard it as more like Caliban than Caesar? And are we being distracted by watching the wrong part of the game with too much intensity? I am a strong supporter of what the Mendeley team have done, but they were let into the marketplace by a chronic publishing failure: the inability of producers to sell to researchers adequately identified PDFs that obeyed agreed industry standards and which would allow a researcher to auto-index his hard disk and find what he had bought. As ever, publishers were complacent about the downstream problems they caused their users. But the real question here is about metadata, and it is a timely reminder of other problems we have never fully solved. When we adopted DOI/Handle technology the publishing community worked, as always, at the lowest common denominator of agreement. The result is a world in which articles are effectively numbered, and CrossRef express that industry cohesion, but we still cannot offer researchers the ability to search consistently over the full range of articles for which they have permissions cleared using their own or even semi-standardized taxonomies. Nature (http://www.nature.com/news/the-future-of-publishing-a-new-page-1.12665) has done sterling work in the last month on the future of publishing, but simply illustrates to me how inadquately we tackle the last steps – the ones that lead to collaboration and to each player moving forward to create knowledge stores which reflect the real research needs of their users.
I do not mean to say that publishers do not collaborate. They increasingly do, and recent press coverage of Springer and CAS, or the case study of Wiley’s work with the AGU demonstrate this. I have been involved with the TEMIS work on collaboration and have learnt a lot from it. And publicly industry leaders do point to data-led strategies, which I was interested to hear acknowledged in a talk by Steve Smith (CEO, John Wiley) to the AAP/PSP in February, which I moderated. So I was very interested indeed to spend some time with Jason Markos, Director of Knowledge Management and Planning at Wiley, and get a current view on the enrichment picture. The contrast over the past five years is, to someone used to the sometimes somnolent complacency of publishing, quite startling. Now you can have conversations about content enrichment that do begin to embrace both the narrow/deep and the broad/shallow needs of users. If the capacity now available in publishing – coming it must be said from people who entered from outside and have a real technical grasp of knowledge engineering which was not prefaced by life in linear publishing workflow processes – to think about the need to turn away from content architectures predicated by the structure of the article and towards creating entity stores or “knowledge” stores which allow data items from article databases to be searched in conjunction with data drawn from other sources like evidential data then we may indeed be on the way towards a user-driven networked vision of the future of publishing. Learning how to work with knowledge models as a way of expressing the taxonomic values of all of this shows me that we are on a route march that follows the track that has been obvious for a little while now, and which involves adopting the RDF as a basis, and creating triples to anchor our texts in semantically searchable environments. So our new Knowledge engineers will be able to spin out new service environments for increasingly demanding users, and the publishing game will not peter out with the commoditization of the article…
…I left Wiley the other day full of hope, and I still am. But this context is necessary to see that the Mendeley deal, lovely though it is, remains symptomatic of the need to scratch yesterday’s itch. I suspect that the real struggle, already underway, is to persuade researchers that publishers really can add value to data, and that they really do know how to analyse it, structure it, create smart research tools around it and extract real value from users as a reward for this investment and effort. This will need smart industry suppliers as well, and I have learnt a lot from working with MarkLogic and TEMIS in the past year. And most of all it needs the support of CEOs who see beyond maximizing PDF downloads to the strategic crossroads this part of the industry now faces – and beyond.
Apr
4
Editorial Views and Viewers
Filed Under Blog, data protection, eBook, Industry Analysis, internet, mobile content, news media, online advertising, Publishing, social media, Uncategorized | Leave a Comment
So thanks so much for all the comments on the last piece. I am now truly troubled and confused. Some of you seem to think that the regional press never will re-create itself, so it must be left to innovators to innovate and old brands to die. Yet those who are deeply involved in the regional press claim that the tools and utilities required for innovative services are simply not available. I decided to look at the latter this week, to look only in the UK, and to follow up some of the leads I received as a result of the first piece.
Since the premiere brand of Johnston Press is the Scotsman, I started my search in Edinburgh. There I found the Edinburgh Reporter, and a small team of local reporters working around an editor (Phyllis Stephens) to create, at www.theedinburghreporter.co.uk, a start-up local event site. This is powered by the nOtice web software currently being trialled (progress report awaited) by Guardian Media Group, which is itself an innovative newspaper project, given that it is an invite-only beta run by a group that sold its regional division and exited hyperlocal. But have a look at the intentions listed 18 months ago (http://gigaom.com/2011/10/28/guardians-n0tice-puts-a-new-twist-on-hyperlocal/) and look at http//nOtice.com to see what is being trialled. Clearly the effort here is to find ways in which users can post content and link to existing content. If you look at The Edinburgh Reporter, then you can see how they use Bambuser and You Tube to provide the image and video upload elements. The principle of hyperlocal is emerging here: we are all our own reporters and freelances, and whether or not we need an editorial hub depends more on brand and business model than on anything else. But the vital feature of this model is the need for someone to be able to signal what sort of news update they need so that those “closest” to the news can create media – video, audio, text – and send it back to be accessed and posted.
Which is when I was introduced to CivicBoom (http://civicboom.wordpress.com/about/). A great deal of the communication around nOtice is twitter-based. CivicBoom, from Canterbury in Kent, has its own mobile app, Boomlly, which acts as the reporting link, and the means of signalling when stories are needed. As far as I know, CivicBoom is not currently in active use in the newsroom of any UK regional, yet it is the perfect tool for experimenting with hyperlocal on the smartphones of a local youth (or mature) audience. As we move into the 4G world, the image and video output of an observant community can be turned into hyperlocal news and information, and the news organization/newspaper can tune and frame this content flow by using apps like Boomlly to request coverage, seek other views, repeat coverage or request similar coverage from different places. In other words, citizen journalism can now be organized on a far better basis than ever before, to the extent that it really does become the answer to “how can we do all that we need to do for hyperlocal on a reduced and reducing journalist workforce”. The first examples of citizen journalism at any scale that I mapped were in Florida in the early years of this century, and they plainly lacked the tools to do the job. Well, the answer is now clear – recognize and adopt your own community as participants and partners, follow the interests of that community, seek quality and set standards but do not forget who is doing what for whom. In order to succeed at hyperlocal the regional press may need to retire its editors, turn its remaining reporters into Boomlly operators and forget what was learnt in the age of print about editorial power. But since the alternative is a slow road to extinction then I strongly recommend anyone interested in survival to contact Lizzie Hodgson (e.hodgson@civicboom.com) at Civicboom and discuss the re-integration of community around news. Now.
And if the partial answer to “there is no hyperlocal news is a community answer, the other important reminder is to underline the fact that local news organizations do not mine existing database resources for hyperlocal news. Thus, for example, the national database of roadworks (www.elgin.com) is not used in any active news environment, as far as I know. Yet one of the most important pieces of information you can give anyone on a hyperlocal smrtphone service would be where delays and disruption can be expected, when it starts and when it is due to end. Much more data of this type is now becoming available, both through Open Data mandates and commercial efforts. You do not need Big Data in a local context to find it, and it blends beautifully with citizen journalism.
If the regional press is to hold its post-print position and move forward, then the size and shape of the local opportunity must be re-defined. Successful citizen journalism can point to entrepreneurial activity in local eBook or local eLearning activity. The media organization as a one delivery, advertising-driven, wholly broadcast news operation in the locality may finally be dead; visions of its future are experimental, but it is only by iteration that we drive netorked publishing forward. There are plenty of US models, and even if Patch.com will not eventually succeed then the broadcast media work around EasyBlock (MSNBC) and iReport (CNN) should provide some clues – and TownSquareBuzz and The Batavian suggest others. This is the eleventh hour – there is not a minute to spare!
« go back — keep looking »