We need to talk seriously about Futures Literacy . And we need to do it now , before it is too late . The decisions being taken in our boardrooms are getting bigger and bigger . And if they are not , then we should be very worried indeed . This month we come to CoP 26 , exposing once again the need to take urgent steps to address climate change . The Board cannot simply leave all of this to the politicians , who will always be guided by what will give them electability. The decisions on climate , upon investment in change and most of all on speed of deployment , will be critical in meeting targets and , eventually , in escaping the worst effects of hundreds of years of exploitation and neglect . Yet for many of us , as we steam towards the Metaverse at ever increasing speed , it seems as if we have a parallel set of concerns. We know that we have to think about investing in the technologies that surround information content and data  in the information industry . We also know that next year our customers will have different and enhanced expectations of us . We are sophisticated now as businesses , handling online service functions , raising fresh capital and working cohesively with stakeholders . Then why , O why,  is the Pygmy in the room  the way that we discuss the Future ? 

I am now past fifty years of working , as a manager , a Director , a CEO and as an advisor to many boards . My experience of experience is that you do not really learn very much from it in periods of rapid change . When I started it did not matter much if a senior manager could not distinguish Linotype from Monotype . Today it does not matter much if a manager cannot discuss Digital Twins or tell you how a GAN network operates . What concerns me is the nature of the dialogue , the discipline of the approach , the “ empirical rigour “ in the discussion , since these are the necessary supports for planning , and , above all , for planning timing , which are needed if we are to sustain any hope of making sense of what we need to do beyond Q2. 

All too often , even at board level , discussion devolves to the anecdotal brilliance of someone’s daughter and the app she found on Google , or the son who downloaded a course and passed Math without needing a tutor , or a visionary who someone has seen speaking on YouTube , or a book which someone had heard of but never actually read … This anecdotalisation of the Future makes me want to scream . I take it that we sit on Boards because we are charged by the stakeholders , beyond our governance duties , with the maintenance and growth of Value through Time . The Future is thus our mandate , not something to obfuscate around . We need to talk frankly about how we anticipate change , and just as we should be watchful now for bias in data , we need to start with a careful self-audit of our own bias about the future .

The most valuable work that I know in this area comes from UNESCO , and from Riel Miller, their head of Futures Literacy . The case he makes is impressive and has the huge merit of moving us away from an extrapolation-based thought process , where we all try to second guess future trends from what we have experienced in our own lives . In the first instance , our own experiences are collected randomly . In the second , this method gives us no way of testing probability or timing . Far better then to try to develop strategies about the Future by creating , or reframing , our thinking through developing hypotheses, altering all of the variables and testing our assumptions . This sounds to me like a managerial version of scientific method , and a discipline devoutly to be wished for when we come to consider  the lazy thinking around much of the Futurism that we read and hear . In the information industry , after all , we say that we are driven by data science . Some attempt to think scientifically may well be overdue . 

So how do we go about the business of reframing our corporate thinking about the future ? Riel Miller’s suggestion is the Futures Literacy Labs concept , though I would not recommend this in some of our industry corporate frameworks as a board level activity . However , the opportunity to put some senior directors , key managers and some younger fast track recruits into a regular meeting context where a discussion discipline is maintained around forming and testing concepts,  could both inform board decision making and spark small scale experimentation to test developed ideation . And this would be especially valuable and useful if the primary concentration was on our users and how they will work . This then forces us to think hard about how we continue to add value for them . It could stop this low level assumptive discussion of generalities – “ Of course , AI is the future of everything “ – and ground our arguments in the vital qualities that they seem to lack – Context and Timing. Above all , it widens the responsibility for the future – this does not rest with the CEO , the CSO or the CTO . It rests with all of us .

The Man Who Mistook Open Access for a Windmill

An Open Letter to Richard Charkin in response to his column in Publishing Perspectives ( https://publishingperspectives.com/2021/09/richard-charkin-an-heretical-view-of-academic-publishing/

Dear Richard. You know well the warmth and affection that I feel about your work and for you personally . But just at the moment , having read this piece ( doubtless written to irritate !) I feel like Sancho Panza . I am sitting heavily on my mule behind you , Master . I see you applying the spurs to Rocinante’s lean flanks , I see the direction of your lance , and I must cry out , though in your enthusiasm you will not be able to hear me , “ Those be Windmills , sire”. Sixteen long years have passed since I , as Chief  Researcher on the House of Commons enquiry, invited you to give the evidence that you cite here . In that time Open Access has ceased to be an innovation and has become a norm . This is not a battleground any more . In the five day Geneva Workshop on Innovation in Scholarly Communications , organised by CERN and the university of Geneva , and attended by 1400 scholars last week, I heard no voice that even questioned the hegemony of Open Access . 

The battle ground is elsewhere . Lets stable Rocinante and give her a good feed of corn and listen to some market  voices . Like the ScholarLed consortium , and the COPIM partners , who spoke in Geneva of pooling publishing software solutions online to create infrastructure and scale for scholarly self-publishing Open Access monographs . Or like Knowledge Unlatched in Berlin , using the subscription business model you so love to “Open “ books subscribed by libraries . Or the MicroPublishing work sponsored by CalTech which publishes short evidence -based articles , many by post-grads and early career researchers , which address one of the problems of the day – how do young scientists get recognition and build up a portfolio of work when the great branded journals are barred to them by elitism and economics . 

Or we could go and talk about Open Science – really the subject of which OA is but a tiny sub-section . As publishers we always shrank from understanding how scientists worked , but since all the processes of that work are now contained in seamless digital networks we cannot avoid it . The Professor of BioSemantics at the University of Leiden is very clear . He says that the Data is  now more important than the Article . His peers elected him President of CODATA, the International standing committee on research data , and he chairs the High Level Expert Group  of the European Science Cloud . One of his  problems , as he works to proliferate the FAIR protocols and the Global Open FAIR mandates , is that publishers rushed to publish articles but ignored the Data . There is no business model for Data . Yet its metadata and mark-up are urgent publishing problems . In a world where more machines than people are reading both articles and data , it is no good just marking up the narrative bit so a machine can serve it up to a human . Machines do not do narrative . They do RDF . They understand triples . Publishers really do have a long way to go until anyone, man or machine , reading an article can find the evidence and vice versa , and both humans and machines can find and fully interact with both . 

And then of course , Open Science would restructure the article . Ethical considerations may yet demand that the hypothesis and the methodology be openly available before experimentation commences . Are publishers generally good , do we think , on the ethical side ? Are retracted articles clearly marked as such in databases so that no one would ever mistake one in a search ? Are articles marked to show where work has been done to reproduce their results , and is that work linked to the original paper ? Publishers really do need to understand how science is changing and work with it to provide the process tools it needs in terms of analytics and discoverability and reproducibility. Shifting to Open Access but postponing the real impact through transitional deals buys time , but that time has to be used to re-invent and  re-invest the future . Above all , we need to recognise the scale of what has changed . The 450,000 Covid related research articles of the past two years defy human analysis . There is no time left for a decent tilt at a windmill  , dear friend !


« go backkeep looking »