During this period of enforced convalescence I have had to come to grips with the idea that my brain only works effectively when supported by the memory in all the devices around me. And that this state of dependency is now global. Without our membership of a globally networked society we would become slow and inefficient: with it we become dependent. And it is this dependency which seems to me the first stop on a mental route march which we need to make. I am far from the first to try to examine what Internet of Things (IoT) or, as some will say, Internet of Everything (IoE) will mean for social, industrial or commercial aspects of society. But I do not yet hear much examination of this phenomena in terms of the information industry, let alone the businesses we insist on still calling “publishing” or the “media”.

Let’s start at a point of common agreement. We are in the middle of a new industrial revolution. For evidence, check the websites of IEEE or IET: the latter have just published a splendid “Ones to Watch” report (http://www.theiet.org/policy/media/campaigns/ones-to-watch.cfm?utm_source=redirect&utm_medium=any&utm_campaign=onestowatch#.VHt6PmB0imw.mailto).
They see the vanguard industries in this fundamental change in the nature of commerce and society – think what happened in the UK between 1780 and 1830 – as driven by space exploration, robotics. 3D printing (I would rather they had spoken of additive manufacturing), new energy networks, food manufacturing and cyber-security. I buy all of those, but would add drug manufacture driven by individual DNA analysis.

Underlying this social and industrial revolution is the revolution that makes it all possible: the global connectivity of network – attached computing power, and it’s ability to exploit intelligence and data generated in the network. Only this week Professor Steven Hawking has pointed out the dangers of AI outside of man’s control. My feelings run the other way: it amazes me that while we have spoken of machine intelligence for 30 years we have so little to show for it. Only in the past few years has the ability to harvest data more effectively, and the ability to cross- search it without restructuring it, produced real results in terms of the impact of the data analytics advances (“Big Data”) really struck home. While we will always be seduced by thrills and tricks (Google Glasses?), we can now see machine intelligence built into most common workflows and at a variety of levels.

Here is a list, posted by Vincent Granville at DataScienceCentral, of impact areas for data analysis in the next ten years:

(http://www.datasciencecentral.com/profiles/blogs/17-areas-to-benefit-from-big-data-analytics-in-next-10-years)

Just look at how many of these impact the information industry marketplace. As our world of work changes so the very survival of information market players will depend upon how easily we are able to track change and react to it. But what part of this struggle to survive can we lay at the door of IoT/IoE? And can we picture an IoT world which is less trivial than sports wearables or more useful than a car that turns on the house lights at home when you are still a mile away? Well, obviously we can, but the unacceptable passengers riding on the back of IoT must then be taken into account. Yes, it does mean that we shall move from the age of privacy into the age of transparency – and we are halfway there already. And, yes, it does mean that employment is going to be very different. We will lose millions of jobs, and we are surprisingly far down this track as well. The UK public sector will lose a further Million jobs in the next five years, we learnt this week. Some of those will be outsourced but governments do not give up governing lightly – and many of those jobs will become automated systems roles in the outsourcing process . And it may well mean that, at last, we have to properly rethink what Capitalism means. After all, a zero marginal production cost society will ask questions about how the profit mechanism works.

For a good review of many of these questions see Sue Halpern’s review article in New York Review of Books (vol. LXI, Number 18, 3December 2014). Cisco famously predicts that all of this adds up to a 14.4 trillion dollar boost to the global economy between now and 2022. The 10 million sensors that measured our world in 2007 will number 100 trillion by 2030. In Rotterdam docks all containers will be engineered for auto drive by 2018. Uber, a precursor of the automated driving world, was as valuable as Time Warner this very month. For better or worse, this world is with us now. This is not 1780 in the original British experience, but 1820 and the railway boom is just beginning. And for information companies of every type there is a corresponding possibility of mega growth, as long as we read change accurately. Wherein lies a problem that I want to address later.

When J P Rangaswami, Chief Scientist at Salesforce, speaks, as he did last year at the Outsell Signature Event, he often alludes to his childhood in a small Bengal village. When he went to the shop for cigarettes for his father or groceries needed by his mother, these were handed over to a child on trust, and with the justified expectation that payment would be made by an adult in due course. I had similar experiences in a Gloucestershire farming village, which demonstrates to me that trust and privacy form part of the lattice-work which underpins a society in which it feels good to exist, and in which there is a balanced view between everyone knowing enough about everyone in order to live at peace with them and trust them, and not knowing those things which are unnecessary to such trust and regarded as private by the individuals concerned.

We lost that balance in civil, real society in most western democracies in the second half of the twentieth century, so it is a bit surprizing that one of the most widespread criticisms of the virtual world of the networks is that these features are not in place. Yet the frequency with which both Trust and Privacy occur on conference agendas and in newspaper and blogosphere commentary reminds me all the time that we talk a great deal about the loss of these things, but do very little about them, even to the point of not consistently monitoring what is going on. And I was fascinated to hear, at a recent conference of librarians and academic researchers, one distinguished critic of publishing businesses point out that aggregators and publishers are now creating new and valuable information about the patterns of usage, the contexts in which certain types of usage take place, and the preferences of users where content was available unwrapped from its original – all of this great information was private to publishers and was not available to authors. And that the paradox here was the levels of sometimes unjustified trust given to academic peer review (he cited the Tamiflu case, of course) where those being trusted were denied full sight of the evidential data, in a milieu where peer reviewers are not required to even try to replicate the results achieved by the experiment which they are concerned to referee.

And these comments were made about science and medicine, areas where public trust must be blind because it cannot be fully informed. The other area where this same consideration applies is in security, the flipside of Trust. When we are told that something is “secure” in the network, few of us have the ability to make a judgement. Thus, until last week, few of us had doubts about SSL – until Heartbleed broke. I have been reading about this in a new online newssheet called Cyber Security Intelligence (info@cybersecurity-intelligence.com), founded by my friends Alfred Rolington and Tim Heath. And I take it that the success of this venture – I have certainly become an avid reader – is about the fact that Trust is now front and centre of our concerns on the network, and so Security, whether it is the Snowden leaks or security being compromised at our bank, becomes critical to all of us.

In these difficult circumstances we will probably behave as we usually behave. We will ignore warnings (who changed passwords because of Heartbleed?) and only seek control of the Trust question by increasing security close to us when it is easy to do so, and enhances our Privacy at the same time. To an extent, everything we do in the network is competitive. I must have better Security and enhanced Privacy because I need it/deserve it/can pay for it. You do not need it, since I need to evaluate you as credit risk/market research you as a target/use you as part of my data analytics sweep. In order, then, to even think about the question of balance with which I began this piece, we need to be able to decide, each for himself, our own settings in terms of Trust (Security) and Privacy.

The systems to help us do this are now becoming available, just as more and more of our personal information becomes network available/vulnerable. Users of the new Galaxy S5 smartphone, just launched by Samsung, may or may not be keen on its built-in personal heart rate monitor being available to insurers or employers. So the market at last seems interesting for services like Paogo (info@paogo.com), long in the wings (2008), or new developments like the French/American Tego (http://www.tegobox.com), in beta now for launch later this year. Here is something of what Tego says it will deliver:

“Simple
Simply tag a file and it will be encrypted and kept safe. Control what others can see and for how long.
Clear
No central server collecting your data, no tracking. Data never leaves your devices. Surf completely anonymously.
Personal
Build different personal environments with trusted contacts. Then share without risk”

Tego (“I protect” in Latin) will secure you against market analysts and hackers and your own government – but what about that issue of balance? When everything is all locked up we still won’t have the levels of Trust of JP’s Bengali village. Maybe I am looking for something completely different. Is anyone out there building a Trust machine, which does data analytics on your avatar, on your writing, on your facial expressions on Skype, and compare them with Trust models, so I really know whether to trust anyone out there ? And whether I am rated Trustworthy? I doubt it, since it would undermine elective politics completely, but if the answer to the machine is in the machine, as my dear friend Charles Clark was wont to say, then we should start now to engineer the network to find us that vital point of balance between Trust and Privacy.

« go backkeep looking »