When J P Rangaswami, Chief Scientist at Salesforce, speaks, as he did last year at the Outsell Signature Event, he often alludes to his childhood in a small Bengal village. When he went to the shop for cigarettes for his father or groceries needed by his mother, these were handed over to a child on trust, and with the justified expectation that payment would be made by an adult in due course. I had similar experiences in a Gloucestershire farming village, which demonstrates to me that trust and privacy form part of the lattice-work which underpins a society in which it feels good to exist, and in which there is a balanced view between everyone knowing enough about everyone in order to live at peace with them and trust them, and not knowing those things which are unnecessary to such trust and regarded as private by the individuals concerned.
We lost that balance in civil, real society in most western democracies in the second half of the twentieth century, so it is a bit surprizing that one of the most widespread criticisms of the virtual world of the networks is that these features are not in place. Yet the frequency with which both Trust and Privacy occur on conference agendas and in newspaper and blogosphere commentary reminds me all the time that we talk a great deal about the loss of these things, but do very little about them, even to the point of not consistently monitoring what is going on. And I was fascinated to hear, at a recent conference of librarians and academic researchers, one distinguished critic of publishing businesses point out that aggregators and publishers are now creating new and valuable information about the patterns of usage, the contexts in which certain types of usage take place, and the preferences of users where content was available unwrapped from its original – all of this great information was private to publishers and was not available to authors. And that the paradox here was the levels of sometimes unjustified trust given to academic peer review (he cited the Tamiflu case, of course) where those being trusted were denied full sight of the evidential data, in a milieu where peer reviewers are not required to even try to replicate the results achieved by the experiment which they are concerned to referee.
And these comments were made about science and medicine, areas where public trust must be blind because it cannot be fully informed. The other area where this same consideration applies is in security, the flipside of Trust. When we are told that something is “secure” in the network, few of us have the ability to make a judgement. Thus, until last week, few of us had doubts about SSL – until Heartbleed broke. I have been reading about this in a new online newssheet called Cyber Security Intelligence (firstname.lastname@example.org), founded by my friends Alfred Rolington and Tim Heath. And I take it that the success of this venture – I have certainly become an avid reader – is about the fact that Trust is now front and centre of our concerns on the network, and so Security, whether it is the Snowden leaks or security being compromised at our bank, becomes critical to all of us.
In these difficult circumstances we will probably behave as we usually behave. We will ignore warnings (who changed passwords because of Heartbleed?) and only seek control of the Trust question by increasing security close to us when it is easy to do so, and enhances our Privacy at the same time. To an extent, everything we do in the network is competitive. I must have better Security and enhanced Privacy because I need it/deserve it/can pay for it. You do not need it, since I need to evaluate you as credit risk/market research you as a target/use you as part of my data analytics sweep. In order, then, to even think about the question of balance with which I began this piece, we need to be able to decide, each for himself, our own settings in terms of Trust (Security) and Privacy.
The systems to help us do this are now becoming available, just as more and more of our personal information becomes network available/vulnerable. Users of the new Galaxy S5 smartphone, just launched by Samsung, may or may not be keen on its built-in personal heart rate monitor being available to insurers or employers. So the market at last seems interesting for services like Paogo (email@example.com), long in the wings (2008), or new developments like the French/American Tego (http://www.tegobox.com), in beta now for launch later this year. Here is something of what Tego says it will deliver:
Simply tag a file and it will be encrypted and kept safe. Control what others can see and for how long.
No central server collecting your data, no tracking. Data never leaves your devices. Surf completely anonymously.
Build different personal environments with trusted contacts. Then share without risk”
Tego (“I protect” in Latin) will secure you against market analysts and hackers and your own government – but what about that issue of balance? When everything is all locked up we still won’t have the levels of Trust of JP’s Bengali village. Maybe I am looking for something completely different. Is anyone out there building a Trust machine, which does data analytics on your avatar, on your writing, on your facial expressions on Skype, and compare them with Trust models, so I really know whether to trust anyone out there ? And whether I am rated Trustworthy? I doubt it, since it would undermine elective politics completely, but if the answer to the machine is in the machine, as my dear friend Charles Clark was wont to say, then we should start now to engineer the network to find us that vital point of balance between Trust and Privacy.keep looking »