Filed Under B2B, Big Data, Blog, data protection, eBook, Education, eLearning, Financial services, Industry Analysis, internet, mobile content, online advertising, privacy, Publishing, social media, Uncategorized, Workflow | 1 Comment
I thought it might be a restful day. After the conferences, after Frankfurt, try another conference, in a different field, and attend as a spectator – NO Speaking. Add the fact that this one was to be held in the Royal Institution (Faraday’s lecture theatre is a favourite place) and that I was invited by an old friend and I anticipated a light day catching up with the good folks in data marketing and what we used to call “list cleansing” years ago. The hosts were DQM (Data Quality Management), who I have known for a long time in different manifestations and the meeting was organized by their magazine, DataIQ, who have developed ways of benchmarking and replicating best practice in data management (www.dqmgroup.com). I settled into my seat prepared to be mildly re-educated: by mid-morning I was in a state of shock, and by the end of the day I was sure that another significant convergence was now locking into place.
Just think back a few years. For the information industry, preposterously concerned with their own proprietory content, data management of this type was simple housekeeping. Keeping lists of customers and prospects in good order was a non-strategic mid-level task. But now we live in a Big Data world, and while speakers urged us to see this as developmental as much as revolutionary, and pointed out how old the concepts are, there was still a feeling in the air that we could know so much more about customers and prospects that our chances of selling them something they wanted were increasing all the time. John Belchambers from Telefonica and Colin Grieves from Experian both hammered home the lesson, but as they were talking about global corporations and global marketing, my head was working furiously in the information products and services space. Think through the future of marketing information and the future of “publishing” (or creating information services and solutions) and you come out at the same place: The Market of One.
Noel Penrose, the former COO of Interbrand, made a vital intervention when he reminded us that Brand can be valued, and so can Data. We are not living in a place where data becomes commoditized if we are perpetually developing market-leading techniques for qualifying and comparing it. It was around this time that I saw why the agenda was dovetailed between speeches by two eminent futurologists. Melanie Howard, the chair of the Future Foundation advisory service began the day in fine form, and Dr Ian Pearson, formerly BT’s futures man, ended it. Both gave excellent value in predictive terms, but their effect on me in terms of the information marketplace was to drive me towards the Market of One prediction and what it means for all of us.
They reminded us of some things that we should know already. Demographic change should be the first stop in every strategic enquiry. In a sensor dominated society, each of us will be in receipt of 2 gigabytes a day of unsolicited content, and equally, as we walk and talk and move around the networks, we shall create as much again each day. While we talk about “mass customization” in a hopeful way, the drive to personalization, both in terms of marketing and services, is surely inexorable. Having just come from Frankfurt, where I moderated a panel on what we called “network publishing” I can testify to the willingness of producers to think about “customized textbooks” (CourseSmart) or custom workflow for lawyers (Wolters Kluwer Germany) but can we really see beyond that?
Christine Andrews, DQM’s Managing Director, certainly sees the regulatory issues (another round of European Community privacy law belt-tightening is due in 2015), but sees beyond it as well. One of the criteria for value may well become the quality of data governance in a business, and its ability to audit and report its own performance. But she is very right to point to the barrier that consumer-based legislation creates at the moment – and will increasingly in the US as that market catches up with European concerns. So turn that upside down for a moment. I could well predict, from what I heard this week, that we shall see a market where the power of the customer steadily increases to the point where powerful consumers are able to save and make private all aspects of their performance as network users, enabling them to sell it back to suppliers and marketeers in return for – coupons, discounts, customized products and services. In this permissioned world we shall have different levels or strata of market optimization – I can make this service fit a class of people who behave like you, or to fit your behaviours specifically.
So what classes of data will have most value: objective data, derived from observation of what happens on the networks which is commonly usable by all, or subjective data, derived from individual transactions and owned by the individual themselves? The latter, I imagine, but by gaining permission to use the latter, or enough of it, we could add real value to the former. This is what the Financial Times do, I hope, when they assess the reading habits of 300,000 recordable readers every day using Deep View. The inestimable Chuck Richards at Outsell took us down this track in his note this week (October 15) on 1 to 1 marketing, which also indexed companies like IDG Techsignals and Scout Analytics (www.outsellinc.com).
I came down the road from the Royal Institution grateful to my hosts for a reminder that the future is part of the present, and that marketing data and content data are all data in the context of an individual customer’s requirement.