Nov
28
Credit Intelligence where Credit is Due
Filed Under B2B, Big Data, Blog, Financial services, Industry Analysis, internet, Search, semantic web, social media, Thomson, Uncategorized, Workflow | 1 Comment
The best network marketplace ideas are simple. And inexpensive in terms of user adoption. And productivity enhancing. And regulator pleasing. And very, very clever. So we need to give Credit Benchmark, the next business created by Mark Faulkner and Donal Smith, who successfully sold DataExplorers to Markit earlier this year, a double starred AAA for ticking all these boxes from the start. And doing so in the white-hot heat of critical market and regulatory attention currently being focused on the three great ratings businesses: S&P, Moodys and Fitch. Here is a sample from the US (taken from BIIA News, the best source of industry summary these days at www.biia.com):
“Without specifying names, the U.S. regulator said on Nov. 15 that ratings agencies in the country experienced problems such as the failure to follow policies, keep records, and disclose conflicts of interest. Moody’s and Standard & Poor’s Corp. accounted for around 83% of all credit ratings, the SEC said. Each of the larger agencies did not appear to follow their policies in determining certain credit ratings, the SEC found, among other things. The regulator also said all the agencies could strengthen their internal supervisory controls.
The SEC noted that Moody’s has 128 credit analyst supervisors and 1,124 credit analysts, in contrast with S&P’s 244 supervisors and 1,172 credit analysts. The regulator also examined the function of board supervision at ratings agencies, and implied in its report that directors should be “generally involved” in oversight, make records of their recommendations to managers, and follow corporate codes of conduct. Source: Seeking Alpha”.
Well, in a global financial crisis, someone had to be to blame. It was the credit rating agencies who let us all down! The French government and the EU have them in their sights. They have a business worth some $5 billion with excellent margins (up to 50% in some instances). They are still growing by some 20% per annum because they are a regulatory necessity. They have become a natural target for disruptive innovation, and small wonder, because this combination of success and embedded market positioning attracts anger and envy in equal parts. Yet no one, least of all the critical regulators, wants disruptive change. It is easy enough to point to the problems of the current system, illustrate the conflicts inherent in the issuer-pays model, bemoan the diminished credibility of the ratings, or criticize the way in which multiple -notch revisions can suddenly bring crisis recognition where steady alerting over a time period would have been more useful, but at present no one has a better mousetrap.
At this point look to Credit Benchmark (http://creditbenchmark.org/about-us). Having successfully persuaded the marketplace, and especially the hedge funds, to contribute data on equity loans to a common market information service at DataExplorers (a prime example of UGC – user generated content – more normally seen in less fevered and more prosaic market contexts) the team there have a prize quality to bring to the marketplace. They have been once, and can be again, a trusted intermediary for handling hugely sensitive content in a common framework which allows value to be released to the contributors, which gives regulators and users better market information, and which does not disadvantage any of the contributors in their trading activities. So what happens when we apply the DataExplorers principle to credit rating? All of a sudden there is the possibility of investment banks and other financial services sharing their own ratings and research via a neutral third party. At present the combined weight of the bank’s own research, in manpower terms, dwarfs the publicly available services – there are perhaps as many as 8000 credit analysts at work in the banks in this sector globally, covering some 74% of the risks. If all members of the data sharing group were able to chart their own position on risks in relationship to the way in which their colleagues elsewhere across a very competitive industry rated the same risk using the same data – in other words show the concensus and show their own position and indicate the outliers – then the misinformation risk is reduced but the emphasis on judgement in investment is increased.
And of course the Big Three credit agencies would still be there, and would still retain their “external” value, though maybe their growth might be dented and the ability to force up prices diminished if there was a greater plurality of information in the marketplace, and if banks and investors were not so wholly reliant upon them .The direction in which Credit Benchmark seem to be going is also markedly one which is very aligned to the networked world of financial services. User generated content; data analytics in a “Big Data” context; the intermediary owning the analysis and the service value, but not the underlying data; the users perpetually refreshing the environment with new information at near real-time update. And these are not just internet business characteristics: they also reflect values that regulators want to see in systems that produce better-informed results. A good conclusion from Credit Benchmark’s contributory data model would be better visibility into thematic trends for investment instrument issuers and their advisors, as well as more perception of and ongoing monitoring of their own, their client’s and their peer’s ratings. In market risk management terms, regulators will be better satisfied if players in the market are seen to be benchmarking effectively, and analysts and researchers who want to track the direction and volatility of ratings at issuer, or instrument, or sector, or regional levels will have a hugely improved resource. And something else will become clear as well: the spread of risk, and where consensus and disagreement lies. Both issuers and owners get a major capital injection of that magic ingredient – risk – reducing information.
None of this will happen overnight. Credit Benchmark are currently working on proof of concept with a group of major investment banks, and the data analytics demand (in a market place which is not short of innovative analytical software at present) is yet to be fully analysed. Yet money markets are the purest exemplars of information theory and practice, and it would be satisfying to be able to report that one outcome of global recession had been vast improvements in the efficacy of risk management and credit rating of investments. Indeed, in this blog in this year alone we have reported on crowd-sourcing and behavioural analysis for small personal loans (Kreditech), open data modelling for corporate credit (Duedil) and now, with Credit Benchmark, UGC and Big Data for investment rating. These are indicators, should we need them, of an industrial revolution in information as a source of certainty and risk reduction. Markets may never (hopefully) be the same again.
Nov
4
Beware: Lawyers at Work
Filed Under B2B, Big Data, Blog, eBook, Industry Analysis, internet, Reed Elsevier, Search, semantic web, social media, Uncategorized, Workflow | 3 Comments
Writing a piece here in September (The Way Lawyers Work Now) drove me back to the sustaining works of Richard Susskind: “The Future of Law” (1996), “Transforming the Law” (2000), and “The End of Lawyers?” (2008). They remain a most impressive achievement, and as well a rare effort to forecast the future of work in a particular vertical market sector. The trends that are apparent now align closely with the Susskind theses, especially in terms of the moves into practice solutioning, where Lexis now pursue PLC much more closely in the UK, with the benefit of being able to support their solutions by invoking the whole research environment as well. Whether these moves support ideas of the democritization of access to the law – Richard quotes Shaw’s dictum that “all professions are a conspiracy against the laity” – is not the question for this blog. However, they certainly deliver a vision of deskilling and cost erosion, and thoughts that many corporate and individual clients may in future have a very different procedural access to the law and its requirements.
I was encouraged in this thinking by discovering that Lexis UK last month published some of their own research survey findings, under the title “Practice Points”. This was a very worthwhile process, though not so that we could learn that 66% of respondents forecast 10% growth per annum over the next two years. With so many UK law practices currently debating their status after the last government’s liberalization measures, no one contemplating incorporation of floatation would say anything else. What impressed me more was the high score that lawyers gave to increased competition associated with the ABS (Alternative Business Structures) legislation, and the increase in M&A activity that this foretold. In order to hold costs and even reduce them (those surveyed saw fixed fee not hourly rates as the future business model) the gearing had to change – they needed to recruit more support staff who were not going to share profits or become partners. The way in which many would do this was by outsourcing to a fixed fee legal outsourcing company, often in the UK but sometimes offshore as well. And IT was the critical element – 60% looked to process automation to reduce costs and create the communications with clients and third party suppliers which will make this work.
This plays well with the line on practice solutions now being taken by Lexis and long held by PLC in the UK. PLC’s US expansion still appears on course, though moving more slowly in the recession. But I wondered about continental Europe, especially given the traditional positioning of German lawyers between clients, and provincial regulation, Federal law and EU requirements. Do not forget that both Thomson Reuters and Lexis, in various ways, quit this difficult marketplace in the last decade. So I was delighted at Frankfurt to find Christian Dirschl of Wolters Kluwer Germany on my panel, and to be able to ask him whether German law publishers were having to adjust their positioning and move towards new access models alongside their existing commitment to research tools. And, since I have always found WK Deutschland very difficult to understand as an outsider, since it has 8 constituent law companies and another four tax imprints, I was hugely impressed by the answer: WK Germany has fully embraced semantic technologies by launching the Jurion interface (www.jurion.de) to make much of its own and growing amounts of third party content accessible in a contextualizable environment.
There are a number of very striking points about Jurion. In the first instance WK have gone back and re-engineered their content acquisition, enrichment and bundling cycle. With their metadata ducks all in a row, and fundamental problems of delivery format and functionality solved, they have been able to invite third parties on to the platform to work through the same interface. So here you can get your Haufe content as well as your Lucterhand WK content, and if you are not a subscriber to the particular Haufe service, you can join up in 20 seconds. Then again they are members of the EU-supported LOD 2 project (http://lod2.eu), with 15 other companies in 12 countries. This lets Jurion swim in the world of EU Open Government (via the publicdata.eu platform), and provides not just another layer of content accessibility, but a context in which open source semantic technologies (DBpedia, Virtuoso, Sindice, Silk) can work jointly. Add to this rich stew a few more ingredients: their ability with semantic analysis and the LOD (linked open data) environments has propelled them into the development of major taxonomic instruments, with the legal thesauri now covering a large range of public/private content and WK becoming the effective gateway and standards setter for legal access. And then consider that at the same time they have integrated document construction and document location, using the same metadata. And then search all of this on legal terms and legal concepts. And then add, from the end of this year, web data as well as web content (look at the Wikipedia -style work accomplished here). Very impressive.
But what does it look like from the user screen? When I open my Jurion desktop I have options. jSearch is a normal law database environment with semantic search. jStore has the WK products, its partners’ products, fast purchase and – almost inevitably, a recommendation system which is likely to be very important. jLink will allow annotation sharing and thus becomes a gateway to social media. jBook allows personalization and rebundling of content – and you can have it as an eBook or print copy too. jCreate allows content creation, metadata allocation and sharing – via jStore for a fee if necessary. And jDesk, which subsumes the lawyers user desktop, giving him indexation, and coverage of the whole or parts of the firm’s network. Here clients have OCR, citation recognition, topic classification, and document creation. This is not yet fully completed, but remains a startling step forward. It potentially transforms the competitive structure of the German law and tax market, and it is based on vital ideas of collaboration which have to underlie all of these developments in future.
WK Germany have gone horizontal in their effort to supply the lawyer in Germany with a complete access point. Lexis in the UK have gone vertical in their bid for practice solutions. Both of these legitimate approaches will one day end in the same place, with comprehensive and collaborative service environments that eventually begin to democratize access to the law.
« go back — keep looking »