Don’t you love the way that financial analysts run for the cover of the Big Generalization? So Thomson Reuters buying PLC (Practical Law Company) on 3 January is Consolidation. Big getting Bigger. More market share. Problems of law markets in the recession years need to be addressed by bigger content units. Simples? No, not at all. And this form of analysis entirely misses the point. Why did Thomson Reuters need to buy PLC now? Where does it place them in the evolving story of professional services? And what does this acquisition do to their existing services and their positioning in the place where there is growth – small and medium size law practices? In fact, what is this story which may be superficially easy to categorize but actually tells us a very great deal about what is happening to networked services in the professional sphere of activity.

I have written about this in several pieces in the last 3 months (“Beware: Lawyers at Work”, 4 November 2012 and “The Way Lawyers Work Now”, 13 September 2012). I have tried to underline there that this is not a new process. Robert Dow and Chris Millerchip, who founded PLC, left Slaughter and May to do so in 1990. As I recall the story, their very first impetus was to start a magazine which would advise lawyers on practical processing issues in dispatching routine legal matters, and only later did they turn to devising and implementing those pieces of process – precedents, practice notes, checklists, document templates etc – which would dig down deep in key sectors like commercial, corporate, employment, intellectual property, competition or finance law. They now have what the press release describes as a “comprehensive suite” and they do this in the US as well as in the UK. They are universally respected, used by 96% of the UK’s top law firms and 80% of the AmLaw 200, yet at around £50 m in revenue in 2011, surprizingly small. However, they are exceeding profitable, running subscription services which few ever leave (they become part of the way your law firm works), and often quoted as running ebitda returns in the high 30% range. Estimates of their sale price this week were around £300m, arguing 12X a forward ebitda of £25 m. We shall never know, but even these estimates indicate a very valuable company that Reed Elsevier’s Lexis and Thomson Reuter’s Westlaw have sought to buy for years. But they would never pay the founder’s idea of a full price. So why Thomson Reuters and why now?

I have tried to indicate in those previous pieces that Publishers (aka Butterworth or Sweet and Maxwell in 1990) would not have seen what PLC do as  publishing”. And, from the 1970s onwards, big law publishing had invested in the world of Research (which in lawyer terms mean that they were mostly concerned with litigation, always a bigger game in the Us than in the UK). As a result Westlaw and Lexis dominated law library budgets in major law firms globally, but their revenue base was very dependent upon a small base of litigators, and the ability of their costs to be charged through the system to the ultimate client. However, the practise of law is not mostly like that, but rather more like the patient game of form completion and document filing where PLC sought to introduce productivity game. It took a global recession but now the big law publishers get it too. The impressive attempts by Lexis in London to build practice tools and sell more use of research through them bear witness to that: strategy turns through 180 degrees when we realise that we are not in business to simply support and then replace the library, but that we are there to handle the whole business of the law office. This is about productivity gain, better decision-making and cheaper and more effective compliance, this “business of law” thing, and if we can do it for lawyers we can do it for any professionals. As the largest player in Law as Research, Thomson Reuters were the most vulnerable player as the market began to move towards these Business of Law considerations.

But, just a minute, a lot of those future customers in the law office context will not be lawyers. Even lawyers, as polled by Lexis in the UK, see the majority of routine work getting parcelled out to legal services and paralegal services players, both onshore and offshore. And there will also be Expert systems doing some of this work. Law offices will get smaller and more expert, and sell on their expertise alongside and within the workflow that they place with contractors. But how do you ensure quality results – unless the outsourcers use standard precedents and proven workflow modelling from verifiable sources. And what happens when these tools reach medium and smaller practices: quality gets improved and cost competition grows. It is not hard to see the law office and the corporate law/counsel office of the future. It runs on the network, uses work processed by a variety of hands in different places, employs standardized and compliance-approved workflow tools allows users to collaborate in alerting each other to threats or reversals (in the Courts) which may inhibit the utility of some of those processes. Thomson Reuters just joined this world, and not a moment too soon. Some of their thinking and some Lexis minds were there already. But now it is official: Business of Law is the Future of Law.

Two points remain to be made. We have to recall that Messrs Dow and Millerchip left Slaughter and May where they had been working lawyers in search of efficiencies. In other words, they were not the editorial/academic lawyers normally employed by publishers. This says something about the sort of people Thomson Reuters and Lexis will need to employ to get this huge transition right. Then again, one major player is yet to shift. Bloomberg is a private company and what it does is its own business, yet the maintenance of the infant Bloomerg Law separate from Bloomberg BNA is an enigma, as is the apparent indifference in the 12 months since the BNA acquisition towards global markets or these Business of Law issues. Perhaps having to have everything on the Bloomberg box, rather than in cloud/network configuration, has something to do with it. As it is , in contrast to Thomson Reuters and Lexis, Bloomberg’s offering looks a bit off the pace of change. Enough reason, perhaps, for Thomson Reuters to buy PLC in the first place?

Thomson Reuters Press Release: http://www.prnewswire.co.uk/news-releases/thomson-reuters-to-acquire-practical-law-company-185535352.html

So have we all got it now? When our industry (the information services and content provision businesses, sometimes erroneously known as the data industry) started talking about something called Big Data, it was self-consciously re-inventing something that Big Science and Big Government had known about and practised for years. Known about and practised (especially in Big Secret Service; for SIGINT see the foot of this article) but worked upon in a “finding a needle in a haystack” context. The importance of this only revealed itself when I found myself at a UK Government Science and Technology Facilities Council at the Daresbury Laboratory in he north of England earlier this month. I went because my friends at MarkLogic were one of the sponsors, and spending a day with 70 or so research scientists gives more insight on customer behaviour than going to any great STM conference you may care to name. I went because you cannot see the centre until you get to the edge, and sitting amongst perfectly regular normal folk who spoke of computing in yottaflops (processing per second speeds of 10 to the power of 24) as if they were sitting in a laundromat watching the wash go round is fairly edgy for me.

We (they) spoke of data in terms of Volume, Velocity and Variety, sourced from the full gamut of output from sensor to social. And we (I) learnt a lot about the problems of storage which went well beyond the problems of a Google and a Facebook. The first speaker, from the University of Illinois, at least came from my world: Kalev Leetanu is an expert in text analytics and a member of the Heartbeat of the World Project team. The Great Twitter Heartbeat ingests Twitter traffic, sorts and codes it so that US citizens going to vote, or Hurricane Sandy respondents, can appear as geographical heatmaps trending in seconds across the geography of the USA. The SGI UV which did this work (it can ingest the printed resources of the Library of Congress in 3 seconds) linked him to the last speaker, the luminous Dr Eng Lim Goh, SVP and CTO at SGI, who gave a magnificent tour d’horizon of current computing science. His YouTube videos are as wonderful as the man himself (a good example is his 70th birthday address to Stephen Hawking, his teacher, but also look at (http://www.youtube.com/watch?v=zs6Add_-BKY). And he focussed us all on a topic not publicly addressed by the information industry as a whole: the immense distance we have travelled from “needle in a haystack” searching to our current pre-occupation with analysing the differences between two pieces of hay – and mapping the rest of the haystack in terms of those differences. For Dr Goh this resolves to the difference between arranging stored data as a cluster of nodes to working in shared memory (he spoke of 16 terabyte supernodes). As the man with the very big machine, his problems lie in energy consumption as much as anything else. In a process that seems to create a workflow that goes Ingest > Store and Organize > Analytics > Visualize (in text and graphics – like the heatmaps) the information service players seem to me to be involved at every point, not just the front end.

The largest data sourcing project on the planet was represented in the room (The SKA, or Square Kilometre Array, is a remote sensing telemetry experiment with major sites in Australia and South Africa). Of course, NASA is up there with the big players, and so are the major participants in cancer research and human genomics. But I was surprized by how Big the Big Data held by WETA Data (look at all the revolutionary special effects research at http://www.wetafx.co.nz/research) in New Zealand was, until I realised that this is a major film archive (and NBA Entertainment is up there too on the data A List) This reflects the intensity of data stored from film frame images and their associated metadata, now multiplied many times over in computer graphics – driven production. But maybe it is time now to stop talking about Big Data, the term which has enabled us to open up this discussion, and begin to reflect that everyone is a potential Big Data player. However small our core data holding may be compared to these mighty ingestors, if we put proprietory data alongside publicly sourced Open Data and customer-supplied third party data, then even very small players can experience the problems that induced the Big Data fad. Credit Benchmark, which I mentioned two weeks ago, has little data of its own: everything will be built from third party data. The great news aggregators face similar data concentration issues as their data has to be matched with third party data.

And I was still thinking this through when news came of an agreement signed by MarkLogic (www.marklogic.com) with Dow Jones on behalf of News International this week. The story was covered in interesting depth at http://semanticweb.com/with-marklogic-search-technology-factiva-enables-standardized-search-and-improved-experiences-across-dow-jones-digital-network_b33988 but the element that interested me and which highlights the theme of this note concerns the requirement not just to find the right article, but to compare articles and demonstrate relevance in a way which only a few years ago would have left us gasping. Improved taxonomic control, better ontologies and more effective search across structured and unstructured data lie at the root of this, of course, but do not forget that good results at Factiva now depend on effective Twitter and blog retrieval, and effective ways of pulling back more and more video content, starting with You Tube. The variety of forms takes us well beyond the good old days of newsprint, and underline the fact that we are all Big Data players now.

Note: Alfred Rolington, formerly CEO at Janes, will publish a long-awaited book with OUP on “Strategic Intelligencein the Twenty First Century” in January which can be pre-ordered on Amazon at http://www.amazon.co.uk/Strategic-Intelligence-21st-Century-Mosaic/dp/0199654328/ref=sr_1_1?s=books&ie=UTF8&qid=1355519331&sr=1-1. And I should declare, as usual, that I do work from time to time with the MarkLogic team, and thank them for all they have done to try to educate me.

« go backkeep looking »