Dec
16
Metadata Memento Mori
Filed Under B2B, Blog, data protection, Education, Financial services, healthcare, Industry Analysis, internet, news media, Publishing, Uncategorized, Workflow | 1 Comment
Content was once valuable. Then content about content, the metadata that identifies our content values and made them accessible, became a greater and more powerful value. Soon we stood at the edge of a universe where no searching would take place which did not involve machine interrogation of metadata. We evolved ever more complex systems of symbology to ensure that customers who used our content were locked into accepting our view of the content universe by virtue of accepting our coding and metadata, and using it in relation to third party content. Further, we passed into European law, in terms of the provisions of the so-called directive on the legal protection of databases, the notion that our metadata was itself a protectable database. Now content is less valuable, more commoditized, and inevitably widely copied. So it is our fall back position that our metadata contains the unique intellectual property and as long as we still have that in a place of safety we are secure. And can sleep easily in our beds.
Until the day before yesterday, that is. For on the 14 December the European Union’s Official Journal published a settlement offer from Thomson Reuters in an competition enquiry which has run for two years (http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=OJ:C:2011:364:0021:0024:EN:PDF) The case concerns Thomson Reuters’ use of its RICs codes. Insofar as they have become the standard way in which traded equities are described in datafeeds, the fact that the market bought the Reuters solution as a surrogate for standardization did give Thomson Reuters competitive advantage – and this is made clear by the fact that the Commission investigation was prompted by its commercial rivals. But that advantage was not unearnt, and the standardization that resulted from it brought benefits across the market. Now Thomson Reuters, to end the process, offers licensing deals and increased access to its metadata. This may turn out to be a momentous moment for the industry.
I have no interest here in examining whether Thomson Reuters are right or wrong to seek a deal. From Microsoft to Google to Apple, the frustrations of enquiries by the competition commissioner’s office in Brussels have worn down the best and most resilient. But I do want to comment om what may be happening here. If you accept my thesis that content is becoming increasingly commoditized and that systems for describing it are increasingly valuable, we may have to recalibrate our picture of what is happening as a result of this news. What if, in fact, the commoditization involved here spreads slowly up the entire information value chain over time. In this model, the famous value pyramid which we have all used to subjugate our audiences and colleagues is under commoditization water at its base, which is where raw data and published works are kept. Now the next level is becoming slightly damp from this rising tide, as descriptive modalities get prised off and become part of the common property of all information users. So information vendors scramble further up the pyramid, seeking dry land where ownership can be re-asserted. Maybe more advanced metadata will offer protection and enhance asset value. The Scorm dataset in an educational product can annotate learning outcomes and allow objects and assessment to be associated. Or, following the financial services theme here, maybe we add Celerity-style intelligence to content which allows a news release to be “read” in machine-to-machine dialogue, and trading actions sparked by the understanding created. We will certainly do all these things, because no one will buy our services if they do not accord with the most appropriate descriptive norms available. But will they protect our intellectual property in conent or data? No, I am increasingly afraid that they will not.
It will take many years to happen. And it will happen at a very different pace in different marketplaces. But the days when you valued a company by its content IP, by its copyrights and its unique ownership value have been over for some time. And now we can see that the higher order values are themselves becoming susceptible to competition regulation which seems, in this age, to over-ride IP rights in every instance. So what are we actually doing when we say we are building value? Normally, it seems to me, we are combining content with operational software systems to create value represented by utility. From the app to the workflow system, content retains its importance in the network because we shape it not just for research, but for action, for process, for communication. And that, after all, is where the definition of a networked society with a networked economy lies.
And if we were in doubt about this, reflect on the current pre-occupation about Big Data. Is our society going to be willing to hold up the vital release of “new” scientific knowledge from the ossified files of journal publishers just because some of this stuff is owned by Elsevier and some by Wiley? The water of analytic progress is already flowing around the dams of copyright ownership, and this week surged past a major player protecting his coding, though the proposed licensing scheme does leave a finger in the hole in the dyke. We seem to me to be running at ever greater speed towards a service economy in professional information where the only sustaining value is the customer appreciation of service given, measured in terms of productivity, process improvement, and compliance . These benefits will be created from content largely available on the open web, and increasingly using metadata standards which have gone generic and are now, like RICs, part of the common parlance of the networked marketplace. The language of IP in he information economy is getting to sound a bit old-fashioned.
Dec
3
Accelerated Departures Confront Reality Shock
Filed Under B2B, Blog, Financial services, healthcare, internet, news media, Publishing, Reed Elsevier, Thomson, Workflow | 1 Comment
You can tell when even major corporates are embarrassed. Their use of language deteriorates to the point when meaning (hopefully) vanishes and we hacks are left to put our own, corporately deniable, slant on their gnomic pronouncements. Thus it is with the “accelerated departure” of Tom Glocer, CEO of Thomson Reuters. What exactly does that mean? Did he leave before his time, or was he unexpectedly ejected? The rumour mill had it that he was going in April 2012, so was the acceleration to be found there (his fourth anniversary is not a huge senior service for such a stable outfit as Thomson Reuters), or in his contract, or elsewhere? And did he know, or was he pushed?
Certainly it is always alleged that his predecessor, Dick Harrington, did not know that a discreet negotiation continued behind the scenes bringing Thomson and Reuters together with no place in it for him. That, if true, must have been a surprise. Did Tom Glocer come by a similar “confronts reality” shock, as the FT termed it? And what was the reality that was being confronted? I can think of at least three realities that must needs be in the minds of Thomson Reuters CEOs, and none of them relate to the decline in market value which is widely blamed for triggering these changes. The first, and most important, is the nature of the company’s ownership. Wherever a big player is really 55% controlled by the family of its original founders, confidence issues will come into play. This is real control, not the artificial dominance of voting shares practised by Murdochs or Harmsworths in defiance of market views of good practice. And this real control means that, as in the eighteenth century, once the incumbent first minister loses the confidence of the King and his closest advisor, it is impossible to continue in office. That rule applied to the reign of Ken Thomson and John Tory, as it does in the Woodbridge Trust of David Thomson and Geoffrey Beattie. It is simple and natural; you go when the owners no longer believe you can deliver.
And since Thomson Reuters are the largest professional player in the marketplace, it is worth asking what these men need to have confidence about. As far as the press commentary is concerned, one would think that the only issue is the Eikon terminal and its slow start. Well, the history of Reuters is littered with slow starts, one of which let Bloomberg into the marketplace to begin with, and several of which cumulated to create this peculiar position where the smartest and most modern application is also the cheapest and has lost market share in the recession to Bloomberg’s older and more expensive option. In each of these cycles the market for trading systems has returned to rough parity. Over at the professional side of Thomson they know about these cycles, having sometimes been up and sometimes down, but in that market they are currently in the Bloomberg position and Lexis are in the Reuters position. So did Tom Glocer’s acceleration towards the swing doors relate to all this?
Certainly this may have been the symptom, but perhaps it was not the underlying problem. The mandate that Tom Glocer accepted was to build an integrated company and it is possible, as the company became wracked by the issue of combining the parts to create new growth as a whole, that the Woodbridge owners began to doubt whether this aim was ever going to be achieved through these policies. Certainly the sacrificial slaughter of a layer of Reuters management and the balkanization of the company into an unmanageable number of operating units did not lull any misgivings in Toronto, though they may have given rise to rejoicing in old Thomson management circles, where the attitudes of their new Reuters colleagues had been met with all of the enthusiasm that the Anglo-Saxons showed to their new Norman rulers. In the new dispensation we are back down to five divisions, with former Reuters strategy chief (latterly running GRC) David Craig taking the old Market divisions, Legal going to Mike Suchsland, Tax and Accounting to Brian Peccarelli, and Global Growth to Shanker Ramamurthy. Jon Robson gets the Business Development role. What factor is common to all of these? None of them comes from a very long term Reuters and/or Thomson background. A generation has effectively passed.
And what of Jim Smith, the new CEO. Some commentators have him as a caretaker, awaiting the new strategic leader to be found and installed. Others, and I incline to this view, see him as chairman and arbiter of resource and manpower development and deployment to support and drive the integration of these two companies. So not a traditional Thomson CEO, any more than Erik Enstrom is a traditional Reed Elsevier CEO. In the latter case one has a feeling of a profoundly numerate portfolio owner looking to encourage the growth points with acquisition investment, dispose of underperformers and reward successful managers who reliably produce results. It is almost as if Reed Elsevier does not see a need anymore for an informing central strategy about its market positioning, other that “we will invest in anything that works and avoid the bits that don’t”. By contrast, Thomson Reuters is built around a distinctive market positioning, a “big niche” strategy and definite ideas about what it needed to buy, sell or grow to make the aspiration work. And yet… once you have the strategy in place, here too market strategic thinking devolves to the operating unit quite quickly. Hopefully that means that in both of these market leading players, the doors will soon stop revolving at the speed of light and we can get back the real problems of addressing the needs of global information markets in times of scarcity.
PS. One of the items on Jim Smith’s agenda must surely be the finalization of the sale of Healthcare, whose projected disposal was an early agenda item for his predecessor. It is hard to remember but this move has now been projected for almost four years!
« go back — keep looking »