The news (BBC, 29 December) that Orang Utans in Milwaukee are using iPads to watch David Attenbrough while covertly observing each others behaviour reminds me at once of how “early cycle” our experience of tablet tech still is, while how little we still extract from the experience we have of all digital technologies. So, by way of apologizing for missing last week (minor knee procedure, but the medical authorities advised that no reader of mine could possibly deserve my last thoughts before going under the anaesthetic…) and wishing you all (both…?) a belated happy Christmas I am going to sort through the December in-tray.

The key trends of 2011 will always be, for me, the landmark strides made towards really incorporating content into the workflow of professionals, and the progress made in associating previously unthinkable data collections (not linked by metadata, structure and /or location) in ways that allowed us draw out fresh analytical conclusions not otherwise available to us. These are the beginnings of very long processes, but already I think that they have redefined “digital publishing” or whatever it is that we name the post-format (book, chapter, article, database, file) world we have been living in now for a few years and are at last beginning to recognize. Elsevier recognized it all right with their LIPID MAPS lipid structures App (http://bit.ly/LipidsApp) earlier this month and I should have been quicker to see this. This App on SciVerse does all of the workflow around lipid metabolisms  and is thus integral to the research into lipids-based diseases (stroke, cancer, diabetes, Alzheimer’s, arthritis, to name a few). The LIPIDS MAP consortium is a multi-institutional, research-based organization which has marshalled into its mapping all of the metadata and nomenclature available – common and systematic names, formula, exact mass, InChiKey, classification hierarchies and links to relevant public databases. Elsevier adds the entity searching that allows the full text and abstracts to support the mapping and in data analysis terms to draw the sting from a huge amount of researcher process effort. Whenever I hear the old Newtonian saw about “standing on the shoulders of giants” I replace shoulders with “platforms”.

So how do Elsevier pull off a trick like this? By being ready and spending years  in the preparatory stages. Elsevier, in my view, has become two companies, and alongside a traditional, conservative journal publisher has evolved a high tech science data handling company, conceived in Science Direct and reaching, via Scirus and Scopus a sort of  adolescence in SciVerse. This effort now moves beyond pure data into the worktool App, driven by SciVerse Applications (www.applications.sciverse.com) and the network of collaborating third party developers which is increasingly driving these developments (http://developers.sciverse.com). This is and will be a vital component. Not even Elsevier can do all these things alone. The future is collaborative, and here is the market leader showing it understands that, and knows that science goes forward by many players, large and small, acting together. And if developers can find, under the Elsevier technology umbrella, a way of exposing their talents and earning from them (as authors were wont to do with publishers) then another business model extension has been made. There is much evidence here of the future of science “publishing” – and while it may be doubted that many (two?) companies can accomplish these mutations successfully, Elsevier are making their bid to be one of them.

And there is always a nagging Google story somewhere left un-analysed, usually because one could either write a book on the implications or ignore them , on the grounds that they may never happen. But Google is the birthplace of so much that has happened in Big Data that I am loath to neglect BigQuery. With an ordinary sized and shaped company this would all be different. I could say for example that LexisNexis is taking its Big Data solution, HPCC (www.hpccsystems.com) Open Source because it wants to get its product implemented in many vertical market solutions without having to go head to head with IBM, Oracle or SAP. But Google clearly relishes the thought of taking on the major analytics players on the enterprize solutions platforms, and clearly has that in mind with this SQL based service, which has been around for about a year and now enters beta with a waitlist of major corporate users anxious to test it. And yet, wait a minute, Google, Facebook and Twitter led us into the No SQL world because the data types, particularly mapping, and the size of databases involved, pushed us into the Big Data age and past the successful solutions created in the previous decade in SQL enquiry. So is what Google is doing here driven mostly by its analysis of the data and capabilities of major corporates (Google doing market research and not giving the market what Google thinks is good for them!) or is this something else, a low level service environment that may take off and splutter into life, or may beta and burn like so many predecessors. Hard to tell but worth asking the question of the Google Man Near You. Meanwhile, the closest thing to a Big Data play in publishing markets remains MarkLogic 5.0. Coming back to where I started on Big Data, one of the most significant announcements in a crowded December had Lexis Nexis – law this time, not Risk Solutions – using MarkLogic 5 as the way to bring its huge legal holdings together, search them in conjunction with third party content and mine previously unrecognized connectivities. Except that I should not have said “mine”. Apparently “mining” and “scraping” are now out of favour: now we “extract” as we analyse and abstract!

However, I wish every scraper and miner seeking  a way forward every good wish for 2012. And me? Well, I am going to check out those Orang Utans. They may have rewritten Shakespeare by now.

 

 

You can tell when even major corporates are embarrassed. Their use of language deteriorates to the point when meaning (hopefully) vanishes and we hacks are left to put our own, corporately deniable, slant on their gnomic pronouncements. Thus it is with the “accelerated departure” of Tom Glocer, CEO of Thomson Reuters. What exactly does that mean? Did he leave before his time, or was he unexpectedly ejected? The rumour mill had it that he was going in April 2012, so was the acceleration to be found there (his fourth anniversary is not a huge senior service for such a stable outfit as Thomson Reuters), or in his contract, or elsewhere? And did he know, or was he pushed?

Certainly it is always alleged that his predecessor, Dick Harrington, did not know that a discreet negotiation continued behind the scenes bringing Thomson and Reuters together with no place in it for him. That, if true, must have been a surprise. Did Tom Glocer come by a similar “confronts reality” shock, as the FT termed it? And what was the reality that was being confronted? I can think of at least three realities that must needs be in the minds of Thomson Reuters CEOs, and none of them relate to the decline in market value which is widely blamed for triggering these changes. The first, and most important, is the nature of the company’s ownership. Wherever a big player is really 55% controlled by the family of its original founders, confidence issues will come into play. This is real control, not the artificial dominance of voting shares practised by Murdochs or Harmsworths in defiance of market views of good practice. And this real control means that, as in the eighteenth century, once the incumbent first minister loses the confidence of the King and his closest advisor, it is impossible to continue in office. That rule applied to the reign of Ken Thomson and John Tory, as it does in the Woodbridge Trust of David Thomson and Geoffrey Beattie. It is simple and natural; you go when the owners no longer believe you can deliver.

And since Thomson Reuters are the largest professional player in the marketplace, it is worth asking what these men need to have confidence about. As far as the press commentary is concerned, one would think that the only issue is the Eikon terminal and its slow start. Well, the history of Reuters is littered with slow starts, one of which let Bloomberg into the marketplace to begin with, and several of which cumulated to create this peculiar position where the smartest and most modern application is also the cheapest and has lost market share in the recession to Bloomberg’s older and more expensive option. In each of these cycles the market for trading systems has returned to rough parity. Over at the professional side of Thomson they know about these cycles, having sometimes been up and sometimes down, but in that market they are currently in the Bloomberg position and Lexis are in the Reuters position. So did Tom Glocer’s acceleration towards the swing doors relate to all this?

Certainly this may have been the symptom, but perhaps it was not the underlying problem. The mandate that Tom Glocer accepted was to build an integrated company and it is possible, as the company became wracked by the issue of combining the parts to create new growth as a whole, that the Woodbridge owners began to doubt whether this aim was ever going to be achieved through these policies. Certainly the sacrificial slaughter of a layer of Reuters management and the balkanization of the company into an unmanageable number of operating units did not lull any misgivings in Toronto, though they may have given rise to rejoicing in old Thomson management circles, where the attitudes of their new Reuters colleagues had been met with all of the enthusiasm that the Anglo-Saxons showed to their new Norman rulers. In the new dispensation we are back down to five divisions, with former Reuters strategy chief (latterly running GRC) David Craig taking the old Market divisions, Legal going to Mike Suchsland, Tax and Accounting to Brian Peccarelli, and Global Growth to Shanker Ramamurthy. Jon Robson gets the Business Development role. What factor is common to all of these? None of them comes from a very long term Reuters and/or Thomson background. A generation has effectively passed.

And what of Jim Smith, the new CEO. Some commentators have him as a caretaker, awaiting the new strategic leader to be found and installed. Others, and I incline to this view, see him as chairman and arbiter of resource and manpower development and deployment to support and drive the integration of these two companies. So not a traditional Thomson CEO, any more than Erik Enstrom is a traditional Reed Elsevier CEO. In the latter case one has a feeling of a profoundly numerate portfolio owner looking to encourage the growth points with acquisition investment, dispose of underperformers and reward successful managers who reliably produce results. It is almost as if Reed Elsevier does not see a need anymore for an informing central strategy about its market positioning, other that “we will invest in anything that works and avoid the bits that don’t”. By contrast, Thomson Reuters is built around a distinctive market positioning, a “big niche” strategy and definite ideas about what it needed to buy, sell or grow to make the aspiration work. And yet… once you have the strategy in place, here too market strategic thinking devolves to the operating unit quite quickly. Hopefully that means that in both of these market leading players, the doors will soon stop revolving at the speed of light and we can get back  the real problems of addressing the needs of global information markets in times of scarcity.

 

PS. One of the items on Jim Smith’s agenda must surely be the finalization of the sale of Healthcare, whose projected disposal was an early agenda item for his predecessor. It is hard to remember but this move has now been projected for almost four years!

 

 

« go backkeep looking »