Mar
9
“RELX, not Relicts!”
Filed Under B2B, Big Data, Blog, healthcare, Industry Analysis, internet, Publishing, Reed Elsevier, STM, Thomson, Uncategorized | Leave a Comment
It was a simple enough mistake to make. As I read my screen and saw the announcement two weeks ago I passed on the news to an American friend on the other side of the room. The response prompted the correction from me at the top of this page: Reed Elsevier have renamed themselves after their new stock exchange identity, and are not actually inviting us to call them also-rans. Everyone will appreciate the logic of removing the dual identity and the double quote and the difficult accounting exercise to keep these two trading identities in the market together. But I am left wondering if there is not another, almost subliminal, market message being left here, one to which perhaps even the senior management of RELX are oblivious. Coming a month after the death of Ian Irvine, one of the architects of the deal that brought Reed and Elsevier together, it made me wonder whether the real meaning here is a totally different orientation for the new group, one which would have invited the snarling displeasure of Dr Pierre Vinken, at whose insistence the Dual Monarchy was originally created and launched in January 1993.
Whatever else it is, an “Elsevier” is a book, and indeed in the nineteenth century became the term of use for a pocket book, the contemporary version of a paperback. Reed was a nineteenth century paper company. In some ways therefore these resonances should definitely go, especially since underlying brands like LexisNexis or Elsevier Science remain in place. Perhaps then we are being told that RELX is a bundle of brands invested by a quoted umbrella organization. That would be consistent enough with practice in recent years, and one can imagine that the new structure, the single quote, and the name are designed purely as a play to investors, and have been sold to employees on that basis. After all, one of the lamentations of successive generations of Reed Elsevier management over the years since 1993 has been that the European markets have consistently under-rated the company. After a couple of years of share buy backs and consistent dividend policy, now is the time, one can almost hear them saying, to move away from Reed Elsevier as the sluggish market benchmark in Europe, and re-align RELX as a more dynamic growth vehicle with a much improved rating. And all those with bonus scheme equity holdings not yet vested should cheer that relaunch!
And yet… there are some real risks. The views of investors are always short term and their analysts are as often wrong as right. Does Claudio Aspesi* know more than the professional management of Elsevier Science about what happens next in Open Access? I doubt it, but his views, from his influential desk at Sanford Bernstein, have certainly driven the share price of old Reed Elsevier more than many management announcements in recent years. Further, if you are a bundle of brands represented by a stock market ticker symbol, it is open to everyone in the market to rate you on that brand composition for short term interests. Thus it is now possible to read RELX critics who find the stock “unbuyable” until Lexis Law is divested, or who think Elsevier Health Science is too small in market share terms and should be merged with WK Health and then “IPO-ed”. I wonder who would profit most immediately from that, if not the market-makers themselves? I am not here concerned with whether either of these moves is feasible or desirable: just with the idea that if your focus is unbalanced in the direction of the market, you tend to be driven to appease market sentiment. And market sentiment is a quicksand.
Will it matter to lawyers or scientists that they now buy, ultimately, from RELX? Probably not at all. So what then is the issue? Really one of short versus long term. RELX has a history in science and law and some key business sectors that gives them two advantages. They have experienced management who have shown themselves close enough to ultimate users of information to allow them to judge likely outcomes. Timing is everything. When to press the button is just as important as all the other decisions in new product development. And new product development is going faster in these sectors than ever before. Is that a good time to swap areas of expertise within the portfolio, bringing in areas to which senior management have not been previously exposed and forsaking areas of traditional strength? Or is it a time for long term investment, active acquisition and development programmes, such as the ones that built Elsevier Science, which reposition the brand in the forefront of the marketplace but which take correspondingly long periods to pay back? Whatever choices are made, they surely begin in the market place and end by being packaged for potential investors. It is hard to believe that successful schemes can be created that begin with assessing what investors will swallow, and end with creating market interventions that fit that paradigm.
Fortunately I can end with a suggestion which will please all parties. In 1997-8 RELX attempted to merge with Wolters Kluwer. Why not bring it on again? WK is said to be selling its transport B2B division at the moment, just another of the long list of market exits since the European Commission made its competition opposition clear in 1998. There are now no education or STM assets at WK to get in the way. In the US there would be Health sector competition issues (though there are now other very large content players), but with Bloomberg BNA swarming into the tax market alongside Thomson, combining WK and Lexis on the tax side would make sense. In Europe, Lexis-WK would be powerful in France, though Lexis left Germany to WK, so no competition issues there. Long term bets on the Eurozone would not make the analysts happy, but lawyers in France and Germany are likely to be busy whichever direction the currency takes. And above all, for all of those investors who have boosted the WK share price for 17 years in the hopes of just such a denouement – a payoff!
RELX is not the only player to feel these tensions. Every quoted company is subject to them in one way or another. It is what management do to make these tensions creative and not negative that makes the difference. RELX? RELaX, not RELICTS!
Feb
24
Outsourcing My Brain II
Filed Under B2B, Big Data, Blog, data analytics, Industry Analysis, internet, Publishing, Search, semantic web, STM, Uncategorized, Workflow | 1 Comment
Now, are you ready for this? I am not sure that I am, but I feel honour bound, having started a discussion last month under this heading about Internet of Things/Internet of Everything (IoT/IoE), to finish it by relating it back to the information marketplace and the media and publishing world. It is easy enough to think of the universal tagging of the world around us as a revolution in logistics, but surely it’s only effect cannot be to speed the Amazon drone ever more rapidly to our door? Or to create a moving map of a battlefield which relates what we are reading about in a book to all of the places being mentioned as we turn the pages? Or create digital catalogues as every book is tagged and can respond by position and availability!
You are right: there must be more to all of this. So let us start where we are now and move forward with the usual improbable claims that you expect to read here. Let’s begin with automated journalism and authorship, which, when I wrote here about the early work of Narrative Science and the Hanley Wood deal, was in its infancy, and then came Automated Insights and the Wordsmith package (automatedinsights.com). Here, it seemed to me, were the first steps in replacing the reporter who quarries the story from the press release with a flow of standardised analytics which could format the story and reproduce it in the journal in question just as if it had been laboriously crafted by Man. End result is a rapid change in the newspaper or magazine cost base (and an extension to life on Earth for the traditional media?).
I no longer think this will be the case. As with the long history of the postponed glories of Artificial Intelligence itself, by the time fully automated journalism arrives, most readers will be machines as well as most writers, in fields as diverse as business news and sports reporting and legal informatics and diagnostic medicine and science research reporting. Machine 2 Me will be rapidly followed by real M2M – Machine to Machine. The question then sharpens crudely: if the reporting and analysis is data driven and machine moderated, will “publishing” be an intermediary role at all? Or will it simply become a data analysis service, directed by the needs of each user organisation and eventually each user? So the idea of holding content and generalizing it for users becomes less relevant, and is replaced by what I am told is called “Actionable Personalization”. In other words, we move rapidly from machine driven journalism to personalised reporting which drives user workflows and produces solutions.
Let’s stumble a little further along this track. In such a deeply automated world, most things that retain a human touch will assume a high value. Because of their rarity, perhaps, or sometimes because of the eccentric ability of the human brain to retain a detail that fails the jigsaw test until it can be fitted into a later picture. We may need few analysts of this type, but their input will have critical value. Indeed, the distinguishing factors in discriminating between suppliers may not be the speed or capacity or power of their machinery, but the value of their retained humans who have the erratic capacity to disrupt the smooth flow of analytical conclusion – retrospectively. Because we must remember that the share price or the research finding or the analytic comparison has been folded into the composite picture and adjustments made long before any human has had time to actually read it.
Is all this just futurizing? Is there any evidence that the world is beginning to identify objects consistently with markers which will enable a genuine convergence of the real and the virtual? I think that the geolocation people can point to just that happening in a number of instances, and not just to speed the path of driverless cars. The so-called BD2K iniatives feature all sort of data-driven development around projects like the Neuroscience Information Framework. Also funded by the U.S. government, the Genbank initiatives and the development of the International Nucleotide Sequence Database Collaboration, point to a willingness to identify objects in ways that combine processes on the lab workbench with the knowledge systems that surround them. As so often, the STM world becomes a harbinger of change, creating another dimension to the ontologies that already exist in biomedicine and the wider life sciences. With the speed of change steadily increasing these things will not be long in leaving the research bench for a wider world.
Some of the AI companies that will make these changes happen are already in movement, as the recent dealings around Sentient (www.sentient.ai) make clear. Others are still pacing the paddock, though new players like Context Relevant (www.contextrelevant.com) and Scaled Inference (https://scaled inference.com) already have investment and valuations which are comparable to Narrative Science. Then look at the small fast growth players – MetaMind, Vicarious, Nara or Kensho – or even Mastodon C in the UK – to see how quickly generation is now lapping generation. For a decade it has been high fashion for leading market players in information marketplaces to set up incubators to grow new market presence. We who have content will build tools, they said. We will invest in value add in the market and be ready for the inevitable commoditization of our content when it occurs. They were very right to take this view, of course, and it is very satisfying to see investments like ReadCube in the Holtzbrinck/Digital Science greenhouse, or figshare in the same place, beginning to accelerate. But if, as we must by now suspect, the next wave to crash on the digital beach is bigger than the last, then some of these incubations will get flooded out before they reach maturity. Perhaps there was no time at which it is more important to have a fixed focus on 6 months ahead and three years. The result will be a cross-eyed generation, but that may be the price for knowing when to disinvest in interim technology that may never have time to flower.
« go back — keep looking »