Feb
24
Outsourcing My Brain II
Filed Under B2B, Big Data, Blog, data analytics, Industry Analysis, internet, Publishing, Search, semantic web, STM, Uncategorized, Workflow | 1 Comment
Now, are you ready for this? I am not sure that I am, but I feel honour bound, having started a discussion last month under this heading about Internet of Things/Internet of Everything (IoT/IoE), to finish it by relating it back to the information marketplace and the media and publishing world. It is easy enough to think of the universal tagging of the world around us as a revolution in logistics, but surely it’s only effect cannot be to speed the Amazon drone ever more rapidly to our door? Or to create a moving map of a battlefield which relates what we are reading about in a book to all of the places being mentioned as we turn the pages? Or create digital catalogues as every book is tagged and can respond by position and availability!
You are right: there must be more to all of this. So let us start where we are now and move forward with the usual improbable claims that you expect to read here. Let’s begin with automated journalism and authorship, which, when I wrote here about the early work of Narrative Science and the Hanley Wood deal, was in its infancy, and then came Automated Insights and the Wordsmith package (automatedinsights.com). Here, it seemed to me, were the first steps in replacing the reporter who quarries the story from the press release with a flow of standardised analytics which could format the story and reproduce it in the journal in question just as if it had been laboriously crafted by Man. End result is a rapid change in the newspaper or magazine cost base (and an extension to life on Earth for the traditional media?).
I no longer think this will be the case. As with the long history of the postponed glories of Artificial Intelligence itself, by the time fully automated journalism arrives, most readers will be machines as well as most writers, in fields as diverse as business news and sports reporting and legal informatics and diagnostic medicine and science research reporting. Machine 2 Me will be rapidly followed by real M2M – Machine to Machine. The question then sharpens crudely: if the reporting and analysis is data driven and machine moderated, will “publishing” be an intermediary role at all? Or will it simply become a data analysis service, directed by the needs of each user organisation and eventually each user? So the idea of holding content and generalizing it for users becomes less relevant, and is replaced by what I am told is called “Actionable Personalization”. In other words, we move rapidly from machine driven journalism to personalised reporting which drives user workflows and produces solutions.
Let’s stumble a little further along this track. In such a deeply automated world, most things that retain a human touch will assume a high value. Because of their rarity, perhaps, or sometimes because of the eccentric ability of the human brain to retain a detail that fails the jigsaw test until it can be fitted into a later picture. We may need few analysts of this type, but their input will have critical value. Indeed, the distinguishing factors in discriminating between suppliers may not be the speed or capacity or power of their machinery, but the value of their retained humans who have the erratic capacity to disrupt the smooth flow of analytical conclusion – retrospectively. Because we must remember that the share price or the research finding or the analytic comparison has been folded into the composite picture and adjustments made long before any human has had time to actually read it.
Is all this just futurizing? Is there any evidence that the world is beginning to identify objects consistently with markers which will enable a genuine convergence of the real and the virtual? I think that the geolocation people can point to just that happening in a number of instances, and not just to speed the path of driverless cars. The so-called BD2K iniatives feature all sort of data-driven development around projects like the Neuroscience Information Framework. Also funded by the U.S. government, the Genbank initiatives and the development of the International Nucleotide Sequence Database Collaboration, point to a willingness to identify objects in ways that combine processes on the lab workbench with the knowledge systems that surround them. As so often, the STM world becomes a harbinger of change, creating another dimension to the ontologies that already exist in biomedicine and the wider life sciences. With the speed of change steadily increasing these things will not be long in leaving the research bench for a wider world.
Some of the AI companies that will make these changes happen are already in movement, as the recent dealings around Sentient (www.sentient.ai) make clear. Others are still pacing the paddock, though new players like Context Relevant (www.contextrelevant.com) and Scaled Inference (https://scaled inference.com) already have investment and valuations which are comparable to Narrative Science. Then look at the small fast growth players – MetaMind, Vicarious, Nara or Kensho – or even Mastodon C in the UK – to see how quickly generation is now lapping generation. For a decade it has been high fashion for leading market players in information marketplaces to set up incubators to grow new market presence. We who have content will build tools, they said. We will invest in value add in the market and be ready for the inevitable commoditization of our content when it occurs. They were very right to take this view, of course, and it is very satisfying to see investments like ReadCube in the Holtzbrinck/Digital Science greenhouse, or figshare in the same place, beginning to accelerate. But if, as we must by now suspect, the next wave to crash on the digital beach is bigger than the last, then some of these incubations will get flooded out before they reach maturity. Perhaps there was no time at which it is more important to have a fixed focus on 6 months ahead and three years. The result will be a cross-eyed generation, but that may be the price for knowing when to disinvest in interim technology that may never have time to flower.
Jan
15
The Ultimate Question is Scale
Filed Under Big Data, Blog, Education, eLearning, Industry Analysis, internet, Publishing, Reed Elsevier, STM, Workflow | 2 Comments
How big do you need to be to succeed? In this age of internet service and content consolidation the urge to be large seems almost irresistible. You have to be big enough to be a one stop shop, or big percentage thereof. You have to be big enough to enable the technology spend, and get its paybacks. As content gets increasingly commoditized, you have to be big enough to move up the value chain with your users, and to buy into innovative smaller players at the right time. Above all, if consolidation is, as I have long maintained leading to information market sectors with two, or three big players and a host of smaller ones, you need to be in the First Division if you aim to influence market behaviour, pricing, access and discoverability rather than be driven by them These thoughts come immediately to mind while thinking about today’s news of the “merger” between Springer and Macmillan.
I have put “merger” in inverted commas because you could also describe this as a German dynastic marriage, or indeed you could describe it as an acquisition, since the architect of the deal, Stefan von Holtzbrinck, ends up holding 53% of the equity. Holtzbrinck, of course, remains a family company and started as a major player in German national and regional newspapers. Like another family company, DMGT, this generation has seen the instability of basing the family wealth solely in newsprint. DMGT, through diversification supported and encouraged by Vere and then Jonathan Harmsworth, is now a B2B company with a minority proportion of its activity in newspapers. The Von Holtzbrinck route was different, but ends in the same place: the minority of its interests are now in scientific information, academic publishing and education. The critical threat that the demise of newspapers would sink the family ship is now over.
And over in a very clever way. Keep “merger” in quotes. While Macmillan always had to get bigger to become a rival to Wiley in a market dominated by Elsevier, Springer always had to sell. It has had so many suitors over the years that it qualified for a place on Parship, the Holtzbrinck dating site. The current relationship with BC Partners is a tertiary private equity deal, something unheard of before this century. But the result of Cinven and Candover buying the decaying hulk of Springer from Bertelsmann was a clean-up, followed by a sale to EQT and GIC. Which was followed by more streamlining and margin improvement and a sale to BC Partners for 3.3 billion euros. There could have been little improvement to be made this time round. Springer had recreated its Springerlink online platform and the company is undoubtedly back amongst the market leaders in terms of profitability, so the only way to go was a trade sale. The solution in this deal is just that, staged to the benefit of both parties. BC get to exit their 47%, possibly via an IPO, in the next three years, at an enhanced valuation secured through the Macmillan assets, and especially Nature Publishing. Holtzbrinck get a satisfying revaluation of their Macmillan purchase when the IPO goes through, and probably an opportunity to grow their stake. So both can go happily hand in hand to the German regulator, and get a big tick for accomplishing one of the prized national objectives – keeping Springer, the historical home of German chemistry as it reshaped late nineteenth century science, as a German company. Finally, as you look at this deal, do the maths. Holtzbrinck have merged into this deal their assets at Macmillan to form a company worth 5 billion euros. Their partner put in a company worth 3.3 billion euros two years ago. Holtzbrinck get 53%, depending on how much debt is left in 2-4 years time , and how much of this the partners decide to turn into equity. Sounds good to me!
It could have been so much worse for Springer, though. The perpetual arranged marriage for Springer was always going to be Informa’s Taylor and Francis. It almost came off twice. but the in-laws came to blows at the altar rail. Then people like me bet on Springer, always underexposed in the US, being merged with Thomson-Reuters Healthcare (now in PE hands as Trueven) or Thomson-Reuters Science (still oddly outside of the parent’s finance-law corporate vertical). Wiley was even mentioned as a possible deal, though this always seemed unlikely. But the new marriage, with a market cap, remember, of around 5 billion euros, has desirable scale, and both players together make a powerful force in Open Access and can use their joint capacity to operate effectively as data publishers as science wants more and more experimental evidence linked to articles and made available on time and alongside.
And of course there is more than science in this deal. The Education interests of Macmillan, with some exceptions, are in the mix, as are the now much diminished B2B interests of Springer. Rather more interesting is what is left out on the Macmillan side. No private equity player looking at a forthcoming marriage of convenience would want to see assets included that were under a cloud or had yet to yield a margin. And Springer’s margins , which the current management have recovered from their previous deeply unimpressive levels , are now above the industry average and almost certainly better than Macmillan. The whole US Higher Education market is fairly cloudy, which may explain the exclusion of Bedford from the deal. Macmillan consumer publishing is just irrelevant to all this. And the seed investment areas are just too far from profitability, so they stay with Holtzbrinck, giving that company another bonus. There are some great growth points in these seed beds. Just imagine, looking at the ReadCube venture in Macmillan Digital Science, the effect of using that platform, already in Wiley and Nature, in Springer. That is the good thing about scale – you can build quickly.
The final question we need to ask is how all this can be managed. Annette Thomas goes onto the Springer board as Chief Science Officer, joining Derk Haank, CEO, Martin Mos (COO) and the Springer CFO. As indicated in the last blog here, Annette’s style has been innovation and adventure. Her Dutch and German colleagues on this board have built through more conservative policies. A big priority has been securing the management team pay-outs that three rewarding deals in 15 years can secure. By some estimates those rewards would now buy a small European country, let alone a farm at Groningen! Such things are not secured by high risk investment. As a team these people are the most experienced STM players anywhere: what we now need to see is how well they perform as a management team. This is not the least interesting part of this deal.
« go back — keep looking »