Aug
17
A Month in the STM Country
Filed Under Big Data, Blog, data analytics, eBook, eLearning, healthcare, Industry Analysis, internet, Publishing, Reed Elsevier, Search, semantic web, STM, Thomson, Uncategorized, Workflow | Leave a Comment
Ah, the slow moving waters of academic publishing! Take your eye away from them for a day or a week, let alone a month, and everything is irretrievably changed. Last month it was the sale of Thomson Reuters IP and Science that attracted the headlines: this month we see the waves crashing on the shores of academic institutional and Funder-based publishing, as well as a really serious attempt to supplant Web of Science as the metrics standard of good research and critically influential science. And, as always, all unrelated events are really closely connected.
So let’s start with the wonderful world of citation indexes. Inspired by Vannever Bush (who wasn’t in that generation?), the 30 year old Eugene Garfield laid out his ideas on creating a science citation index and a journal impact factor in 1955. His Institute Of Science Information was bought by Thomson Reuters in 1992, and I am pleased to record that in my daily note to the then EPS clients (we were all testing the concept “internet” at the time), I wrote “It is widely thought that in a networked environment ISI will be a vital information resource”! Full marks then for prescience! As Web of Science, the ISI branded service, became the dominant technique for distinguishing good science, and funding good science, so Thomson Reuters found they had a cash cow of impressive proportions on their hands.
But this history is only significant in light of the time scale. While there have been updates and improvements, we are using a 60 year old algorithm despite knowing that its imperfections become more obvious year by year, mostly because the whole marketplace uses it and it was very inconvenient for anyone to stop. Although altmetrics of all sorts have long made citation indexes look odd, no move to rebase them or separate them from a journal-centric view took place. Yet that may be exactly what is happening now. The inclusion of RCR (Relative Citation Ratio) in the National Instiutes of Health iCite suite fits the requirement that change is effected by a major Funder/official body and can then percolate downwards. RCR (I do hope they call it iCite – RCR means the responsible research code of practice to many US researchers) now needs widespread public-facing adoption and use, so its implementation across the face of Digital Science is good news. Having once thought that Digital Science in its Nature days should acquire Web of Science and recreate it, it is now becoming clear that this is happening without such an investment , and companies like figshare, Uber Research and ReadCube will be in the front line exploiting this.
And then, at a recent meeting someone said that there would be 48 new university presses created this year for Open Access publishing for both articles and monographs. I cannot verify the number – more than Team GB’s initial expectation of Olympic Medals! – but the emerging trend is obvious. Look only at the resplendent UCL Press, decked out in Armadillo software producing some very impressive BOOCS (Books as Open Online Content). In September they launch the AHRC- British Library Academic Book of the Future BOOC, if that is not a contradiction. Free, research-orientated and designed to high standards.
Just up the road in London’s Knowledge Quarter is Wellcome, and it is interesting to see the first manifestation of the predictable (well, in this arrondissment anyway) move by funders into self-publishing. As author publication fees mount (one major Funder already spends over a billion dollars US on publishing) there has to be a cheaper way. And at the same time if you could actually improve the quality of scholarly communication by bringing together all of a grant holder’s research outputs in one place that would seem to make sense. It simplifies peer review, which fundamentally becomes a function of the funder’s project selection – saying in effect that if we thought it right to fund the work then we should publish the results. It does have some objective checks, presumably like Plos!, but the object is to very quickly publish what is available: Research articles, evidential data, case reports, protocols, and, interestingly, null and negative results. This latter is the stuff that never gets into journals, yet, as they say at Wellcome “Publishing null and negative results is good for both science and society. It means researchers don’t waste time on hypotheses that have already been proved wrong, and clinicians can make decisions with more evidence”. The platform Wellcome are using is effectively F100, and so is designed for speed of process – 100 days is Wellcomes aspiration – and for post-publication peer review, allowing full critical attention to be paid after materials are made available. And the emphasis on data very much reflects the F1000 dynamic, and the increasing demand for repeatability and reproducibility in research results.
So, what a month for demonstrating trends – towards more refined metrics in research impact, towards the emergence of universities and research funders as publishers, and towards another successful development from the Vitek Tracz stable, and a further justification of the Digital Science positioning at Macmillan. In an age of powerful users focussed on productivity and reputation management, these developments reflect that power shift, with implications for the commercial sector and the content-centric world of books and journals.
Jul
11
Canadian Hunter bags Bull Moose
Filed Under B2B, Big Data, Blog, Industry Analysis, internet, Publishing, Search, Thomson, Uncategorized, Workflow | Leave a Comment
Canada is a large and wonderful country where things do tend to come in large packets. Land mass. Tar sands. Forests. And now information market deals. Toronto-based Onex Corporation (in company with Barings Asia) have paid $3.55 billion for Thomson Reuters Science and Intellectual Property. This represents a premium over the market estimates of around $3.2 billion being bruited about in London and New York last week, and the fact that it is a cash offer, unconditional, and requiring nothing but routine regulatory clearance anywhere will come as a great satisfaction to its Canadian sellers as well as, presumably, it’s Canadian buyers. At least there could be no reasonable criticism from shareholders that cash had been left on the table, a thought that obviously occurred to some when the last divestment, Thomson Healthcare, went to a PE firm for $1.2 billion, only to be resold two years later to IBM Watson Health for exactly double that sum.
And so Thomson Reuters is now a svelte and streamlined non-portfolio operation organized around offering services across the whole corporate workflow, from capital and equity markets through the spectrum of global corporate investment environments to tax and corporate law and risk and governance and compliance. The whole angst and agony, in other words, of corporate markets in a sluggish post-recession corporate world where growth is hard to find. Yet putting capital raised by divestment to get growth is just what Thomson Reuters must strive to do. They now have the right players in the orchestra – now they need the score and an able conductor. In the siloed world of a large portfolio player this was not so important since each investment justified itself – or didn’t: if growth is now a necessity, and it is, then the whole corporate body needs a new energy to fill the gaps with new product development, to embrace customer participation by innovating across the old divisional structures to react to emerging needs, by being agile and re-iterative in bringing innovation to their markets.
Easier to type than to do. And to be fair Thomson Reuters do many good things already. But the most noticeable factor about post-portfolio players desperately regrouping after divestment is their difficulties around concentrating data in the right places. All those siloed data empires and all those CTOs and their defensive strategies and their unique database configurations. This is hard to break down: no one sees data strategically, or at least strategically enough. There is no more important decision than deciding on the interfacing systems which will enable product development teams in any part of a large niche-focussed group like Thomson Reuters to bring data from any silo in the group and mash or remix it with other data types or with end user data. By rights this is a board decision and must carry the stamp of the CEO and COO. Thomson Reuters now have a chance to get this right, or lapse back into post-imperial stagnation, where powerful operating company barons can shield and block the sharing of data, which, wherever it comes from, is the lifeblood of the company and the key to growth.
And what about Onex and it’s shiny new toys? Well, it has bought the Thomson Reuters cash cows in the hopes that the cash will continue to flow while they break it up and sell the parts. Aging though it is, and desperately in need of a face lift that folds in altmetrics and the revolutionary changes arising from usage data for the measurement of “good science”, Web of Science is still a necessary component of every university library worth its salt. On the other side of the acquisition, IP Advisor, Derwent and services in the trademark registration area are wonderful long term assets and should hold up during the two years it will take to separate the two parts, making savings in overheads which will be unpleasant but helpful in getting the margins to an even more desirable pitch in both parts of the former business. The former owner neglected, in the last decade, to invest in this cash cow what was needed to refresh its product offering and undertake the M&A work it needed to do. This was the company that needed to buy Mendeley, not Elsevier. This was the company that needed to buy Altmetrics, not Nature. And so on… But one other thing is certain. The new buyer, while there may be a few cosmetic deals, will not do so either. That is not the name of the new game plan in Toronto.
The cost which the new owners do have to bear is the cost of the break-up of the two parts, each of which is destined for a different ultimate buyer in the next two years. Here is the dream scenario. Cinven are the PE owners of CPA Global, one of several challengers to the Thomson Reuters position in patent and trademark information. They might have been more serious bidders in this round but for the fact that they did not want Science. They may face some regulatory pressures, but they could always disgorge some of the current Thomson Reuters holding in this area to Lexis, who are proving hungrier in this sector lately. On the other flank, Thomson Reuters Science is believed to have been a target of Springer Nature for many years. That company, is owned by BC Partners (who may indeed have been early stage bidders but probably did not like the IP side if they were) and by Holtzbrinck. It would be strange if these partners did not eye the Science division as a natural add-on, either before or after the IPO due in 15 months time. Would that deal add a final touch to their valuation or not?
So will the cash cow go on producing for two more year without being fed? And will this enable the Toronto PE men to exit at $ 5.2 billion plus after debt and loans are taken into account? We will all have to wait for those answers. All we know today is that a very big tree has just been felled in the Canadian information forest.
« go back — keep looking »