Nov
22
The Rise of the Funders
Filed Under Big Data, Blog, data analytics, Industry Analysis, internet, Publishing, Reed Elsevier, STM, Thomson, Workflow | Leave a Comment
The STM Associations London seminar on Publication Impact last week (19 November 2015) seemed oddly like two events, struggling under a single skin. Not hard perhaps to see why the organizer’s had decided to put both the whole immense subject of new techniques and technologies for measuring the reputation of researchers and the worth of science research into the same skin as the measurement of impact and case studies in new techniques, but since I think these subjects are more important than journal publishing itself, the modest number of attendees had a double bounty. Maybe next year STM will give these subjects the space they deserve. There really is too much going on here that is not only important to science research, but vital to the future of science publishing.
I started this debate by introducing the recent Outsell report, written by Deni Auclair (The Impact of Research Funders on Scholarly Communication; August 2015; www.outsellinc.com). Deni’s hugely informative paper outlines the sea change that has taken place in measurement since altmetrics, and I wanted to add my own feeling that we are about to see another power surge in the unstable structure of publishing as the middleman role in the transfer of knowledge in research-influenced marketplaces. If the Gates Foundation can grant The New Media Corporation of Austin, Texas, $3.2 million to create personalized learning tools, as it did on 12 November 2015, what is to prevent it from creating article publishing software and circulating it to grant-holding researchers, so that they could prepare and upload articles and data to Dryad or figshare or F1000 for a fragment of the APC cost currently charged in conventional publishing houses? The answer, written large in this seminar for me, is that this happens when they realize that the accumulating costs of APCs are unsustainable, and that sufficient mechanisms are now in the marketplace to measure reputation post-publication to ensure a proper scrutiny of their output and its reasonably accurate ranking with its peers.
The meeting did not address the first of those pre-conditions, but it covered the second in very considerable detail. Stephane Bergmans of Elsevier showed how the European Union is moving from a conservative starting point to a more wide-ranging approach. Kevin Dolby of Wellcome Trust convincingly argued the case for the deep interest of funders in reputation. But, as ever, it was Dave Nicholas who plunged us into the unpleasant realities. When it came to loading your H-Index, it really did make a difference on whether you used Google Scholar or Web of Science or Scopus. Different sourcing did matter in a marketplace where he now recognizes 25 different emerging reputation platforms. 13 of those concerned research (and he rightly bemoaned the fact that only 3 were concerned with teaching quality). And he noted the new mystery being born, especially around the use of blogging and social media in altmetrics. Why don’t ResearchGate publish their ranking formula? Because they are afraid that academics seeking to gain a swift promotion will “game” the system? Or because (my thought) they want to preserve the magic until they get an offer from Springer Nature or Wiley?
How do you optimize without cheating? Charlie Rapple and Kudos had the answers to how to explain what the research was about and how it was relevant. Noting time-based correlations between communications and reactions helped you measure whether scholarly peer group communication worked, and Fiona Murphy, who followed her, was able to bang the drum for data deposits as a route to reputation enhancement. It struck me how slow we have been to give data its due: only now are we creating data citation principles (the DC), and using the Resource Identification initiative – and, above all, developing some good practice standards in altmtrics usage and evaluation (NISO). PlosONE showed how they have wobbled (sorry, I mean “developed”) over the years, and how post-publication review and evaluation becomes a critical concern if peer review becomes a simple checklist to technical compliance. But while some publishers in the audience may have sniggered, as I did, we must recognize that this really is the future: a few branded high visibility lead journals, and large databases of articles and data, branded but subsuming the current forest of small second and third level titles, often created to pursue a line of enquiry that nobody followed and inappropriate in times of cross-disciplinary emphasis.
And at the end of the day came a presentation of progressive good sense from Inez van Korlaar, the Director of Product Strategy at Elsevier working in this area. She is managing the soft launch of Mendeley Stats, and she is clearly continuing the line of thinking that has taken her company from Science Direct to SciVal. If publishers are to remain in the game they have to provide value at the point of use to all participants. Moving against the ResearchGates and Academia.edu of the world is one thing: finding a greater utility edge by turning Mendeley Stats into a social network is another. It must be right to look at the economic consequences of research, via patent analysis for instance, just as it must be right to use Elsevier’s Newsflo toolset as well as content from Altmetrics to flesh out a multi-faceted reputation analysis. They have experimented with Elsevier’s citation and usage alerting, via the 65,000 users of MyResearchDashboard. Now Mendeley Stats are two weeks old, and it will be fascinating to see if it provides a way of keeping publishers in position as key intermediaries, or whether the rise of the funders erodes that positioning fatally.
To the indefatigable Anthony Watkinson, who orchestrated and moderated this event should go the last word. He pointed out that the Watson-Crick paper on the Double Helix was never peer reviewed at all: it was simply sent by Sir Lawrence Bragg with a covering note suggesting that Nature should publish it – which they did. Who needs reputation management when that is the role of your head of department?
Nov
11
Get Smaller to Grow Bigger
Filed Under Big Data, Blog, data analytics, Education, Financial services, Industry Analysis, internet, Pearson, Publishing, Reed Elsevier, STM, Thomson, Uncategorized | Leave a Comment
The news that Thomson Reuters announced today that it was “exploring strategic options”, as the dreadful euphemism for selling has it, for its Science and Intellectual Property businesses gives new meaning to another hoary old industry expression, “waiting for the other shoe to drop”! In this instance the other shoe has been hovering for about a decade, and might have happened at any point after the Reuters acquisition. Later, the move might well have followed the sale of Thomson Health to PE (now Truven). The logic of the Thomson Reuters merger, after all, was concentrating on the markets, corporate, and legal concerns of global corporates and their advisers. While IP had a sort of logic, in that patent activity is a measure of value, it seemed more important to build out into areas that bridge financial and legal – compliance, governance, regulatory – rather than fully absorb IP into the mix. So IP stayed with Science and now the pair are on the block. Meanwhile a new business has grown between the merging entities, worth some $600 m pa in revenues. And other investment areas of opportunity are emerging, so the parent company can reasonably say it wants to concentrate its investments on its core concerns. As it could have done ten years ago.
So just what is being offered for sale? In the firtst place two very different businesses, but together they form a $1 bn revenue block, with an Ebitda of 32% (some 10% of the parent’s margins). The bit that is being sold is slightly more profitable than the group to which it belonged. But many bidders and advisers will see it as two businesses. On the one hand, IP is the market leader in patent information, with a slew of services that run from instant updating through to the fully analysed and technically abstracted Derwent World Patent Index. IP Manager was one of the first convincing “solutions” to manage workflow effectively – in this case for in-house patent counsel. On the Science side, alongside a raft of article preparation and management systems, lies Web of Science: the market has looked to this division for the market standard in assessing the importance of articles and their journals to users and peers. The ISI index, acquired by Thomson, remains vitally important to global science research by its definition and measurement of “impact” through citations. The service which incorporates this. Web of Science, is still key to assessment and management of scientific research – and the grants that enable it.
In recent years both sections have attracted, partly as a factor of their success, a great deal more competitive attention. Gone now, for example, are the days when Thomson’s citation indexing totally ruled the roost when it came to measuring the success or otherwise of universities holding grants for science research. This is the age of altmetrics, and we can not only measure more things than citations but analyse the multiplicity of factors more effectively. Elsevier entered the market with SciVal, for example, and there is a feeling now that rapid progress is being made in developing new styles of analysis. Has Thomson Science kept up? Could it be a platform for a new “services to science and research” business at some future point in different hands? In patents, CPA offers a guide to valuations , and also an indication that there is competition in depth, not least from state owned national and international patent offices. Yet Thomson’s offerings would be at the top of the market, both in terms of data held and revenue generated.
Will these high value entities sell separately or together? We may now have at last reached the point where they are more valuable apart. There had been a tacit assumption that the long delay in divesting them was in part about making them more separate as businesses since it is hard to think of a strategic buyer ideally suited to buy both. Taken together they will fetch over $3.6 bn, a big reach for sector trade buyers as well as those private equity players actively interested in this area. There will be some competition considerations as well, in that it may be hard for RELX to buy Science, though other major STM players would have less difficulty, and for some it may suggest a way of diversifying away from a pure reliance on increasingly tough journals only markets. This would have been an ideal buy for Springer before Nature, or Bertelsmann while they were looking at B2B: they seem from this months deals to have decided that education is a better bet. One thing is certain about this divestment, however: where buyers are looking for data-based businesses with a high emphasis on analytics and solutions which add real value to the workloads of users through productivity gain, cost saving and compliance certainty – these two outfits have it in spades!
But what does all this activity mean? For one thing, it suggests that portfolio may have had its day. The sale of the FT and the Economist at Pearson, the divestment of part of Datamonitor at Informa, the trimming down of Penton, the divestment of non-event assets at UBM – all these and many more point to a determination to shed non-core assets in order to put investment heft behind growth in what are seen as more strategically important areas.This is in part a digital effect – networks create full service needs and solutions and tend to duopoly. This is also part of a cycle, and there can be no doubt that portfolio will be back one day, but in the meanwhile there can be little doubt that investors like “slim down to grow bigger”, as long as you slim by getting the right price for the assets. Which means that the real question for Thomson is – did they leave it too late?
« go back — keep looking »