“When the Spin slips, change the name!” as British Spin Meister, Alistair Campbell, almost said but didn’t until I put the words into his ever-open mouth. When I look back over the past 15 years of science publishing, I see more spin and less change than I would ever have believed possible. Yet when I try to look forward 10 years I see a wave of fundamental change more threatening than the games we have been playing in these Open spaces. For me, a good proof of the failure of the almost political campaigning around Open Access to carry the day beyond some 12-15% of users (check the latest Outsell market report, Professor Harnad) is the name switch game – with PLoS now talking “Open Evaluation” and Academia.edu being used by 5 million scientists who believe in Open Science. The fundamental change is about self-publishing and post-publication peer review: this will upset the applecart of commercial publishing, if it does not adjust in time, and the ersatz Fundamentalists of the Open Access movement of a decade ago, who wanted to preserve peer review as much as they wanted to destroy commercial ownership and restriction.

Since we are talking Science, lets try an experiment. Take any other broken, mis-used and meaningless hackneyed term and place it in this context where “Open” is now in terms of Science and Access. For example, take “Socialist”. Or “Community”. Or even “Public”. See what I mean? All meaningless, or, like those eye tests, you see the same through each lens that the optometrist puts into the frame before your eye, and end up lying about the difference between this one and that – because there is no discernible difference but you do not want to disappoint. Real change is not to be described by this means. It concerns the wish of young scientists to be noticed in the network as soon as possible on completion of their work – and before that where conferences, posters, blogs and other mentions begin to build anticipation. Real scholarly communication is now available in several different flavours, from Mendeley to Academia.edu. Since I have been solemnly assured for 30 years by senior scientists and publishers alike that scientists will not share I have to be amazed by the size of these activities. These newcomers are not less worried about attaining research grants or tenure than their predecessors, but they live in a networked scientific world where if you are not quickly present in the network you are not referenced in debate – and being part of the argument is becoming as critical to getting grants and tenure as a solid succession of unread papers published two years after the research ended used to be.

These convictions are much strengthened by this week’s announcements. The announcement from F1000Research (December 12) that their articles are now visible in PubMed and PubMed Central gives a complete clue to what this is all about. Users want to publish in five days, but they want to be visible everywhere where a researcher/peer would expect to look. And increasingly they will expect that the article will collect into post-publication peer review all those earlier references in conference proceedings, blogs and elsewhere. So while people like F1000Reaeach will handle “formal” post-publication peer review, informal debate and commentary will not be lost. And the metrics of usage and impact will not be lost either, as we look so much more widely than traditional article impact to discern what this author/team/ findings/ideas have had. “Open Evaluation” from PLoS aims just there, as it recently launched its second evaluation phase from PLoSLabs (http://www.ploslabs.org/openevaluation/). This post-publication article rating system reminds me very precisely that PLoS One was not in any sense a traditional peer review process. It was a simple methodological check for scientific adequacy (“well-performed” science), and while the volume of processing solved a multitude of financial issues, the fuller rating of these articles still rests with the user. We shall see PLoS One as the turning point to self-publishing when the history is written.

And so we move towards a world where original publication of science articles is no longer the prerogative of the journal publisher. While review systems will flourish and abstracting and indexing will remain vital, that tangled mass the second and third tier journals, the most profitable end of traditional STM, will slowly begin to disperse. Some databases will adopt journal brands, of course, and the great brands will survive as ratings systems themselves. “Selected by Cell as one of the 50 most influential research articles of the year”, or “Endorsed by Nature as a key contribution to science” will be enviable re-publishing, increasingly with datalinks, improved access to image and video and other advantages. This is where semantic enrichment and data analysis will first become important – before it becomes the norm. But these selections will be made from what is published, not what is submitted for publication. And a clue to what the future offers was indicated by a Knovel (Elsevier) announcement this week. Six publishers with either small, high quality holdings in engineering research, or activities in engineering that can use the Knovel platform, entered into collaboration agreements to make their content available via the Knovel portal. Amongst these were Taylor and Francis (CRC Press), as well as specialists like ICE (Institute of Civil Engineers) or the American Geosciences Institute. As novel is in a directly competitive position with IHS GlobalSpec, it is relevant to ask how many engineering research portals that marketspace will need. It now has two – and I seriously doubt that there will ever be more than two aimed at both research and process workflow, though their identities may change (see Thomson-Reuters/Bloomberg/Lexis in law). Increasingly then small science publishing will be re-intermediated – and we do not need a business degree to imagine what that will do to their margins, as well as their direct contact with their users. “Open”, whatever else it means, connotes “contraction” for some people.

There are some major similarities and differences between the giant market players in the information/publishing media sector which are not all about markets and competition. For example, Wolters Kluwer and Reed Elsevier have clearly become portfolio strategists. If rumours in New York a month ago about leaving the law market, and rumours in London about a major entry into credit rating are anything to go by, markets clearly see Reed as a player who now has the scope to restructure the portfolio. But Pearson and Thomson Reuters, both in the $6-9 billion USD range in annual revenues, are not at all like that. The transformation they seek is about global markets and building bigger sectoral presence in order to dominate the workflow of professionals with solutions that become a requirement in markets which are duopolistic at most. Perhaps it is time to catch the flavour of “transformation”.

Pearson and Thomson Reuters, despite the differences in their marketplaces, are thus an important comparison in the Transformation Game. T-R have appointed a Chief Transformation Officer, and when they announced third quarter results this week pointed to 3000 job losses as a first transformational step. Let Jim Smith, CEO of Thomson Reuters have the first word (from his quarterly results press release):

“Our improving track record on execution gives me the confidence to now move even faster in our transformation work,” said Smith. “We will pick up the pace of efforts to simplify and streamline our organization, to shift resources behind the most promising growth opportunities and to use every tool at our disposal to drive value creation for all our stakeholders.”

And then again in a leaked memo to staff (www.jimromanesko.com):

“The answer is to accelerate our evolution into a platform company – one that delivers to customers not just a portfolio of products, but the power of our entire enterprise. We have made progress on that front, but there’s still much to be done. To take the next step, today we set aside funding to further accelerate the transformation of our Financial business and to better align resources to our most promising opportunities.”

Here then is a strategy that remains wedded to the idea of cross-selling and cross-solutioning financial services, law and tax professionals, and then moving outwards to the clients of those professionals. It uses the word “platform” both in a technology and marketing sense, and the word becomes a metaphor that suggests that when all the content and all of the customer knowledge is in one place, T-R can use its skills to quickly generate agile services that fit local needs in a global context. To do this you need to eliminate the turf wars which have been such a feature of these great corporations: in my Thomson years it was easier to partner a complete stranger than share a venture with another Thomson division. Mr Smith has indicated that the politicking must stop, and be replaced by an ethic that is mindful of the overall gain, but changing cultures is one of the toughest elements of transformation, and there are few records of success to use for guidance.

But some things are swinging in Mr Smith’s direction. Sluggish early sales of Eikon are now moving forward and have passed the 100,000 installations mark. Job losses will enable the re-organization of skills and assets needed to permit the transformation, as well as improve margins. The share price has risen by a third in the last year, a welcome sign that markets see what is happening and support it, and the latest results seem to underline that. But problems remain. The platform technology architecture is far from in place, and indeed the historical divisions seem locked on historical technological solutions that have real problems in talking to each other. This is surely a frontline issue for a Chief Transformation Officer.

Over at Pearson, John Fallon, chief executive, said: “In trading terms, 2013 has begun much as we expected. In general, good growth in our digital, services and developing-market businesses continues to offset tough conditions for traditional publishing. Our strategy is to transform Pearson into a single operating company that is sharply focussed on the biggest needs in global education and on measurable learning outcomes. With our restructuring programme on track and the reorganisation of the company under way, we are making significant progress towards that goal.”

In other words, investors are invited to see a picture of a new CEO trying to get a global strategy in place (as against a big US core of 10 years ago, plus some good but small geographically dispersed education assets). Today the balance is much more equal, the US is clearly much less influential in the revenues and margins analysis, and the company i.e. recognized as the sector global market leader. Yet every one still gets worried when college textbook sales are described as “soft”. The share price goes off by 6%, even though Pearson is a company that makes most progress in the second half of its financial year. But this year sets challenging targets if they are to end up within reach of their goal.

The big investor question at Pearson was always “can all this globalization, re-platforming and occupancy of the whole education services and solutions niche, not just the learning content bit, be done without a huge debt burden?” So far, this miracle has happened, and with more non-core assets yet to be sold there is great scope for further acquisition-led growth. And education markets in some sectors outside of US College are picking up, so there is a warm reception to the idea of restructuring the company managerially, reducing duplication and unnecessary cost and getting the right technology in place to re -platform for rapid product development.

Nothing in the managerial changes was a surprise except that technology leadership seems to have been dissipated. Investors expect that when markets give Pearson the signal the tech environment will allow product developers anywhere to have access to the whole corpus of Pearson data/content/knowledge to produce very rapid iterations of innovative services and solutions which can be redeveloped and re-iterated in flight. As with Thomson, the fluent use of the whole data environment, of data analytics and of what we might have called Big Data six months ago becomes crucial to the way in which both players relate to their major customers. Does the Pearson divisional structure allow for this, and does the tech unity and architecture exist to permit it? We do not know yet because markets are not quite warm enough to try out a lot of things, but having been sold the idea of Pearson as a global growth vehicle by John Fallon, there will be an expectation of performance over the next 18 months, and a greater degree of immunity to old sectors like college textbooks giving everyone the shivers.

The important fact that investors must now reluctantly accept is that repeat order, edition-based, price-elastic textbook markets have gone forever, and that Pearson are clearing the decks for what comes next even if no one is always quite sure what that is! But for both companies there is a certainty that it is not just change, but “transformation”. And that the market and technology philosophy around “platform” lies at the heart of it. Markets will of course exercise a great deal of concern at the periphery of these momentous changes. They will want to know what assets are core and what are non-core? John Fallon is obviously fed up with being asked when he is going to sell the FT to Bloomberg/Thomson Reuters, even while the MergerMarket side of the FT Group is being broken out and prepared for sale. Jim Smith and his Chief Transformer will no doubt get the same treatment around Thomson Reuters IP and Science activities. But the future of both players is not decided there and for them this is no longer a portfolio game.

« go backkeep looking »