From Olympic Exile on the splendid South Shore of Nova Scotia, I can observe that the banking crisis continues apace, and that the original Swedish solution – put all the smelly bits into a special container called a Bad Bank and cut it free from the Mother Ship – still holds great appeal. I can also see that the  financial market analyst demand to cut media companies up into “high growth, strong margins” companies and “low growth, declining margin” companies also has great appeal. We have seen it with McGraw-Hill and now with News International. The equity market analyst’s view (and media markets are almost always at their most dangerous when those who lead companies feel forced to follow the views of those ultimate exemplars of power without responsibility – or experience) seems to be at the moment that the assets which have responded least well to the digital revolution, or have been slowest to react, should be cordoned off and cut free. Very strange: I thought the whole idea of “portfolio” in media ownership was that assets developed at different speeds, and the fast growth ones thus gave “cover” – time and capital – to allow low growth assets to become fast growth again – perhaps with the help of judicious bolt – on acquisition on the way.

And then there is the question of cycles. Some of us apparently work in mini-cycles – the turn of markets within an 18 month period according to an analyst friend – while others are “macro-cycle minded”, which is where I am apparently involved. So if I thought that the reason for McGrawHill to hold onto its Education division was that education, alongside Healthcare, is the most enduring long term growth market we have, and that the portfolio duty of Standard and Poor’s was to enable McGraw’s education unit to get back on its feet, challenge Pearson’s leadership and buy the right catalytic add-on, then I was clearly wrong. Yet it seems to me clear that the future of  rating agencies is quite as murky, from both a regulatory as well as a digital standpoint, as any other market. And is McGraw’s B2B, despite some distinguished work, really in the forefront of digital services and solutions in its verticals? Yet these are Good Bank assets, and Education is Bad Bank.

I could write the same about News Corp, television and newspapers. I am certain that no broadcast media have really absorbed the meaning of a networked society, and this is as true of the world of TV stations and cable companies as it is true of newspapers. Of course, one way around the problem is to sell while the going is good, as DMGT so signally failed to do in 2008 when they refused an offer of £1 billion for Northcliffe (regional press), an asset worth around £250m today. Sentiment forbade such a move as it once did at News Corp, so are players like DMGT destined to split to please investors? Apart from my respect for the bravery and ability to change involved in creating new B2B orientated DMGT out of old newspaper DMGT. who is to say that here no digital local manifestation can be created which will not replace traditional local newspapers? And how valuable, since they have them, would those local brands and franchises become in the new local? Especially at helping bits of B2B2C in markets like property reach ultimate consumers.

And where does the splitting end? The arguments that apply here apply equally to the Guardian Media Group, and are complicated by the fact that one investment made to give cover for the newspapers, EMAP, has faded faster than the newspapers themselves. Hopefully selling its half share of this and Autotrader will adjust the losses, and digital revenues (now up to £14.7 m and growing by 26% this year) will do the rest. But here we hit another problem: digital businesses may be more profitable, but they are also smaller. Digital newspaper ad revenue (Mail Online now stands at a forecast of £327m, with a target of £45 m in 2013) models are small, as are paywall models (Times Online now reaches £27m pa after a price hike) And the story of digital books is “less revenue, more margin, cannibalising customers to create a slightly smaller, slightly more profitable company”. What happens when we finish that short cycle?

Maybe the answer to the scale problem is that scale is becoming less important anyway. In a digital world if you have 50% of the workflow and solutions business in agriculture, why should you be in the same group as a content provider to the oil industry? Certainly our current ideas of scale came directly from the print world – you needed to be big enough to finance print runs that took, a day, a week or a year to sell. The cash flow model demanded scale. This is not so today, though I can well imagine a world where deploying common (and very expensive) technologies and having sufficient internal know-how to do so becomes a scale argument. Few B2B players “re-platforming” these days can be doing so, at quite a modest scale, at less than $1.5 m, even if their content is already in good XML order. Larger players face bigger bills, and these will be ongoing as we all go semantic web and Big Data. Then again you may need to be big to finance this as well as investing in collaboration with third parties – content-sharing, delivery mechanism-sharing, solution-sharing. And you may need to be big and diversified to fight off the next round of investors in this sector – the enterprize software vendors who will want to add your B2B solutions to their architecture (or maybe you will need to be big enough to attract them: it can be hard to tell).

So settle back for summer and await the next wave of splits rumours. Back to splitting up Informa? EMAP is already, like Gaul, divided into three parts and ready for resale. Pearson should certainly, in the analysts view, sell Penguin and the FT (despite the fact that they are appreciating nicely now, and they will only be needed as a votive offering to the markets when their sale can finance the next big education push/acquisition). Surely Wolters Kluwer should be subject to this one too – financial analysts sought the sale of its education and its academic publishing assets, and, having succeeded, still hunger for the news that Health is being sold away from law and tax.

Or maybe we should say that it is customer markets that change the size and scale of assets, not investment analysts who have a key interest in the outcomes that they recommend. Maybe we would get richer listening to our customers than listening to these back seat drivers?

It was inevitable that some readers of my piece on Gold OA earlier this week would come back and say that I have grown too fond of defining what won’t work and should become more proactive about stating the new directions. Fair cop. Here then are two “assertions” about that future  for science journal publishers which include areas in which “traditional” text-based “publishing” has only the slightest current experience and skills base, yet which will be vitally significant for the industry five years out. Both fit into a vision of scholarly communication, and the evolution of the publisher’s role away from primary publishing (which will become the prerogative of librarians and  repositories) and into workflow management solutions and the quality of content within process.

My two candidates for step-change status are:

1.  The evolution of video into an accompanying feature and then into the vital core reporting medium for scientific research reporting.

2.  The development of robust and auditable techniques for evaluating the impacts of content on research, creating measures for ROI in content purchasing, and fresh, searchable data derived from the evaluation of usage. This data, along with content metadata, will be more valuable to players in this market than the underlying content on which it rests.

Lets start with the first issue. I am fed up with being told that science and scientists are too dull or too complex for video. Too dull? Just go and play these two minutes of an interview with John Maynard Smith, the great biologist, on Vitek Tracz’s pioneering site Web of Stories (http://www.webofstories.com/play/7277?o=MS) and try to maintain that view. And this site has excellent metadata, as does the  video-based Journal of Visual Experimentation (JoVE) which announces its extension this month to covering experimental reporting in physics and engineering as well as the life sciences (www.jove.com/about/press-releases). Note that both of these sites set a premium upon narrative, and recall the narrative argument in my recent piece on next generation learning (After the Textbook is over... 3 June 2012) which was demonstrated in some wonderful transmedia software (http://www.inthetelling.com/tellit.html). Once again this demonstrates that video is quite capable of forming the narrative stem onto which links, citation, indexation, abstracts and other aids to discovery and navigation can be attached. Indeed, the text can be attached, along with demos and lectures and evidential data. Video file sharing is notoriously easy in the broadband world. Some academics will complain that they lack video story-telling skills, and this in turn may be something that publishers can add to the APC – as long as they acquire those skills themselves in time!

And then there is data. I have thundered on about evidential data and the importance of using the article as the routefinder that leads researchers to the statistics, or the findings, or the software used in the analysis. And we have all talked endlessly about metadata searching, about applications of Big Data in science and about data analytics. But I am moving to the view that we are crucially underplaying the importance of another sort of data, which we used to characterize as “usage data” and wonder whether it was going to become significantly exploitable. The CIBER team have long warned about the underuse of usage logs, but the force of the point has been increasingly brought home to me by an appreciation of what excellent data output can be derived from interfaces like Mendeley or ReadCube. We now begin to appreciate almost for the first time what usage patterns can be mapped – and what they mean. This is important for researchers, and vital for publishers. Users will rightly demand this form of data analysis, and will become increasingly interested in what, of the mass of data that they buy access to, is effective and cost-effective. This will start at the sharp end, in areas like drug discovery, but will grow into a habit of mind as data overload becomes ever more daunting. Concentrating purchasing policies on data that can be demonstrated to support improved decision making or better compliance or increased productivity will drive us to collect and analyse our user data to demonstrate that what we have makes a difference. And as we are driven this way we will get deeper and deeper into examining what users do with our data, and we will be surprised by how much we can track and know. And that knowledge will form another layer in our content stack, alongside metadata itself.

This game is already afoot in the biopharma sector. Eight weeks ago Relay Technology Management (http://relaytm.com) launched a “real-time Business Intelligence and Data Visualization Solution” for life sciences. Building on their RVI (Relative Value Index) formula, this new BD Live! (Business Development Live!) construction demonstrates some of the ways in which scientists and researchers in the future will want to have their data assets assessed – and the ROI of their purchases demonstrated. It is probably no accident then that Nature Publishing Group made an investment in Relay at the end of last year.

« go backkeep looking »