It was inevitable that some readers of my piece on Gold OA earlier this week would come back and say that I have grown too fond of defining what won’t work and should become more proactive about stating the new directions. Fair cop. Here then are two “assertions” about that future  for science journal publishers which include areas in which “traditional” text-based “publishing” has only the slightest current experience and skills base, yet which will be vitally significant for the industry five years out. Both fit into a vision of scholarly communication, and the evolution of the publisher’s role away from primary publishing (which will become the prerogative of librarians and  repositories) and into workflow management solutions and the quality of content within process.

My two candidates for step-change status are:

1.  The evolution of video into an accompanying feature and then into the vital core reporting medium for scientific research reporting.

2.  The development of robust and auditable techniques for evaluating the impacts of content on research, creating measures for ROI in content purchasing, and fresh, searchable data derived from the evaluation of usage. This data, along with content metadata, will be more valuable to players in this market than the underlying content on which it rests.

Lets start with the first issue. I am fed up with being told that science and scientists are too dull or too complex for video. Too dull? Just go and play these two minutes of an interview with John Maynard Smith, the great biologist, on Vitek Tracz’s pioneering site Web of Stories (http://www.webofstories.com/play/7277?o=MS) and try to maintain that view. And this site has excellent metadata, as does the  video-based Journal of Visual Experimentation (JoVE) which announces its extension this month to covering experimental reporting in physics and engineering as well as the life sciences (www.jove.com/about/press-releases). Note that both of these sites set a premium upon narrative, and recall the narrative argument in my recent piece on next generation learning (After the Textbook is over... 3 June 2012) which was demonstrated in some wonderful transmedia software (http://www.inthetelling.com/tellit.html). Once again this demonstrates that video is quite capable of forming the narrative stem onto which links, citation, indexation, abstracts and other aids to discovery and navigation can be attached. Indeed, the text can be attached, along with demos and lectures and evidential data. Video file sharing is notoriously easy in the broadband world. Some academics will complain that they lack video story-telling skills, and this in turn may be something that publishers can add to the APC – as long as they acquire those skills themselves in time!

And then there is data. I have thundered on about evidential data and the importance of using the article as the routefinder that leads researchers to the statistics, or the findings, or the software used in the analysis. And we have all talked endlessly about metadata searching, about applications of Big Data in science and about data analytics. But I am moving to the view that we are crucially underplaying the importance of another sort of data, which we used to characterize as “usage data” and wonder whether it was going to become significantly exploitable. The CIBER team have long warned about the underuse of usage logs, but the force of the point has been increasingly brought home to me by an appreciation of what excellent data output can be derived from interfaces like Mendeley or ReadCube. We now begin to appreciate almost for the first time what usage patterns can be mapped – and what they mean. This is important for researchers, and vital for publishers. Users will rightly demand this form of data analysis, and will become increasingly interested in what, of the mass of data that they buy access to, is effective and cost-effective. This will start at the sharp end, in areas like drug discovery, but will grow into a habit of mind as data overload becomes ever more daunting. Concentrating purchasing policies on data that can be demonstrated to support improved decision making or better compliance or increased productivity will drive us to collect and analyse our user data to demonstrate that what we have makes a difference. And as we are driven this way we will get deeper and deeper into examining what users do with our data, and we will be surprised by how much we can track and know. And that knowledge will form another layer in our content stack, alongside metadata itself.

This game is already afoot in the biopharma sector. Eight weeks ago Relay Technology Management (http://relaytm.com) launched a “real-time Business Intelligence and Data Visualization Solution” for life sciences. Building on their RVI (Relative Value Index) formula, this new BD Live! (Business Development Live!) construction demonstrates some of the ways in which scientists and researchers in the future will want to have their data assets assessed – and the ROI of their purchases demonstrated. It is probably no accident then that Nature Publishing Group made an investment in Relay at the end of last year.

“OA was always going to be a marathon, but we seem to be waiting for a very long time indeed for any of the national runners to enter the stadium…. but now there is a rustle of academic papers in the crowd, as “Two Brains” Willetts, the UK  Minister for Precocious Intelligence in the Department of Obfuscation and Reduced Expenditure, is first to arrive through the gates and begin the last round of the track. And, my goodness, how he is going… his face is a mask of determination as he seeks the Gold OA for the UK… Dame Janet Finch, his trainer, is fanning his face with a White Paper as he turns for home… looks scarcely legal to me… PLoS , the early front-runner, now has nothing left and has been overtaken by the Hughes/Wellcome/Planck entry, who plainly thought it was a three legged race… and Two Brains now reaches out for the tape, a press release already in his hand, and that’s it, viewers, the games have not yet even begun and already the Brits have their hands on Gold!”

And it is a great day for British publishing as well. Clearly the three publisher members of the Finch Working Group worked their advocacy socks off, and as a result we have a conclusion embedded in today’s announcement (http://www.guardian.co.uk/science/2012/jul/15/free-access-british-scientific-research?newsfeed=true) that is as favourable to journal publishing interests as any that could be contrived. The Minister has his obligatory cost-saving (£50m), the publishers get their APCs – fees for publishing OA articles (1% of science funding, which is £4.6 bn), the academics get Open Access and the free distribution of the results, Britain beats the US and the EU to the punch and sets a precedent they may have to follow: surely this is a golden dawn and the greatest good for the greatest number has been miraculously accomplished?

Before we join in with the celebrations lets just go back over the interests checklist and see how this announcement affects the longer term perspective:

Academics  Will this announcement mollify the 12000 who signed the petition against Elsevier? Some but not all would seem to be the answer. Judging from the blogs so far some scientists have started to complain that the government will give their work away for free (send for Dr Harnad to attend this sick man’s bedside!). Others will be pleased to see a principle acknowledged, even if it is 2014 before the results appear. For many, I suspect, the feeling will be like seeing a banker resign or give up a bonus – the protest was not against the act of banking or publishing but against some bankers or publishers perceived to have gone too far. And it takes about a decade for these things to brew up – I suspect that many scientists were protesting against Elsevier’s pricing policies of the mid-nineties, and not against the Elsevier of elegant technology solutions today, some of which they hardly associate with the journal publisher.

Librarians  This is a further step in the long term marginalization of librarians in their traditional roles. But now it is really clear that librarians and their skills base are urgently needed in repository development, research and evidential database availability and institutional self-publishing, this will only hasten a process already well underway.

Publishers  Many publishers will be relieved and happy at this outcome. Peer review as administered by them and paid for by government remains in their control. However, they need to add a grain of caution to their celebrations. True, if UK plc goes Gold OA on this basis, then the revenue base of STM publishing will not suffer grievous harm. However, margins will suffer more, and within a publishing economy that has APC revenue as part of the mix, journal publishing Ebitda must begin to fall. This in turn will have an effect upon the ability to finance new developments at a time of critical change for the whole industry.

The real sufferers here will be the scholarly society publishers. Caught in the middle ground and dependent upon the margins from subscription publishing to run a service-based professional body, some will move from leasing out the rights to publish their journals to selling their journals in whole or in part in order to create a financial cushion and an investment base, probably while retaining a quality control interest in the journal brand. Likely result: big publishers get bigger. And big publishers get more diversified as well. Already Elsevier and Macmillan’s Nature set the pace in building workflow solutions for scientists and other researchers. Migrating the business away from sole reliance on the journal never seemed a more sensible strategy. The research article may be the “unit of currency” in scientific research, as I am perpetually assured by  publishers, but it is undergoing a process of devaluation. Where a research programme is of vital significance to a whole sector, scholarly communication via blogs, conference proceedings, posters etc will have lit up the track already and scientists do not have to wait two years after programmes are completed to read the findings for the first time. And of course much current use of articles is about researching experimental technique, not outcomes. Some researchers have claimed that over 70% of enquiries are about good or best practice in experiment set-up. Others point to the need for validating reports – those which repeat and confirm previously known findings – and these, not being “new” science, seldom get reported. And then there is Good Dr Harnad and Green OA to contend with as well…  though publishers will be heartened to hear him quoted as saying that this decision sets Open Access “back by at least a decade”.

And in a decade? The highest figure that I have heard  for current open access publishing as offered by all major publishers is that it accounts for some 7% of articles published, and has taken 5 years to get there. Judging from the tepid enthusiasm of academics, my guess is that we shall top out at around 15%, by which time the major players will have done a great deal of  consolidation in a slowly contracting journals market, and commoditization of the article through casual re-use will be a greater perceived threat, and diversification into workflow using all of the publishing skills base to maintain knowledge systems (ontologies) across communities so that everything relevant can be found and injected into the research process  will be deeply entrenched.  Everything about STM will change – and in ten years we shall wonder what all the Open Access fuss was about, apart from gaining a political point for the present UK government and playing the publishers back onside again.

« go backkeep looking »