Multi-tasking may be beyond me. I am finding it very hard to travel and blog at the same time. And having traveled 10 hours by train and over 50 in a plane in the past 14 days I must beg forgiveness for the gap in production on this blog. And the annoying thing is that many of the things encountered on these travels have been red meat for bloggers.

Take as an example last week’s Information Industry Summit, organized as ever by the SIIA and held this year at New York’s Pier 60. This is a really important industry event, and I was pleased to be amongst a crowd of some 250 who enjoyed a very varied and interesting agenda. And specially interesting for me these days since these conferences provide a verbal map of industry sentiment. One would have thought, for example, that for many SIIA members 2012 was a disaster worth forgetting. Consolidation goes on apace but many of the “great” players of yesteryear took considerable reverses, and investor sentiment about the “information industry”, if that means failing print to some, must be flagging. With McGraw-Hill dividing into two, and selling the weaker part, and News Corp dividing into two and racking up real problems in the bad bank side, there are plenty of examples of near terminal troubles. And yet the room was full as ever of investors and analysts. I cannot be certain of course whether they were there to pick over the wreckage, or to re-invest post-digital development, but they are still there. And while they are there I really think we should give them a vision of the future which is not framed by our rear view mirror.

So the conference began with George Colony, the articulate and persuasive father of Forrester, giving a talk about Thunderstorms. Indeed, this was a recurrent theme. It reminded me of where we have been these last 20 years. Do you remember the long years when every conference started with a Christensen-esque speech on Disruption? Well, we learnt to live with Disruption, and we ignored the wonderful advice of the author of the Innovator’s Dilemma. For the next five years we were Crossing Chasms, getting one foot into the digital world, rebalancing ourselves. Well, does anyone know if we crossed? Or are we still poised? Then every media sector got involved, and all of a sudden we were Transitioning and Migrating. These may have been words which pleased investors at the time, and helped to explain why nothing much was happening at discernible speeds, but using them now seems laughable. How many major print-based powerhouses of 1993 can you name that have a stake in digital markets that matches a Google, or an Amazon, or a Facebook? We have businesses like Thomson Reuters and Reed Elsevier who have carved out niches in digital workflow, and players like Pearson who dominate education markets which have been slower to move to the network. But giants? Those have been built anew and elsewhere.

So when the conversation turned to Thunderstorm last week I wondered whether Americans had adopted the English art of under-statement. Cataclysm was the word that came to mind that week as I read Gannett’s results statement. If George Colony meant that our industry was under water then I might agree, but I suspect that he was looking or a metaphor for mindless violence, but came up short. Yet the metaphor led me to the totally sane, healthy and interesting part of the week. One objective of my trip was to help my friends at TEMIS, the French semantic analysis software company, launch their LUXID Community, a collaborative network of software players, content companies and platform providers. I was delighted to find some of the themes of the launch event taken up in the main conference. Look for yourself at http://www.temis.com/join-the-luxid-community or come to one of their meetings and express a view.

When extra-ordinary events take place, and change the entire landscape in which we work within a timeframe as short as 20 years, our reaction as businesses might reflect how we react as individuals in an earthquake or a tsunami. We pool our resources and pull together. I believe in what TEMIS are proposing because I do not think we will develop solutions for customers, or exploit to the full the digital opportunities given us by our content, our data, our market knowledge or our ability to develop high quality software unless we work together. Collaboration and co-operation are essential even if tomorrow we also need to buy or merge with some of those with whom we work today. Above all, this collaboration must extend to our customers: the lonely years of competition for competition’s sake must end, and we have to embrace our customers as partners – or they will become our competitors in ways which would be very toxic indeed.

Here then is a theme we could use to re-invigorate investors. Ask them to score us in terms of our proclivity for partnership. Look at us to see if we have a culture of experimentation that involves combining resources and attributes from several different sources to create a value which would have been otherwise impossible. As we move into a networked world where most service and solutions providers will sub-contract, outsource, partner and collaborate as easily as breathing then we in vital information markets could be leaders in proving that Vital is just as important as Big – and maybe more profitable. The LUXID Community may be a small step, but we could look back at this week as a very important one.

Not another note on Open Access, surely? Well, I am sitting here on 31 October reading an article published on 1 November (how up to date can a blogger be?) in the Educause Review Online (www.educause.edu/ero/article/peerj-open-access-experiment) and I really want to convey my respect for people like Peter Binfield, who wrote it, for their huge energy and ingenuity in trying to make Open Access work. Peter’s note, “PeerJ: An Open-Access Experiment” describes the efforts that he and his PeerJ colleagues have put into the business of creating fresh business models around Open Access, which was borne without one and has always seemed to its adherents to need to be cloaked in one. Open Access has proved a far from lusty infant in many ways, but those who continue to adhere to the cause seem to feel, in their admirable and unfailing optimism, that some small tweak will suddenly create economic salvation and thus a take off into sustainable business growth.

In the case of PeerJ, the take-off vehicle is going to be a membership model. Peter Binfield co-founded the outfit in June 2012 with Jason Hoyt, former Chief Scientist at Mendeley, but the model that they feel will work owes nothing to smart algorythms. Instead, they simply launch themselves at the Author Processing Charge (APC), the way in which Gold OA has been sustained so far, and replace it by – a subscription. Now this is admittedly a personal subscription, levied on all article contributors (that is where the volume lies – in multi-authoring) and subscribers – or members as they would wish to describe them – can then continue to publish without further charges as long as they keep up their membership fees. Of course, if they join with new teams who have not previously been members then I presume we go back to zero, until those contributors are also members with a publishing history. Each contributor who pays a membership fee of $299 can publish as often as he likes: a nominal $99 contribution allows you one shot a year.

PeerJ have assembled a peer review panel of 700 “world class academics” for peer review purposes and intend to open for submissions by the end of the year. In a really interesting variation on the norm, they have put a PrePrint server alongside the service, so submissions will be visible immediately they are considered. It is not clear how much editorial treatment is involved in these processes, or indeed what “publishing” now means in this context, or indeed when a submission appears on the pre-print server. But one thing is very clear: this is not going to be peer review as it once was, but simply technical testing of the type pioneered by PloS One. Once it is established that the article conforms to current experimental good practice, then it gets “published”.

It is around this point in ventures of this type that I want to shout “Hold on a moment – do we really know what we are doing here?” I am sure that I will be corrected, but what I can currently see is a huge dilution of the concepts of “journals” and “publishing”. PeerJ starts with no brand impact. It is not conferring status by its selectivity, like Nature or Cell, or even some brand resonance like PloS. And its 700 experts, including Nobel Laureates, are being asked if the enquiry methodology was sound, not whether the result was good science or impacted the knowledge base of the discipline. PeerJ should be commended for allowing reviews by named reviewers to be presented alongside the article, but, fundamentally, this seems to me like another ratcheting downwards of the value of the review process.

Soon we shall hit bottom. At that point there will be available a toolset which searches all relevant articles against the submitted article, and awards points for fidelity to good practice or for permissable advances on established procedures. Articles where authors feel they have been misjudged can re-submit with amended input. The device will be adopted by those funding research, and once the device has issued a certificate of compliance, the article, wherever it is stored, will be deemed to have been “published”. There will be no fees and no memberships. Everything will be available to everyone. And this will introduce the Second Great Age of Publishing Journals, as the major branded journals exercise real peer review and apply real editorial services.

But something has changed now. The Editors of the Lancet or Nature or Cell have decided, in my projection, not to entertain submissions any longer. Instead they will select the articles that seem to them and their reviewers most likely to have real impact. These they will mark up to a high level of discoverability, using entity extraction and advanced metadata to make them effectively searchable at every level and section and expression within the article. Authors will have a choice when they are selected – they can either pay for the services up front or surrender their ownership of the enhanced version of the article. Since the article will be available and technically assessed already, spending more on it will seem fruitless. So we shall return to a (much smaller but equally profitable) commercial  journals marketplace. Based once again on selectivity and real, expensive peer review.

Experienced readers will have already spotted the flaw. With wonderful technologies around like Utopia Documents and other new article development activities (Elsevier’s Article of the Future) surely the new age of the article can only exist until these technologies are generalized to every institutional and research programme repository. That is true – but it will take years, and by that time the publishers will be adding even higher value features to allow the researcher’s ELN (Electronic Lab Notebook) full visibility of the current state of knowledge on a topic. Beyond that, we shall consider articles themselves too slow, and inadequate for purpose, but that is a discussion for another day.

« go backkeep looking »