So off I went to the PPA (Periodical Publishers Association) conference, arriving unexpectedly early and thus catching the Minister of Culture trying manfully – and succeeding brilliantly – in saying nothing of consequence to the future of magazines for 20 minutes. I have encountered Ed Vaizey before – as pleasant and affable a politician as one would wish to meet – but he made it clear that everything significant was decided, as he put it, “above his payscale” so there was no real point in asking him a question at all. I reflected on the wit who suggested that if you needed a Minister for Culture you have no culture, and on a political society in which the government reacted to criticism that it had doctored yesterdays’ Queen’s Speech laying out its legislative programme in the light of election results the previous week, by pointing out that the speech in question is written on goatskin vellum, which takes a week to prepare and inscribe, and where the ink takes three days to dry. And we expect politicians to help us into a networked society! Really!

But from this low point everything got better. Under the ebullient chairmanship of Barry McIlhenney we looked through the PPA Publishing Futures report, where some of the characteristics of the industry became clear. In old world terms, the PPA’s consumer and B2B sectors are pulling further apart, and after a year of slippage in 2012, forecasts for the coming year are more buoyant in B2B than elsewhere. My surprize was that 34% of sales revenue was outside the UK (46% in B2B). It was not surprizing that consumer is only 8% digital, or that B2B is down to 41% of revenues coming from print (though the remainder is a mix of digital with events and consultancy). Average profit margin was 15-16%: very much higher for many B2B companies: rather lower for some consumer players who see little advertising recovery in print. But the world of the future that they all see is a wider range of revenue sources derived from additional services from remodelled businesses which are more “customer-centric” (one of the expressions du jour). The risks are the UK’s dodgy economy, the shortage of investment, the speed of change and the skills gap. B2B now recognizes that scale matters, and confidence is linked to size. On a scale of 1-10, member confidence stood at 8.4, with B2B averaging 9.1.

If indeed confidence is half the battle then this is good. And what followed bore out a good deal of that. Future’s Nial Ferguson showed the T3 technology service platform, a real mix of events, awards and digital services that has 40k subscriptions and 4 m uniques a year, doubling year on year. This has the same usage in the US as in the UK. Less than 20% of margins is now print, while 50% is digital.William Reed Publishing’s 50 Best Restaurants service has similar characteristics, with significant sponsorship (another theme of the day was the importance of sponsorship) and use of social media marketing techniques. Some players still feared the cannibalistic tendency of some digital developments (dmgmedia) but others saw and grasped for completely new business model concepts. In the latter category Immediate Media (BBC Magazines and Magicalia) was a stand-out, with CEO Tom Bureau placing ecommerce centre stage and using brand astutely with some key demographics. But was this really customer-centric? Going retail, in a High Street retail market in the UK that seems to have lost touch with customers, must surely imply that you know customer needs better than bricks and mortar retail does. What we heard about was not mass customization, but a development of reader reply cards, making it hard to see just what the partnership (another good word of the day) with market data player CACI really meant. The big pull at Immediate is Radio Times (bought by 900k AB1s a month and 2.2 m at Christmas; the problem is that they are mostly over 55). Making programming links to travel services (inviting people to book beach holidays at the murder scene in the successful UK crime thriller Broadchurch was a stretch too far for me!), is one thing: supplying customer needs in a user-centric matter is quite another. But I really liked the idea of using brand clout to get the travel companies to share booking data with you.

Dennis, in the hands of James Tye, their CEO, had a more relaxed view. He feels the key problem is format transfer. So they have invested in their supplier, Contentment, and their Padify environment, and have based themselves on HTML5 so as to “future-proof” the business. With 50 apps in the market and 50% of The Week’s subscribers taking a digital product, and given the strength of their print, there is an implication here, as well as elsewhere, that management have time to plan and strategize a response to a networked world. Listening to this I wondered if it was justified: I would have said that the only way to secure any degree of future-proofing was to get all the data – not content – semantically enriched and upon a single platform capable of interrogating structured and unstructured information, and make the key asset the searchable metadata, thus enabling content production to HTML5 or anything else, regardless of format. This prepares the way for a truly user driven network world – one where, amongst other things, the user drives the service through personalization. Templating is very restrictive, and Create Once, Publish Everywhere sounds grand, but only works when the user sees the format and editorial input that you have created for him as more important than removing those constraints and giving him just the content he needs or requires at a particular point in time or in a particular context.

And then on to Events. I did not go on the stream headed Content: Still King? for fear of blood pressure problems, but I really enjoyed the B2B sessions. People kept using words like Collaboration, Community – and even client ROI. Many of my anticipated criticisms from the previous post were confounded. I really liked the IHS Janes experience of getting users to ask for and subscribe to online seminar sessions, using the expertise of the Janes advisors in a new way. And then feeding back the data gathered into the publication system for blogs, articles etc. I rejoiced at the EMAP presentation: how refreshing it was to hear a manager in a unit that creates about 30% of EMAP’s revenue say that sales staff had to be retrained to ask the right questions and listen to the answers in the cause of getting customers to tell you what they want. EMAP’s 780 sponsors are now some 50% of gross revenue, and the object, as yet not attained, is to retain 75% each year. Naming rights enjoyed by BT and Oracle in terms of Retail Week events made a good case study, and supported the idea of a 12% growth rate in the coming year (given performances of 7 and 17 % in the two previous years, during which the changeover to a sponsor centric view has taken place).

And my grand vision of event software that allowed attendees, sponsors and exhibitors to create their own meetings and agendas within the event? It all takes place on Twitter and Facebook, apparently – which implies that event owners do not have the data flowing from this either. But the good news is that event organizers do need to give sponsors and exhibitors some idea of the ROI on the event: it might help here to have some convincing data to put into that model!

By the time I reached the street it had stopped raining. I hope that is true for this industry as a whole, and that they sound convincing when they meet their historic users once again – in the network.

A sudden thought. Doing an interview with some consultants yesterday (we are fast approaching the season when some major STM assets will come back into the marketplace) I was asked where I had estimated Open Access would be now when I had advised the House of Commons Science and Technology Committee back in 2007 on the likely penetration of this form of article publishing. Around 25%, I answered. Well, responded the gleeful young PhD student on the end of the telephone, our researches show it to be between 5-7%. Now, I am not afraid of being wrong (like most forecasters, I have plenty of experience of it!). But it is good to know why and I suspect that I have been writing about those reasons for the last two years. Open Access, defined around the historic debate twixt Green and Gold, when Quixote Harnad tilted at publishers waving their arms like windmills, is most definitely over. Open is not, if by that we begin to define what we mean by Open Data, or indeed Open Science. But Open Access is now open access.

In part this reflects the changing role of the Article. Once the place of publisher solace as the importance of low impact journals declined, it is now the vital source of the things that make science tick – metadata, data, abstracting, cross-referencing, citation, and the rest. It is now in danger of becoming the rapid act at the beginning of the process which initiates the absorption of new findings into the body of science. Indeed some scientists (Signalling Gateway provided examples years ago) prefer simply to have their findings cited – or release their data for scrutiny by their colleagues. Dr Donald Cooper of the University of Colorado, Boulder, used F1000Research to publish a summary of data collected in a study that investigated the effect of ion channels on reward behavior in mice .In response to public referee comments he emphasized that he published his data set in F1000Research “to quickly share some of our ongoing behavioral data sets in order to encourage collaboration with others in the field”. (http://f1000.com/resources/Open-Science-Announcement.pdf)

I have already indicated how important I think post-publication peer review will be in all of this. So let me now propose a four-stage Open Science “publication process” for your consideration:

1. Research team assembles the paper, using Endnote or another process tool of choice, but working in XML. They then make this available on the research programme or university repository, alongside the evidential data derived from the work.

2. They then submit it to F1000 or one of its nascent competitors for peer review at a fee of $1000. This review, over a period defined by them, will throw up queries, even corrections and edits, as well as opinion rating the worth of the work as a contribution to science.

3. Depending upon the worth of the work, it will be submitted/selected for inclusion in Nature, Cell, Science or one of the top flight branded journals. These will form an Athenaeum of top science, and continue to confer all of the career-enhancing prestige that they do today. There will be no other journals.

4. However, the people we used to call publishers and the academics we used to call their reviewers will continue to collect articles from open sources for inclusion in their database collections. Here they will do entity extraction and other semantic analysis to make what they will claim as the classic environments which each specialist researcher needs to have online, while providing search tools to enable users to search here, or here plus all of the linked data available on the repositories where the original article was published – or search here, on the data, and on all other articles plus data that have been post-publication reviewed anywhere. They will become the Masters of Metadata, or they will become extinct. This is where, I feel, the entity or knowledge stores that I described recently at Wiley are headed. This is where old-style publishing gets embedded into the workflow of science.

So here is a model for Open Science that removes copyright in favour of CC licenses, gives scope for “publishers” to move upstream in the value chain, and to increasingly compete in the data and enhanced workflow environments where their end-users now live. The collaboration and investment announced two months ago between Nature and Frontiers (www.frontiersin.org), the very fast growing Swiss open access publisher seems to me to offer clues about the collaborative nature of this future. And Macmillan Digital Science’s deal on data with SciBite is another collaborative environment heading in this direction. And in all truth, we are all now surrounded by experimentation and the tools to create more. TEMIS, the French data analytics practice, has an established base in STM (interestingly their US competitor, AlchemyAPI, seems to work most in press and PR analysis). But if you need evidence of what is happening here, then go to www.programmableweb.com and look at the listings of science research APIs. A new one this month is BioMortar API “standardized packages of genetic patterns encoded to generate disparate biological functions”. We are at the edge of my knowledge here, but I bet this is a metadata game. Or ScholarlyIQ, a package to help publishers and librarians sort out what their COUNTER stats mean (endorsed by AIP), or ReegleTagging API, designed for the auto-tagging of clean energy research, or, indeed, OpenScience API, Nature Publishing’s own open access point to searching its own data.

And one thing I forgot. Some decades ago, I was privileged to watch one of the great STM publishers of this or any age, Dr Ivan Klimes, as he constructed Rapid Communications of Oxford. Then our theme was speed. In a world where conventional article publishing could take two years, by using a revolutionary technology called fax to work with remote reviewers, he could do it in four months. Dr Sam Gandy, an Alzheimer’s researcher, is quoted by F1000 as saying that his paper was published in 32 hours, and they point out that 35% of their articles take less than 4 days from submission to publication. As I prepare to stop writing this and press “publish” to instantly release it, I cannot fail to note that immediacy may be just as important as anything else for some researchers – and their readers.

« go backkeep looking »