I blinked at today’s announcement with incredulity. Neilsen Expositions sold to private equity for $950m? (http://www.followmag.com/2013). Where does this madness end? Since 2008 we have been living, in traditional B2B markets, with the reality of the network. We have all talked increasingly confidently about the irreversible decline of advertising in print, and our inability to replace it in a satisfactory way online. We have talked of companies getting smaller – but more profitable – and we have talked about the future in terms of creating workflow solutions for our customers, using our data to create these service solutions for them, and using our metadata as the sandbox of new product development to build applications that really bind customers to us. The opportunity is now open to us to effectively lead our markets into the future, basing our claim to our clients squarely on the proposition that we can improve their productivity (and thus cut costs), and enhance their decision-making by getting all the salient knowledge into the right framework at the right time, while protecting their backs against the thorn hedge of re-regulation that encroaches the post-recession world. This is a wonderful opportunity, and how good it is to see Thomson Reuters, Reed Business, Lexis Risk and others getting fully to grip with it.

Meanwhile, how sad it is to see the old B2B players in Europe dodging the inevitable. While Schibsteds and Axel Springer in the declining newspaper market now make a fetish of collecting B2B classifieds services (Reed sold Total Jobs to the latter very shrewdly), mainstream B2B in the UK, outside of the market leaders mentioned, seems to have something of a collective death wish at the moment. Like Gaul, EMAP is in three parts, each of them unsalable as they stand. The data section is too diverse, the exhibitions is too small, and the magazines too unprofitable. Over at UBM, they now talk the language of exhibitions and conferences as if it was the golden hope. B2B at Informa remains a collection of fragmented and unrelated businesses, which was how management wanted things historically, but now ignores the need to centre on data, and play the combined strengths of all the data into the key markets you want to grow. And if Datamonitor does not provide a rich way of enhancing service values across the group then what does? Meanwhile Incisive and Haymarket seem to groan for solutions, while only Centaur amongst the smaller players seems to have woken up and smelt the coffee.

I am reciting this doleful catalogue as a way of steeling myself for this week’s PPA Conference in London. What would make me most happy is hearing someone say – “Yes, we are re-investing our events portfolio with a transformative agreement with a software partner. The object is to build readership into virtual events, extending our conferences and exhibitions into year-long happenings, open 24/7. Yes, we know we have to give attendees at real events more – find out what they want, research and book meetings for them etc, while giving exhibitors a better deal, client introductions and profiles, and a year long follow-up with new product releases and regular contact. Yes, we know that, even if it almost too late, we need to build community urgently before we finally lose the chance, and we know that conference delegates, exhibition attendees and exhibitors all want a better deal. Not several better deals – just the one will be good enough”.

I was once, briefly, non-executive chairman of an events software company. I know that rapid development has taken place to assemble data, match buyers and sellers, set up itineraries and update core data holdings with key changes year by year. And I go to about 15 conferences and exhibitions each year, but have yet to be asked who I wanted to meet, or what I wanted to realise from the experience. Afterwards, however, I am deluged with surveys about what I accomplished and how good the show was. This seems to me to be quite upside down. Like most of my fellow citizens, I am well-known in the network: find me on LinkedIn or Twitter and you could even guess, from my friends and contacts, who else I might like to meet. UBM bought the rights to reality-failed COMDEX, and launched a virtual exhibition in November 2012. It attracted an audience that seemed to please UBM, but on the website I see no mention of a 2013 edition, or even of a web presence continuing from the last effort. And last year’s registration asked none of the questions that might be thought relevant to using the meeting effectively. Yet, as I have mentioned here before, if virtual reality is cheap enough to teach language learners spoken English proficiency (www.rendezvu.com) then it will surely sustain the 5000 visitors and 50 exhibitors that came last year. Or will it just slip away, just as London’s Online show has slipped back into a library conference in the hands of Incisive.

So I am worried by what I will find at the PPA. Meanwhile, virtual reality is being used intensively in other places – particularly in the cash-starved museums and art galleries of Europe. Maybe our publishing directors should organize an outing to the local resource to see how its done!

…by which to test the seriousness of the industry. (Yes, I went to the new Hamlet production at Stratford last week). And this week’s play, acted out to a packed house of industry watchers and market analysts, has been the seduction and vanquishment of the fair Mendeley by all-powerful Elsevier, so rudely forced. Or, if you prefer, the seduction of barbarous Elsevier by maidenly Mendeley. Whatever, here was a deal done for a company with negligible revenues at a price , with earn-outs, of something up to $100 m, according to those ever-present “people familiar with the deal”. And since I have seldom had more requests to explain, here is my take: Mendeley represents the greatest leap forward since Eugene Garfield in representing the worth of a science research article. If it went to Thomson Reuters it would put them back into a game where Elsevier have spent a huge amount, culminating in SciVal, in competitive efforts to diminish them. As in days of yore (who remembers BioMedNet?) the competitive threat potentially posed by Mendeley proved greater than the price misgivings. If it went to Macmillan, who already have an investment in ReadCube, Mendeley’s competitor, it would create another axis of competition which would be unwelcome, given the strides that Macmillan Digital Science made by investing in Altmetric, as well as figstore. Since every article is unique and not a competitor with other articles, the true point of competition in science research publishing now lies in workflow tools which make researchers more productive – and help them to decide what to actually read, and what to reference and visualize. So, continuing my Danish theme, this is a pre-emptive strike, like Nelson destroying the fleet at Copenhagen. Do we know whether Mendeley is the ultimate social tool for tracking who buys and reads what? No, any more than we know whether FaceBook is the player in place for life in social networking. But we do know that more than 2 million active researchers value it immensely, and so it posed a question – and one that for 20 years Elsevier have been adroit in answering.

This begs a few questions. Will Elsevier be able to run it independently enough to re-assure those critics who regard it as more like Caliban than Caesar? And are we being distracted by watching the wrong part of the game with too much intensity? I am a strong supporter of what the Mendeley team have done, but they were let into the marketplace by a chronic publishing failure: the inability of producers to sell to researchers adequately identified PDFs that obeyed agreed industry standards and which would allow a researcher to auto-index his hard disk and find what he had bought. As ever, publishers were complacent about the downstream problems they caused their users. But the real question here is about metadata, and it is a timely reminder of other problems we have never fully solved. When we adopted DOI/Handle technology the publishing community worked, as always, at the lowest common denominator of agreement. The result is a world in which articles are effectively numbered, and CrossRef express that industry cohesion, but we still cannot offer researchers the ability to search consistently over the full range of articles for which they have permissions cleared using their own or even semi-standardized taxonomies. Nature (http://www.nature.com/news/the-future-of-publishing-a-new-page-1.12665) has done sterling work in the last month on the future of publishing, but simply illustrates to me how inadquately we tackle the last steps – the ones that lead to collaboration and to each player moving forward to create knowledge stores which reflect the real research needs of their users.

I do not mean to say that publishers do not collaborate. They increasingly do, and recent press coverage of Springer and CAS, or the case study of Wiley’s work with the AGU demonstrate this. I have been involved with the TEMIS work on collaboration and have learnt a lot from it. And publicly industry leaders do point to data-led strategies, which I was interested to hear acknowledged in a talk by Steve Smith (CEO, John Wiley) to the AAP/PSP in February, which I moderated. So I was very interested indeed to spend some time with Jason Markos, Director of Knowledge Management and Planning at Wiley, and get a current view on the enrichment picture. The contrast over the past five years is, to someone used to the sometimes somnolent complacency of publishing, quite startling. Now you can have conversations about content enrichment that do begin to embrace both the narrow/deep and the broad/shallow needs of users. If the capacity now available in publishing – coming it must be said from people who entered from outside and have a real technical grasp of knowledge engineering which was not prefaced by life in linear publishing workflow processes – to think about the need to turn away from content architectures predicated by the structure of the article and towards creating entity stores or “knowledge” stores which allow data items from article databases to be searched in conjunction with data drawn from other sources like evidential data then we may indeed be on the way towards a user-driven networked vision of the future of publishing. Learning how to work with knowledge models as a way of expressing the taxonomic values of all of this shows me that we are on a route march that follows the track that has been obvious for a little while now, and which involves adopting the RDF as a basis, and creating triples to anchor our texts in semantically searchable environments. So our new Knowledge engineers will be able to spin out new service environments for increasingly demanding users, and the publishing game will not peter out with the commoditization of the article…

…I left Wiley the other day full of hope, and I still am. But this context is necessary to see that the Mendeley deal, lovely though it is, remains symptomatic of the need to scratch yesterday’s itch. I suspect that the real struggle, already underway, is to persuade researchers that publishers really can add value to data, and that they really do know how to analyse it, structure it, create smart research tools around it and extract real value from users as a reward for this investment and effort. This will need smart industry suppliers as well, and I have learnt a lot from working with MarkLogic and TEMIS in the past year. And most of all it needs the support of CEOs who see beyond maximizing PDF downloads to the strategic crossroads this part of the industry now faces – and beyond.

« go backkeep looking »