Mar
28
Abundance and Scarcity
Filed Under B2B, Big Data, Blog, Financial services, healthcare, Industry Analysis, internet, online advertising, Publishing, semantic web, social media, Uncategorized, Workflow | 1 Comment
I sat down to write a glowing note on the Digital Science conference at London’s glorious Royal Institution last night. “Inventing the Future” was a huge success and underlined the creative quality of the debate on the digital future in this city. As I stared ruminatively at my blank screen, an alert crossed it: Emap have decided to split themselves into three parts, to be called (no, I am not kidding) Top Right Group (something to do with graphs?) for the whole outfit, i2i Events for the (you guessed it!) events division, 4C Group for the information division (“Fore-see”, geddit?), and, triumphantly, EMAP Publishing for the magazines. Given that they did not waste any of that expensive rebranding budget on the magazines we can guess that this lot are for sale first (though a rumour today also gives that honour to the CAP automotive data unit). The best guess is that everything is for sale, and some reports are already citing advisory appointments in a variety of places.
Meanwhile, the philosophers of the night before had been talking of the very nature of the digital, networked society. Their threnody was “Open”. JP Rangaswami, Chief Scientist at Salesforce.com (I have heard this man twice in a week and would be happy to go again for more tomorrow) set the tone. We have to realize that the network has turned our media picture on its head. Now we have to understand the ways in which consumers are re-using and reshaping content. The social networks are ways of amplifying and diminishing those responses, filtering and distilling them. The publisher’s role is to get out of the way – this is not a push world anymore, but act as a distributor and reproducer of excellence without doing harm or trying to outbid the creativity of endusers. Stian Westlake of NESTA, looking at this from a policy viewpoint, saw the need to rebalance the investment, to innovate in areas of strength like the UK financial services markets, and to make education fit the requirement of a networked economy. As JP said, re-quoting Stewart Brand “information wants to be free”. We have it in abundance, while we have scarce resources for shaping and forming it as users want it, and enabling them to do that in their own contexts.
It turns out, of course, that some of the data we want is held by government. The third speaker was Professor Nigel Shadbolt, Professor of AI at Southampton, Director of the new Open data Institute, and Sir Tim Berners Lee’s vice-gerent and apostolic delegate to the UK government’s Open Data programme here on earth. He mercifully skated across the difficulties of getting governments to do what they have said they will do, while pointing out that despite the fad of Big Data, linked data was now a vital component at all levels, big and small, in delivering the liberating effect of making compatible data available for remixing. With these three speakers we were in the magic territory of platform publishing. Here it was unthinkable not to promulgate your APIs. Here was a collaborative world of licensing and data sharing. Here was a vision of many of the things we shall be doing to to create a data-driven world in the networks for the net benefit of all of us.
And then I read the EMAP announcement, and it brings home the way in which the present and the future are pulling apart radically at the moment. No one looked at the EMAP holdings through the eyes of customers, buyers, or users. Channel and format, the classifications of the past, are the only way that current managers can see their businesses. So we divide into three channels what needed to be seen as a platform environment, created by ripping out all the formats and making all of the data neutral and remixable in any context. So the building and construction marketplace at EMAP, which has magazines, data and events (events – the greatest source of data yet discovered on earth), becomes a way of shaping and customizing content for users large and small, directed by them and driven by their requirements. But the advisors cannot understand anything but ongoing businesses, the strategy has no place in the IM, the McGraw-Hill failure to do this at Dodds and Sweets is not encouraging, so we divide the stuff into parcels that can be sold, and sell it off at small portion of its worth, while blaming the technology that could save it for “disrupting” it to death.
Maybe this is right. Maybe the old world has to be purged before the new one takes over. Maybe we have to go through the waste of redundancies, the dissipation of content, the loss of continuity with users/readers/customers before they are able to show us once again what we really should be doing. But now, when we know so much about “inventing the future” this seems a very rum way of proceeding. Incidentally, last night’s conference host, Digital Science, is a very exciting Macmillan start-up whose business it is to invest in software developed by users in science research to support their work. Truly then a new player with more than a whiff of the zeitgeist of this conference in its nostrils. Those of us with long memories remember an older Macmillan, however. One that owned the Healthcare and nursing magazine market, and lapped up the jobs advertising cream in the days when users (or the NHS), could not use the web as an advertising environment. So Macmillan sold its magazine division before the advertising crash – to EMAP. It is people, decisions and the choices made by users that change things. It is hardly new to note that lack of a tide table can create serious risk of drowning, but it could be true.
Feb
12
The Point of Utility
Filed Under B2B, Big Data, Blog, Education, Financial services, healthcare, Industry Analysis, internet, Publishing, Search, semantic web, social media, STM, Uncategorized, Workflow | 3 Comments
We have had points of inflection and points of sustainability. Time then to propose a new “point”, one which applies universally throughout the world of information services and solutions, but which I found last week dramatically illustrated in the world of STM. Just as in the early years of the networked world we observed a point of disintermediation, at which the network effect removed real world service entities and made process cheaper and quicker, so we can now see places where re-intermediation is required, introducing a new service layer to streamline the network’s own inefficientcies, moving cost to a different place, but still often reducing it while increasing network efficientcies. And once this re-layering has taken place, the point of utility satisfied, and opportunity is created for major increases in the collection of neutral data about how the network processes in question work, and from this derives still greater efficiencies.
The ideal illustration of this is Mendeley (www.mendeley.com) and I listened with great appreciation last week when its co-founder, Jan Reichelt, described the problem which he and his colleagues had set out to solve. The inability to describe PDFs containing scholarly articles arose from the nature of network access and storage. Academic buyers knew what they had bought when they acquired it, but since the PDF envelope had no external description of what it contained, it was hard to cross search one’s collected downloads or manage a research project’s collection. And publishers, being publishers, adopted different practises regarding use of metadata, or even numbering systems. And as sources of grey literature, of files of evidence, or collections of abstracts became involved, local content access and control became a major overhead. Jan and his colleagues knew – only a few years ago they were researchers themselves.
So Mendeley was launched as an environment to regularize this, and to ensure that academics are able to better exploit the acquisitions that they have made. As a result those primary drivers of network efficientcy can be accomplished – saving money, making better decisions, and ensuring cost-effective compliance. This Point of Utility exploitation then has certain network knock-on effects. The service, like Mendeley, becomes an important part of the navigation of end users, and indeed may become part or the base for the user access dashboard. Once the Point of Utility has become an interface, then it is able to pick up all sorts of feedback data from the way end users act through the interface. This data about workflow will indicate usage and popularity, the common processes that users employ in discovery, the way in which resources in the system relate to each other, and the subjects that researchers really search (as distinct from the disciplines that journal editors think they subscribe to). Once this activity gets underway then the new interface owner can begin to suggest workflow improvements, and resell across the market the high value data which derives from the actual patterns of usage. There is a Point of Utlity in every network environment and Mendeley, through their knowledge of researcher proclivities, have camped on one of these exciting fault-lines in STM.
Some of these opportunities arise from publisher activity – lack of collaboration, lack of standardization, lack of knowledge about what happens to articles post-sales – and some from the same features in the user community. This is not a blame game. Mendeley has taken the initiative and must be welcomed as one of the foremost workflow players in the sector, especially since the launch of the Mendeley Institutional Edition last month, which takes the intermediary role into the the academic library, in conjunction with and powered by Swets, who have quickly grasped the point. This, as well as exporting the Mendeley API (http://www.mendeley.com/blog/design-research-tools/winners-of-the-first-binary-battle-apps-for-science-contest/), will turn fast growth into a torrent: Mendeley already have backlogs despite having captured 160 million references for 1.5 million users. Some publishers (Springers’s short article previews) clearly get it – others, as ever in this sector, plainly adopt a policy of “this too will pass”).
But of course it will not. Far from returning to normal , the science knowledge market is in a ferment. Visit Dryad (http://datadryad.org/) and go to Giga Science (http://www.gigasciencejournal.com/) and observe the impact of Big Data on this sector. My friend Mark Ware, in an excellent note for Outsell (https://clients.outsellinc.com/insights/?p=11693), has given chapter, verse and analysis. Content control in the scientist’s workflow is becoming a multiple media nightmare. Does the metadata accurately describe the contents of video diaries and observations, or their audio equivalent? Can we source the data behind that report, and do our own analysis? How many unpublished, unreported studies have validated these results? What has been said and at which conferences about how far down the track this research team has gone? Where do we find the right mix of experience to staff this next enquiry? Regardless of its peer reviewed status, who actually used this work – and, if they did not use it, what did they rely upon instead? Mendeley is a promising beginning, but there is a long road ahead. Stefan Glanzer (Last.fm) and Alejandro Zubillaga (lately head of Warner Digital Music – where he must have seen parallel problems) put in the seedcorn and should be congratulated. They have a real start-up (my spirits rise when I visit an office that has its bike racks and its table football in the foyer!) with the wind behind it.
One last check on where that wind is blowing. Visit ResearchGate (www.researchgate.net) and look at the ways in which scientists are beginning to indulge in meaningful social networking. I have been told for 25 years that scientists and academics are too competitive to use social networking. Like much of the received wisdom of the pre-networked society, this is at best a half truth. The whole truth is that there are no longer simple generalizations that hold true about researcher behaviour. That is why they flock so quickly to the Point of Utility.
« go back — keep looking »