Mar
19
Collaborators Will Not Be Shot
Filed Under Blog, data analytics, Industry Analysis, internet, online advertising, Publishing, Reed Elsevier, STM, Thomson, Workflow | 1 Comment
The nature of networks is collaboration. If we spoke of the Networked Society instead of the Digital Age we might begin to understand the implications of this. Although the tools we use are still crude, every day in the life of every office, or every research laboratory, groups of people are getting together to discuss work, and learn from each other. Moving the sharing from a room with a desk to a conference call and then to a screen simply offers process improvement, to which universal internet access adds the ability to exchange content on the fly. In the view of a collaborator in such a process, any request for information which furthers the group work objective should be answered – we do it ourselves and expect it of others. Collaboration has no rulebook except the need to gratify the requirements of the group focus. Who does not know this and practise it their daily lives?
Well, clearly the STM Association has only just found out. I apologize for not commenting earlier on the announcement in February of a consultation (a helpful change since publisher groupings are more usually into pronouncements) by that association on SSNs and SCNs. Are they a threat to scholarly publishing as we know it, the regulation of ownership, or life on earth? Amazing how you need only apply a three letter acronym to something that everyone was doing anyway and you can effectively demonize it. But what we are really talking about here are networks like ResearchGate, or Mendeley, or Academia.edu or ReadCube or FigShare. As a group they could be described as Scholarly Collaboration (or Sharing) Networks. Those who join them share problems and issues – and content – and sometimes do so as public network members and sometimes do so as private groups, with the network hosting the activity of scholars drawn from different places, just as scholars have always met at conferences and just as very many groups of scholars still meet online undignified by having a three letter acronym to describe what they are doing. Sometimes the SCN has attributes which add further value to collaboration – data collections, or indications of who holds which articles – but the collaboration is the essence. They have millions of members (including students and members of the public, who should of course be caught and charged immediately!).
Now there are elements to this consultation which strike me as highly risible. While I know that, under Fred Dylla as chair, the consultation will be conducted impeccably and with total integrity and fairness, the difficulties that this enterprise faces are daunting. Its not just that SCNs are ways of helping researchers do something they will do anyway, regulated or not. Its not just that publisher attempts to regulate what they do have never made the slightest difference in the past and are unlikely to do so in the future. Its not even the sight of STM making the ungainly ascent into the seaside throne of Canute that amuses so much. It is the fact that most of the SCNs who are possibly involved in the widespread transference of copyrighted scholarly materials between researchers who do not always have formal permission for those transfers, are owned and funded by … publisher members of STM.
My sympathies will always lie with the market. Follow the end user can be the only guideline in a networked society. What Elsevier or Digital Science did here with the creation of these SCNs was simple good sense and they deserve the commercial returns that their efforts on behalf of collaborative scholarship should earn them. Those of us who said loudly a decade ago that Open Access was not the real danger to the established hegemony – it was networked publishing that provided the threat, were told that the major STM players were quite capable of taking the paper journal world into the network and preserving its rules and conventions just as long as research funders did not insist on an author pays model. Well, they were wrong, as people always are when they insist that they can move business models to digital networks without change. Just this week in the UK, on the blasted heath that once was newspaper publishing, a group of news publishers banded together to form the Pangaea Collaboration for selling advertising (CNN, Reuters, Economist, Financial Times, spearheaded by the Guardian) which will allow “brands to collectively access a highly influential global audience via the latest programmatic technology”. In other words, erstwhile deadly competitors getting together to offer tech enabled solutions to customers who have the power to make choices.
Which is probably the answer to the STM question. Trying to alter market behaviour by regulation is fruitless, Forming a ring around current SCNs and licensing them collectively to do what they will do anyway must be more sensible. Creating niched SCNs for research specialities and going to where you can add yet more value must be the way forward. Surprizingly many commercial players see this – but scholarly societies, dependent for survival on journal sales and advertising, are very much more conservative. But then again, the scholars using the SCNs are usually members of those societies, whose role is perhaps what is most at stake here – and of course many commercial publishers make a living by the services that they sell to those scholarly societies. When we look back at a train of events (Macmillan setting up Digital Science, Elsevier buying Mendeley, Nature making all articles free to subscribers, Wiley adopting ReadCube etc) we can be confident that this train is not suddenly going into reverse. The answer for STM members may well be collaboration, but it will be fascinating to see how they attain it.
Mar
16
Data is a Commodity, Analytics is not a Solution
Filed Under B2B, Big Data, Blog, data analytics, Financial services, Industry Analysis, internet, Publishing, Search, Thomson, Uncategorized, Workflow | 1 Comment
This will only get worse. The latest announcement from the Thomson Reuters GFMS service, the premier data analytics environment around gold and silver, indicates that their Copper commodity service on Eikon now moves from mining company to mine by mine performance. “It all adds another data-rich layer of fundamental research to our customers’ copper market analyses” says their head of research. And there, in that line, we have a “fundamental” issue that lies behind the torrent of announcements we see in the B2B sector at the moment. Think only of Verisk buying Wood Mackenzie last week at a price which went well beyond the expectations (17X ebitda) of counter bidders like McGraw Hill, and which shocked private equity players who relish the data sector but find it hard to imagine 12X as an exceedable multiple. The question is this: Risk management and due diligence are vital market drivers, but they are data-insatiable; any and all data that casts a light on risk must be included in the process; it is the analysis, especially predictive analytics, which adds the value; so who will own the analytics – the data companies, the market intermediaries (Thomson Reuters, Bloomberg etc), or the end user customers?
Those of us who come from the content-driven world – they were out in force at Briefing Media’s splendid Digital Media Strategies event last week in London – find this understandably hard to argue, but our biggest single threat is commoditization. Even more than technology disruption, to which it is closely related, data commoditization expresses the antithesis of those things upon which the content world’s values were built. When I first began developing information services, in pre-internet dial-up Britain, we spoke lovingly of “proprietary data”, and value was expressed in intellectual property that we owned and which no one else had. For five years I fought alongside colleagues to obtain an EU directive on the “Legal Protection of Databases”, so it is in a sense discouraging to see the ways things have gone. But it is now becoming very clear, to me at least, that the value does not lie in the accumulation of the data, it lies in the analytics derived from it, and even more in the application of those analytics within the workflow of a user company as a solution. Thus if I have the largest database of cowhide availability and quality on the planet I now face clear and present danger. However near comprehensive my data may be, and whatever price I can get now in the leather industry, I am going to be under attack in value terms from two directions: very small suppliers of marginal data on things like the effect of insect pests on animal hides, whose data is capable of rocking prices in markets that rely on my data as their base commodity; and the analytics players who buy my data under licence but who resell the meaning of my data to third parties, my former end users, at a price level that I can only dream about. And those data analytics players, be they Bloomberg (who in some ways kicked off this acquisition frenzy five years ago when they bought Michael Liebrich’s New Energy Finance company) or others, must look over their shoulders in fear of the day when the analytics solutions become an end user App.
So can the data holding company fight back? Yes, of course, the market is littered with examples. In some ways the entire game of indexation, whereby the data company creates an indicative index as a benchmark for pricing or other data movement (and as a brand statement) was an attempt to do just that. Some data companies have invested heavily in their own sophisticated analytics, though there are real difficulties here: moving from that type of indicative analytics to predictive analysis which is shaped as a solution to a specific trader’s needs has been very hard. Much easier was the game of supplying analysed data back to the markets from which it originated. Thus the data created by Platts or Argus Media and the indexation applied to it has wonderful value to Aramco when pricing or assessing competitive risk. But in the oil trading markets themselves, where the risk is missing something that someone else noted, analysts have to look at everything, and tune it to their own dealing positions. Solutions are changing all the time and rapid customization is the order of the day.
Back out on the blasted heath which once was B2B magazine publishing, I kept meeting publishers at DMS who said “Well, we are data publishers now”. I wonder if they really understand quite what has happened. Most of their “data” can be collected in half an hour on the Open Web. There is more data in their domains free on DBpedia or Open Data sources than they have collected in a lifetime of magazine production. And even if they come up with a “must have” file that everyone needs, that market is now closing into a licensing opportunity, with prices effectively controlled, for the moment, by those people who control the analytics engines and the solution vending. Which brings me back to Verisk and the huge mystery of that extravagant pricing. Verisk obviously felt that its analytics would be improved in market appearance by the highly respectable Wood Mackenzie brand. Yet if a data corner shop, let alone Platts or Argus Media, were to produce reporting and data that contradicted Wood Mackenzie, anyone doing due diligence on their due diligence would surely demand that Verisk acquire the dissenting data and add that to the mix? If data really is a commodity business, far better to be a user than an owner.
« go back — keep looking »