May
16
Of Stable Doors and Acquisitions
Filed Under B2B, Big Data, Blog, Financial services, healthcare, Industry Analysis, internet, Search, semantic web, Uncategorized, Workflow | Leave a Comment
Are we seeing the emergence here of a new truth? Or just an old lie tarted up? The sale of BvD ( Bureau van Dijk) by EQT to Moody’s for $3 billion is either a great celebration of the need for data in a data analytics business ( aka IBM Watson and Trueven Analytics in the Healthcare sector, or Verisk and Wood Mackenzie in energy ) or the need to persuade wary initial users that the analytics they are getting are backed by the familiar brands of the research databases on which they were formerly reliant . And if it is the latter then large database companies would be well advised to sell their data to emerging analytics companies now , because sometime soon users are going to discover that data searched on the Open Web now is often equal in value and sometimes throws up startling new insights when compared with the hoarded and highly curated stuff we have been relying on for years . But it is the stuff from the research databases that has the total credibility
Think of it this way . BvD have been building databases since the early 1980s . As the Belgian member of my board in the Eropean Information Providers Association , Marcel van Dijk was openly sceptical about the future for research databases , a sideshow in his computer bureau business , and a hobby of his colleague , the luminary Professor Bernard van Ommeslaghe. The latter built a business that was bigger than the Bureau at Marcel’s retirement , and started the chain of wealth enhancement that led through Candover and Charterhouse and ended with EQT , in successive deals that have grown from $600m to $3billion over a ten year period . And has BvD grown commensurately with that value ? Well , it is a highly profitable $280m company well plugged into corporate research , especially around credit , risk , compliance , M&A , and that great money earner of our generation , transfer pricing .By the time we entered the age of Compliance the company was already in PE hands and getting expensive , but much of its data was available from public sources and its much vaunted private company data was as good as private company data can be – patchy , and increasingly easy to research , and in markets where you really wanted to know (China) fiercely defended by someone more powerful ( People’s Bank of China ) .
So they did the right and obvious thing to do . While van Ommeslaghe tuned the search engine a decade or so earlier as his response , they now went for the “new wave ” and started an analytics based solutions business , launched in each of their sectors and branded “Catalyst” . I have never seen the figures though I have constantly asked market analysts who know everything about one of the most intensively researched companies in its sector and they change the subject . No mention of this was made in the sale release either , where analytics was concentrated around the ability of the Moody Analytics division to transform BvD . I draw my own , possibly erroneous , conclusion: not for the first time internal re-invention failed to convert a successful team to a new sales pitch and a new business model .
Which would be a good point at which to sell , especially if Moodys Analytics division is as hot to trot as its press releases suggest . And do not forget that these are critical days for the rating agencies . While performance at Moody , S&P and Fitch has returned to pre-recession levels – almost- there are still critical regulatory issues and continuing disquiet about the role the se agencies played , or didn’t play , in that crisis . And for the first time for years there are competitive threats : governments and regulators wonder if there is a better way , while start-ups like Credit Benchmark in the banking sector suggest that aggregating the research and decisions made by all banks can produce valid choices and rating decisions for individual players . In short , we are now removed from the glory days when this market was a “license to print money ” and we are back to the struggle for survival . Will Analtyics be the Get out of Jail card?
Let’s then go back to Marcel and Bernard and the risk decision they made in in or around 1985 . Suppose Marcel says ” Look , are you sure that there will ever be enough librarians and information managers to justify renting them space on our computers ? The IBM PC means that our outsourced bureau service running payrolls and utilities business is in decline but we are competitive , we know how to sell and we can still hold our own “. And Bernard responds ” No , the new business is just like the old one , except we are storing content for the individual use of the endusers of our clients , the business model ( then ) is the same – time based access – all we need to do is learn how to sell access ” . Life is not perfect – as it turned out there was not enough bandwidth in 1980s phone lines , so they ended up succeeding on CD ROM . But there were enough intermediary users . Today cost conscious employers want to cut out the intermediary librarians and deliver solutions directly into the workflow of the ultimate user . Do Moody Analytics , or any of us for that matter , yet know enough about pricing , selling or distributing these solutions . The gap between the decisions made in the 1980s and the decisions that need to be made now is much greater . It is no good retaining the Marcel belief that somehow the good old business will just go on , but that investors and customers will only appreciate the “Change ” badge on your lapel if you spend a very great deal of money on it ( and how did they come by that valuation ….. !)The purchase of BvD takes Moody’s revenues over the billion mark and adds to its margins , so it ticks some analysts boxes , but the horse that lived in the BvD stable has long since bolted , and it is hard to believe that a new incumbent is ready to graze that data – or find some better stuff in the open pastures of the internet
Apr
13
UKSG 40: The Temple of Change
Filed Under Big Data, Blog, data analytics, healthcare, Industry Analysis, internet, Publishing, Reed Elsevier, Search, semantic web, social media, STM, Uncategorized, Workflow | Leave a Comment
The sunny but sometimes chill air of Harrogate this week was a good metaphor for the scholarly communications marketplace. Once the worshippers at the shrine of the Big Deal, the librarians and information managers who form the majority of the 950 or so attendees now march to a different tune. From the form of the article to the nature of collaboration this was a confident organization talking about the future of the sector. And at no point was this a discussion about more of the same. Three sunny days, but for publishers present there was an occasional chill in the wind.
I started the week with a particular purpose in mind, which was all about the current state of collaboration. I was impressed by the Hypothes.is announcement with Highwire (www.highwire.org). There are now some 3000 journals using open source annotation platforms like the not-for-profit Hypothes.is to encourage discoverable (and private) annotation. Not since Copernicus, when scholars toured monasteries to read and record annotations of observations of the galaxies in copies of his texts, have we had the ability to track scholarly commentary on recent work and work in progress so completely. And no sooner had I begun talking about collaboration as annotation than I met people willing to take the ideas further, into the basis of real community-building activity.
It seems to me that as soon as the journal publisher has imported an annotation interface then he is inviting scholars and researchers into a new relationship with his publishing activity. And for anyone who seeks a defence against the perceived threat of ResearchGate or Academia.edu the answer must lie in building patterns of collaborative annotation into the articles themselves, and becoming the intermediary in the creation of the community dialogue at the level of issues in the scholarly workflow. So it seemed natural that my next conversation was with the ever-inventive Kent Anderson of Redlink, who was able to show me Remarq, in its beta version and due to be formally launched on 1 May. Here discoverable annotations lie in the base of layers of service environments which enable any publisher to create community around annotated discussion and turn it into scholarly exchange and collaboration. We have talked for many years about the publishing role moving beyond selecting, editing, issuing and archiving – increasingly, I suspect, the roles of librarians – and moving towards the active support of scholarly communication. And this, as Remaeq makes clear, includes tweets, blogs, posters, theses, books and slide sets as well as articles. Services like Hypothes.is and Remarq are real harbingers of the future of publishing when articles appear on preprint servers and in repositories or from funder Open Access outlets, where the subject classification of the research is less important than who put up the research investment.
And, of course, the other change factor here is the evolution of the article (often ignored – for some reason we seem to like talking about change but are reluctant to grip the simple truth that when one thing changes – in this case the networked connectivity of researchers – then all the forms around it change as well, and that includes the print heritage research article). Already challenged by digital inclusivity – does it have room for the lab video, the data, the analytics software, the adjustable graphs and replayable modelling? – it now becomes the public and private annotation scratchpad. Can it be read efficiently by a computer and discussed between computers? We heard reports of good progress on machine readability using Open Science Jupiter Notebooks, but can we do all we want to fork or copy papers and manipulate them while still preserving the trust and integrity in the system derived from being able to identify what the original was and being always able to revert to it. We have to be able to use machine analysis to protect ourselves from the global flood of fresh research – if the huge agenda was light anywhere then it was on how we absorb what is happening in India, China, Brazil and Russia into the scholarly corpus effectively. But how good it was to hear from John Hammersley of Overleaf, now leading the charge in connecting up the disconnected and providing the vital enabling factor to some 600,000 users via F1000 and thus in future the funder-publisher mills of Wellcome and Gates, as well as seeing Martin Roelandse of Springer Nature demonstrating that publishers can potentially join up dots too with their SciGraph applicationfor relating snippets, video, animations sources and data.
Of course, connectivity has to be based on common referencing, so at every moment we were reminded of the huge importance of CrossRef and Orcid Incontrovertible identity is everything, I was left hoping that Orcid can fully integrate with the new CrossRef Events data service, using triples in classical mode to relate references to relationships to mentions. Here again, in tracking 2.7 million events since service inception last month, they are already demonstrating the efficacy of the New Publishing – the business of joining up the dots.
So I wish UKSG a happy 40th birthday – they are obviously in rude health. And I thank Charlotte Rouchie, closing speaker, for reminding me of Robert Estienne, who I have long revered as the first master of metadata. In 1551 he divide the bible into verses – and to better compare Greek with Latin, he numbered them. Always good to recall revolutionaries of the past!
PS. In my last three blogs I have avoided, I hope, use of the word Platform. Since I no longer know what it means, I have decided to ignore it until usage clarifies it again!
« go back — keep looking »