As a Thomson man of the generation of ’67, I was well schooled in the dictum “its not what you buy, but what and when you sell that makes the real difference.”* And having spent almost three decades button-holing anyone who would listen, like some crazed digital ancient mariner, on the importance of building digital presence in B2B publishing and information markets, I should probably be pleased to see headlines in the Financial Times (3 March 2012) heralding the sale of EMAP’s print assets (“Analysts say EMAP faces challenge to move away from print”). But I am not. I know exactly when these print assets should have been sold: in 2002 at the end of the Dotcom Bust. And I cannot persuade myself that a wrong move then will be rectified by a pointless move now, or that value will be added to anything by selling the subscription/advertising print stable at EMAP – or at UBM, or at Haymarket, or Centaur, or Incisive – to someone who is simply going to live on a declining annuity until it expires. There will in any case be few buyers, and those who do appear will not want the stable, but just one or two of the old nags. The analysts who shriek the headline of this piece are simply transaction mongers who have a firmer grip of deal commissions than they do of the current strategic realities of B2B. So lets go back to 2002 and see what has happened after the management of B2B information and publishing and events decided that it was far too early to exit print subscriptions and, like the regional press, the market would come back to them.

By 2005 it was becoming clear that the bits that worked in B2B, outside of events, were information services and solutions. By that year controlled circulation magazines and newsletters, which had proliferated and at times been generated by online at the end of the previous decade began to wilt. Just as in the pre-2005 period we had spoken of VANs and VADs, so we began to talk about “vertical search” (it turned out to be much the same anyway) and started providing tailored information to self-defined users in commerce and industry. We were beginning to experience for the first time what it was going to be like to live in a “networked society/economy”. A small revolution was taking place: managers were beginning to have to find out what their users did for a living and construct solutions around their daily lives. This meant specialization and expertise in particular verticals: managers could no longer be shifted from title to title on the basis that they knew journalists and advertisers and everything else was the same whether you were publishing in machine tools or in ladies fashions.

And then we came to workflow. If we were really entering an information solutions-type world (where Thomson Reuters had already gone in IP and GRC , and Lexis Risk in insurance) then we had to provide our content directly to the desk of the user, sliced so that it modelled his working patterns, and supported by software tools that added value to it and kept us essential to his processes, and thus too important to be lightly discontinued. And how did we plan to earn his trust in this guise? By either inventing a new brand (think Globalspec in engineering) or by using our old print brands to ensure user confidence (think Bankers Almanac at RBI). Never mind that the print which supported those brands had eroded away, since they were there for entirely different reasons.

And now we are laying another layer in digital development on top of all of this. We now talk of Big Data, of using the services we have created for users as a sort of focussing glass so that we can go out from them to the client’s own content and all sorts of other datasets and find linkages through data mining and extraction, squeezing fresh insight all the time into the workflow of users who, wherever they work, have increasingly become, like us, knowledge workers. And our events activities increasingly morph into always-on trading and learning experiences, where we do introduce clients to the range of products and services in the sector, update and inform on new releases to people who have said they want to know, and move increasingly into the training and professional development of the sectors that we have chosen. Do you see where we are going? We are going to be the full service providers to a handful of vertical markets which we feel confident about dominating.

Why are we confident about that domination? Because we have the brands, many of them over a hundred years old in this country, which our verticals were brought up upon. And behind those brands are archival morgues, full of data with residual value in a Big Data sense. We did not sell those brands in 2002 when they were a going concern, so why sell them now when they are a cause for concern. By all means close the print, by all means reconstruct the service values  using far less journalists in targeted niche environments online. By all means drive towards areas where you have real data intensity, but on the way remember the community and its existing brand affiliations. You want to take them with you.

Which brings us back round to EMAP. I see no point in hanging on to peripheral services, even data-based services like DeHavilland bought as recently as 2007, if they have no strategic coherence in terms of the markets that give EMAP positions of strength. I take these to be construction, local government, broadcast media and fashion. If strength in automotive cannot be linked to the Guardian’s position in Trader Media, then sell that too. But hold onto brands where they can be used to give community credibility and data where it can give archival searchability. By selling them you get a smaller but more profitable business. And that is also the result of digital network development of the type described here – smaller and more profitable businesses. Just don’t throw away something which is pretty worthless now on its own, but which may be needed on a journey to a much better place.

* Note that the companies that Thomson SOLD in the mid-1980s in the UK form the majority of EMAP and Trinity Mirror today, as well as large chunks of Springer and Infinitas, and elsewhere and afterwards the bulk of Cengage and a big portion of the US regional press. Were they right or not?

A long time ago the Financial Times formed a joint company with the London Stock exchange to exploit the FTSE share price index. I seem to recall that this was not a success, but a colleague at that time joined the board and I recall asking him what an index was for. He replied that it was a sort of branding statement, and it also said that you had the underlying data from which to create the index, should anyone want to look at it. And was that a good business? Well, not really, since few people were able to make sense of the underlying data. So it was mostly a brand thing, then? Well, yes. And a brand thing where, since most people refer to the “footsie”, the brand reference is lost in speech.

I do not believe that they have the same view at FTSE now, and in a world currently rampant with indices it is interesting to check on the progress of players like Argus Media (www.argusmedia.com) who have used indexation powerfully to elevate a small player in energy and commodity data markets into a very powerful one. I wrote about this in November 2009 (https://www.davidworlock.com/2009/11/battle-of-the-indices/) and envisaged the war between Argus and McGraw Hill’s Platts in oil markets as a classic  David-Goliath story – but one which would need to be followed up by the victor to consolidate the gain with a wider service base. To quote: “index publishing is becoming an interesting value phenomenon.  It creates lock-in around which workflow activities and value-add analytics can be built. It gives brand focus and recognition.  It provides contract opportunities to supply and maintain service points on client intranets. In truth, it is sexier than it sounds.”

In light of this I was delighted to find that Argus Media had made an important purchase in analytics software this month.  Fundalytics “compiles, cleans and publishes fundamental data on European natural gas markets” and is a first service acquisition of this type that the company has made. Starting with natural gas, however, it should be possible to create a wider range of analytics activities, across energy markets, which are currently so very active, and other commodity areas like fertilizers where the company is building a stronghold. Competition is obviously fierce, with direct pressure from Platts, about double the size, and RBI’s smaller ICIS. And then there are the market information players who have always used the data and its primary analysis to form notation services for both players and investors – the Wood Mackenzies and IHS operations, and, at a further remove, Michael Liebrich’s New (now Bloomberg) Energy Finance and Thomson Reuters’ Point Carbon. It is understandable that there would be heavy competitive pressure in such an important field, and rewards will align with the industrial, financial and political clout the whole field invokes. But of the companies mentioned here, some are primary data producers, some secondary, and some create market commentary without owning a data farm at all. Can they all survive, and, if not, what sort of equipment do you need to succeed?

This is why the Argus Media purchase is more important than its size or value. If we have learnt anything from the consolidation of service markets in the network in the past decade then it is, surely, that relatively few players are needed to provide the whole range of internet services, and that users do not lust for more – indeed, they seem to want one sure place to go, and an alternative  in case their preferred supplier tries to abuse his pricing control. You could point to the history of Lexis and Westlaw in law markets for part of the history of this. Then they want from those two lead suppliers the ability to secure access to all the core data that they need: to both use that data and its analysis on the supplier service, and suck data into their own intranets to use in conjunction with their own content: to access APIs which allow them to create custom service environments and maintain them as fresh value add features are developed by the supplier: to use the supplier as the architect/engineer for workflow service environments, where news and update is cycled to the right place at the right time and where compliance with knowledge requirements can be monitored and audited: and, finally, they want the supplier to run the Big Data coverage for them, using his analytical framework as a way of searching wide tracts of publicly available data on the internet to secure connections and provide analysis which could never have existed before.

This is a formidable agenda, and I am not suggesting that anyone is close to realising it. Those who want to enter the race are probably now securing their content on an XML-based platform and beginning to buy into analytics software systems. And it was the latter point which so interested me in regard to Argus. If the human race descended from a tree shrew, then there is no reason at all why a smart data company close to London’s fashionable Shoreditch tech-zone, should not be a lead player in the future structure of service solutions for the energy and commodities markets!

« go backkeep looking »