Mar
23
The Odd Future of Aggregation
Filed Under B2B, Big Data, Blog, data analytics, Education, eLearning, Financial services, healthcare, Industry Analysis, internet, mobile content, news media, Publishing, Search, semantic web, social media, STM, Workflow | 1 Comment
Mr Bezos and his cohorts at WaPo (the Washington Post to you and I, earthlings) have decided, we are told, on an aggregation model. As far as skilled translators of Bezos-speak can tell, this will mean bringing together licensed news content from all over the globe – spicy bits of the Guardian, a thin but meaty slice of the London Times, a translated and processed sausage segment of FAZ, a little sauerkraut from Bild, a fricasse of Le Monde… well, you get the picture, and the fact that I am writing this before dinner. These ingredients will get poured into a WaPo membership pot, heated and served to members who want to feel that they are on top of global goings-on, from the horses mouth, and without having to read the endless recyclings and repetitions which characterize the world media at source.
Well, I see the point, and the idea of membership and participation seems to me one which has endless energy these days. But I have been thinking for several years now that the Aggregation business model as experienced from 1993 onwards on the Web is on its last legs. Believing that “curation” is too often a word which we use when we are trying to maintain or defend a job, I have tried to steer clear of imagining that storage, the ultimate network commodity, was a good place to start building a business. In the early days of the Web it was certainly different. Then we could create the whole idea of the “one stop shop” as a way of simplifying access and reducing friction for users. All of the things we were collecting and storing, for the purposes of aggregation, were in fact “documents”, and their owners wanted them to be stored and owned as documents, bought as documents and downloaded as documents. The early reluctance of STM publishers to apply DOI identity beyond the article level and make citations, or references or other document sub-divisions separately searchable seems in retrospect to demonstrate the willingness of IP owners to manipulate the access to protect the business model.
Three clear developments have comprehensively undermined the utility of content aggregation:
* the desire of users to move seamlessly from one part of one document through a link to another part of a different document seems to them a natural expression of their existence as Web users – and in the content industries we encouraged this belief.
* the availability of search tools in the Web which permit this self-expression simply raises the frustration level when content is locked away behind subscription walls, and increases the likelihood that such content will be outed to the Open web.
* the increasing use of semantic analysis and the huge extension of connectivity and discoverability which it suggests makes the idea that we need to collect all or sufficient content into a storehouse and define it as a utility for users just by the act of inclusion a very outdated notion indeed.
It seems to me that for the past decade the owners of major service centres in the aggregation game – think Nexis, or Factiva, or Gale or ProQuest – have all at various times felt a shiver of apprehension about where all of this is going, but with sufficient institutional customers thinking that it is easier to renew than rethink, the whole aggregation game has gone gently onwards, not growing very much, but not declining either. And while this marriage of convenience between vendors and payers gives stability, end users are getting frustrated by a bounded Web world which increasingly does not do what it says on the tin. And since the Web is not the only network service game in town, innovators look at what they might do elsewhere on internet infrastructure.
So, if content aggregation seems old-fashioned, will it be superseded by service aggregation, creating cloud-based communities of shared interests and shared/rented software toolsets? In one sense we see these in the Cloud already, as groups within Salesforce for example, begin to move from a tool-using environment to user-generated content and more recently the licensing of third party content. This is not simply, though, a new aggregation point, since the content externally acquired is now framed and referenced by the context in which users have used and commented upon it. Indeed, with varying degrees of enthusiasm, all of the great Aggregators mentioned above have sought to add tools to their armoury of services, but usually find that this is the wrong way round – the software must first enhance the end user performance, then lend itself to community exploitation – and then you add the rich beef stock of content. For me, Yahoo were the guys who got it right this week when they bought Vizify (www.vizify.com), a new way of visualizing data derived from social media. This expresses where we are far more accurately than the lauded success of Piano Media (www.pianomedia.com). I am all for software companies emerging as sector specialists from Slovakia onto a world stage, but the fact that there is a whole industry, exemplified by Newsweek’s adoption of Piano this week, concerned with building higher and harder paywalls instead of climbing up the service ladder to higher value seems to me faintly depressing.
And, of course, Mr Bezos may be right. He has a good track record in this regard. And I am told that there is great VC interest in “new” news: Buzzfeed $46m; Vox $80 m; Business Insider $30m, including a further $12m last week: Upworthy $12 m. Yet I still think that the future is distributed, that the collection aggregation has a sell-by date, and that the WaPo membership could be the membership that enables me to discover the opinions of the world rather than the news through a smartly specialized search tool that exposed editorial opinion and thinking – and saved us from the drug of our times – yet more syndicated news!
Mar
13
Realism and Re-invention
Filed Under B2B, Blog, data analytics, Industry Analysis, internet, Publishing, social media, Uncategorized, Workflow | 2 Comments
But first of all, a practical question. How sensitive do you think you really are? I only ask because tears and laughter while reading a novel seemed to me a most natural consequence of emotional excitation, so when I read about an experiment at MIT in Sensory Fiction (http://www.fastcodesign.com/3026104/a-wearable-book-that-programs-you-with-feelings) I really wondered if Wearables were going to provide our emotions as well as our logic boards. It turns out that they are not quite there yet, and, anyway, the heat generated burnt out the system, but it left me wondering: networks and communities are all about emotion, and we are sufficiently inept at communicating those emotions remotely (phone, email) already without living in a densely rather than a lightly virtual world. Add the developing communities in the network and our capacity for increasing the sum of human misunderstanding will be infinite.
It was Victoria Mellor of Melcrum (www.melcrum.com) who started me down these tracks. I was lucky enough to be moderating her session at Digital Media Strategies, one aspect of which I wrote about here last week. As CEO of Melcrum (4-5 March), she and her co-founder have more reason than most to think about this, since they chose to work in the entirely thankless field of trying to help executives to communicate more effectively. In Old Time Classifications this would have been stereotyped as a training business in B2B Events and Publishing. New Style, this is a Re-invention, and the current versioning of Melcrum is as a community based peer group.
The fulcrum is the Forum, enabling members of Melcrum to get research, in-house support from Melcrum’s professional advisors (aka training), diagnostic and assessment tools to measure practice against best practice, but all of it driven by and from peer-to-peer meetings and leadership sessions. So welcome to the age of Networks, but in order for everything to work remotely, as it should, there have to be moments in the mix of intensive face to face, of peer recognition and satisfaction, and of privileged moments of listening to market leading thinkers one on one or in small groups. The organizer of this physical to virtual spectrum can achieve powerful positioning and margins, but it makes me wonder what we were doing when we sold all of this research and support materials remotely – in a catalogue or online. The world we have left is not simply to be typified by moving from a real world relationship to a virtual networked world relationship. It is moving from a world of the remote where we knew little about how our users were thinking, feeling, changing, reacting, – to a world where we both meet our users regularly, we embrace them as fellow-members of the same community, and we listen and speak with them digitally every working day. It is a world where a re-invented Melcrum competes with Corporate Executive Board, not a myriad host of small training outfits. And it formed a very exciting vision.
But, curiously, the themes it explored had already resonated through the meeting. There was, for example, Adrian Barrick, Chief Content Officer at UBM (www.ubm.com). Now that session, you might have thought, would take us firmly back to the ancient regime of B2B. Not a bit of it. In a company now seemingly dedicated to events the role of content becomes more critical, not less. Think of the network presence needed to maintain the buyer-seller dialogue online between annual trade shows. If content is the connection between network players, what do you need to provide to maximize network connections by customers? Adrian’s vision was very much of the time: treat attendees and exhibitors and conference delegates as communities and create the content that binds them together. If the new look UBM is an events player, it will also need to be a content player to sustain its market positioning.
Yet the next speaker, I thought, will surely have to be wholly outside of this theme. Damian Kimmelman, CEO, DueDil, is the entrepreneur re-inventing the credit and company information market. That morning, 5 March, he had issued a press release confirming a further £17 m in mezzanine financing for his company (and also issued a report from a research group that he supports which indicated that the entrepreneurial activities of immigrants create a net gain for GB PLC over the costs of immigrants). DueDil (www.duedil.com), at launch, drove straight at the heart of the UK’s duopolistic credit and business information companies by offering core government-derived (Companies House) information for …free. Even now, less than 10% of his million or so registered users pay anything. So, the Financial Times and others were saying that morning, how does he monetize the community he has created? What happens next?
In what followed Mr Kimmelman reminded me strangely of what I had heard earlier in the day just as vaguely hinted at by Andrew Miller of the Guardian Media Group. we began to think about the meaning of a network of users. About the potential for user generated content and what people might share with each other. About the fact that these markets had always existed by sharing trading information between each other, and that the free was the glue in an extended dialogue. So perhaps the future here lay, as did the Guardian’s, in some form of membership organization. All of a sudden we were leaving the world of Dun & Bradstreet and Experian far behind and heading for a world far more familiar to Victoria Mellor’s re-invention.
Yet this was all B2B – but not as we know it. It was all accommodation with living in a digitally networked world, yet using the real world, as in Adrian’s exhibitions, to give purpose and vitality to the networked equivalent. I thought I was moderating three wholly different speakers with widely divergent subject matter. I left the stage knowing there was only one.
« go back — keep looking »