Apr
20
Dreams can be Iterative Too
Filed Under B2B, Blog, Financial services, healthcare, Industry Analysis, internet, Publishing, Reed Elsevier, semantic web, STM, Thomson, Uncategorized, Workflow | 1 Comment
After a break for refreshment (archaeology in the Levant) I am back to face further questioning in the Court of Industry Opinion, and particularly from the colleague who recalled a paper written in 2009 as an Outsell CEO Topic: “Workflow: Information’s New Field of Dreams” and argued that the industry had moved so quickly in the past two years that this did not represent any sort of summation of where we were today. She was right, and a little research shows how I underjudged the real position two years ago, and how the iterated aspiration that lies at the root of workflow as an information services model is now maturing rapidly. Worse, I had underestimated how much the new world was beholden to the old. In the new edition of this report, labelled Version 2.0 and published yesterday (http://www.outsellinc.com/store/products/993), I have retraced my steps and looked again at the importance of metadata and its long history, of taxonomic control and semantic search as contributors to our dream of creating living models of streams of working activity, involving deeply different parts of the workforce. And I am sure that I shall revisit and develop this area in Version 3.0, should I ever get that far, and that we shall find that much of the XML-based technology which has been so useful in creating the agile publishing environments of today (MarkLogic would be the market leader with particular resonance here) will be even more useful as we restructure content to fit the shapes required in different workflow roles.
And then something else happened today. Thomson Reuters, whose work in creating a Governance, Risk and Compliance (GRC) Division I have covered here in detail, launched their Accelus Suite (http://thomsonreuters.com/content/news_ideas/articles/legal/4292965), a rebranding of the 40 or so products and services they bought (Complinet) or borrowed from other parts of the group into 12 solutions areas. I have covered this in detail today in an Outsell Insight (https://clients.outsellinc.com/insights/index.php?p=11468) and do not wish to repeat that here, but it is important to remind ourselves of some key issues. This work has taught us, for example, that the outstanding work done by Lexis Nexis in putting together Seisint and Choicepoint to create a risk assessment workflow engine for the insurance industry is a “vertical” model for the industry. Thomson Reuters Accelus Suite is a “horizontal” model, and while its first targets are financial services players, the elements of the Suite (a Governance, Transactions and Legal Risk set, a Compliance and Regulatory Risk set and an Audit and Internal Control set) are common to all businesses of any scale. In addition, all of these elements require elements of training and education, risk mapping and assessment, audit and accountability, and communication of audited results – upwards, for example, via this Division’s Boardlink environment, a communication tool for risk-responsible directors.
Hang on a minute. There is one problem in all of this. As the Accelus Survey, published with this launch as the first in a regular series reminds us, the one thing we know about corporate life is that the legal department, financial control, the auditors, the compliance officer, the tax advisor and the people who do risk assessment and management all, literally, speak different languages. The Survey points out that 94% of the 2000 respondents saw this as a major issue, and it is surely here that the metadata and taxonomic control elements take centre stage. We will not improve risk management generically unless all of these different people can talk fluently and with precision to each other and to outside agencies, and the GRC Accelus Suite, if it is to succeed, must address that core issue. It is the contention of its leaders that this has been done, and while we all know that “done” is a way of saying iterative development is in train, one assurance lies in the size of the industry sample so far engaged. The Accelus Suite platform now claims more than 100,000 users, from each of the job segments in the workflow, providing a community whose feedback should give drive and direction to fitness for purpose. In this environment, the applications must grow to meet the needs (unlike my new shoes, where the foot must change, painfully, to fit the format!).
So what will these workflow environments grow to become in the industry as a whole? Thomson Reuters position Accelus Suite as a brand and line of business as large in stature and importance as Westlaw or Eikon. This is big. When I spoke earlier in the cycle of building a new business in the interstices between Thomson Reuters’ two well established branded businesses in law and financial services this was no exaggeration. And there is another very striking feature of this launch. Have a look at Regulatory Risk Mapper within the Accelus Suite and you will see an old industry trait – discovery – and a new one – visualization. The point of the Mapper is to detect change (Thomson Reuters recorded 12500 important regulatory rewrites last year) and map it onto policy. Then it can be flagged and dealt with at a variety of different levels and many different ways. And it is what distinguishes an information solutions business from an information research business. And makes Dreams re-iterate.
Jan
9
Decline and Fall of the Google Empire
Filed Under B2B, Blog, Industry Analysis, internet, mobile content, online advertising, Search, semantic web, Uncategorized | 2 Comments
In the course of this year I need to find a local source of shredding services in my desperate fight to stop this hut from drowning in paper. By the end of the year I shall need to have bought a new car. In the idle twilight between Christmas and new year I found myself Googling on both of these topics – and the process took longer and took me to more places than I had ever imagined. And I read more advertising, dodgy reviews and spam than I had ever imagined, so when I read that Paul Kedrosky had had an identical experience (http://broadstuff.com/archives/2370-On-the-increasing-uselessness-of-Google……html) then I perked up a bit. It is always good to find really clever people reacting just as you did. I then discovered a whole band of bloggers through December and January basically arguing that the Web of spam and misleading search of a decade ago, which Google had cleaned up effectively in its early days, had now returned to haunt us – on Google.
Whether this is the fault of Google is debatable. Some argue that it is SEO which causes the damage, others that it is the insatiable hunger for Google advertising. Some appear to think that a search environment without advertising will do the trick, and Vivek Wadhwa at UC Berkeley argues convincingly for Blekko (http://techcrunch.com/2011/01/01/why-we-desperately-need-a-new-and-better-google-2/). Both of these blogs demonstrate key facets of the debate, but, to my mind, the debate they are having is couched in the wrong terms entirely. What we must think about is not who replaces Google, but whether keyword searching has a future.
Now I must declare a prejudice. I have never been a huge fan of keyword searching. My experience of search began in the early 1980s, when as a (younger) Thomson manager I was deputed to build an online service for lawyers. We used a search package called STATUS which had been created for the UK’s Atomic Energy Research Establishment to search UK statutes and statutary instruments for references to the enactments which had set up the AERE. Both inventors worked for me, one as an advisor, the other as my CTO. Both warned me daily of the insufficiency of the system we were operating to do more than find words in documents, and not to fall victim to the idea that we were thereby creating “answers” or “solutions”. The result was that I was never a victim of the “myth of infallability ” that pervaded early web search engines and became an essential Google quality in the past 5 years. Infallable? A system that cannot distinguish the grossly out of date from today, that can be spoofed into presenting advertising copy as answers, or that can represent anything except a thought or a concept?
As a result of this early innoculation, my sights have long been set on finding search solutions, so I checked back with some of my legal market successors this week to see how they were faring. Was Google law going to sweep them away? Would the service principles of Google Scholar once applied to law, as Google have claimed, create universal free service values that would separate the lawyer from his dependence on subscription based legal retrieval engines? Not so, I learnt from Lexis Nexis. In fact, the opposite is the case. The body of law is finite, its authorship necessarily limited. In any legal domain, the retrieval engine investment is now dedicated towards tagging content with semantic metadata, developing the inference rules within the ontological structure created when taxonomies are being refined and redeveloped, and emerging as semantic search players. As law is increasingly defined in conceptual blocks which can be developed as a classification system for the ideas and arguments that lie behind legal concepts, systems are emerging which owe little to the world that Google still inhabit. And what Lexis (and undoubtedly Westlaw) are doing today will be the way in which law offices search contextually in their own and third party content tomorrow.
Is this just a law market phenomenon? Well, the techniques and ideas mentioned here have been very heavily involved in scientific research, especially in the life sciences, for the past five years. The whole standards environment created by Tim Berners-Lee and the World Wide Web council predicted this development and the search engine software SPARQL is an experimental exemplar of a direction taken by a number of semantic search start-ups. The drawback has been the tendency for searching on concepts to become very domain-focussed, where taxonomy can be more precise and concepts easier to describe. But as we move forward, this may be the next big push behind vertical search. Despite (or because) we have stopped talking about them, community-based vertical sector players like Globalspec have been able to take a strong grip on the way in which professionals work in a sector like engineering. Once community activity – making engineering design specs available for cross-searching – becomes susceptible to semantic enquiry, the ability of vertical business players to lock in users and establish themselves as the performance benchmark (and compliance engine) of the sector becomes realistic. The scenario that results from this is sometimes monopolistic, often duopolistic, seldom capable of sustaining rafts of competing content players.
So Google remains in place just as a consumer environment? No, I think that Facebook and its successors become the consumer research environment. Search by asking someone you know, or at least have a connection with, and get recommendations and references which take you right to the place where you buy. Search in mobile environments is already taking too long and throwing up too many false leads. Anyone here used a shredding company in South Bucks? How did you rate them? How do I contact them? I have this fantasy that I mention “Google” to my grandchildren and they say “did you mean the phone company?” What is the best strategy job in the industry: the one that defines the line of migration for Google out of search and towards the next big marketplace (pity they missed Groupon!).
« go back — keep looking »