Nov
4
Beware: Lawyers at Work
Filed Under B2B, Big Data, Blog, eBook, Industry Analysis, internet, Reed Elsevier, Search, semantic web, social media, Uncategorized, Workflow | 3 Comments
Writing a piece here in September (The Way Lawyers Work Now) drove me back to the sustaining works of Richard Susskind: “The Future of Law” (1996), “Transforming the Law” (2000), and “The End of Lawyers?” (2008). They remain a most impressive achievement, and as well a rare effort to forecast the future of work in a particular vertical market sector. The trends that are apparent now align closely with the Susskind theses, especially in terms of the moves into practice solutioning, where Lexis now pursue PLC much more closely in the UK, with the benefit of being able to support their solutions by invoking the whole research environment as well. Whether these moves support ideas of the democritization of access to the law – Richard quotes Shaw’s dictum that “all professions are a conspiracy against the laity” – is not the question for this blog. However, they certainly deliver a vision of deskilling and cost erosion, and thoughts that many corporate and individual clients may in future have a very different procedural access to the law and its requirements.
I was encouraged in this thinking by discovering that Lexis UK last month published some of their own research survey findings, under the title “Practice Points”. This was a very worthwhile process, though not so that we could learn that 66% of respondents forecast 10% growth per annum over the next two years. With so many UK law practices currently debating their status after the last government’s liberalization measures, no one contemplating incorporation of floatation would say anything else. What impressed me more was the high score that lawyers gave to increased competition associated with the ABS (Alternative Business Structures) legislation, and the increase in M&A activity that this foretold. In order to hold costs and even reduce them (those surveyed saw fixed fee not hourly rates as the future business model) the gearing had to change – they needed to recruit more support staff who were not going to share profits or become partners. The way in which many would do this was by outsourcing to a fixed fee legal outsourcing company, often in the UK but sometimes offshore as well. And IT was the critical element – 60% looked to process automation to reduce costs and create the communications with clients and third party suppliers which will make this work.
This plays well with the line on practice solutions now being taken by Lexis and long held by PLC in the UK. PLC’s US expansion still appears on course, though moving more slowly in the recession. But I wondered about continental Europe, especially given the traditional positioning of German lawyers between clients, and provincial regulation, Federal law and EU requirements. Do not forget that both Thomson Reuters and Lexis, in various ways, quit this difficult marketplace in the last decade. So I was delighted at Frankfurt to find Christian Dirschl of Wolters Kluwer Germany on my panel, and to be able to ask him whether German law publishers were having to adjust their positioning and move towards new access models alongside their existing commitment to research tools. And, since I have always found WK Deutschland very difficult to understand as an outsider, since it has 8 constituent law companies and another four tax imprints, I was hugely impressed by the answer: WK Germany has fully embraced semantic technologies by launching the Jurion interface (www.jurion.de) to make much of its own and growing amounts of third party content accessible in a contextualizable environment.
There are a number of very striking points about Jurion. In the first instance WK have gone back and re-engineered their content acquisition, enrichment and bundling cycle. With their metadata ducks all in a row, and fundamental problems of delivery format and functionality solved, they have been able to invite third parties on to the platform to work through the same interface. So here you can get your Haufe content as well as your Lucterhand WK content, and if you are not a subscriber to the particular Haufe service, you can join up in 20 seconds. Then again they are members of the EU-supported LOD 2 project (http://lod2.eu), with 15 other companies in 12 countries. This lets Jurion swim in the world of EU Open Government (via the publicdata.eu platform), and provides not just another layer of content accessibility, but a context in which open source semantic technologies (DBpedia, Virtuoso, Sindice, Silk) can work jointly. Add to this rich stew a few more ingredients: their ability with semantic analysis and the LOD (linked open data) environments has propelled them into the development of major taxonomic instruments, with the legal thesauri now covering a large range of public/private content and WK becoming the effective gateway and standards setter for legal access. And then consider that at the same time they have integrated document construction and document location, using the same metadata. And then search all of this on legal terms and legal concepts. And then add, from the end of this year, web data as well as web content (look at the Wikipedia -style work accomplished here). Very impressive.
But what does it look like from the user screen? When I open my Jurion desktop I have options. jSearch is a normal law database environment with semantic search. jStore has the WK products, its partners’ products, fast purchase and – almost inevitably, a recommendation system which is likely to be very important. jLink will allow annotation sharing and thus becomes a gateway to social media. jBook allows personalization and rebundling of content – and you can have it as an eBook or print copy too. jCreate allows content creation, metadata allocation and sharing – via jStore for a fee if necessary. And jDesk, which subsumes the lawyers user desktop, giving him indexation, and coverage of the whole or parts of the firm’s network. Here clients have OCR, citation recognition, topic classification, and document creation. This is not yet fully completed, but remains a startling step forward. It potentially transforms the competitive structure of the German law and tax market, and it is based on vital ideas of collaboration which have to underlie all of these developments in future.
WK Germany have gone horizontal in their effort to supply the lawyer in Germany with a complete access point. Lexis in the UK have gone vertical in their bid for practice solutions. Both of these legitimate approaches will one day end in the same place, with comprehensive and collaborative service environments that eventually begin to democratize access to the law.
Nov
1
Members of the House of Peers
Filed Under Blog, healthcare, Industry Analysis, internet, Publishing, Reed Elsevier, Search, semantic web, STM, Uncategorized, Workflow | Leave a Comment
Not another note on Open Access, surely? Well, I am sitting here on 31 October reading an article published on 1 November (how up to date can a blogger be?) in the Educause Review Online (www.educause.edu/ero/article/peerj-open-access-experiment) and I really want to convey my respect for people like Peter Binfield, who wrote it, for their huge energy and ingenuity in trying to make Open Access work. Peter’s note, “PeerJ: An Open-Access Experiment” describes the efforts that he and his PeerJ colleagues have put into the business of creating fresh business models around Open Access, which was borne without one and has always seemed to its adherents to need to be cloaked in one. Open Access has proved a far from lusty infant in many ways, but those who continue to adhere to the cause seem to feel, in their admirable and unfailing optimism, that some small tweak will suddenly create economic salvation and thus a take off into sustainable business growth.
In the case of PeerJ, the take-off vehicle is going to be a membership model. Peter Binfield co-founded the outfit in June 2012 with Jason Hoyt, former Chief Scientist at Mendeley, but the model that they feel will work owes nothing to smart algorythms. Instead, they simply launch themselves at the Author Processing Charge (APC), the way in which Gold OA has been sustained so far, and replace it by – a subscription. Now this is admittedly a personal subscription, levied on all article contributors (that is where the volume lies – in multi-authoring) and subscribers – or members as they would wish to describe them – can then continue to publish without further charges as long as they keep up their membership fees. Of course, if they join with new teams who have not previously been members then I presume we go back to zero, until those contributors are also members with a publishing history. Each contributor who pays a membership fee of $299 can publish as often as he likes: a nominal $99 contribution allows you one shot a year.
PeerJ have assembled a peer review panel of 700 “world class academics” for peer review purposes and intend to open for submissions by the end of the year. In a really interesting variation on the norm, they have put a PrePrint server alongside the service, so submissions will be visible immediately they are considered. It is not clear how much editorial treatment is involved in these processes, or indeed what “publishing” now means in this context, or indeed when a submission appears on the pre-print server. But one thing is very clear: this is not going to be peer review as it once was, but simply technical testing of the type pioneered by PloS One. Once it is established that the article conforms to current experimental good practice, then it gets “published”.
It is around this point in ventures of this type that I want to shout “Hold on a moment – do we really know what we are doing here?” I am sure that I will be corrected, but what I can currently see is a huge dilution of the concepts of “journals” and “publishing”. PeerJ starts with no brand impact. It is not conferring status by its selectivity, like Nature or Cell, or even some brand resonance like PloS. And its 700 experts, including Nobel Laureates, are being asked if the enquiry methodology was sound, not whether the result was good science or impacted the knowledge base of the discipline. PeerJ should be commended for allowing reviews by named reviewers to be presented alongside the article, but, fundamentally, this seems to me like another ratcheting downwards of the value of the review process.
Soon we shall hit bottom. At that point there will be available a toolset which searches all relevant articles against the submitted article, and awards points for fidelity to good practice or for permissable advances on established procedures. Articles where authors feel they have been misjudged can re-submit with amended input. The device will be adopted by those funding research, and once the device has issued a certificate of compliance, the article, wherever it is stored, will be deemed to have been “published”. There will be no fees and no memberships. Everything will be available to everyone. And this will introduce the Second Great Age of Publishing Journals, as the major branded journals exercise real peer review and apply real editorial services.
But something has changed now. The Editors of the Lancet or Nature or Cell have decided, in my projection, not to entertain submissions any longer. Instead they will select the articles that seem to them and their reviewers most likely to have real impact. These they will mark up to a high level of discoverability, using entity extraction and advanced metadata to make them effectively searchable at every level and section and expression within the article. Authors will have a choice when they are selected – they can either pay for the services up front or surrender their ownership of the enhanced version of the article. Since the article will be available and technically assessed already, spending more on it will seem fruitless. So we shall return to a (much smaller but equally profitable) commercial journals marketplace. Based once again on selectivity and real, expensive peer review.
Experienced readers will have already spotted the flaw. With wonderful technologies around like Utopia Documents and other new article development activities (Elsevier’s Article of the Future) surely the new age of the article can only exist until these technologies are generalized to every institutional and research programme repository. That is true – but it will take years, and by that time the publishers will be adding even higher value features to allow the researcher’s ELN (Electronic Lab Notebook) full visibility of the current state of knowledge on a topic. Beyond that, we shall consider articles themselves too slow, and inadequate for purpose, but that is a discussion for another day.
« go back — keep looking »