Which is to say, we haven’t exactly been storming it in St Petersburg this week. There is little which is revolutionary in nature about any conference of librarians, publishers and academics, and the 13th meeting of the Fiesole Collection Development Retreat does not sound like a caucus of anarchists. But then, pause for a moment. We have been 60 or so mixed discipline people, an offshoot of the famous Charleston librarians meetings, using a few days in a brilliant city to cogitate on the future needs of research, and the roles that librarians and publishers may play in attaining them. Anyone afraid of the argument would not come here, but go to specialist meetings of their own ilk.

Youngsuk Chi, surely the most persuasive and diplomatic representative of the publishing sector, keynoted in his role as CEO at Elsevier Science and Technology. So we were able to begin with the schizophrenia of our times: on one side the majestic power of the largest player in the sector, dedicated both to the highest standards of journal quality and the maintenance  of the peer reviewed standards of science, while on the other investing hugely and rightly in solutioning for scientists (ScienceDirect, Scopus, Scirus, SciVerse, SciVal…..) without the ability to make those solutions universal (by including all of the science needed to produce “answers”).

I was taught on entry into publishing that STM was only sustainable through the support of the twin pillars of copyright and peer review. This week those two rocked a little in response to the earth tremors shaking scholarship and science. We reviewed Open Access all right, but this now seems a tabby cat rather than the lion whose roar was going to reassert the power of researchers (and librarians) over scholarly collections. The real force which is changing copyright is the emergence of licensing and contract systems in the network which embed ownership but defuse the questions surrounding situational usage. And the real force which is changing peer review is the anxiety in all quarters to discover more and better metrics which demonstrate not just the judgement of peers but the actual usage of scholars, the endurability of scholarship, and the impact of an article and not the journal in which it appeared.

And its clearly game over for Open Access. The repetitive arguments of a decade have lost their freshness, the wise heads see that a proportion of most publishing in most sectors will be Open Access, much of it controlled by existing publishers like Springer who showed the intensity of their thinking here. But does it matter if this is 15% of output in History rising to 30% in Physics? It is a mixed economy, and my guess is that the norm will be around 15% across the board, which makes me personally feel very comfortable when I review the EPS prognosis of 2002! A few other ideas are going out with the junk as well – why did we ever get so excited about the institutional repository, for example.

So where are the Big Ideas now? Two recurrent themes from speakers resonated with me throughout the event. We now move forward to the days of Big Data and Complete Solutions. As I listened to speakers referring to the need to put experimental data findings in places where they were available and searchable, I recalled Timo Hannay, now running Digital Science, and his early work on Signalling Gateway. What if the article is, in some disciplines, not the ultimate record? What if the findings, the analytical tools and the underlying data, with citations added for “referenceability”, forms the corpus of knowledge in a particular sector? And what if the requirement is to cross search all of this content, regardless of format or mark-up, in conjunction with other unstructured data? And use other software tools to test earlier findings? And in these sectors no one can pause long enough to write a 10,000 word article with seven pages of text, three photos and a graph?

And where does this data come from? Well, it is already there. Its experimental, of course, but it is also observational. It is derived from surveillance and monitoring. It arises in sequencing, in scanning and in imaging. It can be qualitative as well as quantitative, it derives from texts as well as multimedia, and it is held as ontologies and taxonomies as well as in the complex metadata which will describe and relate data items. Go and take a look at the earth sciences platform, www.pangea.de, or at the consortium work at www.datacite.org in order to see semantic web come into its own. And this raises other questions, like who will organize all of this Cloud-related content – librarians, or publishers, or both, or new classes of researchers dedicated to data curation and integration? We learnt that 45% of libraries say that they provide primary data curation, and 90% of publishers say that they provide it, but the anecdotal evidence is that few do it well and most do no more than pay lip service to the requirement.

Of course, some people are doing genuinely new things (John Dove of Credo with his interlinking reference tools – www.credoreference.com – for undergraduate learning would be a good example: he also taught us how to do a Pecha Kutcha in 6 minutes and 20 slides!). But it is at least observable that the content handlers and the curators are still obsessed by content, while workflow solutions integrate content but are not of themselves content vehicles. My example would be regulatory and ethical compliance in research programmes. The content reference will be considerable, but the “solution” which creates the productivity and improves the lab decision making and reduces the costs of the regulatory burden will not be expressed in terms of articles discovered. Long years ago I was told that most article searching (as much as 70% it was alleged) was undertaken to “prove” experimental methodology, to validate research procedures and to ensure that methods now being implemented aligned with solutions already demonstrated to have successfully passed health and safety strictures. Yet in our funny mishaped world no specialist research environment seems to exist to search and compare this facet, though services like www.BioRAFT.com are addressing the specific health and safety needs.

Summing up the meeting, we were pointed back to the role of the Web as change agent. “Its the Web, Stupid!” Quite so. Or rather, its not really the Web, is it? Its the internet. We are now beyond the role of the Web as the reference and searching environment, and back down into the basement of the Internet as the communications world between researchers, supported by the ancilliary industries derived from library and publishing skills, moves into a new phase of its networked existence. It takes meetings that have equal numbers of academics and librarians and publishers to provide space to think these thoughts. Becky Lenzini and her tireless Charleston colleagues have now delivered a further, 13th, episode in this exercise in recalibration of expectations, and deserve everyone’s gratitude for doing so. And the sun shone in St Petersburg in the month of the White Nights, which would have made any storming of the Winter Palace a bit obvious anyway.

After a break for refreshment (archaeology in the Levant) I am back to face further questioning in the Court of Industry Opinion, and particularly from the colleague who recalled a paper written in 2009 as an Outsell CEO Topic: “Workflow: Information’s New Field of Dreams” and argued that the industry had moved so quickly in the past two years that this did not represent any sort of summation of where we were today. She was right, and a little research shows how I underjudged the real position two years ago, and how the iterated aspiration that lies at the root of workflow as an information services model is now maturing rapidly. Worse, I had underestimated how much the new world was beholden to the old. In the new edition of this report, labelled Version 2.0 and published yesterday (http://www.outsellinc.com/store/products/993), I have retraced my steps and looked again at the importance of metadata and its long history, of taxonomic control and semantic search  as contributors to our dream of creating living models of streams of working activity, involving deeply different parts of the workforce. And I am sure that I shall revisit and develop this area in Version 3.0, should I  ever get that far, and that we shall find that much of the XML-based technology which has been so useful in creating the agile publishing environments of today (MarkLogic would be the market leader with particular resonance here) will be even more useful as we restructure content to fit the shapes required in different workflow roles.

And then something else happened today. Thomson Reuters, whose work in creating a Governance, Risk and Compliance (GRC) Division I have covered here in detail, launched their Accelus Suite (http://thomsonreuters.com/content/news_ideas/articles/legal/4292965), a rebranding of the 40 or so products and services they bought (Complinet) or borrowed from other parts of the group into 12 solutions areas. I have covered this in detail today in an Outsell Insight (https://clients.outsellinc.com/insights/index.php?p=11468) and do not wish to repeat that here, but it is important to remind ourselves of some key issues. This work has taught us, for example, that the outstanding work done by Lexis Nexis in putting together Seisint and Choicepoint to create a risk assessment workflow engine for the insurance industry is a “vertical” model for the industry. Thomson Reuters Accelus Suite is a “horizontal” model, and while its first targets are financial services players, the elements of the Suite (a Governance, Transactions and Legal Risk set, a Compliance and Regulatory Risk set and an Audit and Internal Control set) are common to all businesses of any scale. In addition, all of these elements require elements of training and education, risk mapping and assessment, audit and accountability, and communication of audited results – upwards, for example, via this Division’s Boardlink environment, a communication tool for risk-responsible directors.

Hang on a minute. There is one problem in all of this. As the Accelus Survey, published with this launch as the first in a regular series reminds us, the one thing we know about corporate life is that the legal department, financial control, the auditors, the compliance officer, the tax advisor and the people who do risk assessment and management all, literally, speak different languages. The Survey points out that 94% of the 2000 respondents saw this as a major issue, and it is surely here that the metadata and taxonomic control elements take centre stage. We will not improve risk management generically unless all of these different people can talk fluently and with precision to each other and to outside agencies, and the GRC Accelus Suite, if it is to succeed, must address that core issue. It is the contention of its leaders that this has been done, and while we all know that “done” is a way of saying iterative development is in train, one assurance lies in the size of the industry sample so far engaged. The Accelus Suite platform now claims more than 100,000 users, from each of the job segments in the workflow, providing a community whose feedback should give drive and direction to fitness for purpose. In this environment, the applications must grow to meet the needs (unlike my new shoes, where the foot must change, painfully, to fit the format!).

So what will these workflow environments grow to become in the industry as a whole? Thomson Reuters position Accelus Suite as a brand and line of business as large in stature and importance as Westlaw or Eikon. This is big. When I spoke earlier in the cycle of building a new business in the interstices between Thomson Reuters’ two well established branded businesses in law and financial services this was no exaggeration. And there is another very striking feature of this launch. Have a look at Regulatory Risk Mapper within the Accelus Suite and you will see an old industry trait  – discovery – and a new one – visualization. The point of the Mapper is to detect change (Thomson Reuters recorded 12500 important regulatory rewrites last year) and map it onto policy. Then it can be flagged and dealt with at a variety of different levels and many different ways. And it is what distinguishes an information solutions business from an information research business. And makes Dreams re-iterate.

« go backkeep looking »