May
13
Stroking the Winter Palace
Filed Under Blog, eBook, Industry Analysis, internet, Publishing, Reed Elsevier, Search, semantic web, STM, Workflow | 1 Comment
Which is to say, we haven’t exactly been storming it in St Petersburg this week. There is little which is revolutionary in nature about any conference of librarians, publishers and academics, and the 13th meeting of the Fiesole Collection Development Retreat does not sound like a caucus of anarchists. But then, pause for a moment. We have been 60 or so mixed discipline people, an offshoot of the famous Charleston librarians meetings, using a few days in a brilliant city to cogitate on the future needs of research, and the roles that librarians and publishers may play in attaining them. Anyone afraid of the argument would not come here, but go to specialist meetings of their own ilk.
Youngsuk Chi, surely the most persuasive and diplomatic representative of the publishing sector, keynoted in his role as CEO at Elsevier Science and Technology. So we were able to begin with the schizophrenia of our times: on one side the majestic power of the largest player in the sector, dedicated both to the highest standards of journal quality and the maintenance of the peer reviewed standards of science, while on the other investing hugely and rightly in solutioning for scientists (ScienceDirect, Scopus, Scirus, SciVerse, SciVal…..) without the ability to make those solutions universal (by including all of the science needed to produce “answers”).
I was taught on entry into publishing that STM was only sustainable through the support of the twin pillars of copyright and peer review. This week those two rocked a little in response to the earth tremors shaking scholarship and science. We reviewed Open Access all right, but this now seems a tabby cat rather than the lion whose roar was going to reassert the power of researchers (and librarians) over scholarly collections. The real force which is changing copyright is the emergence of licensing and contract systems in the network which embed ownership but defuse the questions surrounding situational usage. And the real force which is changing peer review is the anxiety in all quarters to discover more and better metrics which demonstrate not just the judgement of peers but the actual usage of scholars, the endurability of scholarship, and the impact of an article and not the journal in which it appeared.
And its clearly game over for Open Access. The repetitive arguments of a decade have lost their freshness, the wise heads see that a proportion of most publishing in most sectors will be Open Access, much of it controlled by existing publishers like Springer who showed the intensity of their thinking here. But does it matter if this is 15% of output in History rising to 30% in Physics? It is a mixed economy, and my guess is that the norm will be around 15% across the board, which makes me personally feel very comfortable when I review the EPS prognosis of 2002! A few other ideas are going out with the junk as well – why did we ever get so excited about the institutional repository, for example.
So where are the Big Ideas now? Two recurrent themes from speakers resonated with me throughout the event. We now move forward to the days of Big Data and Complete Solutions. As I listened to speakers referring to the need to put experimental data findings in places where they were available and searchable, I recalled Timo Hannay, now running Digital Science, and his early work on Signalling Gateway. What if the article is, in some disciplines, not the ultimate record? What if the findings, the analytical tools and the underlying data, with citations added for “referenceability”, forms the corpus of knowledge in a particular sector? And what if the requirement is to cross search all of this content, regardless of format or mark-up, in conjunction with other unstructured data? And use other software tools to test earlier findings? And in these sectors no one can pause long enough to write a 10,000 word article with seven pages of text, three photos and a graph?
And where does this data come from? Well, it is already there. Its experimental, of course, but it is also observational. It is derived from surveillance and monitoring. It arises in sequencing, in scanning and in imaging. It can be qualitative as well as quantitative, it derives from texts as well as multimedia, and it is held as ontologies and taxonomies as well as in the complex metadata which will describe and relate data items. Go and take a look at the earth sciences platform, www.pangea.de, or at the consortium work at www.datacite.org in order to see semantic web come into its own. And this raises other questions, like who will organize all of this Cloud-related content – librarians, or publishers, or both, or new classes of researchers dedicated to data curation and integration? We learnt that 45% of libraries say that they provide primary data curation, and 90% of publishers say that they provide it, but the anecdotal evidence is that few do it well and most do no more than pay lip service to the requirement.
Of course, some people are doing genuinely new things (John Dove of Credo with his interlinking reference tools – www.credoreference.com – for undergraduate learning would be a good example: he also taught us how to do a Pecha Kutcha in 6 minutes and 20 slides!). But it is at least observable that the content handlers and the curators are still obsessed by content, while workflow solutions integrate content but are not of themselves content vehicles. My example would be regulatory and ethical compliance in research programmes. The content reference will be considerable, but the “solution” which creates the productivity and improves the lab decision making and reduces the costs of the regulatory burden will not be expressed in terms of articles discovered. Long years ago I was told that most article searching (as much as 70% it was alleged) was undertaken to “prove” experimental methodology, to validate research procedures and to ensure that methods now being implemented aligned with solutions already demonstrated to have successfully passed health and safety strictures. Yet in our funny mishaped world no specialist research environment seems to exist to search and compare this facet, though services like www.BioRAFT.com are addressing the specific health and safety needs.
Summing up the meeting, we were pointed back to the role of the Web as change agent. “Its the Web, Stupid!” Quite so. Or rather, its not really the Web, is it? Its the internet. We are now beyond the role of the Web as the reference and searching environment, and back down into the basement of the Internet as the communications world between researchers, supported by the ancilliary industries derived from library and publishing skills, moves into a new phase of its networked existence. It takes meetings that have equal numbers of academics and librarians and publishers to provide space to think these thoughts. Becky Lenzini and her tireless Charleston colleagues have now delivered a further, 13th, episode in this exercise in recalibration of expectations, and deserve everyone’s gratitude for doing so. And the sun shone in St Petersburg in the month of the White Nights, which would have made any storming of the Winter Palace a bit obvious anyway.
May
8
100 Years of Marshall McLuhan
Filed Under Blog, eBook, eLearning, internet, mobile content, Publishing, social media, Uncategorized, Workflow | 1 Comment
Since I learnt this week that this is the centenary of the great Canadian scholar and mystifiers’ birth, I cannot resist using the fact, and using it to get back into a groove that seemed to escape me last week. I justify this by saying that I only write when I have something to say. On the other hand I have something to say far more often than I write, but I need an impulse to get me over the hump and force me to find time and concentration. That impulse is almost always the ability to use this to postpone starting something else equally important. In this way my life turns into a series of deadlines, each one creating pressure and driving activity. I dread to think what might happen if these pressures to perform were removed: would I sit down and read the whole of “Colonel Roosevelt”, the third volume of Edmund Morris’s grand life of the progressive President, and my current obsession, in one go, like some greedy schoolboy, the Fat Owl of the Remove, consuming chocolate cake? Probably. I am not a very refined person. And I do like to gloat.
Which brings me back to writing. This week’s impulsion came during the Publishers’ Forum in Berlin. I was listening to one of the beguiling masters of change, Robert Stein, describing his experimental work in his SocialBook Inc operation. I have no doubt that he is right: in a networked society reading becomes a social activity, and that I should not be secretively curling up with the Colonel, but actively debating with you and other readers (and I do know another current reader as it happens) whether Roosevelt was right to run against Taft in 1912. And was it Woodrow Wilson who was the true progressive? I know perhaps 5 people with whom I could have this discussion, and no doubt I could find 50 more online if this book were a social document. And it would be cream on my chocolate cake to have those talks, but they would slow down and retard the progress through the book, which now occupies late evenings and weekends. I read 37 books of this size last year: how many would I read in a social hall of mirrors? And would the conversation and friendship derived from Bob’s social vision, from his four styles of social reading embedded in a browser-type interface that allowed me to annotate pages, read other readers comments and interact with them, would all that compensate me for only reading 5 books a year?
Earlier in the session, part of an increasingly highly regarded meeting, put together by Klopotek, the publishing workflow specialist (http://www.klopotek.de/enindex.htm), I heard Liza Daly talking about ePub 3. I welcome this with open arms, delighted by the speed with which this revised standard is being produced, warmed by Liza’s clear and emphatic summation of its aims, and only depressed by the liklyhood that hardware vendors will take their time in introducing compliant devices. As Liza summarized it, the major advances, as well as using HTML5, are in language use (however did we persuade ourselves that vertical reading as well as horizontal left-right as well as right-left was not necessary – and thus exclude at a stroke Chinese, Korean and Japanese!); interactivity; audio and video; and design/layout conventions that allow pages to refocus themselves appropriately in terms of the screen size being used to view them. The gains here will be for graphic novels, the beginnings of multimedia in eBook, and, I would guess, for the further evolution of the eTextbook (whatever that may be). As Liza came to an end I found myself at once delighted by a real progress report by a real expert on real progress made, and straining to see the expression on Stein’s face to see if he was thinking what I was thinking: “Fifteen years on the Internet and we are only now installing the features that were so important, in the early 1990s, on multimedia CD-ROM”!
Unfortunately he was sitting in front of me, but in those days in the early ’90s when “bandwidth” meant “fixed disc”, not Broadband, Robert’s Voyager operation was the shining example of creativity in the US market in this sector. His example was very encouraging to people like me, an advisor to and non-executive director of Dorling Kindersley Ltd, who, through the drive and determination of Peter Kindersley (an impatient innovator whose background as a designer helped no end in the creation of this medium) were following rapidly along the same track to create a generation of interactive disc-based reference products whose ingenuity and use of content and software have not yet been emulated in the eBook world. DK also produced a publisher of real note in Peter’s colleague, Alan Buckingham, who proved a master at stitching together resources and effects to produce deeply engaging learning and reference materials. Alan was the first maestro to paint with the whole multimedia palette: when eBooks grow up and they start giving awards for them, they should call them the “Buckies”!
So here is one example of the way in which markets sometimes have to loop back and rediscover themselves: Marshall McCluhan knew all about that when he spoke of the effect of television on film. Another Publishers Forum speaker said something similar: “Longtail is not a lucrative market unless you are an aggregator” said David Hetherington of Baker and Taylor. Which is why print on demand providers are aggregators and why publishers surrender their digital files to them. Which heralds the day, which David did not say, when publishing margins are more rentals and royalties than retail, pushing publishing even further away from organizing the marketplace and imperilling its position.
A hundred years ago a man was born who well described these and so many other changes in media marketplaces, and did it from the user viewpoint, creating a sort of sociological view of media access. Nothing here would surprize him in the least: he would have claimed it all as his.
PS. One very good reason for going to this conference, let alone the excellent content, is being inside Berlin’s wonderful conference centre, the Axica. Built by Frank Gehry for a bank that now cannot afford it, to our great advantage, the conference auditorium sits in the womb of the building beneath a glass canopy, while seminars are held in a wooden egg suspended above it. I got to use this, and can testify to its wonderful sound qualities: spectators sit in rising seating above the (?) bank’s board room table, for all the world like speaking in one of those anatomy theatres of the sixteenth century (Uppsala university has an outstanding example).
« go back