Dec
17
Access, Evaluation, Science – all Open?
Filed Under B2B, Blog, Industry Analysis, internet, Publishing, Reed Elsevier, Search, social media, STM, Thomson, Uncategorized, Workflow | 1 Comment
“When the Spin slips, change the name!” as British Spin Meister, Alistair Campbell, almost said but didn’t until I put the words into his ever-open mouth. When I look back over the past 15 years of science publishing, I see more spin and less change than I would ever have believed possible. Yet when I try to look forward 10 years I see a wave of fundamental change more threatening than the games we have been playing in these Open spaces. For me, a good proof of the failure of the almost political campaigning around Open Access to carry the day beyond some 12-15% of users (check the latest Outsell market report, Professor Harnad) is the name switch game – with PLoS now talking “Open Evaluation” and Academia.edu being used by 5 million scientists who believe in Open Science. The fundamental change is about self-publishing and post-publication peer review: this will upset the applecart of commercial publishing, if it does not adjust in time, and the ersatz Fundamentalists of the Open Access movement of a decade ago, who wanted to preserve peer review as much as they wanted to destroy commercial ownership and restriction.
Since we are talking Science, lets try an experiment. Take any other broken, mis-used and meaningless hackneyed term and place it in this context where “Open” is now in terms of Science and Access. For example, take “Socialist”. Or “Community”. Or even “Public”. See what I mean? All meaningless, or, like those eye tests, you see the same through each lens that the optometrist puts into the frame before your eye, and end up lying about the difference between this one and that – because there is no discernible difference but you do not want to disappoint. Real change is not to be described by this means. It concerns the wish of young scientists to be noticed in the network as soon as possible on completion of their work – and before that where conferences, posters, blogs and other mentions begin to build anticipation. Real scholarly communication is now available in several different flavours, from Mendeley to Academia.edu. Since I have been solemnly assured for 30 years by senior scientists and publishers alike that scientists will not share I have to be amazed by the size of these activities. These newcomers are not less worried about attaining research grants or tenure than their predecessors, but they live in a networked scientific world where if you are not quickly present in the network you are not referenced in debate – and being part of the argument is becoming as critical to getting grants and tenure as a solid succession of unread papers published two years after the research ended used to be.
These convictions are much strengthened by this week’s announcements. The announcement from F1000Research (December 12) that their articles are now visible in PubMed and PubMed Central gives a complete clue to what this is all about. Users want to publish in five days, but they want to be visible everywhere where a researcher/peer would expect to look. And increasingly they will expect that the article will collect into post-publication peer review all those earlier references in conference proceedings, blogs and elsewhere. So while people like F1000Reaeach will handle “formal” post-publication peer review, informal debate and commentary will not be lost. And the metrics of usage and impact will not be lost either, as we look so much more widely than traditional article impact to discern what this author/team/ findings/ideas have had. “Open Evaluation” from PLoS aims just there, as it recently launched its second evaluation phase from PLoSLabs (http://www.ploslabs.org/openevaluation/). This post-publication article rating system reminds me very precisely that PLoS One was not in any sense a traditional peer review process. It was a simple methodological check for scientific adequacy (“well-performed” science), and while the volume of processing solved a multitude of financial issues, the fuller rating of these articles still rests with the user. We shall see PLoS One as the turning point to self-publishing when the history is written.
And so we move towards a world where original publication of science articles is no longer the prerogative of the journal publisher. While review systems will flourish and abstracting and indexing will remain vital, that tangled mass the second and third tier journals, the most profitable end of traditional STM, will slowly begin to disperse. Some databases will adopt journal brands, of course, and the great brands will survive as ratings systems themselves. “Selected by Cell as one of the 50 most influential research articles of the year”, or “Endorsed by Nature as a key contribution to science” will be enviable re-publishing, increasingly with datalinks, improved access to image and video and other advantages. This is where semantic enrichment and data analysis will first become important – before it becomes the norm. But these selections will be made from what is published, not what is submitted for publication. And a clue to what the future offers was indicated by a Knovel (Elsevier) announcement this week. Six publishers with either small, high quality holdings in engineering research, or activities in engineering that can use the Knovel platform, entered into collaboration agreements to make their content available via the Knovel portal. Amongst these were Taylor and Francis (CRC Press), as well as specialists like ICE (Institute of Civil Engineers) or the American Geosciences Institute. As novel is in a directly competitive position with IHS GlobalSpec, it is relevant to ask how many engineering research portals that marketspace will need. It now has two – and I seriously doubt that there will ever be more than two aimed at both research and process workflow, though their identities may change (see Thomson-Reuters/Bloomberg/Lexis in law). Increasingly then small science publishing will be re-intermediated – and we do not need a business degree to imagine what that will do to their margins, as well as their direct contact with their users. “Open”, whatever else it means, connotes “contraction” for some people.
Nov
13
Hyperlocal: Lost in the Data Mix?
Filed Under B2B, Big Data, Blog, data analytics, Financial services, healthcare, Industry Analysis, internet, mobile content, news media, online advertising, Publishing, semantic web, social media, STM, Uncategorized, Workflow | Leave a Comment
On my good days I scan the screen for the re-invention of local news in a personalized framework, which is how I have defined “hyperlocal” for some years now. On the bad ones, I search for news of Ashley Highfield, erstwhile creator of the BBC web customization service, iPlayer, and now running Johnston Press. If he cannot re-invent the press, then who can? Or maybe there is another Johann Carolus in somewhere like Strasbourg, just about to do digitally what his namesake did in 1605, and develop the first news sheet. Yet Mr Carolus put the news to work for local businessmen (seventeenth century Germany was as yet oblivious of the bogus distinction of B2B from B2C), and it thus occurs to me that I may be looking in the wrong place for the renaissance of local news.
These thoughts were triggered by a piece in the New York Times (November 10) which did look as if it was going to tackle my “hyperlocal” anxieties.
(http://www.nytimes.com/2013/11/11/technology/gathering-more-data-faster-to-produce-more-up-to-date-information.html?_r=0&adxnnl=1&pagewanted=2&adxnnlx=1384288878-xYaeMfcYZuVjg950SS64xg&pagewanted=print) This piece, entitled Big Data’s Little Brother”, is in fact a story about data analytics, featuring Premise , a service which collects and analyses photos of market stalls around the globe in order to compile inflation and availability data for food supply and cost analysis, and Clear Story data, which does custom predictive analysis from the data available on the web and/or supplied by clients. Indeed this works with my own observations: SaaS in data analytics is becoming a boom industry, with players like RecordedFutures.com now creating multi-faceted analysis from cyber intelligence to competitive positioning. And these tools can only get smarter, which leads me to believe that we may have to re-create “news” for people who will have commercial reasons to pay before we can personalize news for the general reader/citizen at large.
So what are these data- driven, analytical insight organs going to look like? Well, for a start, we shall have to redefine the word “news”. The services that grab the attention now do not use the news to report something so much as to predict something. When Takadu.com is deployed by a water utility company, it is putting together analysis around sensor, image, staff and public reporting on water leaks. Since 25-30% of global water supplies are NRW water – non-revenue contributing, a glorious term for leaks – this is as vital to the utility as it is to the globe, but the important matter may not be the leak itself, but the trend, the order of repair, and the potential future impact. While it is hard to appreciate FoodGenius.com, which helps food processors develop ever more nutritional disasters for our consumption, it reads 300,000 menus daily to find the trend and create the prediction – lambs’ kidneys in guacamole will be big in 2014 – and will be available everywhere. And moving swiftly to a subject that makes me feel less emotional, companies like Molecular Connections can use the analytics on one side of their business for advanced drug discovery processes, finding and analysing news from the future, while using their technology to give meaning to archival news, as they have done with Nature, the pre-eminent science journal.
None of this is News as we know it, and part of me now accepts the idea that the networked society will never quite want News as newspapers once knew it. Things like the Huffington Post are hybrids, the results of miscegenation, not a new evolutionary track. Things like Buzzfeed are entertainments, brilliant if you want to contemplate the life and works of Rob Ford, Torontonian mayor/buffoon/jester, reduced to 22 captioned images, but only customizable in the “more like that” sense. Nothing here speaks to me about the use of the one thing we have in plenty – data – to inform us of the patterns of our lives and the way that they may change in future.
And we know so much. Isotrak.com reckon they are saving their haulier clients £150 m per annum on areas like building patterns of more efficient driving. This links to my interest, already expressed here, in lower motor insurance costs as your car speaks to your insurer via your smartphone and reports your performance. While recording your journey on Wayze, and noting car accidents and traffic congestion as a result. So maybe these services of the future have active advertising – not just “buy our service to save money” – but lets do it and save it now! And maybe the “news” is about you – in society, against the backdrop of the performance of others, all living anxiously in a rated, graded world. After long years when news tycoons and advertising gurus fought to create “My” service environments and telling us all how to behave, it would be poetic justice if we ended up making them for ourselves, and letting the data modelling tell us how to behave.
Which is what I think we will do. Soon the tools will become available to view all the niche networks that we join in the post-Facebook world in a single viewer which allows us one view of our separate networks for family, for college friends, for business and professional associates, sports aficionados etc. Here we will pull in more data – is anyone getting better wholesale prices for his home- produced electricity than I am? And analysis. And prediction. And we will move the dial from the congested relief road to who is standing for office who wants to do something about it. And before we know it we are back to wondering how any group of well-adjusted people elected Rob Ford, or Boris Johnson, or any other mayor, and then we want commentary and analysis to explain these things. Here “journalism” is by definition self-employed.
But in the meanwhile, the deconstruction of news has to be total before we can begin to reconstruct the flows of data and information which will make a digital economy in a networked society perform and function. So it is probably just as well we have the guys we have in charge of our press. From power-broking to phone hacking , they are doing a grand job of destroying public trust in the world of paper and preparing us all for the digital yet to come. So good, in fact, that rather than put them on trial we should give them an award!
« go back — keep looking »