Feb
12
The Point of Utility
Filed Under B2B, Big Data, Blog, Education, Financial services, healthcare, Industry Analysis, internet, Publishing, Search, semantic web, social media, STM, Uncategorized, Workflow | 3 Comments
We have had points of inflection and points of sustainability. Time then to propose a new “point”, one which applies universally throughout the world of information services and solutions, but which I found last week dramatically illustrated in the world of STM. Just as in the early years of the networked world we observed a point of disintermediation, at which the network effect removed real world service entities and made process cheaper and quicker, so we can now see places where re-intermediation is required, introducing a new service layer to streamline the network’s own inefficientcies, moving cost to a different place, but still often reducing it while increasing network efficientcies. And once this re-layering has taken place, the point of utility satisfied, and opportunity is created for major increases in the collection of neutral data about how the network processes in question work, and from this derives still greater efficiencies.
The ideal illustration of this is Mendeley (www.mendeley.com) and I listened with great appreciation last week when its co-founder, Jan Reichelt, described the problem which he and his colleagues had set out to solve. The inability to describe PDFs containing scholarly articles arose from the nature of network access and storage. Academic buyers knew what they had bought when they acquired it, but since the PDF envelope had no external description of what it contained, it was hard to cross search one’s collected downloads or manage a research project’s collection. And publishers, being publishers, adopted different practises regarding use of metadata, or even numbering systems. And as sources of grey literature, of files of evidence, or collections of abstracts became involved, local content access and control became a major overhead. Jan and his colleagues knew – only a few years ago they were researchers themselves.
So Mendeley was launched as an environment to regularize this, and to ensure that academics are able to better exploit the acquisitions that they have made. As a result those primary drivers of network efficientcy can be accomplished – saving money, making better decisions, and ensuring cost-effective compliance. This Point of Utility exploitation then has certain network knock-on effects. The service, like Mendeley, becomes an important part of the navigation of end users, and indeed may become part or the base for the user access dashboard. Once the Point of Utility has become an interface, then it is able to pick up all sorts of feedback data from the way end users act through the interface. This data about workflow will indicate usage and popularity, the common processes that users employ in discovery, the way in which resources in the system relate to each other, and the subjects that researchers really search (as distinct from the disciplines that journal editors think they subscribe to). Once this activity gets underway then the new interface owner can begin to suggest workflow improvements, and resell across the market the high value data which derives from the actual patterns of usage. There is a Point of Utlity in every network environment and Mendeley, through their knowledge of researcher proclivities, have camped on one of these exciting fault-lines in STM.
Some of these opportunities arise from publisher activity – lack of collaboration, lack of standardization, lack of knowledge about what happens to articles post-sales – and some from the same features in the user community. This is not a blame game. Mendeley has taken the initiative and must be welcomed as one of the foremost workflow players in the sector, especially since the launch of the Mendeley Institutional Edition last month, which takes the intermediary role into the the academic library, in conjunction with and powered by Swets, who have quickly grasped the point. This, as well as exporting the Mendeley API (http://www.mendeley.com/blog/design-research-tools/winners-of-the-first-binary-battle-apps-for-science-contest/), will turn fast growth into a torrent: Mendeley already have backlogs despite having captured 160 million references for 1.5 million users. Some publishers (Springers’s short article previews) clearly get it – others, as ever in this sector, plainly adopt a policy of “this too will pass”).
But of course it will not. Far from returning to normal , the science knowledge market is in a ferment. Visit Dryad (http://datadryad.org/) and go to Giga Science (http://www.gigasciencejournal.com/) and observe the impact of Big Data on this sector. My friend Mark Ware, in an excellent note for Outsell (https://clients.outsellinc.com/insights/?p=11693), has given chapter, verse and analysis. Content control in the scientist’s workflow is becoming a multiple media nightmare. Does the metadata accurately describe the contents of video diaries and observations, or their audio equivalent? Can we source the data behind that report, and do our own analysis? How many unpublished, unreported studies have validated these results? What has been said and at which conferences about how far down the track this research team has gone? Where do we find the right mix of experience to staff this next enquiry? Regardless of its peer reviewed status, who actually used this work – and, if they did not use it, what did they rely upon instead? Mendeley is a promising beginning, but there is a long road ahead. Stefan Glanzer (Last.fm) and Alejandro Zubillaga (lately head of Warner Digital Music – where he must have seen parallel problems) put in the seedcorn and should be congratulated. They have a real start-up (my spirits rise when I visit an office that has its bike racks and its table football in the foyer!) with the wind behind it.
One last check on where that wind is blowing. Visit ResearchGate (www.researchgate.net) and look at the ways in which scientists are beginning to indulge in meaningful social networking. I have been told for 25 years that scientists and academics are too competitive to use social networking. Like much of the received wisdom of the pre-networked society, this is at best a half truth. The whole truth is that there are no longer simple generalizations that hold true about researcher behaviour. That is why they flock so quickly to the Point of Utility.
Feb
5
The Games We Should Play
Filed Under B2B, Blog, eBook, Education, eLearning, Financial services, healthcare, Industry Analysis, internet, mobile content, Publishing, social media, STM, Uncategorized, Workflow | Leave a Comment
As soon as you give something a name on the web, then anti-matter appears and the original ideas get lost in the welter of abuse which is web discourse. The word “gamification” is a classic example. Some clever fellow clearly felt that this coinage gave dignity and grandeur to the process of using game theory as a means of helping learners in all walks of life to find greater pleasure and more effective learning in acquiring skills or attributes needed for their advancement. As a result there fell upon his head a posse of academics concerned to create research around the idea that playing games turns peoples’ brains soft, fails to prepare them for the real world (no games played there?), and indeed that game theory was an elaborate entrapment created by the enemies of democracy and free speech to undermine Western Civilization as we know it today … What rubbish!
The first time I encountered teachers and designers building serious gaming scenarios to help learners learn was in the late 1990s. “Gamification” according to its wiki, http://gamification.org, has been in the bloodstream since 2004. If it has taken Farmville and AngryBirds and X Box to awaken some people to the pervasive presence of game theory within all of our thinking about the way we learn, then they stand convicted of not living in the twenty first century. Gaming is now tightly wrapped around the way we learn: the problem is that we still do not do it consistently, in large enough contexts, to create ultimate learning value. People who call themselves publishers, information service solution providers, content developers etc still have the notion that the game is something you add to the mix to lighten the load, provide some variety, change the pace or overcome a tricky and boring learning essential. But what if gaming was the core to our learning, the methodological base for instruction and measurement. What if it was the package that replaced the training manual and accomplished its assessment as well as handled its updating? What if, as much biological evidence demonstrates, games are the way we learn and we are just now returning to a full recognition of what that means?
Sitting in an armchair in the City Lights bookstore in San Francisco one foggy day in June 2007 I opened a copy of Mackenzie Wark’s Gamer Theory, published that year as Version 2.0 of his blog GAM3Y 7H3ORY, a networked book hosted online by Bob Stein’s Institute for the Future of the Book. Here is a sample: “Here is the guiding principle of a future utopia, now long past: “To each according to his needs, from each according to his abilities”. In gamespace, what do we have? An atopia, a senseless, placeless realm where quite a different maxim rules: “From each according to his abilities – to each a rank and score? Needs no longer enter into it. Not even desire matters. Uncritical gamers do not win what they desire: they desire what they win. The score is the thing. The rest is agony.” (para 021). Is this different to what you thought? Is it closer to passing that test, completing that continuous development assignment, getting those SATs, or satisfying all of those humiliating hurdles placed in the way of forward progress by those who have already progressed far enough forward not to be troubled by them any more. If you say “yes” to any of these questions then you are in danger of joining me on a dangerous road – towards a future for learning dominated by gaming.
But we are in good company. That hugely serious player, SAP, employs Mario Herger as its Global Head of Gamification (www.enterprize-gamification.com). MIT’s Learning Lab spawned Scratch (http://scratch.mit.edu/) to create and test learning games for younger people and Microsoft created Kodu (http://www.kodugamelab.com/), a programming environment designed to allow users to build their own games on the XBox. And in most countries there is now a serious gaming industry, often with 10 to 15 years of experience behind them, mostly making serious games for user organizations, and unvisited and unblest by the publishers who should be their natural collaborators. Centres of excellence here in the UK include inventive survivors like Desq (www.desq.co.uk), the Sheffield -based developer with almost 15 years of intensive work around immersive experiences like DoomEd or the SimScience environment built for the Institute of Physics. Or look at Pixelearning (www.pixelearning.com) in Birmingham and its training environments, or the company created by its founder, Kevin Corti (SoshiGames – http://www.soshigames.com/, exploiting customer retention through social gaming). Then, around London’s Old Street Silicon Roundabout, see how many of the 800 start-ups are games related, like Michael Acton Smith’s hugely successful MoshiMonsters (http://www.moshimonsters.com/). As a director of CreatureLabs many years ago I recognize the DNA! The games thing is on the march, but the content businesses old-style are not yet aligned with it.
So lets drop “gamification” if we are going to get into some social backlash. Really, games for learning are not like that lesson on Friday afternoon when the teacher showed a filmstrip (younger readers can insert film-loop, film, TV programme, slides, video etc according to age or taste) and we all slept or gazed out of the window. They are the very stuff of learning and the keywords which we shall associate with them are engagement, immersion, collaboration. They will have their problems, but as well as the future of learning they are also the future of assessment.
FOOTNOTE While continuing to use this blog to record a view of information marketplaces and the players within them, I would also like to devote a regular item to looking at what I am increasingly calling the Post Digital Information World. This does not mean that I think that we shall renege at all on the digitalization of all forms of communication – just that once infrastructures are in place, and the majority of human society is connected to a networked society, it is conceivable that the next stages of development, while they are faster and even less supportive of current business models, will be different in type and style. The current debate about the future of email highlights this. More from me here later.
« go back