As soon as you give something a name on the web, then anti-matter appears and the original ideas get lost in the welter of abuse which is web discourse. The word “gamification” is a classic example. Some clever fellow clearly felt that this coinage gave dignity and grandeur to the process of using game theory as a means of helping learners in all walks of life to find greater pleasure and more effective learning in acquiring skills or attributes needed for their advancement. As a result there fell upon his head a posse of academics concerned to create research around the idea that playing games turns peoples’ brains soft, fails to prepare them for the real world (no games played there?), and indeed that game theory was an elaborate entrapment created by the enemies of democracy and free speech to undermine Western Civilization as we know it today …  What rubbish!

The first time I encountered teachers and designers building serious gaming scenarios to help learners learn was in the late 1990s. “Gamification” according to its wiki, http://gamification.org, has been in the bloodstream since 2004. If it has taken Farmville and AngryBirds and X Box to awaken some people to the pervasive presence of game theory within all of our thinking about the way we learn, then they stand convicted of not living in the twenty first century. Gaming is now tightly wrapped around the way we learn: the problem is that we still do not do it consistently, in large enough contexts, to create ultimate learning value. People who call themselves publishers, information service solution providers, content developers etc still have the notion that the game is something you add to the mix to lighten the load, provide some variety, change the pace or overcome a tricky and boring learning essential. But what if gaming was the core to our learning, the methodological base for instruction and measurement. What if it was the package that replaced the training manual and accomplished its assessment as well as handled its updating? What if, as much biological evidence demonstrates, games are the way we learn and we are just now returning to a full recognition of what that means?

Sitting in an armchair in the City Lights bookstore in San Francisco one foggy day in June 2007 I opened a copy of Mackenzie Wark’s Gamer Theory, published that year as Version 2.0 of his blog GAM3Y 7H3ORY, a networked book hosted online by Bob Stein’s Institute for the Future of the Book. Here is a sample: “Here is the guiding principle of a future utopia, now long past: “To each according to his needs, from each according to his abilities”. In gamespace, what do we have? An atopia, a senseless, placeless realm where quite a different maxim rules: “From each according to his abilities – to each a rank and score? Needs no longer enter into it. Not even desire matters. Uncritical gamers do not win what they desire: they desire what they win. The score is the thing. The rest is agony.” (para 021).  Is this different to what you thought? Is it closer to passing that test, completing that continuous development assignment, getting those SATs, or satisfying all of those humiliating hurdles placed in the way of forward progress by those who have already progressed far enough forward not to be troubled by them any more. If you say “yes” to any of these questions then you are in danger of joining me on a dangerous road – towards a future for learning dominated by gaming.

But we are in good company. That hugely serious player, SAP, employs Mario Herger as its  Global Head of Gamification  (www.enterprize-gamification.com). MIT’s Learning Lab spawned Scratch (http://scratch.mit.edu/) to create and test learning games for younger people and Microsoft created Kodu (http://www.kodugamelab.com/), a programming environment designed to allow users to build their own games on the XBox. And in most countries there is now a serious gaming industry, often with 10 to 15 years of experience behind them, mostly making serious games for user organizations, and unvisited and unblest by the publishers who should be their natural collaborators. Centres of excellence here in the UK include inventive survivors like Desq (www.desq.co.uk), the Sheffield -based developer with almost 15 years of intensive work around immersive experiences like DoomEd or the SimScience environment built for the Institute of Physics. Or look at Pixelearning (www.pixelearning.com) in Birmingham and its training environments, or the company created by its founder, Kevin Corti (SoshiGames – http://www.soshigames.com/, exploiting customer retention through social gaming). Then, around London’s Old Street Silicon Roundabout, see how many of the 800 start-ups are games related, like Michael Acton Smith’s hugely successful MoshiMonsters (http://www.moshimonsters.com/). As a director of CreatureLabs many years ago I recognize the DNA! The games thing is on the march, but the content businesses old-style are not yet aligned with it.

So lets drop “gamification” if we are going to get into some social backlash. Really, games for learning are not like that lesson on Friday afternoon when the teacher showed a filmstrip (younger readers can insert film-loop, film, TV programme, slides, video etc according to age or taste) and we all slept or gazed out of the window. They are the very stuff of learning and the keywords which we shall associate with them are engagement, immersion, collaboration. They will have their problems, but as well as the future of learning they are also the future of assessment.

FOOTNOTE  While continuing to use this blog to record a view of information marketplaces and the players within them, I would also like to devote a regular item to looking at what I am increasingly calling the Post Digital Information World. This does not mean that I think that we shall renege at all on the digitalization of all forms of communication – just that once infrastructures are in place, and the majority of human society is connected to a networked society, it is conceivable that the next stages of development, while they are faster and even less supportive of current business models, will be different in type and style. The current debate about the future of email highlights this. More from me here later.

“Keep it Simple, Stupid” was an acronym I brought home from the first management course I ever attended yet it has taken me years to find out what it really means. There are, clearly, few things more complex than simplicity, and one man’s “Simple” is another man’s Higgs Boson. So I was very energised to have a call last week from an information industry original who has been offering taxonomy and classification services to the information marketplace since 1983. When I first met Ross Leher in the late 1980s we were both wondering how far we would have to go into the 1990s until information providers recognized that they needed high quality metadata to make their content discoverable in a networked world. Ross had sold his camera shop to take the long bet on this, but he worked at his new cause with a near religious persuasion, as I realised when I went to see him in the 1990s at his base in Denver, Colorado. Denver at that time was home to IHS, whose key product involved researching regulatory material from a morass of US government grey literature. Denver people did metadata. It was a revolution waiting to happen.

So when I heard his voice on the phone last week my first emotion was relief – that he had not simply given up and retired to Florida – and then agreement. Yes, we were 15 years too early. And many of the people we thought were primary customers, like the Yellow Page companies and the phone books and the industrial directories – are now either dead or dying, or in the trauma of complete technological makeover. Ross’s company, WAND Inc (www.wandinc.com) is now very widely acknowledged as a market leading player in horizontal and multi-lingual taxonomy and classification development. They are the player you go to if you have to classify content, if you are in a cross-over area between disciplines (he has a great case study around taxonomies for medical image libraries), and if you have real language problems (“make this search work just as effectively in Japanese and Spanish”). What they do is really simple.

Your taxonomy requirement is going to start with broad terms that define your content and its area of activity. These can then be narrowed and specified to give additional granularity in any specific field. These classifications can be incorporated into the WAND Preferred Term Code, given a number, and used in a programmatic, automated way to classify and mark up your content (www.datafacet.com). Preferred terms can be matched to synonyms, and the codes can be used to extend the process to very many different languages. So someone whose company, for example, was created in Spanish can be found in the same list as someone who has a Japanese outfit, as the result of a search made by a Chinese user working in Chinese.

And from synonyms we can extend the process  to extended terms themselves, and then map the WAND system to third party maps – think of UNSPSC, Harmonized Codes or NAICS, as well as those superficial and now dwindling Yellow Page classifications. WAND can isolate and list attributes for a term, and can then add brand information. All of these activities add value to commoditized data, and one would think that the newspaper industry at least would have been deep into this for 15 years. Yet few examples – Factiva is an honourable example – exist which demonstrate this.

Not the least interesting part of Ross’s account of the past few years was the interest now shown by major enterprize software and systems players in this field of activity. Reports from a variety of sources (IDC, Gartner) have high-lighted the time being wasted in  internal corporate search. Both Oracle and Microsoft have metadata initiatives relevant to this, and it still seems to me more likely that Big Software will see the point before the content industry itself. With major players like Thomson Reuters (Open Calais) deeply concerned about mark-up, there are signs that an awareness of the role of taxonomy is almost in place, but as the major enterprize systems players bump and grunt competitively with the major, but much smaller, information services and solutions players, I think this is going to be one of the competitive areas.

And there is a danger here. As we talk more and more about Big Data and analytics, we tend to forget that we cannot discard all sense of the component added value of our own information. We know that our content is becoming commoditized, but that is not improved by ignoring now conventional ways of adding value to it. We also know that the lower and more generalized species of metadata are becoming commoditized; look for instance at the recent Thomson Reuters agreement with the European Commission to widen the ability of its competitors to utilize its RICs equity listings codes. This type of thing means that, as with content, we shall be forced to increase the value we add through metadata in order to maintain our hold on the metadata – and content – which we own.

And, one day, the only thing worth owning – because it is the only thing people search and it produces most of the answers that people want – will be the metadata itself. When that sort of sophisticated metadata becomes plugged into commercial workflow and most discovery is machine to machine and not person to machine we shall have entered a new information age. Just let us not forget what people like Ross Leher did to get us there.

 

« go backkeep looking »