Mar
12
Evidence-Based Education
Filed Under Blog, Cengage, eBook, Education, eLearning, Industry Analysis, internet, Pearson, Publishing, Uncategorized, Workflow | Leave a Comment
Paradox: nothing is more measured, assessed or examined than education, but we still seem to know remarkably little about how people “learn” in full sense of the word. And while the world is full of learned academics with impressive qualifications in “cognitive processing” and the like, try to build a “learning system” for humans and you encounter immediate design problems. Indeed, it is easier to teach machines to learn. So each generation seems to approach the problem – that each of us learns differently, under different stimuli and at different ages – in a different way. Once it was a matter of coursework and textbook. In this age, the Age of Assessment, satisfactory proof of learning is accomplished by testing. Never mind that the learner may have no resulting ability to deploy his or her learning in any other context than a test; we are developing people who can jump immediate hurdles and may not be able to successfully navigate the great oceans of life in front of them. This applies to schools and universities, but also to the rapidly growing vocational and training sectors.
Over in the medical environment, we have had evidence-based practice for over a decade. This is now becoming a discipline in its own right, combining systematic review of literature (for example, the Cochrane Collection) with statistical analysis, meta-analysis and risk-benefit analysis to produce, in combination with the patient record, some really effective results in diagnostic terms. These are now widely deployed in different configurations by information service solution providers like Elsevier, WK Health and Hearst Medicine. As genetic analysis and custom drug treatment become more common, this will no doubt develop, but as we have it today, the information service players are fully plugged into the system. How different to this is education!
Despite the huge collection of indicative statistics, there is still no feedback loop in education which tells teachers what works with certain types of learning profiles. As they develop and test digital learning environments, private sector learning systems developers (not just systems houses but content developers too) are getting significant feedback on the efficacy of their work. Schools store an ever-growing amount of performance data, and much of this can be related to learning systems. Examination boards have yet more (Digression: my most depressing moment in education this year – going to a parent’s evening with a sixth former studying classical civilizations. Question to teacher: what do you recommend as additional reading (I have shelves full of possibilities); Answer: we do not recommend reading around the subject. It only confuses people to have several interpretations and inhibits their ability to secure high pass grades!). And yet all of this content or evidence is disaggregated, not plumbed for learning experience significance, and there is no tradition of building ideas about what input might secure learning gains – just give the learner another diagnostic test!
These notes were sparked in the first place by the announcement last month of the creation of a Coalition for Evidence-Based Education by the Institute for Effective Education at the University of York. I also know of the TIER project in the Netherlands (involving the Universities of Amsterdam, Groeningen and Maastricht) and have great respect for the ongoing work of Better magazine, created by the Johns Hopkins Centre for Research and Reform in Education. But all of these seem to me as much concerned with applying evidence to changing policy at government or school administration level, as they are with developing practitioner tools. And they exemplify something else – there is not a publisher/education solutions supplier loose around any of them. True, no one ever field-trialled a textbook (though I once did this with a UK Schools Council course in the 1970s called “Geography for the Young School Leaver” – and it had dramatic effects on the presentation and construction of the learning journies involved). Yet here we are in the age of Pearson’s MyLab or the Nature Education’s Principles of Biology online learning experience. The age of iterative learning devices, wired for feedback and capable of recording both anonymized statistical performance data and giving diagnostic input to a single user or teacher on what needs support and re-inforcement in a learning process. Yet I know of no developer who trades use with feedback in terms of co-operating with government and schools in trialling, testing and developing new learning environments. And given that these are iterative – they tend to change over time as refinements are made and non-statistical feedback is procured – I know of no schemes which are able to demonstrate the increasing efficiency of their learning tools.
ELIG (the European Leaning Industry Group) has issued members of its marketing board, like myself, with an urgent requirement to uncover good case studies which demonstrate the efficacy of learning tools in practice. I can find plenty, but they are all based on the findings of the supplier. I can even find some where a headmaster says “exam results increased X% while we were using this system” – but they never indicate whether this was the sole change that led to the finding. If I were a teacher with a poor reader with real learning difficulties, where do I go for the ML Consult or UpToDate medical review equivalent – a way of defining my pupil’s problem and relating it to success with others with similar problems, and the learning systems feed back on the systems that worked? The answer is that you do not go anywhere, since education, one of the most lonely and secretive jobs in the professional world, is still not quite prepared to enter the digital age with the rest of us. And its suppliers, sharing something of that culture, still operate in an isolated way that also predates the new world of consolidation and massive systems development now beginning in this marketplace. And the Learner? Processed or Educated? It all depends on the feedback loop.