This is an updated version of a blog post which I originally submitted as part of my MsC. in E-learning at the University of Edinburgh, in Spring 2007.
Anyone who attends e-learning conferences (or keeps up with the chatter on the blogs) will have found it difficult to avoid references to 'digital natives'. There are other terms which are popular (like 'Gamers', or 'Gen Y') but the term 'digital native' is perhaps the most commonly used term to describe the people currently entering the workforce.
But what are they and where does the term come from?
The term was coined by the author Marc Prensky (pictured above) who used it to distinguish between people who have grown up using interactive multimedia and those who are having to 'learn' how to interact with and use services like Youtube, Google Reader and the dizzying array of social web services like Facebook, Orkut and Myspace.
'Today's students – K through college – represent the first generations to grow up with this new technology. They have spent their entire lives surrounded by and using computers, videogames, digital music players, video cams, cell phones, and all the other toys and tools of the digital age. Today's average college grads have spent less than 5,000 hours of their lives reading, but over 10,000 hours playing video games (not to mention 20,000 hours watching TV). Computer games, email, the Internet, cell phones and instant messaging are integral parts of their lives.
It is now clear that as a result of this ubiquitous environment and the sheer volume of their interaction with it, today's students think and process information fundamentally differently from their predecessors. These differences go far further and deeper than most educators suspect or realize. “Different kinds of experiences lead to different brain structures, “ says Dr. Bruce D. Perry of Baylor College of Medicine. As we shall see in the next installment, it is very likely that our students’ brains have physically changed – and are different from ours – as a result of how they grew up. But whether or not this is literally true, we can say with certainty that their thinking patterns have changed. I will get to how they have changed in a minute.'
My reading of Prensky's article on digital 'immigrants' and 'natives' was thought-provoking in two ways: firstly in that it seemed to confirm some long-held suspicions about current educational models and secondly because it proposed a new division of learners which I'm not entirely sure I'm comfortable with.
Prensky's arguments that 'digital natives' are not being well served by traditional pedagogies is one part of a larger argument which has been bubbling up within the zeitgeist for some time now. Any of us working in educational fields will be long-familiar with constant claims that school-children are more disengaged than ever before, with teachers complaining that learners are switched-off, suffering from attention-deficit disorder and that they lack the basic ability to concentrate in class. This isn't new - Plato uttered similar complaints in the 6th century BCE.
What's notable now is that such complaints have become so vociferous in recent years that increasingly there are more and more people suggesting that rather than there being issues with the learners, there is in fact a fundamental problem with the system which is trying to educate them.
Some time back I stumbled across the work of John Taylor Gatto, who in his groundbreaking book 'The Underground History Of American Education' suggested that the American educational system is designed as nothing more than a complex factory line for producing hordes of compliant individuals, whose sole purpose in life is to willingly submit themselves to processing by multi-national corporations in menial physical labour.
The exact details of how this project came to be are complex, but if we accept Gatto's arguments for a moment, a large problem makes itself very clear. The Anglo-Prussian model upon which the US system was based was designed during a time of Industrial manufacturing. That age is now gone - with today's school-leavers thrust into a global 'knowledge economy' which they are ill-prepared to compete in.
This notion has been surfacing in several places - from blogs, to academic journals - to mainstream media where we saw in an episode from the seventh season of the TV show 'The West Wing', Democratic candidate Matt Santos (played by Jimmy Smits) railing against the US educational system because it was based on the agrarian calendar.
As dead as Gagne
I have no difficulty in accepting that some older pedagogical models may have had their day in the sun, but I'm concerned that such a rush to embrace the future may marginalise those of us who do not necessarily fit comfortably into one of Prensky's two categories. For example, Donald Clark (former CEO of the Epic e-learning company - where I worked for a year) posted a blog in February of 2007, where he stated that the increase in usage of the web in education is 'killing Gagne dead'. The Gagne that he refers to is Robert Gagne and his 'Nine events of learning'. This is probably the most often used ID model for producing 'chapters' of digital e-learning content.
Clark argues that the increased accessibility of content via the web (notably video content) is knocking such 'guided' models out of the water entirely. To an extent I agree with him - I believe that many digital learning solutions are designed not with the learner in mind but rather the tutor - accommodating perhaps the tutor's preferred learning styles rather than the learners.
However, a white paper from Epic (which Donald, as you can see from the comments below the blog post in question, is at pains to point out he did not write) makes a good case for different usage of ID models for different audiences. Specifically, it suggests that 'entry-level' audiences would benefit from the more guided models (of which Gagne is an excellent example) and that more free-form models of self-guided learning (such as are facilitated by the web) work better for more senior and experienced learners - who crucially have the time and the technical skills to enable such learning to occur.
Pedagogy or not?
My own experiences would seem to back this up: I work in creating digital learning solutions for a range of audiences - all of them in the Police services of the UK. Unbelievably, entry level learners to the police force have no protected learning time - rendering any offer for learners to learn in their 'own way' redundant. This, coupled with the fact that the learners we design for are acquiring sensitive, challenging and highly legislation-sensitive materials, behoves us to provide a clearer line of guidance in any learning materials provided. I would prefer it that we lived in a world that allowed trainee Police officers the opportunity to have exploratory, free-form learning experiences but the time constraints, legislation compliance and complexity of information negates this from happening - at least for entry level learners. Digital natives or not, pedagogy (of some form) is required for audiences such as this.
I can't help but feel that Prensky's narrow division of learners into 'natives' and 'immigrants' is reductive, binary and perhaps unhelpful. For certain a 12 year-old will have different wiring (as a result of the neuro-plasticity which Steven Johnson alludes to in his book 'Everything Bad Is Good For You') than a 32 year-old as myself, but I'm not entirely ignorant of new technologies and in some rare instances still have the ability to wow my teenage nephews with what seems to them like arcane knowledge of the web and it's myriad services. On the flip side, put them up against me in a PS2 shoot-em-up and they'll annihilate me. I'd like to think that myself and my nephew fit somewhere in-between 'native' and 'immigrant' and that neither of us would be pigeon-holed into one educational model based on our age and hours put in on an Xbox.
Prensky's article on 'Digital Natives, Digital Immigrants' (pdf)
'Did you know?' - explaining the 'digital native'.