Here, Annie Murphy Paul explains the challenges that teachers face as learners and what research shows can be done to overcome them. This was addressed to a group of teachers and principals from the Common Core Lab of the New York City Department of Education. The lab is a group of New York City public schools who are thoughtfully working through the challenges of implementing Common Core standards. This process involves a lot of learning on the part of adults as well as students, so the theme of my speech was “teachers as learners.” This is one of the articles in the National Teacher Enquiry Network May Half Term Newsletter (sign up here).
Annie Murphy Paul is a book author, magazine journalist, consultant and speaker. This talk initially appeared on her own blog.
Thank you for that lovely introduction, and thank you all for the thoughtfulness and care with which you’re educating our children. I always say that teachers are the real experts on education, but nevertheless I’ll try tonight to contribute to our collective understanding of how teaching and learning work by drawing on some recent research in cognitive science and psychology.
My theme in this talk will be about teachers as learners—about the special challenges that arise when teachers themselves are asked to grapple with learning new information and new skills. I’ll quickly outline for you the four topics I’ll be discussing tonight, and then will jump right into the research and what it can tell us.
First, knowing what we know: I’m going to present research on how we can accurately assess our own knowledge and identify gaps and misconceptions.
Second, unlearning mistaken ideas: I’ll describe several empirically-supported ways to shed misconceptions that can get in the way of learning.
Third, lifting the “curse of expertise.” I’ll explain why experts in a subject (that’s you!) sometimes have difficulty conveying their knowledge to novices, and will present several methods for eliminating the expert’s “blind spot.”
And lastly, I’ll talk about the benefits of engaging in a “cognitive apprenticeship.” This is a practice in which the age-old method of apprenticeship to a trade is adapted to more abstract, less hands-on knowledge work.
So, first, knowing what we know—which is harder than it sounds.
Research has shown over and over again that we are not very good judges of how effectively we’re learning new information, or how accurately we’ll remember it. This means we may stop the learning or training process prematurely, before new material is truly absorbed, and it means we may be in for an unpleasant surprise when we realize—when it’s time to put that knowledge and those skills into action—that we didn’t know as much as we assumed.
How can we achieve an accurate sense of how well we’re learning? Science has identified four strategies.
First: Wait a while. We often base our judgments of how well we’re learning on how easily we can call the learned information to memory. This cue may be misleading, however, if we retrieve the material while it’s still fresh in our minds. Hold off a day or two, or even just a few hours, and then check how well you know it. Psychology professor Bennett L. Schwartz reports that “testing oneself—not immediately after studying, but after a meaningful delay—greatly increases the accuracy of JOLs.” (JOLs is psychologist-speak for “judgments of learning.”)
Second: Put notes and books away. Another cue we use to assess how well we’ve learned new information is its “ease of processing”—how easy it seems to understand and remember in the first place. One habit many of us fall into is checking our preparedness by “looking over” notes or other written materials. But a study led by Purdue University professor Jeffrey Karpicke finds that such re-reading breeds overconfidence: “When students have material right in front of them, as they do when they repeatedly read, the material is immediately accessible and processing is fluent and easy.” Satisfied that the material seems familiar, we figure we know it—only to realize later that we don’t. We can keep ourselves from falling into this trap by putting away notes and books and trying to recall the material from memory.
Third: Mix it up. During training or studying sessions we often practice the same type of problem until we feel we’ve mastered it, then move on to the next kind of task. This can give us an overly confident sense that we know the material—until we’re flummoxed by real-world conditions, in which problems don’t come at us neatly arranged but rather randomly and unpredictably. We can replicate these realistic conditions during training and studying by mixing up tasks so that we don’t know which one is coming next. Psychologists Dominic Simon and Robert Bjork notes that “learners who train under such conditions are less likely to terminate practice before achieving the level of learning that is the goal of such practice.”
And fourth: Gain expertise. David Dunning, the psychologist mentioned above, notes that people who are beginners, or simply not very skilled in a particular domain, are “doubly cursed”: “Their lack of skill deprives them not only of the ability to produce correct responses, but also of the expertise necessary to surmise that they are not producing them.” In other words, when we’re bad at something we don’t even have the knowledge to know how bad we are. He has found that for students beginning study in a discipline, there is a weak relationship between what they believe their level of understanding is and their actual exam performance—but that this connection grows stronger as students become more advanced. To avoid overconfidence, heed the words of Confucius, the ancient sage: “Real knowledge is to know the extent of one’s ignorance.”
So, second: unlearning mistaken ideas.
The psychological study of misconceptions shows that all of us possess many beliefs that are flawed or flat-out wrong—and also that we cling to these fallacies with remarkable tenacity. Studies from cognitive science and psychology have looked at ways to actively disabuse ourselves or others of erroneous conceptions. Although much of this research concerns misguided notions of how the physical world works, the techniques it has produced can be used to correct any sort of deficient understanding.
The most important thing to realize is that just telling isn’t enough. Most methods of instruction and training assume that if you provide people with the right information, it will replace any mistaken information listeners may already possess. But this turns out not to be so. Especially when our previous beliefs (even though faulty) have proved useful to us, and when they appear to be confirmed by everyday experience, we are reluctant to let them go.
Donna Alvermann, a language and literacy researcher at the University of Georgia, notes that in study after study, learners “ignored correct textual information when it conflicted with their previously held concepts. On measures of free recall and recognition, the learners consistently let their incorrect prior knowledge override incoming correct information.” It’s what our mothers called “in one ear and out the other.” Science has identified three ways to make that new information push out the old.
First, highlight the mistaken notion. The simplest way to correct mistaken notions is to point them out as the accurate information is being presented. In a 2010 article in theInternational Journal of Science and Mathematics Education, researcher Christine Tippett offers an example from a science book for children:
“Some people believe that a camel stores water in its hump. They think that the hump gets smaller as the camel uses up water. But this idea is not true. The hump stores fat and grows smaller only if the camel has not eaten for a long time. A camel can also live for days without water because water is produced as the fat in its hump is used up.”
Note the three-part structure: the misapprehension is described, declared false, and replaced by an accurate version. Although such “refutation text” is very effective in debunking misconceptions, Tippett notes, it’s rarely used in informational books for children or in textbooks for older learners.
Second, issue an advance alert. For more deeply embedded beliefs that resist simple clarification, teachers, managers and other leaders can ask people to “activate” these prior beliefs, then instruct them to attend carefully to ways in which the correct explanation differs from their current conviction.
For example, Donna Alvermann and a co-author conducted an experiment in which students in an introductory physics class were asked to draw, and then explain, the path a marble would take if shot from a tabletop. The investigators’ instructions contained this advice: “If you thought that the path the marble would take would be straight down, straight out and then straight down, or straight out and then curved down, your ideas may be different from what the laws of physics would suggest. As you read the following text, be sure to pay attention to those ideas presented that may be different from your own.” The students who were “forewarned” with these instructions, the authors note, “showed marked improvement in learning information that conflicted with their existing knowledge.”
And third, create a confrontation. For the most tenaciously-held beliefs, it may be necessary to stage an intervention. In a 2002 article in the American Journal of Physics, researchers from the University of Washington note that “students often finish a standard introductory course or an advanced undergraduate course on relativity with some fundamentally incorrect beliefs.”
It’s frequently not enough for instructors to point out the discrepancy between learners’ convictions and the way things actually work, they note; learners have to perceive this discrepancy themselves, at which point they’ll be motivated to resolve it. The Washington researchers designed tutorials in which students were led to confront the fact that they held two mutually-exclusive ideas, one mistaken and one correct (in this case, about the concept of time in special relativity). The students then were helped to discard naive beliefs and fully embrace scientific ones.
The key is creating an uncomfortable sense of cognitive dissonance; only then are we willing to trade our private versions of reality for something that looks more like the real world.
Now, on to our third topic: lifting the curse of expertise.
There’s a truly stunning statistic that comes courtesy of Ken Koedinger, a professor of human-computer interaction and psychology at Carnegie Mellon University. He notes that experts can articulate only about 30 percent of what they know. This is a problem when designing courses, he noted, because the experts creating them often can’t adequately explain what they know to the novice learner.
This phenomenon is called the “curse of expertise.” Research has identified four practical ways that can help us lift this curse and share our knowledge effectively with others:
First, use data. Ken Koedinger, the CMU professor, also notes that designers of online courses now have a wealth of objective data on what learners find difficult to understand and master. This information, gathered with every keystroke the students make while proceeding through the courses, removes the bias that experts have (“But that’s so easy!”), revealing precisely where novices’ difficulties lie. You can use data, too, by setting aside your assumptions about what’s easy or hard and looking at the evidence instead.
Second, remind yourself of your own experiences as a learner. Experts’ judgments about their field are colored by the “availability heuristic”: that is, the memories that are most recent and thus most available to them are not memories of struggle and confusion but memories of ease and understanding. Prompting ourselves to remember in detail what it was like when we first started out can make the beginner’s mindset more accessible to us. A study led by psychologist Roger Buehler, for example, found that asking computer programmers to recall their own experiences as learners led them to make more accurate estimates of how long it would take a novice programmer to write a new program.
Third, draw up a list of the problems learners might face. As psychologist Tom Stafford notes, “learning makes itself invisible”—it subtly but thoroughly changes our perceptions and our judgments so that it’s hard to see how much we know, and how much others don’t. In particular, we “anchor,” or base our assumptions, on our own experiences as an expert, and then fail to make sufficient adjustments for the very large gap between us and the inexperienced. Generating an explicit list of the hurdles that novices must surmount, psychologists Patrice Engle and J. Bradley Lumpkin found, helps experts develop a more realistic sense of the challenges beginners face.
And fourth: break it down, then break it down again. As we gain expertise, tasks that were once a jumble of apparently unconnected steps become organized into simple and efficient mental patterns (“Just do this, and then this, and then you’re done”). The beginner, however, must still labor over each detail. We can help novices attain the mastery we enjoy by analyzing our own knowledge and breaking it down into steps—even “microsteps,” or tiny increments of knowledge—and making sure they understand each one. Once that’s been achieved, we can then help learners assemble the discrete steps into the streamlined mental models we ourselves use.
Now, what if you’re the novice? Ask your trainer or mentor to adopt one or more of the strategies above. Or find someone who is just little more advanced than you are to explain. Research shows that people with “intermediate” knowledge can often be more helpful to the beginner than experts.
Lastly tonight, I want to tell you about the benefits of engaging in a cognitive apprenticeship.
For centuries before the rise of educational institutions, everyone learned on the job, through formal or informal apprenticeships. An aspiring blacksmith learned his trade by working alongside a master craftsman; a dressmaker-in-training performed increasingly complex tasks under the tutelage of an experienced seamstress. But much of today’s work, of course, is less concrete than hammering an anvil or cutting a bolt of fabric; it’s social, emotional and intellectual labor, often carried on inside a person’s own mind.
In a landmark article published more than a decade ago, cognitive scientist Allan Collins and his coauthors John Seely Brown and Susan Newman gave us a new way to think about this kind of contemporary learning: novices, they wrote, can engage in a cognitive apprenticeship. Like a traditional apprenticeship, this form of training pairs a rookie with a worker who’s far more advanced, but Collins and his colleagues adapted the older custom to the new needs of executives, managers, salespeople and other professionals who work with their heads rather than their hands.
As they describe it, the cognitive apprenticeship proceeds in three steps. First, the master models the skill for the apprentice. Second, the master coaches the apprentice as he or she attempts to execute the skill. And third, the master “fades” or pulls back as the apprentice is increasingly able to work independently. Over the course of this cycle, the apprentice learns to identify and correct mistakes, and to integrate his or her burgeoning knowledge and skill into a smooth, coordinated performance.
So far, this sounds a lot like how things were done in the olden days—but as Collins writes, “Applying apprenticeship methods to largely cognitive skills requires the externalization of processes that are usually carried out internally.” That means that the modern-day master and apprentice must be continuously communicating as they work side by side.
Collins prescribes two specific types of talk: in the first, the master and the neophyte take turns explaining what they’re doing as they do it. This alternation allows apprentices “to use the details of expert performance as the basis for incremental adjustments to their own performance,” Collins writes.
The second approach Collins calls “abstracted replay”: that is, after a task has been performed, the master offers a detailed commentary on what just happened (sometimes augmented by the actual replay of video taken during the task). During the recap, the more experienced member of the pair recounts what would have been his or her internal dialogue so that the less-experienced participant can hear it—and, in time, draw that dialogue inward as well.
I think the cognitive apprenticeship model is an apt one for what you all are doing: you are at the same time masters of your craft, and also apprentices who are always learning, always improving. I wish you all the best in that essential undertaking. Please email me your questions and your comments at firstname.lastname@example.org; I’d love to hear from you. Thank you again for all you do.
You can sign up for free Associate membership of the National Teacher Enquiry Network for more articles like this using the form below: