Phone Monday through Friday 8:00 a.m.-6:00 p.m.
1-800-933-ASCD (2723)
Address 1703 North Beauregard St. Alexandria, VA 22311-1714
Complete Customer Service Details
by Bryan Goodwin, Tonia Gibson and Kristin Rouleau
Table of Contents
What you'll learn in this book isn't new or faddish. It's based on carefully designed studies of learning reported in peer-reviewed publications that have been around for many decades. Some of it, in fact, dates back to the 1870s when an amateur scientist in Germany, Hermann Ebbinghaus, began a series of unusual experiments on a singular subject—himself.
Each evening, at the same hour, Ebbinghaus would sit alone in a quiet room and pull from a box small sheets of paper with different nonsense syllables on each—drawn from a list of 2,300 nonsense syllables he carefully created (e.g., mox, fim, tib). After writing down each syllable in a notebook, he'd start a metronome and, following its rhythm, recite each syllable on the list in a monotone voice in equally spaced intervals. Afterward, he'd close his notebook and attempt to recall the list from memory, over and over, until he could recall them all.
From this lonely and tedious work, Ebbinghaus arrived at many important insights into the inner workings of our minds, including our "forgetting curve" (how quickly we forget new learning) and ways to strengthen memory (Boring, 1957). Perhaps most important, through his exacting and methodical experimentation, he began to turn what had previously been mostly just philosophical musings about the mind into a scientific pursuit, paving the way for study and exploration of how we learn.
Starting in the 1950s, cognitive scientists developed what's commonly referred to as the information processing model, which uses the computer as an admittedly imperfect metaphor for what happens to information once it enters the brain. Basically, the information processing model attempts to map the long, perilous journey—full of twists, turns, and dead ends—that all new information must take before finding a home in our long-term memories. As you'll discover, the human brain is both shockingly powerful and maddeningly inconsistent. Sometimes it forgets things that its owner wishes desperately to remember (What's the name of my boss's husband? Where did I park the getaway car?). Sometimes it remembers things that its owner wishes desperately to forget (an unkind word or an annoying jingle).
In many ways, the challenge of learning is rooted in a fundamental paradox of the human brain. Although it can learn and retain staggering amounts of information, it's also incredibly adept at ignoring and forgetting information, which in many ways is a good thing. If we paid attention to every stimulus in our environment, we'd be nervous wrecks with our heads on a swivel, trying to pay attention to everything that's happening around us. And if we couldn't forget anything, we'd grow progressively unable to cope with the world as our brains clogged with useless information.
In fact, too much memory can be annoying—even lethal. Consider the curious case of Jill Price, who at first blush appears to possess what seems like a superpower: the ability to never forget. Now in her early 50s, she can recall events from her teens like they occurred yesterday. Ask her what she was doing on August 29, 1980, and she'll tell you, "It was a Friday. I went to Palm Springs with my friends, twins Nina and Michelle, and their family for Labor Day weekend."
The first time she heard Rick Springfield's "Jessie's Girl"? March 7, 1981. She was driving in a car with her mother yelling at her. The third time she drove a car? January 10, 1981. It was a Saturday. She was at "Teen Auto. That's where we used to get our driving lessons from" (McRobbie, 2017).
Price is among a group of rare people who have been clinically tested and found to have hyperthymesia or HSAM (highly superior autobiographical memory): the ability to recall abnormally vast details from their lives. They can remember minutiae from years earlier, such as every meal they've eaten, every phone number they've written down, and every song they've heard on the radio. Sounds awesome, yes? But in reality, not so much. Price will tell you that having "total recall" memory creates a swirling mess in her head and leaves her teetering on the edge of sanity.
My memory has ruled my life. Whenever I see a date flash on the television (or anywhere else for that matter), I automatically go back to that day and remember where I was, what I was doing, what day it fell on, and on and on and on and on. It is nonstop, uncontrollable and totally exhausting. … Most have called it a gift, but I call it a burden. I run my entire life through my head every day and it drives me crazy! (Parker, Cahill, & McGaugh, 2006, p. 35)
Recent studies in neuroscience are finding that our brains appear actively and purposefully to forget most of what we learn—continually pruning and clearing out old and unneeded memories (often as we sleep) to allow us to focus on more important information. As it turns out, forgetting is as important to our memory systems as remembering (Richards & Frankland, 2017). Forgetting extraneous information simplifies our memories, decreasing the static hiss of the noisy, information-rich worlds in which we live and allowing us to focus on the pertinent details needed to make better decisions.
So, for the sake of our mental health and happiness, it's good that most of us ignore and forget the vast majority of what we experience. For learning, though? Not so great. As educators, we are locked in a constant battle with our students' brains, which by design are programmed to ignore or forget most of what's in their environment, including what we attempt to share with them in our classrooms. Therefore, let's take a look at the stages of memory, followed by the phases of learning with which they intersect, to build a mental model of the learning system.
Before memories can be created, we must notice some initial information with one or more of our five senses—sight, hearing, touch, taste, smell—or our related senses of movement and balance. Our nerves convert these stimuli into electrical signals that travel along our body's nerve fibers in milliseconds, racing with incredible urgency to arrive in our brains where—surprise!—the vast majority of stimuli are promptly discarded in less than a second.
Why does this happen? Well, there's simply too much going on around us every second of the day for our minds to remember it all in full detail. Our bodies are designed for survival in a hostile environment, and to survive, our early ancestors primarily needed to pay attention to and remember the really important stuff—things that kept us safe from predators, nourished, and sheltered. For example, it was important to be able to ignore our hunting companion prattling on about his digestive issues and narrow our focus down to a tiny pinhole of stimuli: a lion making its way toward us through the savannah grass while licking its chops. The ability to filter distractions down to a pinhole was a good thing—it was the difference between living to tell the tale and being a lion's lunch.
Even now, hundreds of thousands of years later, most of what we sense throughout our day can be simply ignored. In fact, our brains' ability to filter out distractions (which I'm doing right now as I write this paragraph on my laptop while sitting outdoors at my daughters' swim meet, surrounded by screaming kids, loud music, towels flapping in the breeze, and people walking by, to name but a few stimuli) is often essential in helping us focus on the stimuli that are most important to us at the moment. Yet as teachers, it means we must ensure students focus their "pinholes" on what we want them to learn.
The next time you walk into your school or office, try to observe and remember everything you're seeing, hearing, and feeling for as long as you can: the color and shape of every car in the parking lot, the conversations of people you pass by, the feeling of a light breeze or sun on your face, the clothes and facial expressions worn by every person you see. This is the sensory register, and it's impossible to hold on to every single input all at once and for any length of time. Only a tiny fraction of what registers gets retained. And as we'll see, "rules" in our brains form something of a pecking order for which information we pay attention to and which we ignore.
Stimuli that make it through the filters of our sensory registers and are deemed important enough can begin moving along a journey through three phases of memory: immediate, working, and long-term. This is true regardless of the type of memory in play, declarative or procedural, although the area of the brain that leaps into action varies. Declarative memory, which is the recall of facts, information, and personal experiences, is stored across the neocortex—the large, gray, wrinkly outer part of the brain—and deeper down, inside the hippocampus and the amygdala near the center of your brain. It is further divided into episodic memory (recollections of events we personally experience) and semantic memory (facts and information we have learned).
Procedural memory refers to the memories that allow us to repeat physical actions and skills, such as how to ride a bicycle or draw a portrait. These performance-based memories are stored in the basal ganglia and cerebellum, which coordinate our movement, balance, and equilibrium (Queensland Brain Institute, n.d.). Experiments by neuroscientists have found that our procedural memories, once established, are very strong and far less likely to fade over time than our declarative memories, which is why we can remember how to ride a bicycle years after our last pedal around the block (Suchan, 2018).
Those lucky few sensory inputs that make it through our initial filters are carried along by electrical signals to neurons that then produce a biochemical charge that records, or encodes, the impression of that stimulus. Then it passes along this code to a thousand other neurons to which it is connected; each neuron can then help store and recall multiple memories (Reber, 2010). Later, when you try to recall a particular memory, that group of neurons fires the same biochemical code associated with it, re-creating the memory in your mind (Mastin, n.d.).
Our initial, immediate memory is short term, lasting only about 30 seconds. It also has limited capacity, as Harvard psychologist and researcher George Miller discovered in the 1950s. Through a series of experiments, he found that our brains can actively focus on and work with approximately seven bits of information at a time (Miller, 1956). The bits of information for what Miller called the Magic Number 7 range from small singular items such as a letter of the alphabet or a single number to chunks of information that the brain is able to group together because of some connection, such as words or mathematical functions.
Try juggling more than seven of these bits at a time, and most of us will begin to mentally stumble and forget some data points, letting some of the information fall to the floor, so to speak (Harvard University Department of Psychology, n.d.). We can thank Miller for our relatively short phone numbers, as it was his research that persuaded phone companies around the world to limit local phone numbers to seven digits. A reexamination of Miller's research (University of South Wales, 2012), however, suggests the magic number may be closer to four, because what we really seem to be doing when we encode a seven-digit number, such as 6458937, is break it into four shorter chunks, such as 64, 58, 93, and 7.
Between four and seven items at a time in our immediate memory—doesn't sound like much, does it? But think about the student activities taking place in your classroom on a daily basis, and you'll see students must constantly employ immediate memory when they are
These initial memories can't get too comfy—this stage lasts only about 30 seconds because, again, the brain must weed out some information. It can't store everything. But if the stimulus is deemed important enough, it can be retained long enough to advance into the next stage, working memory.
Here's where volition comes in. If we consciously focus on what's in our immediate memory (by listening to someone as they're talking or making marginal notes in a book, for example), we cause our neurons to repeat their chemical and electrical exchanges, which in turn increases the efficiency and strength of their communication (Queensland Brain Institute, n.d.). It's akin to creating a new path through a forest; as your feet press down on the soil and vegetation, it makes the path more visible and easier to follow.
For simplicity's sake, this book merges two overlapping yet arguably distinct concepts: short-term memory and working memory. Although cognitive scientists still debate the exact relationship between these ideas (Aben, Stapert, & Bickland, 2012), we might think of the difference like this.
Short-term memory is our stream of consciousness—the sensory events (e.g., listening to a lecture, reading a book) and information (e.g., names, words, numbers) we hold in our attention at any given moment. When we apply mental effort to what's in our short-term memory—for example, manipulating, clustering, or connecting it with stored memories—we employ working memory.
Through brain imaging, scientists have found we appear to activate different parts of our brains when we shift from simply rehashing what's in our short-term memory (e.g., repeating a string of letters) to manipulating what's in short-term memory (e.g., alphabetizing that same string of letters). So we might think of working memory as short-term memory plus mental effort. As we'll see throughout this book, the key to learning almost anything is focusing mental energy on what's in short-term memory—that is, employing working memory. Rather than bouncing back and forth between these two similar ideas, we'll simply use the single, blended term short-term working memory.
But it takes more than just one or two walks along the same route in the forest to create an easily seen, easily followed path. Similarly, the pathways in our working memory don't last long, holding on to a recollection roughly 5–20 minutes before the memory either decays or continues its journey to long-term memory.
Although this book mainly focuses on the brain's processing of new information, neuroscientists say that our working memory is also used when we activate old memories and bring them front of mind, so to speak. As with the creation of new memories, the more often we recall and think about these existing memories, including combining them with new sensory inputs and information being learned, the more efficient our neural pathways become, strengthening the memories in our minds a bit more each time.
If we decide to revisit the information often enough through repetition, rehearsal, contextualization, or application, we can usher it into its ultimate destination. The brain creates more, and larger, dendrites (extensions of the nerve cell) to store these memories (Young, 2015).
Not only that, but activating neurons using different sensory inputs related to the same concepts can strengthen memories and make more connections and pathways between related memories to build broader understanding. In other words, memory and knowledge about a subject—say, civil rights—can be strengthened by reading about civil rights and then also listening to or watching interviews with activists and historians describing what took place and visiting museums or places connected with the civil rights movement.
Effort is everything at this stage; the more we think about and experience a topic, the stronger the memories we create will be. As noted earlier, our brains actively weed out most memories. Researchers think sleep is critical to this process. While we slumber, the subconscious mind sorts and organizes the day's events, embedding the important bits and pieces as best it can and building connections among other related bits and pieces. It also prunes what it regards as useless memories—in particular, memories we haven't stored strongly or connected with other learning. In the morning, we wake up refreshed, ready to load up with a new day's sensory inputs.
Think back to my initial question in the Preface. Can you now sketch out the basics of how memory works—how knowledge enters our brains and is saved for future use?
This is, of course, a summary of memory formation, and a brief one at that—the equivalent of waterskiing over something we could easily scuba dive into.
As we noted earlier, sometimes "brain science" can become too granular and impractical to help educators. We don't really need to know about neuropeptides to be good teachers, but we do need to have solid mental models of how learning occurs so we can use tactics to help our students better seize control of information at just the right time and in the right manner to give that new knowledge the best possible chance for moving through the phases of memory.
So how can we as educators use our knowledge of these phases of information processing to ensure that our lesson planning and instructional delivery help our students' learning stick? By following a learning model, which arranges strategies for teaching and learning into a larger process for helping new knowledge travel through the three types of memory in our students.
We must trigger these two key phases of learning in students' minds for new information to pass through the filters of their sensory registers and enter their immediate memories.
Become interested. The external stimuli that make it past the brain's mental filters tend to be of two varieties: those that stir emotions and those that arouse curiosity (typically in that order). Our brains default to ignoring almost everything else. What this means is that to start the learning process—to get information past our students' mental filters—we need to help them feel comfortable in their learning environment and then attach some form of emotion (e.g., excitement, indignation, passion) and/or intellectual stimulation to what they're learning that leaves them scratching their heads in wonder. For example, we might pose a mystery to them—for example, "Thousands of years ago, the wooly mammoth was the dominant creature in North America. So what happened? How could such a massive creature just up and disappear?"
Commit to learning. Being interested is vital but only gets us so far; to go beyond learning mere tidbits of information or discrete skills, we must take the next step and commit to learning more. As teachers, we can help students do this by presenting new knowledge and skills as part of a big picture that affects their lives and helps them set clear, challenging, yet attainable goals for their learning. In short, when it comes to learning, we need to help students answer the simple question "What's in it for me?" For example, we might help students see how learning why the mammoth went extinct connects to a modern crisis (e.g., the mass extinction of species around the world).
Once information begins tumbling around students' working memories, students must engage in these two phases of learning to begin encoding information, preparing it for long-term memory storage.
Focus on new learning. Once students are "thirsty" for new knowledge, they must acquire it by actively thinking about what they're learning. For example, they might participate in a question-and-answer session, engage in close reading of text, follow a process as it's modeled, visualize what they're learning by creating nonlinguistic representations of concepts, or take notes during a lecture. All these active learning processes, especially when used in combination, help knowledge soak deeper into the brain.
Make sense of learning. Due to the limitations of working memory, we must "chunk" learning into bite-size segments interspersed with opportunities to connect new learning with prior knowledge and cluster ideas together, which is how our brains store knowledge—as webs of ideas and memories. Even though knowledge remains in our working memory, we must "make sense" of it before the details fade. For example, we might help students group various scientific facts, details, and insights into how the wooly mammoth went extinct into three big scientific theories of over-kill, over-ill, and over-chill.
At this point in the process, new learning is still at a crossroads; students' brains are primed to prune the information, discarding it onto a mental trash heap unless they engage in these final two phases of learning.
Practice new learning. To store learning into long-term memory, we must go on more than one date with it, so to speak. As it turns out, cramming seldom works. Rather, we're more apt to remember what we learn when we engage in distributed practice (engaging in practice sessions a few days apart) and retrieval practice (being quizzed on or quizzing ourselves on new knowledge). Learning science shows that searching our memories for knowledge that's begun to fade rekindles those waning neural networks and strengthens memory. Therefore, giving students multiple opportunities to repeat, rehearse, and retrieve new knowledge and skills makes them more apt to commit new learning to memory.
Extend, apply, and find meaning. We've all likely experienced the frustration of struggling to jog our memory for an important bit of information. Often, what's going on in our brains when this happens is that we've stored the information but have too few neural pathways to retrieve it. This "use it or lose it" principle of learning suggests that students more readily retrieve knowledge when they develop multiple connections to it by, for example, associating it with multiple other pieces of information, digging more deeply into it, or using it to solve real-world problems. For example, we might encourage students to delve into the science and ethics of using DNA to bring the wooly mammoth back to life or investigate whether the forces that led to its extinction might now be causing the collapse of global honeybee populations.
In sum, we might visualize this entire process of learning as looking something like Figure 1.1. Together, the steps provide a simple six-phase model for student learning. Figure 1.2 provides more detail about these six phases, along with a practical toolkit for bringing them to life in your classroom that includes many evidence-based teaching practices from Classroom Instruction That Works (Dean et al., 2011) and The 12 Touchstones of Good Teaching (Goodwin & Hubbell, 2013). Bear in mind, you needn't use every tool with every lesson but, rather, should use your professional judgment to draw on them to design learning opportunities for your students.
Information Processing
Learning Phase
Guiding Questions
Design Principles(learning science)
How Teachers Guide Learning
The Classroom Toolkit
Stimuli in our sensory register catch our attention.
Become interested
Why should students care?
Emotional valence. Our brains have a "pecking order" for stimuli; we first pay attention to stimuli with emotional valence.
Prime emotion
What will spark student interest?
Curiosity. After emotionally laden stimuli, our brains attend next to novel stimuli—the unexpected, incomplete, controversial, mysterious, or gaps in our knowledge.
Spark curiosity
We determine whether stimuli are worthy of further attention.
Commit to learning
What meaning will students find?
Meaning and purpose. Our limbic (emotional) system is more powerful than our prefrontal (logical) cortex; thus, we must "feel" like learning.
Give a why
What will motivate students to learn? What will connect them to the learning?
Connecting to personal interests and goals. Connecting learning to our own lives motivates and deepens learning. Students are more motivated to learn—and recall later what they've learned—when they set personal goals for learning.
Set learning goals
We focus on new knowledge and skills while they're in our working memory.
Focus on new learning
What do I need to show and tell students? How will I help students visualize key concepts?
Visual learning. Our brains process information more effectively when it's presented verbally and visually.
Support visual learning
What do I want students to think about?
Active engagement. The only way to keep knowledge in our working memory is to think about it—to actively engage with new knowledge or skills.
Engage in thoughtful learning
While new knowledge is in our working memory, we begin to cluster it and link it to prior learning.
Make sense of learning
How will I chunk learning and support information processing?
Pausing and processing. Our working memories are limited in how much information they can hold at once (7 ± 2 items) and how long they go before "timing out" (5-20 minutes) and needing to process learning.
Provide time to process
What themes, categories, sequences, or links to prior learning do students need to make with this learning?
Categorizing and clustering. Memories form in our brains as neural networks-as complex webs connecting ideas; in short, we learn by connecting new learning to prior learning.
Help students categorize knowledge
Repetition and retrieval help us store new learning in long-term memory.
Practice and reflect
What knowledge and skills must students commit to memory or automate?
Spaced and interleaved practice. New learning is more likely to be retained when practice is spaced and reflects "desirable difficulties."
Design and guide deep practice
What feedback will I provide to guide deep learning?
Reflecting on gaps in learning or skills. Repetition that adds new connections to learning along with "discrepancy reduction" enhances storage and retrieval.
Help students reflect on their learning
Applying new learning in novel, meaningful ways supports retrieval.
Extend and apply
What will I ask students to do with their knowledge?
Transferring and applying. Memory storage and retrieval are two different functions; we better retrieve learning when we transfer it to new ways and are more apt to transfer knowledge when we make our thinking visible.
Help students apply learning to new challenges
How will I (and my students) know they've mastered learning?
Building mental models for critical thinking. Mental models that integrate declarative and procedural knowledge are essential for deep learning and critical thinking skills. Assessments should engage students in applying mental models and demonstrating critical thinking.
Help students develop mental models and demonstrate deep learning
As noted earlier, the labels for each phase of the learning model reflect not what you're doing as a teacher but, rather, what's happening inside students' minds as they learn. Though this may seem like a slight mental and semantic shift, it carries profound significance for how we plan lessons and reflect on and respond to student successes and struggles, which is the essence of professionalism—being able to apply expert knowledge to diagnose and solve problems.
We imagine some readers, especially those familiar with Classroom Instruction That Works, may wonder how the research-based instructional strategies in that book map onto these phases of learning. Figure 1.3 provides such a link, aligning the 26 strategies from Classroom Instruction That Works with the six phases of the learning model offered here. You may note that some strategies, such as questions, align with multiple phases of the model—that's because when skillfully applied, they serve different roles in advancing learning.
Teacher Support
CITWStrategies
Help students set goals
Provide information
Support deeper processing
Support reflective practice
Support deeper learning and application
As you'll soon discover, these six phases are also based on the assumption that intrinsic motivation—not external punishments and rewards—is the true key to deeper learning. After all, when you think about it, all learning (with the possible exceptions of brainwashing or subliminal advertising) requires the learner to be a willing participant in the process. Although we can cajole, bribe, or bird-dog, we really cannot force anyone to learn anything. In the end, learning only occurs when the learner decides (or relents) to learning something.
Most of the meaningful things we learn in life, in fact—whether it's our native tongue, a hobby, or the lyrics of our favorite song—we learned because we saw the value of learning and often experienced joy in doing so. With that in mind, a key idea that runs through all six phases of learning is that intellectual curiosity—the need to explore, answer questions, and encounter new experiences—is the best companion to learning.
We'll start the next chapter with this idea—how curiosity can spark learning—and return to it repeatedly, showing you how to design learning experiences for your students that tap into what lies deep inside them: an innate desire to learn. In so doing, you'll be able to create learning experiences for your students that tap into and unleash their curiosity, making the entire process of learning easier and more joyful for both you and your students.
Subscribe to ASCD Express, our free email newsletter, to have practical, actionable strategies and information delivered to your email inbox twice a month.
ASCD respects intellectual property rights and adheres to the laws governing them. Learn more about our permissions policy and submit your request online.