1703 North Beauregard St.
Alexandria, VA 22311-1714
Tel: 1-800-933-ASCD (2723)
8:00 a.m. to 6:00 p.m. eastern time, Monday through Friday
Local to the D.C. area: 1-703-578-9600
Toll-free from U.S. and Canada: 1-800-933-ASCD (2723)
All other countries: (International Access Code) + 1-703-578-9600
by Eric Jensen
Table of Contents
You've heard for much of your life that the human brain is amazing. It's true. That soft, squishy blob between your ears—the blob that runs your life—is pretty amazing. Every day in classrooms around the world, teachers are amazed by what the human brain can do. Because exploring all the facets of the brain is beyond the scope of this chapter, we'll focus on three relevant and essential features:
These themes help to establish the nature of the brain: it is constantly working; it operates with a high level of structural cooperation; and seemingly simple processes, like learning to read, are actually highly complex. This dynamic and versatile structure is unlike anything else on earth. That may be why we are so attracted to the study of the brain—it evokes both wonder and curiosity. At the simplest level, the brain is an organ that we are all born with, and we'll explore that concept first. But the brain is much more than an anatomical structure; it is also an active processing center, always at work.
To begin learning about the brain, consider a grocery store's produce and dairy departments. In shape, the brain closely resembles a head of cauliflower. In size, it's similar to a large grapefruit or cantaloupe (see Figure 1.1). The brain is mostly water (78 percent), fat (10 percent), and protein (8 percent). From the outside, the brain's most distinguishing features are its convolutions, or folds. The wrinkles are part of the cerebral cortex(Latin for “bark” or “rind”), the brain's outer covering. The cerebral cortex is about as thick as an orange peel. The folds allow the covering to maximize its surface area (have more cells per square inch). In fact, if the cortex were laid out flat, it would be about the size of an unfolded, single page from a daily newspaper. Remember, the brain is only a grapefruit-sized organ. It's general texture is about the same as soft butter, but some parts are as gooey as raw eggs or yogurt.
Brains have both neurons and glial cells (see Figure 1.2). The most well-studied brain cells are neurons, which consist of a cell body with fingerlike input extensions, called dendrites, and a single output, called an axon. Neurons have different shapes depending on the part of the brain they're in and their function. There are many types of glial cells, each with different functions. Recently, scientists have discovered that glia—also known as interneurons—are not, as once thought, just a “support” or “housekeeping” cell, but are quite important in brain development, function, and growth.
Estimates vary on the actual number of neurons and glia in the human brain. One researcher who has done detailed studies in this area, William Shankle of the University of California-Irvine, asserts the human brain has about 30 to 50 billion neurons. His studies (Landing, Shankle, Hara, Brannock, & Fallon, 2002) also show a 20 to 40 percent variance among humans, meaning the real numbers vary by billionsfrom one person to another. No wonder differentiation in teaching makes sense!
A more mainstream view is that we're born with about 150 to 200 billion neuron cells and keep about 100 billion of them. (The rest disappear for various reasons, as explained later.) By the time we're adults, we also have about 500 billion to 1,000 billion glial cells. For the sake of comparison, a fruit fly has 100,000 neurons, a mouse has 5 million, and a monkey has 10 billion. A single cubic millimeter (1/16,000th of an inch) of human brain tissue has more than 1 million neurons.
Humans have large brains relative to body weight. The adult human brain weighs about three pounds (1,300–1,400 grams). But would a bigger brain make you smarter? That's unlikely. A sperm whale's brain weighs about 17 pounds, or 7,800 grams.
The brain's various parts and its nerve cells are connected by nearly 1 million miles of nerve fibers. The human brain has the largest area of uncommitted cortex (with no specific function identified so far) of any species on earth. This gives humans extraordinary flexibility for learning.
Scientists divide brain areas into lobes (see Figure 1.3). The occipital lobe is in the middle-back area of the brain, and it's primarily responsible for vision. The temporal lobes are located above and around the ears on the left and right sides of the brain. These areas are primarily responsible for hearing, memory, and language. Connect visual areas to language areas, and you can “see” what you hear and say. That's part of the essence of reading: high visual-auditory connectivity. The frontal lobeis the area around your forehead. It's involved with purposeful activities like judgment, creativity, problem solving, and planning. It also holds short-term memory so you can juggle two or more thoughts at once. The parietal lobe is at the top and back areas of your head. Its duties include processing higher sensory and language functions. It also has a cool tie-in with the Sci Fi Channel in that it's highly active in subjects who claim to have seen hallucinations or UFOs, or have had “near death” experiences.
The territory in the middle of the brain includes the hippocampus, thalamus, hypothalamus, cingulate, basal ganglia, fornix, striatum, and amygdala (see Figure 1.4). You could call this area both the chemistry lab and the drama department of the brain. Sometimes known as the limbic system, it represents 20 percent of the brain by volume and is partly responsible for emotions, sleep, attention, body regulation, hormones, sexuality, sense of smell, and production of many brain chemicals. However, noted neuroscientist Joseph LeDoux (1996) contends that there is no real “limbic system,” only specific structures that process emotion, such as the amygdala. In either case, this middle area of the brain, along with the parts of the cortex, helps you feel what you feel about the world.
The location of the brain area that allows you to know that you are “you” (consciousness) is disputed. It may be dispersed throughout the cortex, or it may be in the thalamus, or it may be located near the reticular formation, a structure atop the brain stem (Crick, 1994). You'd think that this part of the brain would be easy to find—just cut away brain areas until a person loses awareness, right? But it's not just a simple case of Jack the Ripper meets the Nutty Professor. Remember, the second essential feature of the brain is integration, or strong connectivity. That means many areas connect to and influence other portions, so that specific sections of the brain may contribute separately and collectively to your sense of self. In short, one critical quality that makes the brain work so well is its degree of connectivity, not its individual structures.
Not long ago, the prevailing view of the brain was that it remained fairly constant throughout a person's life. We knew that the brain was smaller in childhood; once it reached maturity, we thought it remained more or less stable over many years before beginning to deteriorate somewhat with age. This view of a “static” brain is decidedly out of date. Yes, the most amazing new discovery about the brain might be that human beings have the capacity and the choice to be able to change our own brains.
It's now understood that environmental events at one level of an organism (molecules, cells, organs, systems, individual behavior, society) can profoundly influence events at other levels (Cacioppo, Berntson, Sheridan, & McClintock, 2001). This finding suggests that your experiences and the actions you take can lead to changes in your brain. These changes, in turn, change you. We also know that your life influences your genes at the same time that your genes regulate your life. Researchers have found evidence of social influence on both genetic constitution (Reik, Dean, & Walter, 2001; Wilson & Grim, 1991) and genetic expression (Suomi, 1999)—meaning the substance of the genes and how the genes function. New evidence suggests that environmental triggers, even things like stress (Foster & Cairns, 1994), can “reprogram” our genes. In short, we can and do influence our own genetic material; this is a profound revelation!
The result of the various interrelation of humans shaping environments and environments shaping humans is that there is no fixed human brain; it is always a work in progress. Another way to put it is that your brain is dynamic and constantly changing as a result of the world you live in and the life you lead. Whether you are 2 or 92, your brain is a cauldron of changing chemicals, electrical activity, cell growth, cell death, connectivity, and change.
This dynamism makes it very challenging to get clear data on what's happening in the brain. From birth to the teenage years, the brain undergoes a fourfold increase in volume (Johnson, 2001). Infants are born with roughly a trillion connections (synapses) already in place. The infant's interaction with his or her environment helps create many additional connections within the cortex. At the same time, the genetic process called “pruning” eliminates countless unnecessary connections. Throughout life, your brain is losing connections at the same time it is creating new connections. It's a bit like going out shopping for new clothes at the same time that someone is raiding your closet back at home. This ongoing refinement results in a highly adapted, highly specialized brain (see Figure 1.5).
Source: Based on data from Huttenlocher & Dabholkar (1997) and Bourgeois (2001).
Longtime neuroscience dogma held that the mammalian brain couldn't grow new brain cells, and mainstream science was absolutely certain that new brain cell growth (neurogenesis) was impossible in the human brain. However, the ground-breaking research of Kempermann, Kuhn, and Gage (1998) showed not only that humans do grow new neurons, but also that these new cells survive and become functional and integrated. Just as important, a follow-up study (Van Praag et al., 1999) found that humans can influence the rate of new brain cell growth. In fact, researchers have identified more than 15 factors that either enhance or impair neurogenesis. Again, the complexity of the brain comes into play. Although factors such as excess stress can inhibit growth, exercise can encourage it, as we'll see in later chapters. All of this paints a complex picture of what exactly you have in your brain at any particular moment.
Inside your brain, cells are being eliminated at the same time new cells are being born. You lose some brain cells every day through attrition, decay, and disuse, and we know that certain behaviors affect the loss of brain cells. For example, although there's no evidence that an occasional glass of wine or beer destroys brain cells, it's clear that alcoholism does substantial damage (Eckardt, Rohrbaugh, Rio, Rawlings, & Capola, 1998). Scientists differ on what your daily net gain or loss in brain cells might be. But even if you lose a half-million neurons per day, it would take centuries to literally lose your mind.
Some of the most interesting recent research on the brain's adaptability shows how activities can influence the actual mass and organization of the brain. For example, playing a musical instrument consistently over time can literally remap the brain's “real estate.” It's as if there's a big “Texas land grab” going on. Neuroscientist Arnold Scheibel of UCLA did an autopsy on a renowned violinist and found that the area of the brain responsible for hearing reception (layer four, auditory cortex) was twice as thick as normal (Diamond & Hopson, 1998). Michael Kilgard found that areas of the auditory cortex increased in size with specific auditory trainings over time (Kilgard & Merzenich, 1998). It's as if the brain said, “We need more space for what you're doing. We'll just use this nearby spot.” Another study found that the cerebellum, the brain structure that contains almost half of the brain's neurons and that is also involved in keeping beat and rhythm, was 5 percent larger in musicians than in the general population (Gaser & Schlaug, 2003; Hutchinson, Lee, Gaab, & Schlaug, 2003). These studies and others provide evidence that many years of specific fine-motor exercise prompts brain reorganization and nerve growth.
What's truly amazing is that this constant reorganization of the brain is always purposeful—driven not by a mysterious signal but by real-life use and disuse. The brain has no single command center; it's a system of systems governed by life experience and by complex processes, which appear to be both variable and fixed, random and precise. Your constantly changing brain is shifting your moods, your thinking, and your actions through countless electrical and chemical changes. Each of these changes results in a shifting state of mind.
In summary, the brain is a dynamic, opportunistic, pattern-forming, self-organized system of systems. That's a mouthful. It's also mind-boggling. So why is this new view of the brain so important to you, as a teacher? Because it reinforces that every student in your classroom has the capacity for change. Yes, genetics plays a part in who students are and how they behave and reason, but each of them can change. Even your most frustrating student can improve. Now that should be the best news you've gotten all day.
How does your brain cooperate with itself? Brain cells are “connected” to other brain cells by physical structures such as axons, which are extensions sent out by neurons. Brain areas and structures can communicate via glial cells too. And certainly the bloodstream creates a common network, circulating brain chemicals known as neurotransmitters(e.g., serotonin, dopamine, and acetylchoine) and hormones known as neuromodulators (e.g., cortisol and adrenaline). Information is also communicated through the immune system and “messenger molecules” known as peptides. It's fair to say that very little happens in one part of the brain without some kind of potential effect in other areas. It's just a matter of degree.
The two sides of the brain, the left and right cerebral hemispheres, are connected by bundles of nerve fibers. The corpus callosum (see Figure 1.6) is the largest of these connective pathways, with about 250 million nerve fibers. In healthy brains, this interhemispheric highway allows each side of the brain to exchange information freely. Patients whose corpus callosum has been severed can still function in society, but suffer an inability to integrate certain brain functions. For example, a subject who is shown an apple in his right field of vision might know what it is, but not be able to come up with the correct name for it. Switch the apple to the right field of vision, and the subject might be able name it correctly, but not be able to explain what an “apple” is.
Although each side of the brain processes things differently, some earlier assumptions about the “left” and “right” brain—that the left brain is “logical” and the right brain is “creative”—are outdated. In general, the left hemisphere tends to process information in parts, in a sequence, and using language and text representations. But none of these tendencies guarantees that the left brain will be logical. If a learner sequences words and then assembles the parts of sentences, there's no guarantee that the written material is logical. Any high school English teacher will confirm this. The use of logic is not a given; it's a learned, highly complex, contextually based, and rule-generated subskill that probably uses many brain areas. Again speaking generally, the right hemisphere tends to process information as a whole, in random order, and within a spatial context. But, like the left-brain tendencies, none of these tendencies guarantees that the right brain will be creative. Creativity can be either more right- or more left-hemisphere dominant. Logic can be either more left- or more right-hemisphere dominant.
For all these reasons, it's best to avoid the labels of “left-brain” and “right-brain” thinking. Clearly, some people do prefer linear processing and others do prefer randomness. But that's all it is—a preference. And there's no scientific support for music and arts being “right-brained frills” (Jensen, 2000). Many of the greatest scientific and mathematical discoveries of the last 500 years fit the qualities of both right-hemisphere processing (random, focused on the whole, having a spatial context) and left-hemisphere processing (sequential, focused on the parts, relying on language).
Recent discoveries in cognitive neuroscience have shown many nuances in the left- and right-brain preferences. Trained musicians process music more in their left hemisphere, while novice musicians process it more in the right hemisphere. Why? The brain of a more-experienced musician is trained to recognize the elemental parts of music more than a beginner's brain. Among left-handed people, almost half use their right (not left) hemisphere for language. And here's something odd: those chess players who battle IBM's “Big Blue” computer for big bucks have more activity in their right (not left) hemisphere during their games. But beginning chess players usually have more activity in the left hemisphere.
Richard Davidson (1992) at the Laboratory for Affective Neuroscience at the University of Wisconsin has shown that the right hemisphere is activated with negative emotions and the left hemisphere is activated with positive emotions. People with more left-hemisphere activations tend to be happier and more positive than those with a right-hemisphere dominance. We also know that the left hemisphere controls movements on the right side of the body, and vice versa.
As you may have guessed, it would be difficult to have a left- or right-brained school. Although a teacher could structure an activity so that it was hemisphere-dependent, on most typical schooldays, students use both sides of the brain. Let's put aside the notion of right brain versus left brain and move on.
“Competition within the brain” sounds a little like malfunction to be corrected. Actually, the brain has a problem to solve. Because humans have so much uncommitted brain tissue at birth (proportionally more than any other species), our brains have an extraordinary opportunity to become customized by life experiences. Put another way, the human brain has a great deal of uncommitted postnatal “real estate.” These undeveloped brain areas are waiting for signals from the environment to tell them whether they should “set up camp” or wait for further signals. The competition concept is simple: whatever is first, whatever activities are more frequent, and whatever actions are more coherent will “win” the competition for network wiring and signal the brain to allocate space and resources to that set of behaviors.
Although there are many examples we could look to for an illustration of the brain's complexity, it's the learning process that we want to focus on. At the most general level, the brain processes for learning are deceptively simple (see Figure 1.7). Input to the brain arrives from the five senses or is generated internally through imagination or reflection. This input is initially processed in the thalamus, but it's also routed simultaneously to other specific areas for further processing. Visual information is routed to the occipital lobe, language to the temporal lobe, and so on. Quickly, the brain forms a rough sensory impression of the incoming data. If any of the data are threatening or suspicious, the amygdala (the “uncertainty activator”) is activated. It will jump-start the rest of the sympathetic nervous system—the part of the nervous system that helps us deal with emergencies—and enable a quick response.
Typically, however, the frontal lobes hold much of the new data in short-term memory for 5 to 20 seconds. Most of the new information is filtered, dismissed, and never gets stored. It may be irrelevant, trivial, or not compelling enough. If it's worth a second consideration, new explicit learning is routed to and held in the hippocampus. There the information is processed further to determine its value. If the new learning is deemed important, it will be organized and indexed by the hippocampus and later stored in the cortex. In fact, it will be stored in the same lobe that originally processed it—visual information in the occipital lobe, language in the temporal lobe, and so on. The original processing takes place at lightning speed, but the subsequent stages and storage process can take hours, days, or even weeks. To better appreciate the brain's complexity, let's take a closer look at learning.
Have you ever fallen in love? The mechanics of learning are a bit like human relationships. Initially, there's some attraction. Early on, dating is more effortful, with one person often trying harder to “make it happen.” Either there are some “sparks” or there aren't. If the sparks don't reach the threshold needed to continue, the dates are no fun and the two people go their separate ways. If the dating goes well and becomes more intense, it may become exclusive. The couple may decide to become engaged and get married. The relationship deepens. Whereas early on in a relationship little things were often misinterpreted, at some point, the relationship is close enough so that a kind word, a smile, or a touch goes a long way toward saying “I love you.” We could say that the relationship has matured. Less contact goes further, whereas early in the relationship it took more contact to get the same partner reaction. So, what do attraction, lust, love, consummation, and maturity have to do with the brain and how we learn?
First, it's important to know that humans learn in many ways, including through sensitization, habituation, conditioned responses, semantic learning, imitation, and by doing. Many of these processes are not well understood. For the most part, long-term potentiation (LTP) has been accepted as the physical process of learning. The foundation for LTP was built on the work originally done by Donald Hebb in 1949. Since LTP was first described in 1973 (Bliss & Lomo, 1973), countless experiments have explored this process of memory formation. LTP means a neuron's response to another neuron has been increased. It has “learned” to respond. Each future event requires less work to activate the same memory network.
Briefly, the process goes like this. The units in the brain that are largely responsible for information processing and storage are the neurons and the glia. The brain has at least two dozen types of neurons. As mentioned on page 8 (and illustrated in Figure 1.8), neurons have a cell body, a tail-like extension called an axon, and branchlike structures called dendrites. The junction between two connected neurons is called a synapse. Neurons use both chemical and electrical signals for processing. Each brain cell acts as a tiny electrical battery. A normally functioning neuron is continuously firing, integrating, and generating information; it's a virtual hotbed of activity. The connectivity is powered by the electrical-to-chemical-to-electrical activity within each nerve cell. Information flow in the cortex always goes in two directions. Receiving neurons “talk back” to the neurons that are providing the information. This “dialogue” produces a large amount of internal feedback for error correction.
The electrical charge is generated by the difference in concentration of sodium and potassium ions across the cell membrane of each nerve cell. Neurotransmitters are chemicals stored in the ends of the neuron's axon, which nearly touch the dendrites of another cell (see Figure 1.9). Typically, the neurotransmitters are either excitatory (glutamate is the most common) or inhibitory (an example is GABA, or gamma-aminobutyric acid). Glutamate is highly excitatory—something like zoo monkeys teased by a hyperactive class of 4th grade boys. At first, the monkeys may simply ignore the visitors, but with just enough activation, all heck breaks loose. The sum total of all the neurotransmitters arriving from all the dendrites to the cell body at any moment determines whether or not that cell will, in fact, fire. The electrical discharge that comes down the axon stimulates the release of that final “oomph” of stored glutamate into the synaptic gap—the “playing field” or “common activity area” defined by the area just outside the end of the outputting axon and just outside the surface of a receiving dendrite—and a glutamate threshold is reached. This “climax” in the synapse releases neurotransmitters such as serotonin and dopamine into the synaptic gap.
Once chemicals have been released into the synaptic gap, a chemical reaction triggers (or inhibits, depending on which chemical is involved) a new electrical reaction in the receptors of the contacted dendrite. Thus, the process is electrical to chemical and back to electrical. The process is repeated as it moves on to the next cell. But it's also important to suppress unwanted neural firings. Long-term depression (LTD) occurs when a synapse is altered so that it is less likely to fire and it's promoted by making the wrong connection less likely and eliminating possible “false positives.” This occurs when you make mistakes and then learn from them. A good example is trial-and-error learning (Siegfried, 1997). So learning is not just about being able to “throw the switch” on the right neurons. You also have to be able to shut down other neurons. Learning involves both excitatory and inhibitory processes.
When learning occurs, specific neurons connect and form a “junction box” at the synapse. When we say cells “connect” with other cells, we really mean that they are in such close proximity that the synapse is easily, almost effortlessly, “used” over and over again; the cells have changed their receptivity to messages based on previous stimulation and have “learned.” In short, learning happens at a micro level through the alteration of synaptic efficacy. Excited cells will excite other nearby cells. Technically, a specific type of contact occurs between an axon and a dendrite. A process known as synaptic adhesion helps “bind” the two together (Goda & Davis, 2003) in close proximity with protein strands. Without these strands, the axon and dendrite would drift apart.
To understand what happens beyond the micro, cell-to-cell level, consider this analogy. Individual students may have a small influence on a school, but assemblies of students (clubs, sports teams, special interest groups) can change the school's entire nature. Likewise, the brain multiplies the individual cell-to-cell learning process by thousands, even millions. These network codes are robust; damage to one neuron will not damage the entire “coded” network (Pouget, Dayan, & Zemel, 2000). The brain has what we call population codes or neural networks—entire “forests” of neurons signaling other neurons, many with massive proliferations of dendrites. An individual cell may be connected, through its synapses, to tens of thousands of other cells. At the simplest level, learning may seem microscopic, but each neuron plays its part in larger assemblies of cell networks. Inside the brain, several conditions show that learning has taken place:
The repeated mention of “synapses” may lead you to think they are the holy grail in learning. Although synapses are certainly key players, learning is far more complex. No causal relationship exists between the number of neurons and either learning or intelligence. Researchers also now know that learning is not simply “stored” at the synapse. If that were the case, activation of a particular synapse would always activate a particular memory. Other factors come into play, and the brain's enormous sophistication begins to reveal itself. Even with the learning stored properly, only the right “state activations” (meaning the right neuronal assemblies) and the appropriate chemical mix will retrieve the learning.
Whole-body “states” activate these networks. When you're in a clear-thinking, level-headed frame of mind (a good state for learning), you learn and recall more than when you're depressed, tired, or angry. This conclusion seems straightforward, doesn't it? We'll learn more about states and learning in later chapters.
So what should we do with our knowledge about the brain? Is it useless theory? Just trivia? Not for the professional educator. As long as we are in the business of learning, the brain is relevant. Many studies present enough clear and solid information to be transformed into classroom practice. In Minds, Brains, and Learning, Byrnes (2001) suggests that any ideas from neuroscience that we want to implement should be integrated and consistent with other models from psychology and behavioral sciences. This is a good approach. Many of the studies cited in this book are multidisciplinary.
It's also a good idea to share information with your students about how their brains learn and work. Talk about how their lives influence their brains' adaptability. Help them make connections. And acknowledge the complexity of the brain by allowing a wider range of what we call learning. To paraphrase Einstein, today's problems cannot be solved with yesterday's thinking. Allowing learners to think outside the box is a good occasional strategy. Talk to interested parents about the brain too.
The following chapters present many solutions to everyday problems in teaching and learning. But be prepared: there also will be many questions.
Copyright © 2005 by Association for Supervision and Curriculum Development. All rights reserved.
No part of this publication—including the drawings, graphs, illustrations, or chapters, except for brief quotations in
critical reviews or articles—may be reproduced or transmitted in any form or by any means, electronic or mechanical,
including photocopy, recording, or any information storage and retrieval system, without permission from ASCD.
Subscribe to ASCD Express, our free e-mail newsletter, to have practical, actionable strategies and information delivered to your e-mail inbox twice a month.
ASCD respects intellectual property rights and adheres to the laws governing them. Learn more about our permissions policy and submit your request online.