HomepageISTEEdSurge
Skip to content
ascd logo

Log in to Witsby: ASCD’s Next-Generation Professional Learning and Credentialing Platform
Join ASCD
December 1, 1993
Vol. 51
No. 4

Why You Shouldn't Trust Creativity Tests

If creativity is task-specific, as studies suggest, let's stop using ineffective divergent-thinking tests to select students for gifted programs.

CurriculumAssessment
Every year, tens of thousands of students take divergent-thinking tests to help educators decide which students are most creative. The most frequently used tests are the Torrance Tests of Creative Thinking (Torrance 1966, 1990). Creativity/divergent-thinking assessment has become a major category of educational testing, and test scores are used widely to select students for gifted/talented programs. This is both a waste of educational resources and an unfair basis for making placement decisions.
Divergent thinking is the ability to generate many different responses to an open-ended question. “Imagine all the things that might change if students were allowed to run their own school” is a typical divergent thinking task. Responses are scored for fluency (sheer quantity of responses), flexibility (the number of different kinds of responses), originality, and degree of elaboration.
Divergent-thinking tests were once the most common measure of creativity in psychological and educational research, but their popularity among researchers is waning because of serious questions about validity (see, for example, Amabile 1983, Baer 1993, Gardner 1989, Torrance and Presbury 1984). Dozens of research reports over the years have shown that divergent-thinking test scores fail to predict real-world creativity. As several comprehensive reviews of the literature have pointed out, there is good reason to doubt the validity of all of the available divergent-thinking tests (Baer 1993, Crockenberg 1972, Kogan 1983).

The Search for a General Theory of Creativity

Research evidence, most of it fairly recent, also suggests that the problems facing creativity testing are far more serious than the quality of the testing device. Studies have shown that the cognitive abilities underlying creative performance differ from task to task (Baer 1991, 1992, 1993; Runco 1986, 1987, 1989). It is therefore impossible to test general creative-thinking skills simply because—so far as we can tell from available research— no such skills exist.
Why, in the face of such doubtful and disconfirming evidence, do we continue to test for creativity by giving students such tasks as: “Name as many uses as you can think of for a brick”? Why do we so desperately want to believe that some special kind of thinking leads to creative performance, no matter what the nature of the task? For one thing, a general theory of creativity would be both more efficient and more elegant than a host of task-specific theories, one for each kind of creativity-relevant task. And, second, from a practical perspective, a one-size-fits-all theory would make the testing and training of creativity much easier and more cost effective.
Psychologists working in fields other than creativity have also, and for similar reasons, been attracted to general theories. Consider, for example, the hard-to-eradicate appeal of IQ scores (for example, Jensen 1980, Snow and Lohman 1984), which lingers despite both (1) arguments dating back even to the originator of IQ testing, Alfred Binet, who pointed out that “intelligence is not a simple indivisible function” (Binet 1911/1962); and (2) nearly a century of accumulated psychometric evidence that single-factor theories cannot adequately represent human cognition (Gardner 1983; Sternberg 1988, 1990).
The search for all-encompassing theories has by no means been limited to educational psychology. A primary goal of physics is to find “a complete, consistent, unified theory that would include . . . all partial theories as approximations” (Hawking 1988). Einstein spent most of his later years searching for such a theory, and although he was unsuccessful, the search for a grand unified theory, or “Theory of Everything,” has captured the imagination of many of the world's foremost scientists (Davies 1984, Riordan and Schramm 1991).
Despite their popularity, however, general, context-free theories have been on the defensive in recent years in many fields. Feminist scholarship has challenged the hegemony of such models in the physical, natural, and social sciences (see Belenky et al. 1986, Fausto-Sterling 1991, Gilligan 1982, Gilligan and Attanucci 1988, Harding 1986, Nelson 1990). In the field of IQ testing, the idea of multiple intelligences (Gardner 1983, 1988; Sternberg 1990) is now widely accepted (although that battle is certainly not over).
It would be very nice, for both creativity theorists and for educators, if a single class of thinking skills—such as divergent thinking—could explain creativity across many task domains. Unfortunately, research evidence suggests that this is not the case.

A Task-Specific View of Creativity

In a recent series of studies, I asked subjects ranging in age from 7 to 40 to create poems, stories, collages, equations, and mathematical word problems (Baer 1991, 1992, 1993). Experts in their respective fields then judged these products for creativity, using a consensual assessment technique pioneered by Amabile (1983). The basic hypothesis was: If general-purpose, domain-transcending creative-thinking processes substantially contribute to creative performance on different tasks, then subjects who perform more creatively than their peers on one task should perform more creatively on other tasks in different domains. Conversely, low creativity on one task should predict low creativity on other tasks.
The results of these studies have consistently favored a task-specific view of the skills underlying creative performance. Analyses of the expert ratings of the products made by subjects of all ages indicate that creative performance on one task is not predictive of creative performance on other tasks, including those that might be considered to fall into the same domain, such as writing poetry and writing short stories. Scores on divergent-thinking tests also fail to predict creative performance on any of the tasks. These results challenge the existence of any general creative-thinking skill. They also argue against a domain-specific theory of creativity such as that proposed by Gardner (1988). Based on these studies, creativity-relevant skills appear to be quite narrowly applicable, perhaps of use only on specific tasks.
The claim that general creative-thinking skills do not exist is a strong one, with wide implications. It calls into question the divergent-thinking theory of creativity, of course, along with any theory (such as Mednick's [1962] associative theory) that posits some other general creative capacity. Such an assertion argues against the possible validity of any general test of creativity. And it casts doubt on the value of a variety of creativity-training programs, most of which include divergent thinking as a major component. If no general skills (like divergent thinking) influence creative performance across domains, how can we explain the apparent success of many programs that teach divergent thinking?
  1. Think of as many words as you can that begin with an 's' sound.
  2. Think of as many shapes as you can that could be made with eight toothpicks.
Clearly, divergent-thinking training does work: Students who receive such training produce more creative stories, poems, collages, and so on. But most of these programs use multiple contexts and a variety of task materials, and thus provide practice in a wide range of creativity-relevant skills. If we are interested only in a specific kind of creativity, we might target our teaching more narrowly. But if our goals are general, existing divergent-thinking programs do a fairly good job.

Teach—Don't Test—for Creativity

The prognosis for divergent-thinking testing, however, is not so rosy. Even if we want to assess creativity on a specific task, such as writing poetry, we do not know which divergent-thinking tasks are most important. More important, if we know the kind of creativity we care about, why not measure it more directly—by having students produce whatever kind of product we are interested in and assessing the creativity of these products?
Consensual assessment of creative performance is neither difficult nor mysterious. If your interest is in poetry writing, simply have your students write poems and give their poems to judges who know something about poetry. You don't need to use poets or other experts: Amabile has shown that with student work, the ratings of teachers and of poets (or, with collages, of artists) differ very little (1982, 1983). It is important that raters do not know whose products they are rating and that the same judges evaluate all products. It is also wise to have multiple raters when possible, and to have more than one sample of each student's work (that is, have students write two or three poems, or create several collages).
What if you can't narrow down your field of interest to a single task, like writing poetry? You could give several tasks, and get different ratings for different kinds of creativity. Or, better still, don't test for creativity at all. Creativity training is no less valuable for those students who show only modest talent than it is for those with an abundance.
It is certainly good that our schools care about creativity and are making special efforts to nurture it by teaching divergent-thinking skills. Why not skip the testing and provide this instruction to all students?
References

Amabile, T. M. (1982). “Social Psychology of Creativity: A Consensual Assessment Technique.” Journal of Personality and Social Psychology 43: 997–1013.

Amabile, T. M. (1983). The Social Psychology of Creativity. New York: Springer-Verlag.

Baer, J. (1991). “Generality of Creativity Across Performance Domains.” Creativity Research Journal 4: 23–39.

Baer, J. (August 1992). “Divergent Thinking Is Not a General Trait: A Multi-Domain Training Experiment.” Paper presented at the annual meeting of the American Psychological Association, Washington, D.C.

Baer, J. (1993). Creativity and Divergent Thinking: A Task-Specific Approach. Hillsdale, N.J.: Lawrence Erlbaum Associates.

Belenky, M. F., B. M. Clinchy, N. R. Goldberger, and J. M. Tarule. (1986). Women's Ways of Knowing: The Development of Self, Voice, and Mind. New York: Basic Books.

Binet, A. (1962). “The Nature and Measurement of Intelligence.” In Psychology in the Making: Histories of Selected Research Programs, edited by L. Postman. New York: Knopf. (Originally published, Paris: Flammarion, 1911.)

Crockenberg, S. B. (1972). “Creativity Tests: A Boon or Boondoggle for Education?” Review of Educational Research 42: 27–45.

Davies, P. (1984). Superforce: The Search for a Grand Unified Theory of Nature. New York: Simon and Schuster.

Fausto-Sterling, A. (1991). “Race, Gender, and Science.” Transformations 2, 2: 4–12.

Gardner, H. (1983). Frames of Mind: The Theory of Multiple Intelligences. New York: Basic Books.

Gardner, H. (1988). “Creative Lives and Creative Works: A Synthetic Scientific Approach.” In The Nature of Creativity, edited by R. J. Sternberg, pp. 298–321. New York: Cambridge University Press.

Gardner, H. (1989). To Open Minds. New York: Basic Books.

Gilligan, C. (1982). In a Different Voice: Psychological Theory and Women's Development. Cambridge, Mass.: Harvard University Press.

Gilligan, C., and J. Attanucci. (1988). “Two Moral Orientations.” In Mapping the Moral Domain, edited by C. Gilligan, J. V. Ward, and J. M. Taylor, pp. 73–86. Cambridge, Mass.: Harvard University Press.

Harding, S. (1986). The Science Question in Feminism. Ithaca, N.Y.: Cornell University Press.

Hawking, S. W. (1988). A Brief History of Time: From the Big Bang to Black Holes. New York: Bantam.

Jensen, A. R. (1980). Bias in Mental Testing. New York: Free Press.

Kogan, N. (1983). “Stylistic Variation in Childhood and Adolescence: Creativity, Metaphor, and Cognitive Styles.” In Handbook of Child Psychology: Vol 3. Cognitive Development, 4th ed., edited by P. H. Mussen, pp. 628–706. New York: John Wiley and Sons.

Mednick, S. A. (1962). “The Associative Basis of the Creative Process.” Psychological Review 69: 220–232.

Nelson, L. H. (1990). Who Knows: From Quine to a Feminist Empiricism. Philadelphia: Temple University Press.

Riordan, M., and D. N. Schramm. (1991). The Shadows of Creation: Dark Matter and the Structure of the Universe. New York: W. H. Freeman.

Runco, M. A. (1986). “Divergent Thinking and Creative Performance in Gifted and Nongifted Children.” Educational and Psychological Measurement 46: 375–384.

Runco, M. A. (1987). “The Generality of Creative Performance in Gifted and Nongifted Children.” Gifted Child Quarterly 31: 121–125.

Runco, M. A. (1989). “The Creativity of Children's Art.” Child Study Journal 19: 177–190.

Snow, R. E., and D. F. Lohman. (1984). “Toward a Theory of Cognitive Aptitude for Learning from Instruction.” Journal of Educational Psychology 76: 347–376.

Sternberg, R. J. (1988). “Intelligence.” In The Psychology of Human Thought, edited by R. J. Sternberg and E. E. Smith, pp. 267–308. New York: Cambridge University Press.

Sternberg, R. J. (1990). Metaphors of Mind. New York: Cambridge University Press.

Torrance, E. P. (1966). The Torrance Tests of Creative Thinking: Norms-Technical Manual. Lexington, Mass.: Personal Press.

Torrance, E. P. (1990). The Torrance Tests of Creative Thinking: Norms-Technical Manual. Bensenville, Ill.: Scholastic Testing Service.

Torrance, E. P., and J. Presbury. (1984). “The Criteria of Success Used in 242 Recent Experimental Studies of Creativity.” Creative Child and Adult Quarterly 9: 238–243.

John Baer has been a contributor to Educational Leadership.

Learn More

ASCD is a community dedicated to educators' professional growth and well-being.

Let us help you put your vision into action.
Related Articles
View all
undefined
Curriculum
Bringing a Bold Voice to Mathematics
Rebecca Koenig
3 weeks ago

undefined
We Can All Teach Climate Change
Mark Windschitl
3 weeks ago

undefined
Tell Us About
Educational Leadership Staff
3 weeks ago

undefined
STEM Instruction Can Be a Change Agent!
Natalie Odom Pough
3 weeks ago

undefined
EL Takeaways
Educational Leadership Staff
5 months ago
Related Articles
Bringing a Bold Voice to Mathematics
Rebecca Koenig
3 weeks ago

We Can All Teach Climate Change
Mark Windschitl
3 weeks ago

Tell Us About
Educational Leadership Staff
3 weeks ago

STEM Instruction Can Be a Change Agent!
Natalie Odom Pough
3 weeks ago

EL Takeaways
Educational Leadership Staff
5 months ago
From our issue
Product cover image 61193176.jpg
Can Public Schools Accommodate Christian Fundamentalists?
Go To Publication