HomepageISTEEdSurge
Skip to content
ascd logo

Log in to Witsby: ASCD’s Next-Generation Professional Learning and Credentialing Platform
Join ASCD
November 1, 2007
Vol. 65
No. 3

Ask About Accountability / A Less-Than-Savvy Answer

author avatar

      As a newcomer to the field of Q & A columns, I've found myself glancing at similar columns in newspapers and magazines to see how other answer-people deal with their question-people. Well, I recently learned a lesson.
      My wife sometimes reads Parade, the skinny magazine that comes each week with our Sunday newspaper but that usually gets lost among an assortment of multicolored advertisements printed on similar paper stock. If you're actually able to retrieve a copy from its surrounding camouflage, you're almost obligated to ponder a page or two. Which brings me to the Sunday, September 30, 2007, issue. My wife dropped it on my desk with a corner turned down and a note that read, “Of possible interest.” She had circled a column called Ask Marilynby Marilyn vos Savant.
      In that column, Marilyn had answered readers' queries about such issues as the difference in weight between gallons of water versus gallons of vegetable oil and the reason mosquito bites itch. But it was the initial question that caught my interest because it dealt with standardized tests.
      The question was from a student—E. Rose Agger from Silver Spring, Maryland. It started off with the statement, “I hate standardized tests.” The student went on to say that her mother had told her that college admissions people needed something objective to distinguish among prospective students. E. Rose then asked Marilyn what she thought schools should do regarding such tests.
      Marilyn replied by saying that standardized test scores should influence college admission decisions in only two cases, namely, when scores are high or low. She added, “These scores are useful as proof that students will or won't be effective in their studies.” Marilyn wrapped up her response by asserting that for scores in the middle, where most scores reside, “the tests prove very little.”
      Although Marilyn's responses to the questions about gallons of water versus gallons of vegetable oil and why mosquito bites itch seemed valid, this one was not. Let's assume that this student, because she mentions “admissions people,” is interested in college entrance exams such as the SAT and ACT. The first thing that Marilyn should have done was distinguish between standardized college-admission aptitude tests and the sorts of standardized achievementtests that today's students are required to take as part of state or federal accountability programs. Achievement tests, such as the Iowa Tests of Basic Skills or high school graduation tests, are intended to show us what students know; aptitude tests, such as the SAT and ACT, are intended to predict such things as a test taker's future grades in college. Not only did Marilyn fail to clarify this important distinction, but also her assertion that admission test scores are “proof” students will or won't succeed in college is flat-out false.
      Admission tests are indeed helpful in predicting how well a student is apt to do in college—but this information is much less useful than most people think. A student's scores on admission tests can accurately predict about 25 percent of the grades he or she will earn in college. Fully 75 percent of college grades are attributable to nontest factors, such as a student's motivation, effort, and study habits. So when Marilyn suggests to E. Rose that test scores tell the tale for high scorers or low scorers, this is simply not so.
      If E. Rose scores very low on a college admission test, she may believe, on the basis of Marilyn's advice, that she can't succeed in college. That's not true. Conversely, if E. Rose happens to score very high on her college admission test, she may think her success in college is guaranteed. That, too, is not true.
      We need to help students—and their parents as well, judging from E. Rose's comment—understand that although high or low scores on college entrance tests may be somewhat predictive of how well students are likely to do in college, it's what a student actually does after arriving on campus that governs college success. A student'seffort, not test scores, makes most of the difference.
      I rated Marilyn vos Savant's advice in this matter to be poor. Nevertheless, wanting to make sure that I wasn't experiencing some sort of rarely studied Dear Abby Envy Syndrome (DAES), I looked Marilyn up on the Internet and discovered that she is an “advice columnist” (I guess I'm now one, too). I was surprised to learn that about 40 advice columnists are listed and that an online rating service evaluates these folks. I'm sorry to say that Marilyn garnered only a 3.4 out of a possible top rating of 5.0. In all fairness, her rating was almost a full point higher than Dear Abby's. Topping the crop was an advice columnist named Dotti Primrose, who earned a perfect rating of 5.0. Surely her name was a factor.
      But back to E. Rose. She's not the only one confused about standardized tests. Although today's schools are being judged almost exclusively on the basis of students' test scores, too few citizens understand even the basics of educational testing. We need to educate not only educators but also the readers ofParade about the fundamentals of educational assessment.
      About that lesson I learned. It's this: Advice columnists need to stick to what they know. In my dictionary, a savant is “a person of extensive learning.” When it comes to educational testing, her last name notwithstanding, Marilyn is definitely no savant.

      James Popham is Emeritus Professor in the UCLA Graduate School of Education and Information Studies. At UCLA he won several distinguished teaching awards, and in January 2000, he was recognized by UCLA Today as one of UCLA's top 20 professors of the 20th century.

      Popham is a former president of the American Educational Research Association (AERA) and the founding editor of Educational Evaluation and Policy Analysis, an AERA quarterly journal.

      He has spent most of his career as a teacher and is the author of more than 30 books, 200 journal articles, 50 research reports, and nearly 200 papers presented before research societies. His areas of focus include student assessment and educational evaluation. One of his recent books is Assessment Literacy for Educators in a Hurry.

      Learn More

      ASCD is a community dedicated to educators' professional growth and well-being.

      Let us help you put your vision into action.
      From our issue
      Product cover image 108022.jpg
      Making Math Count
      Go To Publication