HomepageISTEEdSurge
Skip to content
ascd logo

Log in to Witsby: ASCD’s Next-Generation Professional Learning and Credentialing Platform
Join ASCD
December 1, 2008
Vol. 66
No. 4

Perspectives / Driven Dumb by Data?

    premium resources logo

    Premium Resource

      Are the shrink-wrapped packs of data still sitting on your desk where they've been languishing since last April? Given anonymity, would you admit to infrequent use of your district's multi-million-dollar data warehouse system? If so, you are not alone, as Doug Reeves (p. 89) concludes from his survey of educators about their data use.
      On the other hand, perhaps your district is among those that are enthusiastically collecting and delving into data, using your students' scores on standardized tests to make judgments about student, teacher, and school performance.
      What's wrong with the two pictures? Education researcher Frederick M. Hess (p. 12) might consider it this way: The first scenario shows educators exhibiting "the old stupid," and the second scenario shows educators quite possibly acting out "the new stupid." Hess writes:Today's enthusiastic embrace of data has waltzed us directly from a petulant resistance to performance measures to a reflexive and unsophisticated reliance on a few simple metrics—namely graduation rates, expenditures, and the reading and math test scores of students in grades 3 through 8. The result has been a nifty pirouette from one troubling mind-set to another; with nary a misstep, we have pivoted from the "old stupid" to the "new stupid." (p. 12)
      This issue of Educational Leadership aims to provide guidance about how schools can use data to inform decision making without succumbing to either extreme.
      Focus on questions, not data. "Having a clearly focused question will avoid the tedious and time-wasting exercise of trolling through spreadsheets and databases without direction," Doug Reeves writes. David Ronka and colleagues (p. 18) suggest examples of essential questions: How do absences from school affect assessment results? What are the characteristics of students who achieve proficiency and of those who do not? Where are we making the most progress in closing achievement gaps? A few crucial questions tailored to local needs should help schools cull the data pertinent to the heart of what they need to know.
      Be skeptical of easy answers. Hess reminds us that schools must actively seek out the kind of data they need. Achievement data are useful, but they are not helpful for all purposes. The randomized field trial model in medicine appropriately guides recommendations about interventions for medical conditions, but it is less useful when determining how much to pay nurses or how to hold hospitals accountable. Likewise, we should not expect achievement data to resolve operational concerns nor research to clarify thorny ideological policy disputes in education.
      Become assessment literate. Paul E. Barton and Richard J. Coley (p. 30) note that closing the achievement gap is no small endeavor. Focusing on a single data source—for example, an arbitrary proficiency level or cut point—can lead schools to focus on a single group of students who hover around this point and ignore those far above or below that point. Closing achievement gaps will require a more complex understanding of different measures and a panoramic view of measures over time.
      Think beyond test scores. Output data are not the only kind of data. A recent Rand report mentions that schools too rarely gather and study nontest data, including observation data on instruction and reform implementation, results from stakeholder satisfaction surveys, and reviews of student work. A number of articles in this issue (pp. 26, 35, and 65) describe how schools that surveyed students, teachers, parents, and community members used their findings about values and attitudes to improve everything from grading practices to school climate. Their experiences remind us that schooling has multiple goals, not just raising achievement to cut points.
      Use informed judgment. Jeffrey R. Henig (p. 6) notes that social science research percolates slowly. Examined carefully, collective studies eventually lead us to insights about good practice. The misuse of research to advance ideological agendas has not helped us sort out the sound evidence from the hype, however. And neither has our policymakers' desire to find definitive and universal answers. Yet, Henig sees many hopeful signs about education research, among them that education researchers are learning from other disciplines about improving research design and that states are providing a higher quality of data. If we keep in mind the fundamental limitations of data—for example, that test scores are at best an approximation of learning and that context and implementation do matter—we are much more likely to discern reliable findings.
      We hope you find this issue full of sound ideas for data-driven decision making that steer you clear of the stupids, both old and new, and guide you into smart data-driven decision making.
      End Notes

      1 Marsh, J. A., Pane, J. F., & Hamilton, L. S. (2006). Making sense of data-driven decision making in education. Santa Monica, CA: Rand Corporation. Available: www.rand.org/pubs/occasional_papers/OP170

      Marge Scherer has contributed to Educational Leadership.

      Learn More

      ASCD is a community dedicated to educators' professional growth and well-being.

      Let us help you put your vision into action.
      From our issue
      Product cover image 109023.jpg
      Data: Now What?
      Go To Publication