HomepageISTEEdSurge
Skip to content
ascd logo

Log in to Witsby: ASCD’s Next-Generation Professional Learning and Credentialing Platform
Join ASCD
February 1, 2003
Vol. 60
No. 5

Backward Design for Forward Action

author avatar
author avatar
Looking back to the key concepts and essential questions that underlie content standards can help identify learning goals and provide the starting point for planning both curriculum and school improvement.

Backward Design for Forward Action - Thumbnail
Credit: Copyright(C)2000-2006 Adobe Systems, Inc. All Rights Reserved.
Schools and districts today are working on two distinct kinds of improvement initiatives. One centers on the classroom—emphasizing effective instructional practices in teaching to state standards. The other focus is systemic—creating results-oriented schools that use analysis of achievement data to develop improvement plans.
Schools can integrate these two approaches at both the school and district levels by first thinking carefully about the desired results and then working backward to develop meaningful assessments and learning plans (Wiggins & McTighe, 1998). The three-stage backward design process for curriculum planning can also enhance school improvement planning and ensure that decisions are driven by data. For backward design to work, educators need to identify desired results, analyze multiple sources of data, and determine appropriate action plans.

Identify Desired Results

School improvement planning begins with a consideration of desired learning results, usually identified in the content standards of the district or state.
All too often, however, improvement teams concentrate on a much narrower target—raising scores on external, high-stakes assessments. Given the present accountability pressures, this emphasis is understandable. Standardized tests are an important indicator of student progress. Reducing desired learning to only those items that are measurable on large-scale assessments, however, can result in an unhealthy narrowing of the curriculum and decontextualized, “multiple-choice” teaching methods.
Instead, understanding key concepts and searching for answers to provocative questions—essential questions that human beings perennially ask about the world and themselves—should be the primary goals of teaching and learning. Students' understanding of the key ideas embedded in the content standards, then, should be the focus of any school improvement initiative. While seeking answers to important questions, students learn specific facts, concepts, and skills—those that typically appear on standardized tests—in the context of exploring and applying the larger ideas.
  • The geography, climate, and natural resources of a region influence how its inhabitants live and work. How does where people live influence how they live?
  • Cultures share common features while retaining distinctive qualities. What makes a culture? Are modern civilizations more civilized than ancient ones?
  • The past offers insights into historical patterns, universal themes, and recurring aspects of the human condition. What can we learn from studying other places and times?
These big ideas and essential questions provide a conceptual lens through which to address the specific content in the standards. This approach engages students in meaningful learning, provides a way to manage large quantities of content knowledge (see, for example, Bransford, Brown, & Cocking, 1999), and signals to students, parents, and faculty that the underlying goal of every school effort is to improve student learning of important content, not just to raise standardized test scores.

Analyze Multiple Sources of Data

During the second stage of backward design, we consider the evidence needed to determine whether students have achieved the desired learning. To address the full range of identified learning goals, school teams need to analyze multiple sources of data, examining a “photo album” of assessment evidence instead of looking only at the snapshot provided by a single test.
To assess student understanding of important ideas, we need to ask students to apply their learning to a new situation and explain their responses rather than just make selections from a list of given alternatives. These performance-based and constructed-response assessments can work in combination with multiple-choice items to provide robust evidence of student understanding. We obtain this evidence through an analysis of student work collected over time—perhaps in a portfolio—from classroom assignments and assessments.

Describe, Interpret, Reflect

  • What learning goals do the various assessments measure?
  • What kinds of thinking do the assessments require—recall, interpretation, evaluation, or problem solving?
  • What strengths and weaknesses in student performance do the different data sources reveal?
  • Are these the results we expected? Why or why not?
  • In what areas did the students perform best? What weaknesses are evident?
  • How are different population groups performing on the various assessments?
  • What does this work reveal about student learning and performance?
  • What patterns or changes do we see over time?
  • Are there any surprises? What results are unexpected? What anomalies exist?
  • Is there evidence of improvement or decline? If so, what might have caused the changes?
  • What questions do these data raise?
  • Are these results consistent with other achievement data?
  • Are there alternative explanations for these results?
  • By what criteria are we evaluating student work?
  • What is the performance standard? How good is “good enough”?
  • How do our results compare to those of similar schools?
Finally, the team should reflect on the actions that teachers, students, parents, the school, and the district need to take to improve student learning. By using these descriptive, interpretive, and action-oriented questions to examine data from multiple sources, improvement teams focus on the broader goals of student understanding and avoid fixating on standardized tests alone.

Write Data Summaries

Summarizing data analyses in a few sentences is a helpful way to transform achievement data into useful information. Written summaries provide other benefits, too. Faculty, parents, and board members appreciate the more understandable format; the summaries are descriptive, not evaluative, so they provide evidence without casting blame; priorities emerge from the process of consolidating numerical data into sentences; and those who write the summaries understand the performance data and are more likely to participate actively in trying to improve them.
The most useful written data summaries indicate long-term trends, use multiple data sources, address the most specific level of data available, and describe disaggregated results so that educators can more effectively target needs and actions.
XYZ Elementary's data analysis. Consider the following hypothetical example of a data summary.
At XYZ Elementary School, the grade 3 reading median percentile on the Iowa Tests of Basic Skills has ranged between 65 and 70 for the past three years. In this period, students did well in word identification but were weak in inferring meaning and identifying sequential relationships. Teacher evaluation of student work on classroom performance tasks, writing prompts, and report card grades indicated overall solid levels of performance in the assigned work. On average, however, only 35 percent of students scored at the proficient level on the district reading assessment in the same three-year period.Analyses of district test results revealed student weaknesses in higher-level inferential skills, such as distinguishing cause from effect and relating the conclusions from the text to those of other sources. There was a small gap in performance, with girls outperforming boys each year on every measure.
The XYZ staff concluded that the Iowa Tests of Basic Skills measured skills that are necessary, but not sufficient, for the students to understand what they read, especially when they take the district reading test. The data summary sparked fruitful staff discussion of the characteristics of proficient reading according to the state standards and test scoring protocols, what reading proficiency looks like in classroom assignments and assessments at each grade, and how teachers can ascertain whether students are actually making meaning from text.

Develop the Action Plan

Having identified learning goals and analyzed assessment data, teachers can now plan learning experiences to help students understand key concepts, and school improvement teams can generate action plans focused on obtaining the desired student achievement results.
Educators like action. With multiple demands exceeding available time, educators have an understandable tendency to want to hurry up and do something. Just as teachers should avoid the tendency to jump to planning activities before identifying desired results, school teams should not rush to action before carefully examining potential root causes of present achievement levels (Thomas, 2002).

A Systems View

  • Standards, assessment, and instruction. Are they aligned? To what extent do teachers understand, teach to, and assess the standards at a consistent and appropriate level of rigor?
  • The knowledge and skill levels of staff. In what ways has ongoing learning been embedded into the school and district culture?
  • The use of resources. How does the district use discretionary funds, for example, to promote improvement in targeted areas?
  • Academic supports and interventions for students. To what extent do all students receive the ongoing assistance that they need to achieve at high levels?
  • School culture. How do all staff members demonstrate high expectations for all students?
  • Leadership and policy at the school and district levels. Is the school shifting its focus from teaching to student and teacher learning, for example?
  • Community partnerships. How do various stakeholders contribute to the success of the school's academic goals?

Acting on the Analysis

When school teams identify the processes or structures that need adjustment, they are not just pointing fingers at what teachers need to do differently. Instead, they are using the action planning process to restructure the organization at the classroom, school, and district levels to address underlying reasons for performance problems.
  • The school lacked a coherent curriculum map in language arts because the staff had not aligned the school's reading curriculum with the state standards. Instead, teachers relied on a textbook series to guide reading instruction. (standards, assessment, and instruction)
  • Teachers were unsure about which specific reading proficiencies to teach at particular grade levels and how to assess them. (knowledge and skill levels of staff)
  • Classroom instruction focused on basic skills instead of content standards. (school culture of low expectations)
  • Grading practices on daily assignments and assessments varied so widely that report cards were not reliable measures of achievement or progress. (standards, assessment, and instruction)
  • Development of a curriculum map, aligned with the state and district standards and assessments, that clearly specifies the grades at which to teach and expect mastery of each reading standard.
  • Scheduling of opportunities for teachers to work in teams to unpack the reading standards and develop consistent expectations for assessing students grade to grade.
  • Development and implementation of one schoolwide reading assessment per quarter that mirrors the content and rigor of the state and district assessments.
  • Devoting team meeting time to reviewing the results of the quarterly assessments and identifying interventions for specific students.
  • Establishment of a consistent, standards-based grading and reporting process.

Understanding by Design

  • Determine learning goals;
  • Collect, analyze, and summarize evidence from multiple sources of data to determine how well students are doing on external accountability tests and the extent to which they really understand what they are learning;
  • Consider the root causes of present achievement, and then—and only then—implement systemic actions to address root causes, promote enduring learning, and increase test scores.
In this way, schools can identify priorities, monitor results, and target actions that improve student learning.
References

Bransford, J., Brown, A., & Cocking, R. (Eds.). (1999). How people learn: Brain, mind, experience, and school. Washington, DC: National Research Council.

Thomas, R. (2002). Getting to the root of the gap. School Administrator, 59(7), 23–25.

Wiggins, G., & McTighe, J. (1998). Understanding by design. Alexandria, VA: ASCD.

Jay McTighe has a varied career in education. He served as director of the Maryland Assessment Consortium, a collaboration of school districts working to develop and share formative performance assessments and helped lead standards-based reforms at the Maryland State Department of Education. Prior to that, he helped lead Maryland’s standards-based reforms, including the development of performance-based statewide assessments.

Well known for his work with thinking skills, McTighe has coordinated statewide efforts to develop instructional strategies, curriculum models, and assessment procedures for improving the quality of student thinking. He has extensive experience as a classroom teacher, resource specialist, program coordinator, and in professional development, as a regular speaker at national, state, and district conferences and workshops.

McTighe is an accomplished author, having coauthored more than a dozen books, including the award-winning and best-selling Understanding by Design® series with Grant Wiggins. He has written more than 50 articles and book chapters and has been published in leading journals, including Educational Leadership (ASCD) and Education Week.

UNDERSTANDING BY DESIGN® and UbD® are registered trademarks of Backward Design, LLC used under license.

 

Learn More

ASCD is a community dedicated to educators' professional growth and well-being.

Let us help you put your vision into action.
From our issue
Product cover image 103031.jpg
Using Data to Improve Student Achievement
Go To Publication