HomepageISTEEdSurge
Skip to content
ascd logo

Log in to Witsby: ASCD’s Next-Generation Professional Learning and Credentialing Platform
Join ASCD
February 1, 2003
Vol. 60
No. 5

A Tale of Two Schools' Data

By acknowledging that implementation occurs over time, schools can safeguard against discarding new instructional practices prematurely.

A Tale of Two Schools' Data - Thumbnail
It's June: report card time for students, teachers, and schools. Two teachers are about to get their first look at their schools' student achievement reports. Although they teach in different districts, Joseph and Zindzi have been leading parallel teaching lives for a number of years. They both teach mathematics in schools of similar size, ethnic composition, and socioeconomic demographics.

Joseph's Student Achievement Report

Joseph's report includes three pieces of information: math scores for students in his class; district data about average student scores disaggregated by ethnicity, gender, and the community's socioeconomic status; and next year's target scores for student achievement in math for his school.
Joseph frowns when he sees his students' scores. They are as low as last year's, even though the school had a goal of 10 percent improvement. District data confirm that his students scored well below the goal. His frown deepens when he sees the target scores for next year. Although the target scores have taken into account the low socio-economic level of the student population and the school's history of poor performance, they still set a goal that he considers impossible to attain.
Why hadn't the changes he'd made this year raised his students' scores? What should he do differently next year? He stares at the report, but the figures offer no assistance. The warning, “If you do what you've always done, you'll get what you always got,” rings in his ears. But he's not sure which or how many new ideas about instruction will work for his students. He leaves school feeling depressed.

Zindzi's Student Achievement Report

Zindzi's report contains the same three items as Joseph's, but her district organizes the classroom scores by how fully teachers say they have been able to implement certain new instructional practices. The three levels are “getting acquainted,” “partial implementation,” and “full implementation.” The district knows that applying new teaching practices occurs in degrees, so this system gives teachers time to learn new techniques.
When Zindzi sees her students' scores, she is pensive. They are not much better than last year's. But she knows that the scores are appropriate for her “getting acquainted” implementation level. She has agreed to be at the “partial implementation” level next year, but the higher target scores do not dismay her. Knowing her students' scores from this year and those of other students who have had more time to fully benefit from the new practices, she believes that the target scores are realistic. Zindzi leaves school feeling encouraged and confident in her ability to get good results.

What Are the Differences?

Why did Joseph and Zindzi receive such different student achievement reports? After looking at the past year's scores, Joseph's district made an initial diagnosis, set a goal, and left it up to individual teachers to figure out how to reach it, giving little guidance.
Zindzi's district also made a diagnosis and set a goal, but it decided to change its planning process to better connect instruction, professional development, and student learning, and to put greater control into the hands of teachers collectively. Its usual system consisted of defining the problem, setting goals, planning action, taking action, evaluating results, and adjusting plans. But the district found that the plan lost momentum after the action step. So the district created a planning system based on two distinct yet supporting processes: program planning and action; and evaluative inquiry. For each subject area, the district established an Action Team and an Evaluative Inquiry Team largely composed of teachers. Both teams reported to the district curriculum leadership team. The district reasoned that separating program planning and action from evaluation would create a more realistic workload for the teachers involved and would draw talent from their varied knowledge, skills, and interests.

The Action Team Takes the Lead

The Action Team took the lead role and reviewed existing research about instructional practices to determine which instructional strategies would be the most appropriate for the district's schools. After reviewing research about math instruction, for instance, the team selected three compatible instructional practices: use of manipulatives, student dialogue about math, and student projects demonstrating math concepts. They thought that this combination, along with what they were already doing, would be powerful enough to influence student learning and could be adapted by individual teachers to fit the range of school situations in their district.
Although the type of professional development and the implementation structure varied among the different schools, all teachers received training in the instructional practices and committed to implementing them fully in their classrooms within five years. The teachers also agreed to track the impact that the instructional practices had on student learning over the five-year span.

The Evaluative Inquiry Team Steps In

Once the participating teachers had decided on the five-year implementation goal, the Evaluative Inquiry Team collected and organized the teachers' data on how the instructional approaches were working, paying special attention to the degree to which the approaches had been implemented. The team used the data to show links between the implementation level of the instructional practices and student achievement.
To get the best implementation data possible, the Evaluative Inquiry Team used an implementation rubric (see fig. 1). Although teachers selected the level of implementation that they would undertake in a given year, at the time of data collection they reported the level that they had actually used. The teachers' professional growth plan also stated their expected implementation level.
Figure 1. Rubric for Implementation Levels

Figure 1. Rubric for Implementation Levels

Level 1—Getting Acquainted

  • Participate in a study group focused on new instructional methods.

  • Observe at least three teachers using the new instructional methods.

  • Determine your additional professional development needs for achieving partial implementation.

  • Incorporate your professional development needs into your professional growth plan.

  • New instructional methods are seldom, if ever, used in the classroom.

Level 2—Partial Implementation

  • Incorporate each of the instructional practices into at least one lesson.

  • Practice using two of the three instructional practices until you feel comfortable varying their use to meet the needs of different students.

Level 3—Full Implementation

  • Use the three instructional practices on a regular basis, varying their use to meet the needs of different students.

  • Ask students whether the instructional practices are helping them learn.

  • Have another teacher who is proficient in the use of the practices observe your instruction and provide feedback.

  • Adjust your practice on the basis of feedback from students and another teacher.

 

Because Zindzi opted for the “getting acquainted” level in math for the past year, she participated in a study group about the three instructional practices for math. She observed two teachers who were at full implementation and one at partial implementation. During her performance evaluation, she discussed with her principal the differences she saw between her current approach and the approaches of the three teachers she had observed.
When the district administered the student achievement tests, teachers reported on which instructional changes they had implemented and which changes they had not. The district assessment office tracked that information accordingly and analyzed trends separately for the implementers and nonimplementers. Analyzing trends gave the Evaluative Inquiry Team ideas for adjustments that they could then recommend to the Action Team. Throughout the process, the focus remained on the efficacy of the specific instructional methods rather than on individual teacher performance.

How Communities React to the Data

Joseph's school maintains data to determine the general level of current achievement and to understand the data's connections to their community context. Progress on overall math achievement (as well as other subject areas) is reported on a yearly basis, and the data are disaggregated to show differences by gender and ethnic group. Unfortunately, Joseph and his colleagues are frustrated because they do not know how to link this information to the variables that they have control over: instructional practices. The community shares their frustration.
Zindzi's school reports this same information, but it is secondary to the instruction-focused data. When Zindzi's school reports its data to the community, it focuses on the research-based changes that schools are making to improve student learning, how levels of implementation of the new methods are linked to progress in student learning, and what actions teachers are taking each year to sharpen their instruction. The school provides parents with opportunities to learn how they can reinforce teachers' instructional methods. Through student demonstrations, parents see what high levels of learning look like. With this information in the foreground, the district shows slow but steady changes in achievement and allows the community to make sure that no children are left behind.

ASCD is a community dedicated to educators' professional growth and well-being.

Let us help you put your vision into action.
Discover ASCD's Professional Learning Services
From our issue
Product cover image 103031.jpg
Using Data to Improve Student Achievement
Go To Publication