HomepageISTEEdSurge
Skip to content
ascd logo

Log in to Witsby: ASCD’s Next-Generation Professional Learning and Credentialing Platform
Join ASCD
December 1, 2007
Vol. 65
No. 4

What Research Says About… / Classroom Walk-Throughs

Jane L. David, director of the Bay Area Research Group, begins a new research column for Educational Leadership this month. Coauthor with Larry Cuban ofCutting Through the Hype: A Taxpayer's Guide to School Reform, David will share with readers what research says about the effectiveness of current education reforms.In the coming months, David will examine the research behind such approaches as project-based learning, incentives to attract teachers to high-poverty schools, and small learning communities. In framing the issues and drawing conclusions, she will draw on articles from peer-reviewed journals and reports from research institutions as well as her own 35 years of experience studying schools and districts.We welcome readers' comments at <LINK URL="mailto:edleadership@ascd.org">edleadership@ascd.org</LINK>.

Touted as a systematic and efficient way to gather helpful data on instructional practices, classroom walk-throughs (also called learning walks, quick visits, and data walks) are showing up everywhere.

What's the Idea?

The idea behind walk-throughs is that firsthand classroom observations can paint a picture to inform improvement efforts. These observations typically involve looking at how well teachers are implementing a particular program or set of practices that the district or school has adopted. For example, a school principal might want to know whether teachers are able to put into practice their recent training on quick-writes and pair-shares.
In theory, before visiting classrooms, observers decide what they will focus on, what evidence they will collect, and how they will make sense of it. Afterward, they report their findings formally or informally to one or more audiences.
Walk-throughs are not intended to evaluate individual teachers or principals or even to identify them by name in postobservation reports. Rather, the goals of walk-throughs are to help administrators and teachers learn more about instruction and to identify what training and support teachers need.

What's the Reality?

The sheer variety of walk-throughs is breathtaking. They can last from 2 to 45 minutes. The group observing may range from 2 to 12 people and may include teachers, administrators, community members, and students. Walk-throughs can focus on one teacher, all teachers, or a subset of teachers and schools.
Observers sometimes question students to find out whether they understand what they are doing in the lesson and why. In other cases, observers focus on a particular instructional challenge raised by the teachers under observation: for example, use of questioning techniques and wait time. Or, in a version of walk-throughs verging on compliance monitoring, observers are armed with a checklist on which to record how the classroom furniture is arranged and whether the teacher has posted state standards targeted by the lesson.
Sometimes observers huddle in the hall to discuss what they saw and later send a written report to the school. In other cases, they meet with the faculty to share their findings and then shred their notes at the end of the day to reinforce the point that their purpose is not to evaluate teachers.

What's the Research?

Although research on walk-throughs is limited, available studies reveal wide variation in their usefulness and effects. According to an in-depth study of three urban districts conducted by the Rand Corporation, administrators find walkthroughs more useful than do teachers (who rarely receive individual feedback), and those doing the walkthroughs report learning more than do those who are observed (Marsh et al., 2005). District leaders and principals in a sample of schools in one large urban district reported that the data from walk-throughs gave them a better understanding of how well teachers were able to identify and move students in and out of support programs. This finding led them to make adjustments in the professional development they provided (Supovitz &amp; Weathers, 2004).
Other studies point to the value of district-designed walk-throughs in developing shared understandings of high-quality practice. Training in the use of valid and reliable data-collection instruments and clear rubrics play an important role in creating a common language and communicating district priorities (Coburn, Honig, &amp; Stein, in press).
Walk-throughs also carry significant risks. When the purpose is murky or when trust among teachers, principals, and central-office staff is low, walkthroughs are likely to be perceived as compliance checks, increasing distrust and tension. Valli and Buese (2007) describe increased teacher anxiety in their four-year study of 150 teachers in a district that instituted walk-throughs. Convincing participants that the results will not be used to evaluate individual teachers or principals is a tall order for most districts. In one urban district, in spite of efforts to alleviate fears, more than one-half of the principals believed that district staff members conducting the walk-throughs were passing judgment on them (Supovitz &amp; Weathers, 2004).
Kerr and colleagues (2006) found that district leaders communicate sincerity about the constructive intent of walk-throughs in several ways. One way is to focus walk-throughs on areas where teachers and site leaders have ample professional development opportunities and support to implement changes. When walk-throughs are disconnected from larger improvement efforts, teachers tend to dismiss them as "drive-bys" or "gotchas." Leaders can also communicate good intentions by using high-quality data-collection instruments and training walk-through observers in their use. If teachers and principals perceive the data collection as superficial or invalid, they lose confidence in its purpose and value.
Organizations listed on the Web offer a wide range of protocols and electronic data-collection devices to support walkthroughs. Although the efficiency of electronic checklists is appealing, the kinds of data that provide the most valuable feedback are not necessarily those that are easiest to count and record. In fact, Stein and Nelson (2003) argue that the more a walk-through aims to assess good instruction, the more it requires those making the judgments to be knowledgeable about instruction and spend more than a few minutes observing.

What's One to Do?

The research suggests that walkthroughs can play a constructive role only when districts make their purpose clear and carry them out in a climate of trust. Many districts and schools can tell tales about walk-throughs that backfired.
Before launching any type of walkthrough process, it is important to ensure that everyone understands how it connects to improvement efforts. This connection should be reflected in the specific data that observers collect, the thoughtfulness and quality of the protocols, and the way the results are used. Checklists focused on surface features are not likely to provide useful information to teachers as they implement new approaches or refine their teaching practices. Districts will not accomplish much by amassing new data unless they train observers well and prepare educators to use the data.
If school districts keep these cautions in mind and use walk-throughs as one of a number of strategies to support strong instructional leaders and teachers, they will find that walk-throughs can promote school improvement.
References

Coburn, C. E., Honig, M. I., &amp; Stein, M. K. (in press). What is the evidence on districts' use of evidence? In J. Bransford, L. Gomez, D. Lam, &amp; N. Vye (Eds.), Research and practice: Towards a reconciliation. Cambridge, MA: Harvard Education Press.

Kerr, K. A., Marsh, J. A., Ikemoto, G. S., Darilek, H., &amp; Barney, H. (2006). Strategies to promote data use for instructional improvement: Actions, outcomes, and lessons from three urban districts. American Journal of Education, 112(4), 496–520.

Marsh, J. A., Kerr, K. A., Ikemoto, G. S., Darilek, H., Suttorp, M., Zimmer, R. W., &amp; Barney, H. (2005). The role of districts in fostering instructional improvement. Santa Monica, CA: Rand.

Stein, M. K., &amp; Nelson, B. S. (2003). Leadership content knowledge. Educational Evaluation and Policy Analysis, 25(4), 423–448.

Supovitz, J. A., &amp; Weathers, J. (2004). Dashboard lights: Monitoring implementation of district instructional reform strategies. Philadelphia: Consortium for Policy Research in Education, University of Pennsylvania.

Valli, L., &amp; Buese, D. (2007). The changing roles of teachers in an era of high-stakes accountability. American Educational Research Journal, 44(3), 519–558.

Author bio coming soon

Learn More

ASCD is a community dedicated to educators' professional growth and well-being.

Let us help you put your vision into action.
From our issue
Product cover image 108023.jpg
Informative Assessment
Go To Publication