The language arts and social studies teachers on my professional learning team are intelligent, passionate people who work hard to amplify effective instructional practices. Our team spends significant amounts of time deconstructing our learning standards, identifying essential objectives, writing common assessments, and looking at trends in students' learning; and we're committed to designing interventions that are responsive to individual students. We've embraced Geri Parscale's (2008) argument that student-level interventions are the cardiopulmonary resuscitation (CPR) of a professional learning community:
CPR … is directive, timely, targeted, systematic, and administered by trained professionals. When someone collapses in the presence of one of these trained professionals, immediate action is taken to avoid permanent damage. Similarly, when children are dying academically, we must approach them with the same sense of urgency. (p. 188)
But responding urgently to students in academic need has proven easier said than done for us. With formative assessment, urgency depends on efficiently analyzing data. Like many learning teams, for years we've been trying to analyze data and uncover students' needs using antiquated paper-and-pencil tools. We've got notebooks full of checklists tracking student progress and walls covered with "exit slips" through which students demonstrate knowledge. But it can take hours just to rerecord this data for all 50-plus students connected to our team in a centralized spot, especially when we want to track performance on particular skills. Because the time necessary to sift through data is so daunting, my team had come to see "data-driven decision making" as a cumbersome and down-right frustrating process.
Does this sound like the learning teams in your building? If it does, let me share some good news. Student responders—which my learning team has begun experimenting with this year— provide a digital solution to inefficient data collection. Student responders are handheld devices that enable individual students to respond immediately to teacher questions. Software connected to the devices can instantly calculate the percentage of the class answering questions correctly and can display that information graphically (such as with a pie chart) on a computer screen, on each student's device, or even on a Smart Board in front of the class.
Benefits to All
Student responders have proven user-friendly. They are popular in universities, and several K–12 districts have invested in the technology. But many school leaders wonder whether they can afford student response systems when resources are limited; sets of 20–30 student responders can cost up to $1,200.
Take a look at the research around formative assessment, though, and you'll start to wonder whether you can afford not to invest in student responders. When paired with developmentally appropriate learning goals, effective feedback ranks as the second most important school-level factor influencing student achievement, after a guaranteed and viable curriculum (Marzano, 2003). To be effective, however, feedback must be timely and connected to the content being learned in class—two criteria that student response systems meet.
Responders enable teachers to collect information about student mastery several times each class period and see results instantly. Teachers can quickly scan this information for patterns showing which students are—or aren't—"getting it" and make in-the-moment adjustments to teaching on the basis of something more than professional hunches.
School districts that have experimented with student response systems, such as Henrico County in Richmond, Virginia (Henrico County Schools, 2007), have found that their teachers develop more confidence in their ability to monitor student learning. This confidence in turn can lead teachers to ask more frequent and more challenging questions.
Students can answer privately—and sometimes anonymously—with student responders. This ensures that all students consistently give teachers feedback on what they understand. Students using responders can immediately monitor their own learning. Teachers in Henrico County found that this created a sense of transparency, excitement, and urgency often absent in the classroom.
Because results are recorded automatically on spreadsheets, learning teams no longer need to tabulate data manually. Teams can more quickly use data to spot trends and plan enrichment or remediation sessions.
Worth the Risks
Perhaps the greatest risk in adopting student responders is that teachers might push more meaningful forms of assessment to the sidelines in favor of collecting quick and easy data. Without careful monitoring and professional development that introduces teachers to strategies for asking open-ended questions with responders, schools may find their digital solution takes student questioning in a simplistic direction.
But as a member of a team that once struggled through reams of data to find clues that could inform instruction, I believe the potential rewards of these digital tools outweigh any risks. As student response systems become standard features in our classrooms, my colleagues and I should see our former burden of collecting, rerecording, and analyzing data become a smooth process, freeing us to spend time finding ways to help every student learn.
Isn't that what formative assessment is all about?
A Sampling of Student Response Systems
- Turning Technologies
(www.turningtechnologies.com/studentresponsesystem). These handheld devices can poll students and tabulate results even if there is not a computer or interactive projector in the classroom.
- ActiVote Student Response Systems (www.prometheanworld.com/server.php?show=nav.15999). As students click their ActiVote responders, all students' answers are instantly displayed and analyzed on an interactive digital whiteboard.
- Smartroom Learning Solutions
(www.smartroom.com/k12.htm). Smartroom Learning Solutions student responders enable teachers and students to ask and answer multiple-choice, true/false, and short-answer questions.
Henrico County Schools. (2007). Study of Promethean interactive whiteboards in classrooms: Report of findings. Richmond, VA: Author.
Marzano, R. J. (2003). What works in schools: Translating research into action. Alexandria, VA: ASCD.
Parscale, G. (2008). Building a pyramid of interventions. In The collaborative administrator: Working together as a professional learning community (pp. 181–196). Bloomington, IN: Solution Tree Press.
Editor's note: See Robert J. Marzano's column in this issue ("Teaching with Interactive Whiteboards," p. 80) for a discussion of the effectiveness of displaying data from student responders on digital whiteboards. For more information and discussion on using student responders in class, see the November 14 post on William Ferriter’s blog "The Tempered Radical."
William M. Ferriter teaches 6th grade language arts and social studies in Raleigh, North Carolina, and blogs about the teaching life at The Tempered Radical (http://teacherleaders.typepad.com/the_tempered_radical). He is the coauthor of Building a Professional Learning Community at Work: A Guide to the First Year (Solution Tree, 2009); 919-363-1870;
Click on keywords to see similar products: