Phone Monday through Friday 8:00 a.m.-6:00 p.m.
1-800-933-ASCD (2723)
Address 1703 North Beauregard St. Alexandria, VA 22311-1714
Complete Customer Service Details
March 2014 | Volume 71 | Number 6 Using Assessments Thoughtfully Pages 82-83
Sonny Magaña and Robert J. Marzano
Despite the research on the positive effects of efficient and timely feedback,1 a significant time gap often exists between student responses to questions on an assessment and teacher feedback on those responses. Because teachers often take the assessments home to score by hand, the "feedback gap" can be at least a day and, in some cases, several days, a week, or more.
This lag time greatly diminishes the opportunity for students to think about their thinking, reflect on their errors, and revise their knowledge. The more time that elapses between a student response and teacher feedback, the less metacognitive reflection that takes place. In some situations, students never even get the opportunity to review their incorrectly answered questions so they can revise their knowledge.
The recent emergence of polling technologies—such as clickers, student response systems, and free online resources like Poll Everywhere or Socrative—can potentially diminish or even eradicate the feedback gap. Armed with almost instantaneous feedback on their responses, students are more able to reflect on their thinking and, often with teacher guidance, revise their knowledge on the spot.
Polling technologies are helpful to teachers, too. Teachers can administer more frequent assessments because these technologies are easy to use, and when teachers predetermine the correct answers, the assessments literally grade themselves. Instead of grading tests by hand, teachers can spend more time studying student progress, making informed inferences about student learning needs, and providing timely interventions.
Polling technologies also enable all students in a class to respond to teacher-posed questions simultaneously, increasing student response rates to 100 percent. In addition, teachers can pose questions and get instant student responses throughout a class period, making assessment a seamless part of the flow of instruction.
Of course, polling technology is not required for real-time formative assessment. Teachers often ask students to use hand signals to respond to a question. However, if a teacher asks students to raise their hands if they don't understand, typically few students will raise their hands. Alternatively, if the teacher asks students to publicly vote on a question using thumbs up or thumbs down, some students might first look around the room to determine how others have voted.
In contrast, when students respond in the relatively anonymous mode afforded by polling technologies—only the student and teacher know how that student responded—the students' answers will more accurately reflect their thinking. This will obviously increase the accuracy of their responses.
Let's say a teacher has used polling devices to collect student responses to six multiple-choice questions he or she asked in the course of a class period. Each device has its own number and is assigned to a particular student; student responses are listed for each voting device and student name. As soon as a student enters a response, it's automatically scored, with a checkmark indicating a correct answer. Of course, this immediate scoring requires that the teacher use selected-response items, such as multiple choice, matching, true or false, and the like.
The display of student responses shows how all students in the class answered these six problems. It clarifies for teachers which students struggle with which information as well as which students seem to have mastered the material. It also shows how the class did globally: Only one-half of the class might have answered Question 4 correctly; Question 6 might have stumped almost everyone.
Teachers can display these data to the class with the column listing student names hidden. As long as students know the number of their polling device, they can see how their answers compared with those of other students, while still maintaining their anonymity. Teachers can easily initiate discussions about why several students might have answered a question or questions incorrectly.
Polling technologies need not require new hardware. In classrooms in which all students own or have access to a smartphone, students can take assessments using free polling software services, such as those noted above, or services that charge a nominal fee.
Polling technologies also offer an opportunity for teachers to use proficiency scoring2 when creating tests. Proficiency scoring begins by writing items (or selecting them from an item bank) that reflect three levels of proficiency. These three levels are communicated to students as learning goals.
For example, consider the following three levels of learning goals for the topic of democracies:
To return to the six-question, multiple-choice assessment, the first three questions might focus on content at the basic level, the next two questions on content at the proficient level, and the last question on content at the advanced level. A display of student responses would give the teacher an immediate picture of each student's performance at each of the three levels, as well as how the class did as a whole.
It's important to note that patterns of student responses to short classroom assessments will not necessarily provide accurate profiles. A student might guess at an answer or miss an item because he or she didn't read it correctly. However, if students answer items at all three levels of difficulty several times during a week, accurate patterns should surface for each student.
Polling technologies offer a new type of assessment that benefits both students and teachers as it closes the feedback gap. It's time to try them.
1 Hattie, J. (2009). Visible learning: A synthesis of over 800 meta-analyses relating to achievement. New York: Routledge.
2 Marzano, R. J. (2012). An easier way to score tests. Educational Leadership, 69(6), 82–83.
Sonny Magaña is associate vice president of Marzano Research Laboratory and director of the Educational Technology Division. Robert J. Marzano is cofounder and CEO of Marzano Research Laboratory in Denver, Colorado, and executive director of the Learning Sciences Marzano Center in Palm Beach Gardens, Florida. He is coauthor, with Michael Toth, of Teacher Evaluation That Makes a Difference: A New Model for Teacher Growth and Student Achievement (ASCD, 2013).
March 2014Using Assessments Thoughtfully
Subscribe to Educational Leadership magazine and save up to 51% OFF the cover price.