Conference Countdown
Atlanta, Ga.
April 2-4, 2016
  • membership
  • my account
  • help

    We are here to help!

    1703 North Beauregard Street
    Alexandria, VA 22311-1714
    Tel: 1-800-933-ASCD (2723)
    Fax: 703-575-5400

    8:00 a.m. to 6:00 p.m. Eastern time, Monday through Friday

    Local to the D.C. area, 703-578-9600, press 2

    Toll-free from U.S. and Canada, 1-800-933-ASCD (2723), press 2

    All other countries (International Access Code) + 1-703-578-9600, press 2

  • Log In
  • Forgot Password?


ASCD Annual Conference and Exhibit Show

2016 ASCD Annual Conference and Exhibit Show

Learn. Teach. Lead.
Get the tools to put it all together at this can't-miss education conference—with more than 200 sessions and five inspirational keynote speakers.

Learn more and register.



ASCD respects intellectual property rights and adheres to the laws governing them. Learn more about our permissions policy and submit your request online.

Policies and Requests

Translations Rights

Books in Translation

December 1996/January 1997 | Volume 54 | Number 4
Teaching for Authentic Student Performance Pages 26-29

Learning from Performance Assessments in Math

Carol S. Parke and Suzanne Lane

Teachers using QUASAR performance assessments can improve their math instruction and their students' achievements.

How can teachers become better informed about performance assessment? How can the information that performance assessment tasks provide help teachers to strengthen their curriculum and improve their teaching techniques? How can teachers help their students understand why these tasks are important and how tasks are scored?

Information gathered from the use of performance assessments in one reform oriented mathematics education project can help to answer these questions.1  Since the 1990 1991 school year to the present, six schools have participated in the QUASAR Project. The goal of this project is to demonstrate how math programs that focus on problem solving and reasoning skills can succeed with urban middle school students of various ethnic backgrounds (Silver and Stein 1996).

For this project, we developed the QUASAR Cognitive Assessment Instrument (QCAI) and used it to monitor student outcomes and growth (Lane 1993). This instrument consists of a set of open ended tasks that measure students' mathematical problem solving, reasoning, and communication skills. We used a holistic rubric to score student responses to each task on a scale of 0 to 4, and we conducted in depth analyses of student performance. Over the years, the QUASAR teachers participated in various activities that helped them use this information about their students' mathematical understanding.

Learning About Performance Assessments

For performance assessments to have a positive impact on instructional practices in the classroom, teachers need to become familiar with the nature of the tasks, what content and thinking skills the tasks assess, and what constitutes a high quality response. Teachers in the QUASAR schools participated in workshops and collaborated informally with their colleagues to gain these types of experiences. They focused on two important aspects of performance assessment:

  • Performance assessments allow students to show how they arrived at their solutions and providing explanations for their answers, thereby providing rich information about students' thinking and reasoning.
  • Performance assessments reveal different levels of understanding of the mathematics content. Therefore, evaluations of student responses should focus on the content of the response, not its length.

Providing Answers and Explanations

At one school, teachers worked with their resource partner a mathematics educator at a nearby university to discuss the in depth analyses of student responses to a specific task. The task required students to choose the decimal number with the largest value from a list and to provide an explanation for their answer.

Many students in this school received low scores: Although they could select the correct answer, they were unable to successfully explain how they arrived at that answer. One teacher could not understand why it was necessary for students to provide an explanation. She reasoned that "the answer is the most important part; if they've got it right, there's no need to look at the explanation."

To illustrate the importance of an explanation, the resource partner showed teachers several examples where students had chosen the correct answer but did not necessarily understand why (see Figure 1).2 

Figure 1. The Importance of an Explanation

The first response displays complete and correct knowledge of decimal place value and illustrates why .8 has the greatest value. The second response includes a very incomplete and vague explanation that provides no indication of whether the student has any understanding of decimals. The third response demonstrates a misunderstanding of the value and placement of zeros in a decimal.

When the teachers compared the explanations, they began to see how much insight those explanations can provide into a student's level of understanding. This discussion was one of the first meaningful interactions these teachers had about their students' conceptual understandings and what they were learning in the classroom.

Looking for Content-Rich Explanations

A team of teachers at another school examined their students' responses to the QCAI tasks and agreed that they needed to place greater emphasis on written communication skills. Several students wrote lengthy explanations for some of the tasks, but those explanations contained few to no references to the mathematics in the task.

For example, one problem asked students to pretend to be radio announcers describing a race. First they had to read a graph that charted the times and distances run by the two racers. The task was designed to assess student performance in interpreting and integrating the two dimensions time and distance of a graph in a meaningful way.

One student wrote an elaborate description, but it did not incorporate the mathematical aspects of the graph. This student interpreted the line graphs as if they were simply the "paths" of the two runners. It appears that the student got so caught up in being a radio announcer that he overlooked the time and distance values on the axes of the graph. One teacher noted, "So it's not that they don't write, it's what they write." She then suggested that the teachers in her school should collaborate to identify ways to get students to write higher quality mathematical explanations.

Using Results to Inform Instruction

As the above examples illustrate, QUASAR teachers were becoming better informed about the nature of performance assessment tasks and the ways in which students respond to them. In other situations, groups of teachers used QCAI results to identify curriculum strengths and weaknesses. By examining the percentage of students at each score level for each task, teachers were able to evaluate the impact of their instructional program.

On one occasion, for example, 49 percent of the students received a high score (3 or 4) on a visual pattern task in the fall, while that number dropped to 16 percent in the spring. The teachers reasoned that the decrease occurred because while their instruction focused heavily on visual patterning just before the fall administration, they had provided no instruction on this topic during the remainder of the year. As a result of this discussion, the teachers modified their curriculum to ensure that they revisited this important area throughout the year.

In another instance, an after school workshop for teachers focused on the importance of patterns and functions throughout the middle school mathematics curriculum. The QCAI task required students to draw the next figure in the pattern and then describe the pattern. On examination, teachers saw that while the students seemed reasonably successful in drawing the next figure correctly, they had a great deal of difficulty describing the pattern.

The teachers began to see the importance of giving students frequent opportunities to explore and discover patterns and then gradually providing opportunities for them to describe, extend, and generalize a wider variety of patterns and functions using tables, graphs, and symbolic equations. These activities helped teachers modify and enhance their curriculum and instruction.

Familiarizing Students with the Assessments

To ensure that all students have the opportunity to perform their best, it is important that students become familiar with the open ended nature of assessment tasks, the type of content and thinking skills such tasks assess, and the criteria used to evaluate their responses. Teachers in the QUASAR schools frequently used the QCAI practice tasks to achieve these goals.3 

One way that teachers used the task materials was to illustrate that multiple strategies can be used to solve a problem. For example, student solutions to the task in Figure 2 show three possible strategies for obtaining a correct solution. The first student used common multiples; the next student drew three separate diagrams of blocks; and the third student performed three separate long divisions and showed that they all resulted in a remainder of 1. When students shared their strategies and responses, they gained a deeper conceptual understanding of the content and began to see there is more than one way to solve a problem.

Figure 2. Multiple Responses

Teachers also engaged students in activities designed to ensure that they understood the scoring criteria used to evaluate their work. A two day activity in one teacher's classroom resulted in a very successful learning experience for everyone involved.

First, students worked in small groups to solve a set of tasks located at different stations throughout the room. After each group had worked through the tasks, the teacher explained the criteria for each score level. Then, as a class, students examined a set of sample student responses that had been scored using these criteria.

On the following day, the students exchanged papers and scored one another's responses. The teacher discussed the importance of being objective and consistent when evaluating student work. This discussion helped students take the activity more seriously, and also to think hard about what constitutes good mathematical thinking. The students recorded the score they assigned, wrote a rationale for that score, and suggested ways to improve the response.

Figure 3 shows a response to a QCAI task that involves interpreting a graph of the speed of a boy going to his grandmother's house. The group of students who evaluated this response assigned it a score of 2 on a scale of 0 to 4. As a rationale for assigning that score, the students pointed out where the response was incorrect. For example, the story says Tony stopped his walk from 1:30 to 2, neglecting to note that the graph records his speed at this time. And they noted that the response was somewhat incomplete, because the story did not mention Tony's increase in speed before arriving at his grandmother's house. The figure also shows the students' suggestion for improving the response. Teachers found that instructional activities not only prepared students to take the QCAI, but promoted their engagement in challenging mathematical tasks and in discussions about what constitutes a good mathematical response.

Figure 3. Interpreting a Graph

Benefits for Teachers and Students

Professional development experiences and collaboration with colleagues can help teachers in all subject areas become better informed about the nature of performance assessment tasks. These experiences also can provide teachers with the insights they need to strengthen curriculum and improve instructional techniques.

In addition, teachers can use various challenging and engaging activities to introduce students to performance assessment tasks and to familiarize them with the criteria used to evaluate their responses. As a result, students develop a better understanding of what constitutes high level thinking and reasoning.


Cai, J., Magone, M. E., Wang, N., and Lane, S. (1996), "A Cognitive Analysis of QUASAR's Mathematics Performance Assessment Tasks and Their Sensitivity to Measuring Changes in Middle School Students' Thinking and Reasoning." Research in Middle Level Education.

Lane, S. (1993). "The Conceptual Framework for the Development of a Mathematics Performance Assessment." Educational Measurement: Issues and Practice 12, 2: 16 23.

Lane, S., and C. S. Parke. (April 1996). "Consequences of a Mathematics Performance Assessment and the Relationship Between the Consequences and Student Learning." Paper presented at the annual meeting of the National Council on Measurement in Education, New York.

Magone, M. E., J. Cai, E. A. Silver, and N. Wang. (1994). "Validating the Cognitive Complexity and Content Quality of a Mathematics Performance Assessment." International Journal of Educational Research 21, 3: 317 340.

Parke, C. S., S. Lane, and F. Guo. (April 1995). "The Consequences of a Performance Assessment in the Context of a Mathematics Instruction Reform Project." Paper presented at the annual meeting of the National Council of Measurement in Education, San Francisco.

Silver, E. A., and M. K. Stein. (1996). "The QUASAR Project: The Revolution of the Possible' in Mathematics Instructional Reform in Urban Middle Schools." Urban Education 30, 4: 476 522.


1  We draw on studies (Parke, Lane, and Guo 1995; Lane and Parke 1996) that collected data from teachers, resource partners, and administrators regarding the consequences of a mathematics performance assessment (QCAI).

2  See Magone and colleagues (1994) and Cai and colleagues (1996) for a further discussion of the analysis of student responses to the tasks.

3  To monitor student progress and the instructional program, the QCAI was administered in the fall and spring of each year. The task materials included actual tasks used in earlier years.

The QUASAR Project, is a University of Pittsburgh project funded by the Ford Foundation and directed by Edward A. Silver at the University of Pittsburgh.

Carol S. Parke is a Research Specialist in the School of Education, University of Pittsburgh, SCOI Forbes Quad, University of Pittsburgh, Pittsburgh, PA 15260. (e mail: Suzanne Lane is a Associate Professor of Research Methodology, in the School of Education, University of Pittsburgh, SCOI Forbes Quad, University of Pittsburgh, Pittsburgh, PA 15260 (e mail


Log in to submit a comment.

To post a comment, please log in above. (You must be an ASCD EDge community member.) Free registration