HomepageISTEEdSurge
Skip to content
ascd logo

Log in to Witsby: ASCD’s Next-Generation Professional Learning and Credentialing Platform
Join ASCD
February 1, 2000
Vol. 57
No. 5

Research Link / A Value-Added View of Pupil Performance

Some assessments show whether students know more or less than their counterparts who took the test in previous years. Value-added assessment, however, measures how much an individual student has learned. Barton and Coley (1998) recommend such testing because "it adds an important dimension to be able to say how much students learned, as well as how much they know" (p. 5). But how well does value-added assessment take into account an assortment of variables that influence achievement?

Some Examples

The Tennessee Value-Added Assessment System (TVAAS) uses value-added testing to develop a profile of academic growth for individual students. Scaled scores indicate a student's current level of academic attainment. When collected over time, this information provides a profile of academic growth for each student. By statistically aggregating these data, researchers get a fairly reasonable picture of the impact of a school system, a school, and an individual teacher on student learning (Pipho, 1998).
Data derived from student achievement tests have traditionally been used to place an individual student somewhere within the distribution of the general student population. Pipho (1998) considers this unfair because the tests do not take into account the socioeconomic status of students. The TVAAS adjusts for these influences by using achievement data as input in a complex longitudinal analysis. By studying related factors, researchers can estimate with considerable sensitivity the relative effectiveness of school districts, schools, and teachers in facilitating academic growth of student populations. This value-added assessment model provides Tennessee researchers with enough evidence to suggest that the single largest factor affecting academic growth is differences in effectiveness of individual classroom teachers.
Bryk, Thum, Easton, and Luppescu (1998), researchers for the Chicago Public Schools, use an assessment model that parallels the Tennessee initiative. They, too, use caution in calculating exactly how much a student has learned over time. For instance, they realize that for security reasons, test forms change from year to year. As a result, both test content and scales are not necessarily constant over time or from one test administration to the next: Thus, when we look at 10-year trends in score reports, we are, in essence, judging students, schools, and the system against a moving target. . . . As a result, a teacher may see, for example, that students in her classroom clearly know more mathematics than previous classes of students, but their standardized test scores may still come back lower. (Pp. 14-15) To level the playing field, these researchers engage in an extensive task of making the test forms comparable.
Because of student mobility, the Chicago researchers also caution against using the average achievement level of students as an indicator of school productivity over time. If a group of students enrolls in a school sometime during the academic year, even on the day just before testing, these students' scores will be counted as part of the overall achievement level for the school. Because a school should be held responsible only for the learning that occurs among students who were actually taught in that school, we must consider the gains in achievement made by particular students at each grade in the school for each year.
The Sweetwater Union High School District in California is an excellent example of how value-added testing helped identify student successes (Goycochea, 1998). Compared with other schools in that district, National City Middle School consistently ranked at or near the bottom academically. Using the value-added approach, faculty members compared students' final outcomes on a writing test with their incoming performance. They used conventional statistical procedures for analyzing student data to compare the students' writing scores with those of students at the district's flagship school.
In 1992-93, only 16 percent (104 students) of the National City Middle School student body achieved a passing grade on the writing test. By 1996-97, that percentage rose to 30.8 (233 students). On the basis of these data, they concluded that in reality, the National City Middle School was as effective in raising its students' writing scores as the school considered the best in the district. The real difference between the two schools was the students' incoming academic level—a factor directly related to socioeconomic conditions.
Even though most educators understand the relationship between socio-economic conditions and academic achievement, they tend to overlook it when outcome-only approaches assess school effectiveness. It is equally harmful to give students a false sense of progress by identifying a school as the "flagship": "Are we content to have students coast and then stop altogether once they achieve the minimum outcome required?" (Goycochea, 1998, p. 33).

The Value of AddingValue

Pupil assessment is crucial in bringing about reform—and value-added testing is a major player in this arena. But, as with so many educational endeavors, we must understand all the variables that influence our final results to ensure that our conclusions are sound.
References

Barton, P. E., & Coley, R. J. (1998). Growth in school: Achievement gains from the fourth to the eighth grade [Policy Information Report]. Princeton, NJ: Educational Testing Service.

Bryk, A. S., Thum, Y. M., Easton, J. Q., & Luppescu, S. (1998). Academic productivity of Chicago public elementary schools. Chicago: Consortium on Chicago School Research.

Goycochea, B. B. (1998, December). Rich school, poor school. Educational Leadership, 55, 30-33.

Pipho, C. (1998). The value-added side of standards. Phi Delta Kappan, 79 (5), 341-342.

Author bio coming soon

Learn More

ASCD is a community dedicated to educators' professional growth and well-being.

Let us help you put your vision into action.
From our issue
Product cover image 100028.jpg
What Do We Mean by Results?
Go To Publication