HomepageISTEEdSurge
Skip to content
ascd logo

Log in to Witsby: ASCD’s Next-Generation Professional Learning and Credentialing Platform
Join ASCD
February 1, 2003
Vol. 60
No. 5

No Schools Left Behind

author avatar
Schools can get a better picture of how to improve learning for all students by gathering, intersecting, and organizing different categories of data more effectively.

No Schools Left Behind - Thumbnail
Educators often begin improving their schools by asking two questions: “What data should we be analyzing to help our school improve?” and “What data besides the state's standardized test results can we use?”
If you have asked these questions, the good news is that your school probably already has an abundance of powerful data on the effectiveness of all parts of the school—and much of this information does not come from standardized tests. The bad news is that if you are not using these data, your school or district probably has not yet organized the data for easy access and analysis.
What kinds of data are important for continuous school improvement, and how can you best organize the data for easy access and analysis?

Four Kinds of Data

You can answer almost any question about the effectiveness of a school by gathering, intersecting, and analyzing four kinds of data.
Demographic data describe the students, the school's staff, the school, and the surrounding community (see fig. 1). This information delineates the context in which the school operates and is crucial for understanding all other data. By dis-aggregating information by demographics (for example, by gender or ethnicity), you can understand what impact the education system is having on different groups of students.

Figure 1. Data You Can Use to Analyze Demographics

About Students

  • Number of students in the school

  • Class sizes

  • Absences/tardies

  • Gender

  • Ethnicity/race

  • Home background

  • Lunch status (free, reduced, or full price)

  • Language proficiency

  • Preschool attendance

  • Special needs

  • Mobility

  • Retention rates

  • Dropout rates

  • Graduation rates

  • Post-graduation employment/education

  • Extracurricular activities

  • Honors/advanced placement status

  • Employment during high school

About Staff

  • Numbers of teachers, administrators, paraprofessionals, and support staff

  • Years of experience (total number of years and by grade level)

  • Absences

  • Gender

  • Ethnicity/race

  • Retirement projections

  • Types of certification

  • Student-teacher ratios

  • Professional development opportunities

  • Extracurricular activity involvement/committees

About the School

  • History

  • Safety/crime data

  • Special qualities and strengths

  • Turnover rate of teachers and staff

  • Community support

  • Programs offered

About the Community

  • Location and history

  • Makeup of the population

  • Economic base

  • Population trends

  • Types of employers in the community

  • Projections of growth

  • Parent employers

  • Community/business involvement

  • Support agencies

 


Student learning data include a variety of measurements—norm-referenced tests, criterion-referenced tests, standards assessments, teacher-assigned grades, and authentic assessments—that show the impact of your education system on your students.
Perceptions data—gathered through questionnaires, interviews, and observations—help you understand what students, parents, teachers, and the community think about the learning environment. People act according to what they believe about different topics, so if you want to change a group's perceptions, you have to know about their beliefs. Student perceptions, for example, can tell you what motivates students to learn, and staff perceptions can indicate what kind of change is possible and necessary within the school.
School processes data include the school's programs, instructional strategies, assessment strategies, and classroom practices. Keeping track of these processes through careful documentation helps you build a continuum of learning for all students.

Data Snapshots

  • How many students are enrolled in your school this year? (demographic)
  • How did students at your school score on the state test? (student learning)
  • What are parent, student, and staff perceptions of the learning environment? (perceptions)
  • What special programs are operating in your school this year? (school processes)
  • How has enrollment in the school changed? (demographic)
  • Have student scores on standardized tests changed during the past several years? (student learning)
  • How have parent, student, and teacher perceptions of the learning environment changed? (perceptions)
  • What programs have operated in the school during the past five years? (school processes)

Intersecting Two Data Categories

  • Do students who attend school every day get better grades? (demographic/student learning)
  • Do students with positive attitudes toward school do better academically, as measured by teacher-assigned grades? (perceptions/student learning)
  • Did students enrolled in interactive math programs this year perform better on standardized achievement tests than those who took traditional math courses? (student learning/school processes)
  • What strategies do 3rd grade teachers use to teach students with native languages different from their own? (demographic/school processes)
  • Is there a difference in how students enrolled in different programs perceive the learning environment? (perceptions/school processes)
  • Is there a gender difference in students' perceptions of the learning environment? (perceptions/demographic)
Looking at the intersection of two kinds of data over time allows you to see trends developing—for example, standardized achievement test scores disaggregated by ethnicity over the past three years can help a school see whether the scores of a given ethnic group compared with those of others constitute a trend or just a fluctuation.

Intersecting Three Data Categories

  • Do students of different ethnicities perceive the learning environment differently, and do they score differently on standardized achievement tests consistent with these perceptions? (demographic/perceptions/student learning)
  • Which program this year is making the biggest difference in achievement for at-risk students, and is one group of students responding more successfully to the program than are other students? (school processes/student learning/demographic)
  • Is there a difference in students' reports of what they like most about the school according to whether they participate in extracurricular activities? Do students who participate have higher grade-point averages than students who don't participate? (perceptions/student learning/school processes)
  • What instructional process did the previously non-English-speaking students enjoy most in their all-English classrooms this year? (perceptions/demographic/school processes)
Looking at these data over a period of time will allow you to see trends, to understand the learning environment from the students' perspectives, and to know how to deliver instruction to get the best possible results for all students.

Intersecting Four Data Categories

  • Are there differences in achievement scores for 8th grade girls and boys who report that they like school, by the type of program and grade level in which they are enrolled? (demographic/perceptions/school processes/student learning)
Not until you intersect all data categories at the school level and over time will you be able to answer questions that allow you to predict whether the actions, processes, and programs that you are operating will meet the needs of all students. By crossing all four data categories, you are taking into account who your students are, how they prefer to learn, which subgroups of students are achieving, and with which processes students achieve.

What Does Using Data Look Like?

Sixty percent of 3rd graders at Archer Elementary School scored below the proficient level on the state's criterion-referenced reading test. To understand this achievement problem, the staff disaggregated the below-proficient 3rd grade scores (student learning) by gender and ethnicity (demographic). They discovered that the scores of boys and girls were similar but that one ethnic group had consistently lower scores. An examination of the data for the past three years revealed the same scenario.
Were there differences in the way these students were taught (school processes)? By disaggregating the student achievement scores by gender, ethnicity, and teacher, the staff noticed that each teacher's students received generally consistent scores over time, and certain teachers' students who were of this particular ethnicity had never scored at or above the proficient level in three years. At the same time, other teachers' students of this ethnicity scored above proficient. Looking at the student questionnaire results (perceptions) by ethnicity, the teachers were stunned to see that this ethnic group scored the lowest in response to such statements as “My teacher thinks I will be successful,” “My teacher believes I can learn,” “I am recognized for good work,” “I know what I am supposed to be learning in my classes,” and “Students are treated fairly at this school.” Staff questionnaires showed that not all teachers agreed with the statement that “all students can learn.”
After reviewing these data, the district undertook personnel changes at the school and helped the school set up new school procedures: diagnosing student learning at the beginning of the year in every grade level, clarifying what students should know and be able to do by the end of each year, aligning curriculum and instruction to district standards, and measuring progress toward learning goals throughout the year.
Taking a different approach, staff members at Canyon View High School wanted to use their data to understand why more than half of the school's 9th grade students failed the state reading proficiency examination. Working backward through the students' education experiences to determine the earliest occurrence of a characteristic common to all students who had not passed the exam, the teachers were shocked to see that most of these students had missed up to 30 or 40 days in a 180-day school year when they were 1st graders.
These 9th graders and the students in grades just below them were already getting remedial reading help, but the new data provided an opportunity to save younger students from the same fate. The district began more extensive screening of elementary and middle school students who were likely to suffer academically because of high absenteeism in early years. Teachers, counselors, and principals followed up by working closely with parents—setting up telephone trees, for example, and in some cases making home visits—to make sure that the children got to school.

Data Access

To be done well, data analysis requires the technical support of knowledgeable people and a database or data warehouse. Districts are just now beginning to buy data warehouses that facilitate the storage and analysis of a large number of data elements quickly, easily, accurately, and meaningfully. School districts or schools that do not have such a tool should consider buying a data warehouse; in fact, not having one should no longer be an option. When looking for a database or data warehouse, districts and schools should look for the following six features:
Accessibility at different levels. Stored at the district and possibly even regional or state levels, educators should be able to gain access to the data at schools and in classrooms.
Automatic graph builders. Being able to look over tables to check for accuracy is important, but the data analysis tool should be able to build graphs so that everyone can see the information in the same picture form.
Disaggregation on the fly. Anyone performing an analysis that is starting to show interesting data should be able to gain access to the next deeper levels quickly and easily.
Intuitive point-and-click or drop-and-drag technology. Everyone should be able to use the database without referring to a manual.
The ability to follow individual and group student achievement. Districts should be able to follow achievement from pre-K through grade 12.
Fast and easy creation of standard reports. Some reports—for school accountability or Title I, for example—have to be created every year and require much of the same information. Recreating the document should be possible with the click of a button.

Getting Started

Educators should start by organizing the data already on hand, such as student information and standardized test score results. Both are usually in electronic form, so importing the information into a data warehouse or data analysis tool is easy. Next, educators should think about how they can intersect that data to answer questions about program implementation. Gathering data for its own sake is counterproductive and often results in “analysis paralysis.” The goal of using data to improve learning for all students should always be paramount.
The district should provide an expert to perform the major analyses for the district and each of its schools. Teachers should be able to spend their time studying the results instead of looking at the data or performing analyses themselves. They should be able to start the school year with historical data on each of their students and a full picture of what students already know and what they need to learn. And they should use ongoing measurements to make sure that all students are progressing and mastering the content.
When student learning measures are the only focus of a school's data analysis efforts, school personnel end up using their time figuring out how to look better on the student learning measures. This narrow approach has limited results. By contrast, looking at student achievement results in conjunction with the context of the school and the processes that create the results gives teachers and administrators important information about what they need to do to improve learning for all students.
End Notes

1 The schools mentioned in this article are identified by pseudonyms.

2 Bernhardt, V. L. (2000). Designing and using databases for school improvement. Larchmont, NY: Eye on Education.

Victoria L. Bernhardt, PhD, is executive director of the Education for the Future Initiative, whose mission is to build the capacity of learning organizations at all levels to gather, analyze, and use data to continuously improve learning for all students. She is also a professor emeritus in the College of Communication and Education at California State University, Chico. Bernhardt received her PhD in Educational Psychology Research and Measurement, with a minor in Mathematics, from the University of Oregon.

She has worked for more than 25 years with learning organizations all over the world to assist them with their continuous improvement and comprehensive data analysis work. She is the author of 20 books, including the widely recognized Data Analysis for Continuous School Improvement, 3rd Edition.

Learn More

ASCD is a community dedicated to educators' professional growth and well-being.

Let us help you put your vision into action.
From our issue
Product cover image 103031.jpg
Using Data to Improve Student Achievement
Go To Publication