HomepageISTEEdSurge
Skip to content
ascd logo

Log in to Witsby: ASCD’s Next-Generation Professional Learning and Credentialing Platform
Join ASCD
December 1, 2008
Vol. 66
No. 4

Answering the Questions That Count

Examining student data through the lens of pressing questions can mobilize staff, promote data literacy, and help raise student achievement.

premium resources logo

Premium Resource

Daily life in districts and schools requires educators to effectively navigate a sea of data: diagnostic and norm-referenced standardized assessment data, reading assessment data, state and local assessment data, in combination with other data related to instructional programs and demographic, attendance, and dropout trends. This new level of applied data use requires district and school administrators, teacher leaders, and classroom teachers to be data literate, that is, able to use multiple types of assessment and other data to inform decisions that lead to higher student achievement.
Despite the increased amounts of data available, many educators still feel ill prepared to analyze and use their school data effectively. They are data rich, but information poor. Our experiences working with data use in schools and districts have led us to define an effective framework for building data literacy. This framework is fueled by an essential-questions approach that organizes data use around a cycle of inquiry and is grounded in three core components of systemwide data use: data quality, data capacity, and data culture.

The Essential-Questions Approach

A study on data use in several urban U.S. high schools showed that when school leaders used questions to focus the collaborative examination of data, school staff became more engaged in the process. When important questions drove the dialogue about school effectiveness, school staff quickly learned how to identify and use different types of data to answer those questions (Lachat & Smith, 2004).
  • How do student outcomes differ by demographics, programs, and schools?
  • To what extent have specific programs, interventions, and services improved outcomes?
  • What is the longitudinal progress of a specific cohort of students?
  • What are the characteristics of students who achieve proficiency and of those who do not?
  • Where are we making the most progress in closing achievement gaps?
  • How do absence and mobility affect assessment results?
  • How do student grades correlate with state assessment results and other measures?
Asking questions such as these enables administrators and teachers to focus on what is most important, identify the data they need to address their questions, and use the questions as a lens for data analysis and interpretation. To avoid the common tendency to get lost in a long list of questions, district or school staff should, in general, identify no more than five or six crucial questions that get at the heart of what they need to know.
The essential-questions approach provides the fuel that drives collaborative analysis. But to use data purposefully and in a sustained way over time, schools and districts have to address three interrelated components of systemwide data use.

Data Quality

Teachers and administrators need to believe in the completeness and accuracy of the data they are expected to use. Data must be sufficiently disaggregated to address questions of concern, displayed in easy-to-understand formats, and available in a timely manner for instructional planning. The most effective way to ensure these conditions is to use technology that supports data disaggregation, provides data access, and generates useful data displays.
Research has emphasized that data disaggregation is essential to effective data use (Johnson, 2002; Lachat & Williams, 2003). Districts and schools are used to getting assessment data that are broadly disaggregated by gender or race/ethnicity. However, disaggregating assessment data by combinations of students' demographic characteristics (that is, race/ethnicity by gender or disability) and by the programs in which students are enrolled (that is, race/ethnicity by specific reading or mathematics programs) enables schools to examine the effectiveness of programs for specific groups of students. In addition, disaggregated data linking attendance, mobility, or course grades to assessment results are helpful when looking at the kinds of factors that may influence student performance.
Data-warehousing applications provide the deep disaggregation necessary for meaningful data use (Mieles & Foley, 2005; Wayman, Cho, & Johnston, 2007; Wayman, Stringfield, & Yakimowski, 2004); and midsize to large districts are increasingly using these applications. Researchers have also noted the importance of technology in creating data access and presenting data in useful formats (Mandinach & Honey, 2008; Rudner & Boston, 2003).
Although many schools we have worked with have some form of data-warehousing technology, training typically focuses on how to use the technology itself rather than on how to make meaning of the data. The essential-questions approach has helped educators recognize the power and potential of going beyond aggregated data to identify the data they need and when they need it and to define multiple ways of disaggregating the data.

What It Looks Like in Practice

Improving students' reading skills was a major issue for three high schools in an urban district in which the majority of students were below grade level in their reading assessment scores. Hispanic students—many of whom were English language learners—constituted more than one-half of the population in all three high schools; the percentage of low-income students ranged from 60 to 85 percent.
  • How did students in each of my 10th grade course sections perform on the 9th grade reading vocabulary and reading comprehension assessment?
  • Do some of my course sections have a higher proportion of students below grade level in reading skills?
  • What is the variation in students' reading skills within each of my course sections?
  • How can I meet the instructional needs of students with varying skill levels in reading?
The teachers' questions indicated that, on the standardized reading assessment, they wanted 9th grade vocabulary and comprehension subskill results disaggregated for each of the 10th grade English language arts course sections so teachers would have a profile of students' reading ability in their current classes.
  • The schools had received aggregate total reading results rather than subskill results.
  • Data for the 9th grade assessment were organized by the 9th grade homerooms in which the students took the test rather than by the 10th grade English teachers' classrooms in which the students were now enrolled.
  • The high schools had no test results for students who had transferred from another high school in the district.
  • The schools did not have a dissemination plan to get data to teachers on a timely schedule.
The district's high school reform initiative provided the stimulus and funds to address these data-quality issues. The district acquired technology services that used a data-warehousing application to disaggregate vocabulary and reading comprehension results by students' current course sections and to provide information about vocabulary subskills, including basic vocabulary, synonyms, words with multiple meanings, and use of context clues. The district's director of information services set up new data-verification procedures to ensure that high schools had complete records on transfer students, including assessment results. With the assistance of the data coach, school principals developed a dissemination plan that identified what data would be available and when, who would get the data, and how staff members might use it.
Addressing data quality and disaggregating data for different course sections meant that the 10th grade teachers could answer their essential questions about students currently in their classes. They learned that most students were below grade level in recognizing synonyms and determining meaning for words with multiple meanings but were at grade level or above in using context clues. This enabled teachers to target instruction to focus on word study and word analysis.
  • How much instructional support will teachers need for students below grade level in reading vocabulary and reading comprehension?
  • How can I use reading subskill data profiles to analyze teachers' professional development needs?
The literacy coaches helped teachers learn instructional strategies that are particularly relevant to English language learners' needs. These included strategies for building vocabulary, helping students understand text structures, and using anticipation guides, graphic organizers, and think alouds. The focus on data continued beyond the reform initiative, and the disaggregated subskill reports became part of the core set of reports the district provided to these high schools.
The high school reform initiative also involved using data-warehousing technology to follow the progress of three cohorts of students from grade 9 to grade 11 on one particular subskill of the state assessment. All three cohorts showed some increases in the percentage of students achieving proficiency in this subskill and significant decreases in the percentage of students scoring at the lowest level of the assessment. Although these positive results can't be directly attributed to teachers' use of assessment data and targeted reading interventions, this focus most likely contributed to improving student performance.

Data Capacity

It's not enough to have high-quality data. Effective data use will not occur unless schools also address data capacity. Building data capacity means establishing data teams, designating data coaches, creating structured time in the school calendar for collaborative analysis, building staff skills in data analysis and assessment literacy, and displaying data in formats that facilitate inquiry and analysis (Boudett & Steele, 2007; Lachat & Smith, 2004; Love, Stiles, Mundry, & DiRanna, 2008).
It's important to ensure the broad representation that makes collaborative analysis meaningful. At the district level, representation should include leadership in curriculum and instruction, the elementary and secondary levels, special programs, student personnel services, research and assessment, and student information services. School teams should include the principal and other instructional leaders, personnel from the guidance department, and grade-level or subject-area teacher representatives. Schedule time for collaborative analysis at key data points, such as when pertinent assessment or quarterly data on attendance and course grade patterns become available. This enables schools to define an annual schedule of when data teams will do their analysis and improvement planning. Using a data coach as a facilitator is an effective strategy for providing an embedded form of professional development to enhance the data teams' skills and assessment literacy.

What It Looks Like in Practice

Many of the schools and districts we have worked with want to go beyond trend data in their analysis of student progress. One urban district in the northeastern United States specifically focused on building the skills of both district and school data teams to analyze multiple types of assessment data. The district used data-warehousing technology to disaggregate longitudinal data that addressed the teams' questions about the performance of different student subgroups. Their analysis of the data led the teams to define questions about the progress of cohorts of students as they moved from one grade to the next. The team particularly wanted to focus on whether early elementary students were improving their proficiency levels in vocabulary as measured by the reading assessment that all the elementary schools used.
  • Are our students making sufficient grade-to-grade progress in vocabulary development?
  • How many of our lowest-performing students on last year's vocabulary assessment improved their proficiency level on this year's assessment?
  • What are the characteristics of students who made progress and of those who did not?
  • What percentage of students at grade level on a previous year's assessment declined in their performance?
  • Do the data indicate that teachers need more training to improve students' vocabulary skills?
Participants discussed how to visually represent the data in displays that would facilitate analysis. We have found that graphic displays created to address one or more essential questions are the most useful. The display shown in Figure 1, for example, enables viewers to examine the grade 3 performance in vocabulary of students who scored at each of three performance levels in that skill in grade 2.

Figure 1. Are Our Students Making Sufficient Grade-to-Grade Progress?

el200812_ronka_fig1.gif
Data capacity was crucial to this district's process of inquiry. The district leadership established and trained district- and school-level data teams, allotted time to engage in collaborative analysis, and made available meaningful data displays driven by essential questions. Ultimately, educators in this district were able to go beyond the superficial (and often inaccurate) conclusions of trend analysis and identify the specific effects of current programming. This enabled the district to target areas in which additional resources were needed to improve instruction. Responses included having reading coaches focus their work with teachers on modeling intensive reading interventions for low-performing students, providing more direct support in teachers' classrooms for struggling students, and establishing more consistent monitoring procedures to determine the success of interventions.

Data Culture

Achieving purposeful and sustained data use necessitates a culture shift. It requires paying deliberate attention to issues of leadership, policy, accountability, shared beliefs, and collaboration (Boudett & Steele, 2007; Firestone & Gonzalez, 2007). This entails establishing and providing leadership direction to data teams, modeling effective data use, scheduling time for collaborative data-driven conversations, and connecting data analysis to clear action steps. Holcomb (1999) compellingly wrote about the importance of mobilizing broad stakeholder involvement and getting people excited about data use. She refers to this as focusing "people, passion, and proof" on strategically aligning all elements of a school to analyze what is and isn't working to improve student learning.

What It Looks Like in Practice

  • How do the reading levels of our students compare with those of students across the state?
  • How many of our middle and high school students read below grade level?
  • Will improving students' reading skills positively affect their performance in core courses and on state and local assessments?
  • What are we doing to support accelerated growth in reading for students below grade level?
Our work in this district engaged administrators, teachers, and reading specialists in a data-collection process built on the belief that improving student literacy was everyone's responsibility and that addressing the issue required a commitment to using data for improvement. The data-collection process, which occurred over the course of a few months, brought together student performance data, teacher survey data, and data on school capacity to support literacy. A school-based literacy team facilitated both the data-collection and the data-reporting/data-use processes. When juxtaposed with information about school capacity, the data showed that neither the middle school nor the high school had an effective way to address the needs of the sizeable number of students who read below grade level.
For example, teachers said that many students were unable to analyze what they read, did not like to write, responded to questions with incomplete answers, and had difficulty learning vocabulary. The majority of teachers reported that they did not use several instructional strategies that might address these issues, such as those relating to student choice, student inquiry, the use of technology and varied texts, and student discussion of text materials and what they have learned.
Recommendations included having teachers learn some common instructional strategies targeted to vocabulary development; motivating students to read and write in the content-area classroom through the use of collaborative routines, such as reciprocal teaching and paired reading and summarizing; and teaching students how to think critically when reading and writing. We also recommended that teachers learn a common protocol for looking at student work and that the middle and high school each form a literacy team to support implementation of the literacy improvement initiative.
Other issues emerged during the data-collection process. For example, teachers at the middle school level used three different reading assessments and lacked common protocols for testing. This made it difficult to track student progress. The high school had no system in place to determine the reading proficiency of incoming 9th graders. Despite the clear need, no interventions were available for struggling readers in grades 6–8, and few teachers or administrators in the middle and high schools regularly used existing data about student performance for placement, instructional decision making, or progress monitoring.
In response to our recommendations, the district took several steps that deepened data use at both the district and school levels. The district researched and selected a reading assessment for grades 6–10 that provided subskill reading performance data, including information about vocabulary and nonfiction reading comprehension. This enabled the district to monitor the progress of student cohorts, including those enrolled in intervention classes. At the middle school level, teachers learned protocols for looking at student work and met with students to set reading progress goals. School leaders communicated the expectation that teachers would use the new reading assessment data to determine what types of literacy support needed to occur in content-area classes.
The data-based recommendations led to targeted professional development for the faculty. In the first year, professional development focused on vocabulary-development strategies, instructional strategies to promote engagement and critical thinking in reading and writing, and a common set of instructional strategies to improve reading comprehension. Teachers in some departments began an in-depth look at the literacy demands of their content areas and started to develop common agreements about what they expected students to be able to do.
In the second year, professional development focused on how teachers might promote reading and writing for authentic reasons within and across content areas. A group of teachers began to engage in peer coaching, collecting data on one another's practice and sharing evidence about how specific strategies supported improved teaching and learning. The principal and vice principal conducted literacy walkthroughs to determine the effectiveness of professional development, and teachers received feedback on what was happening in classrooms schoolwide. According to staff, this combination of approaches contributed to gains in student reading achievement at the 6th, 7th, and 8th grade levels for the following two years.
Using data, which teachers and administrators can access online, is now an integral part of the culture in this district. Data use determines professional development needs, intervention requirements, and resource allocation; it focuses discussions about teaching and learning, guides teacher instruction, and monitors progress. Most important, teachers and administrators have a shared belief about its value.

The Data Difference

Schools and districts of all sizes can use the essential-questions approach to become data-driven decision makers focused on improving student learning and achievement. Properly used, data can make a difference in meeting the needs of every student and can be a powerful ally in stimulating positive change and improvement from the central office to the classroom.
References

Boudett, K. P., & Steele, J. L. (Eds.). (2007).Data wise in action. Cambridge, MA: Harvard Education Press.

Firestone, W. A., & Gonzalez, R. A. (2007). Culture and processes affecting data use in school districts. In P. A. Moss (Ed.),Evidence and decision making: The 106th yearbook of the National Society for the Study of Education, Part I (pp. 132–154). Malden, MA: Blackwell.

Holcomb, E. L. (1999). Getting excited about data: How to combine people, passion, and proof. Thousand Oaks, CA: Corwin Press.

Johnson, R. (2002). Using data to close the achievement gap: How to measure equity in our schools (1st ed.). Thousand Oaks, CA: Corwin Press.

Lachat, M., & Smith, S. (2004). Data use in urban high schools. Providence, RI: Education Alliance at Brown University.

Lachat, M., & Williams, M. (2003). Putting student performance data at the center of school reform. In J. DiMartino, J. Clark, & D. Wolk (Eds.), Personalized learning(pp. 210–228). Lanham, MD: Scarecrow Press.

Love, N., Stiles, K. E., Mundry, S., & DiRanna, K. (2008). A data coach's guide to improving learning for all students: Unleashing the power of collaborative inquiry. Thousand Oaks, CA: Corwin Press.

Mandinach, E. B., & Honey, M. (Eds.). (2008). Data-driven school improvement: Linking data and learning. New York: Teachers College Press.

Mieles, T., & Foley, E. (2005). From data to decisions: Lessons from school districts using data warehousing. Providence, RI: Annenberg Institute for School Reform at Brown University.

Rudner, L. M., & Boston, C. (2003). Data warehousing: Beyond disaggregation.Educational Leadership, 60(5), 62–65.

Wayman, J. C., Cho, V., & Johnston, M. T. (2007). The data-informed district: A district-wide evaluation of data use in the Natrona County School District. Austin: University of Texas.

Wayman, J. C., Stringfield, S., & Yakimowski, M. (2004). Software enabling school improvement through analysis of student data (Report No. 67). Baltimore: Johns Hopkins University, Center for Research on the Education of Students Placed at Risk.

Julie Meltzer is the director of literacy research and development at the Public Consulting Group's Center for Resource Management, Inc. (CRM), in Portsmouth, New Hampshire. Dr. Meltzer is the codeveloper of CRM's School-Wide Program for Improving Reading and Learning (SPIRAL), which supports middle and high school educators to systemically address improving adolescent literacy. As director of the Adolescent Literacy Project at the LAB at Brown University, Dr. Meltzer developed the Adolescent Literacy Support Framework and authored Adolescent Literacy Resources: Linking Research and Practice (2002) and other research-based resources for professional development and technical assistance. A sought-after keynote speaker, author, reviewer, conference presenter, and workshop leader, Dr. Meltzer consistently seeks to help educators build their capacity to effectively apply promising research-based practices to support the literacy development and learning needs of adolescents. A key focus of her current work is helping school leaders to understand the roles, responsibilities, and actions associated with academic literacy development at the middle and high school levels. Dr. Meltzer can be reached at jmeltzer@pcgus.com.

Learn More

ASCD is a community dedicated to educators' professional growth and well-being.

Let us help you put your vision into action.
From our issue
Product cover image 109023.jpg
Data: Now What?
Go To Publication