There's a flood of information available about student performance. In the School District of Indian River County, we were no exception. Our plethora of information included data from state tests, national tests, and district benchmark assessments. In the past, when we saw a student or a group of students struggling, we'd turn to those results to try to figure out why. But this data only told us what we already knew—students were struggling! We needed insight into the cause of students' struggles.
We realized we needed to expand our view. In 2014, we launched an initiative to use early warning indicators (EWIs) to determine what other factors might affect a student's academic performance. When we paired academic data with student engagement data, answers emerged. We began to see not only why some students faltered but also how we could help.
Here are the five strategies we used to launch our Early Warning Indicators initiative, which ultimately helped us better collect, organize, examine, and translate student data into meaningful action.
1. Form a committee of diverse stakeholders to select EWIs.
In every school district, a variety of stakeholders are examining student data from a unique perspective. To ensure these diverse perspectives are represented, assemble a committee of stakeholders from across the district. Include teachers, school leaders, student support specialists, guidance counselors, school psychologists, social workers, attendance officers, district administrators, and others. You'll be surprised at the collective expertise assembled and pleased with the overall perspectives on student achievement offered by this diverse group.
2. Collaborate to identify and refine EWIs.
At the committee's first meeting, ask participants to collaborate to identify which EWIs would be most useful in catching students before they slip academically. In subsequent meetings, refine those indicators by drawing upon each committee member's expertise. This will ensure the EWIs will be meaningful to each stakeholder but broad enough for districtwide use.
3. Make it easy to manage EWI data.
For an early warning initiative to work, each stakeholder must be able to easily access the information they need day-to-day. When our committee chose our EWIs—attendance, discipline, mobility, retentions, course failures, and academic measures—we realized that the data for these indicators was housed in four different systems. After looking at our data systems and how other districts were managing EWIs, we decided to go with the Early Warning System application from Performance Matters, the makers of our assessment and data management system. Using this customizable reporting and filtering module, we were able to set our own values and rules for student attendance, discipline, and academic measures. Now, we can track all of our EWIs in one place all while pairing this information with our academic data, which makes it easier to analyze the data and act upon it in a timely way.
4. Conduct beta testing.
Before pushing the initiative districtwide, it is essential to conduct beta testing to identify potential issues and correct them before they become full-blown problems. When we tested our initiative in spring 2014, we asked committee members to pair data from academic measures (e.g., student performance on state tests) with data for attendance, discipline, mobility, retentions, and course failures. Through this process, we discovered that some of our indicators needed further refining. For example, we found that two or more retentions in a student's academic history proved a more accurate performance indicator than only one retention.
We also discovered issues with the way we entered data into our management system. For example, when a discipline referral is created, it may include several different observed behaviors. Because of the way we'd set up our system, each of those behaviors showed up as a separate incident, which made the student's behavior look worse than it really was. To fix this, we worked with our system provider to create a new filter that allowed us to assign multiple behaviors within one referral.
5. Use actual school data to make trainings more meaningful.
One way to help school leaders quickly see the value of EWIs is to give them their own data for analysis. In summer 2014, we introduced our initiative to a team of instructional leaders from each school. These teams included principals, assistant principals, teachers, and guidance counselors, among others. We used the Early Warning System to provide each school team with its own set of data and gave them a protocol for reviewing the data.
For the first time, they were able to see students' academic data from our state assessments side-by-side with engagement data such as attendance and behavior. When the teams looked at students whose scores placed them in the bottom two levels on our state math and reading tests, many saw that a significant percentage of those students had chronic attendance issues. In that instant, they understood that to improve students' academic performance, they first had to get them to school.
Next, we launched a series of trainings for staff members at each school. Instead of trying to "teach" them how to use the data management system, however, we simply showed them student data from each EWI and compared the data from different EWIs. As soon as they saw the relationship between academic data and student engagement data, they grasped the value of our initiative. They were then ready to learn the details of how to use the data management system.
Thanks to our early warning initiative, we can now use our data to find connections we couldn't see before. As a result, we're better able to connect students to the support they need to stay engaged and improve their performance.