HomepageISTEEdSurge
Skip to content
ascd logo

Log in to Witsby: ASCD’s Next-Generation Professional Learning and Credentialing Platform
Join ASCD
November 1, 2015
Vol. 73
No. 3

How Report Design Makes or Breaks Data Use

author avatar
The design of data reports makes the difference between insight and irritation as teachers review data. See how your school's data system stacks up.

How Report Design Makes or Breaks Data Use - Thumbnail
Credit: Copyright(C)2000-2006 Adobe Systems, Inc. All Rights Reserved.
Why did your district purchase a computerized data system? Your colleagues likely hoped the data system would be easy to use, capable of handling large quantities of longitudinal data, and budget conscious. But I'd guess the main thing they hoped to get out of the data system was—feedback. A data system's main purpose is to provide feedback to help you help students. That's the whole reason you're putting all that information into the system, training staff to use it, and discussing data in your professional learning communities—to make decisions that impact students (VanWinkle, Vezzu, & Zapata-Rivera, 2011).
Data reports are a data system's main means of communicating feedback. Programmers make such reports available in the systems, but often educators and report design experts are left out of the process of designing how reports will look. That's a problem because education and report design aren't simple matters.
The design of data reports and the analysis tools that accompany those reports can make or break data use in a school district, affecting students in the process (Johnson & Rankin, 2014). This is because the design of a student data report can make or break its users' ability to understand the data, draw accurate conclusions, and respond appropriately. And teachers do need help interpreting data effectively. A national study by the U.S. Department of Education Office of Planning, Evaluation, and Policy Development (2011) of districts known for strong data use found teachers have difficulty with question posing, data comprehension, and data interpretation. For example, teachers in these districts correctly interpreted only 48 percent of given data. Other research upholds the trend: Educators use data to inform decisions, but they don't always use it correctly.
You can support your staff's data use with professional development and by providing supports like data coaches or leadership for data teams. But those supports will go further if your data system generates reports that conform to research-based recommendations.

Evaluate Your Data System's Fluency

When an educator looks at a report from your school or district's data system, does the report deliver information clearly and meaningfully? Looking at four key areas will give you a sense of ways your system might improve: (1) credibility; (2) the presence of key features that make the data easier to see and understand; (3) helpful design; and (4) analysis supports.
As you read, gather several different reports from your data system and keep a tally or checklist of whether these reports contain each of the features or follow each of the recommendations bulleted here. One article, of course, can't capture all design and data considerations that should go into each report, but these elements cover many basics.

Credibility

Some experts argue that credibility is data reporting's first priority. If a system user has reason to distrust the data (for instance, if a data item reports, "285 percent of female students are proficient"), he or she won't use the data. However, credibility is based on more than just accurate numbers. Reports should not contain
  • Inappropriate displays. We want data to be simple, but sometimes the simplest displays aren't accurate. For example, for many assessments, you can't subtract one year's test score from the next year's score and call it growth, because the tests vary in difficulty from one year to the next. If your data system ignores recommended approaches, it risks its credibility. Worse, it may mislead your staff.
  • Sloppy displays. If a data system vendor can't within five minutes clean up an accidental change in font size or type, cut-off text, or cell formatting that was overlooked, how can staff trust the accuracy of the data displayed? Sloppiness renders the data unworthy of trust.

Key Features for Easy Interpretation

When you view a report, it should give you answers to your questions as quickly and easily as possible through these features.
  • Summaries or averages for comparison. Most data have no meaning without a point of comparison. For example, if I'm looking at a list of 20 teachers that shows how their students, on average, performed on a test, and I find my own row of data on that list, how do I instantly know how my students performed in relation to my colleagues' students? The table the list is part of should end with an average score for the whole group to which I can compare my row's numbers. Typically, a student list should feature an average for the whole class, a site list should feature an average for the district, and so on. Comparisons give data added meaning. An educator shouldn't have to run a separate report to make a likely comparison.
  • Juxtaposition of subgroups' performance. If I disaggregate a report by the English learner subgroup and find my English learners averaged 76 percent correct on a test, how can I see how they performed in relation to students fluent in English? The 76 percent is meaningless without other subgroups' performance. While your data system should allow you to disaggregate its reports, it should also offer reports comparing multiple subgroups. You shouldn't have to run separate reports to make subgroup comparisons.
  • A way to track gaps. "Of course, my [insert name of any subgroup] students are behind my other students. They're always that way." This is a statement teachers viewing data often make, and it shows that the data presentation isn't shedding light on an important question. A group of students might have entered a classroom or grade level with achievement behind that of the rest of the class, but is that gap narrowing or widening? The whole point of No Child Left Behind was to bring struggling subgroups up to speed with other students. Displaying subgroup gaps on a series of assessments over time (in addition to tracking scores) gives all of a school's educators ownership over each group's performance.
  • All vital data. We all want data to be simple, but sometimes they aren't. Research has revealed that the report format users prefer is sometimes the opposite of the report format they can most accurately interpret (Hattie, 2010). For example, some domain scores have no meaning without state or national data that a viewer can use to determine domain strengths and weaknesses through comparison. Reports without the added data look easier to use, but they're useless (and misleading) without the data from the broader group. Look at the simpler reports in your data system and consider: Is any category of data missing—but needed to do meaningful interpretation?
  • Graphics. Have you ever heard someone say, "I'm not a visual learner," the questionable merit of learning styles aside? A mountain of research supports the use of graphics in reports. Using a graphic for everything defeats the purpose, but include charts and graphs for key comparisons.
  • Completed calculations. You shouldn't have to perform mental arithmetic when analyzing data. For example, if students' proficiency levels are broken down to show that 35 percent of students scored Proficient and 24 percent scored Advanced, the sum of these two percentages, showing what percentage of students met proficiency standards (59) should also be displayed on the report. If you're looking at performance over multiple years, you should see the growth year over year calculated in its own row or column. Research shows that missing sums and averages can discourage educators from using a report or open their analyses up to error. Even at school districts considered exemplars of data use, data analyses involving multiple calculations was one of teachers' greatest and most common struggles (U.S. Department of Education Office of Planning, Evaluation, and Policy Development, 2011).

Helpful Design

A large portion of educators are already intimidated by data; most educators' postgraduate programs didn't include a course on data analyses. Designers should take these steps to make analyses less daunting.
  • Avoid keys/legends. Every time a chart asks a user to look away, find a key, or gather knowledge on a color or character's meaning, and then look back at the chart again to apply that knowledge, you risk losing the user. Work content into the chart's title or labels whenever possible.
  • Put the most important data in prime locations. The last column or row in a table is prime real estate, as it's easy to see and less cluttered than spaces sandwiched in the middle. Reports shouldn't culminate with less important data. Vital data (often averages or sums, to which other data can be compared) should stand out. Any culminating graphics or information should summarize important concepts, not trifles.
  • Let users scan without obstacles. Data providers put more than one number on a report because they mean readers to view and compare numerous data. Design shouldn't get in the way of this. For instance, if you want a table to show both the percentage of students scoring in 5 different performance levels and the raw number of students performing at each level, if both figures are displayed side by side in each of the five cells across a row, even with the raw numbers in parentheses [4 % (10), 9% (23), 29% (73), and so on] the clutter will make it hard to scan the row quickly and compare percentages. Much better to put the raw numbers in small type underneath each percentage within each cell. This simple change will let the viewer's eye scan easily from one percentage to the next. The raw numbers no longer vie for the primary attention.
  • Use color with purpose. Color can add meaning to reports without adding clutter. Color can distinguish sections of the report and communicate which scores are problematic (in red) and which scores are what teachers want to see (green).
  • Center data items. Instead of "hugging the lines," numbers and percentages should be centered in cells and all text should be vertically centered within cells. The added space makes data items easier to read and removes clutter.

Analysis Supports

If you bought over-the-counter medication, would the bottle be merely marked "cold" or "flu"? Never. Rather, the label would outline the content's purpose, ingredients, dosage instructions, and dangers. To do less would jeopardize the well-being of those the medicine treats.
Yet data systems often give educators marginally labeled data to treat students' well-being. Busy educators shouldn't have to read a test's 200-page post-test guide to use data when a data system can easily provide crucial guidance within a report. Instead, they need the following:
  • Footers. For nearly every data source, there are analysis errors teachers commonly make. At footer at the bottom of a page in a report can alert educators to a common misstep and provide them with report-specific guidance. For instance, a footer might read, "Caution: Each grade level's version of this assessment differs in difficulty. To compare grade levels, compare how they performed in relation to the national average."In a quantitative study I conducted involving 211 educators of varied roles and backgrounds, footers like this in data reports improved educators' data analysis accuracy by 307-336 percent (Rankin, 2013). Do your reports have a brief footer or other text alerting users to possible missteps?
  • Reference sheet. A one-page reference sheet, ideally just a click away, can explain the purpose, use, intended audience, and content of a particular report. Such reference sheets improved educators' data analysis accuracy by 205–300 percent in my 2013 study.
  • Reference guides. A reference guide can act as an expanded reference sheet. It guides the user while he or she views the report, walking an educator through analysis steps, such as how to use a data display to answer particular questions. Reference sheets that accompanied data reports quadrupled educators' data analysis accuracy in my study.
  • Help system. Your data system likely has a built-in help system. Does it only cover how to use the technology, or does it also help users analyzing the data in a report?

Your Next Steps

How did your data system measure up when you evaluated how fluently its reports communicate data? If it was found wanting, I recommend you work with your data system provider or vendor to push for any changes needed in the way it communicates. Be wary if your vendor trivializes the importance of your concerns. You might even refer your data system provider to Over-the-Counter Data Standards, on a website I've developed that summarizes more than 300 studies and sources into best practices for displaying data for educators.
With so many analysis errors involving student data taking place, your staff—and students—deserve a data system that supports educators in every way possible as they strive to use data well.*
Often educators are left out of the process of designing how data reports will look.
We want data to be simple, but sometimes the simplest displays aren't accurate.
References

Hattie, J. (2010). Visibly learning from reports: The validity of score reports. Online Educational Research Journal. Retrieved from http://www.oerj.org/View?action=viewPaper&paper=6

Johnson, M., & Rankin, J. (2014, July 31). Empowering Users to Make Data-Informed Decisions. U.S. Department of Education's Institute of Education Sciences (IES) National Center for Education Statistics (NCES) STATS-DC Conference. Presentation conducted from Marriott Wardman Park Hotel, Washington, DC.

Rankin, J. G. (2013). Over-the-counter data's impact on educators' data analysis accuracy. ProQuest Dissertations and Theses, 3575082. Retrieved from http://pqdtopen.proquest.com/doc/1459258514.html?FMT=ABS

U.S. Department of Education Office of Planning, Evaluation and Policy Development (2011). Teachers' ability to use data to inform instruction: Challenges and supports. Washington, DC: United States Department of Education (ERIC Document Reproduction Service No. ED516494)

VanWinkle, W., Vezzu, M., & Zapata-Rivera, D. (2011). Question-based reports for policymakers (ETS Research Memorandum No. RM-11-16). Princeton, NJ: Educational Testing Service.

End Notes

1 This checklist assumes that your system's reports cover basic functionality, such as tying students to their student identification numbers and charting progress over time.

Jenny Grant Rankin, PhD, teaches the Post Doc Masterclass at the University of Cambridge. Dr. Rankin, who resides in California most of the year, earned a PhD in Education, with a specialization in School Improvement Leadership. She is an award-winning former junior high school teacher who earned such honors as being named Teacher of the Year and having the American flag flown over the US Capitol building in honor of her dedication to her students. As the majority of her students were socio-economically disadvantaged English learners, she specialized in using data, differentiation, and creative instruction (e.g., gamification, project-based learning, global learning) to ensure that her exceptional students were being challenged and engaged even as they learned alongside struggling and grade-level learners. Dr. Rankin, a Mensan who grew up in GT/GATE/TAG, is the assistant coordinator of her county's Mensa Gifted Youth Program and the author of numerous books and journal articles.

Learn More

ASCD is a community dedicated to educators' professional growth and well-being.

Let us help you put your vision into action.
From our issue
Product cover image 116030.jpg
Doing Data Right
Go To Publication