HomepageISTEEdSurge
Skip to content
ascd logo

Log in to Witsby: ASCD’s Next-Generation Professional Learning and Credentialing Platform
Join ASCD
May 1, 2021
Vol. 78
No. 8

So You Want to Get Serious About Evidence-Based Practice?

author avatar
Three factors make using research to inform school or classroom practice trickier than it looks.

premium resources logo

Premium Resource

Professional Learning
So You Want to Get Serious About Evidence-Based Practice? thumbnail
Credit: Credit: Ammentorp Photography / Alamy Stock Photo
Everyone says they want more "evidence-based" practice in schools. Indeed, you hardly ever hear anyone arguing for "evidence-free" practice. Yet there's also a widespread sense that research too rarely makes its way into schools and classrooms. Why is that?
This is a tension I've wrestled with for decades, particularly as I compile my yearly RHSU Edu-Scholar Public Influence Rankings, which highlight the education researchers who have most influence on K–12 education policy and practice. Let me unpack what I see as some of the forces at work that lead to research evidence not being widely used in schools, then offer some thoughts on what school and system leaders might do about this.

What's Preventing a Research-Practice Connection?

The lack of evidence-based practice is often blamed on educators who fail to properly study the evidence, or on silver-tongued vendors who confuse and confound district leaders. These may each play a part. But I believe three often-ignored factors play a more crucial role: (1) the fact that "evidence" isn't always all it's cracked up to be, (2) choices about what research gets pursued, and (3) a lack of clarity on what it means to "follow" research.

1. "Evidence" Isn't Foolproof

When we think about "evidence-based" practice, the evidence in question is often less definitive than we might imagine. Even medical researchers, with their deep pockets, fancy lab equipment, and focus on straightforward physical phenomena, change their minds with distressing regularity on things like the risks of cholesterol or the effects of alcohol.
Twentieth-century education researchers once reported that head size was a good measure of intelligence, that girls were incapable of doing advanced math, and that developmental retardation was rampant among certain ethnic groups. Now, you're probably thinking, "That's so wrong, it couldn't have been ‘real' research!" Well, it was conducted by university professors, published in scholarly journals, and discussed in textbooks. Other than the fact that the findings now seem wacky and offensive, that sounds like real research to me.
More generally, the fact that a study concludes something doesn't mean that the conclusion is necessarily "true." This is why research shouldn't be followed uncritically. Consider: In 2015, an attempt to replicate 97 studies in psychology with statistically significant results found that more than one-third couldn't be duplicated. This means that, when someone took the original data and re-ran the study, the results disappeared. Heck, a survey a few years back found that 90 percent of psychology researchers confessed to doing at least one behavior that might have compromised some of their research, such as stopping data collection early because they liked the results or failing to disclose all of a study's conditions. The bottom line: All "evidence" isn't reliable. Determining what to trust requires judgment, acumen, and patience.

2. Ed Research Tends to Focus on Broad Trends

Which research gets undertaken is often dictated less by educators than by policymakers and funders, who can provide resources, platforms, and professional opportunities for researchers and academics that practitioners can't match. Policymakers and funders want to know which policies and programs work; they have less use for prescriptions regarding classroom practice. And researchers are all too happy to use publicly funded state data that allow them to tackle broad policy questions—like whether preK or charter schooling boosts student achievement—and then garner grants and accolades from the convenience of their laptops.
On the other hand, designing field experiments to identify which elements of a preK program may matter for learners requires a lot of partnering with schools, gaining access to student data, and running the study—and often yields less-clear-cut results. Thus, there's a built-in temptation for systematic research to shortchange practice, meaning much of the research focused on practice is more ethnographic or journalistic. Such research can be enormously informative, but offers less clear prescriptions for practitioners.

3. The Complexity of "Following the Research"

It's not always clear what it means for educators to "follow" the research. Learning that something "works" and making it work successfully oneself can be two completely different things. Effectiveness often requires that the practice be implemented in a rather specific way. In health care, for instance, if a vaccine "worked" when patients got a 100-milligram dosage 28 days apart, that's how doctors will seek to administer it. If those instructions aren't followed, the treatment may not work the way it's supposed to.
In education, there's the same need for educators to understand precisely how to implement a practice in order for it to work. Too often, "evidence-based" practice winds up—due to exigencies of staffing, scheduling, and sentiment—being only loosely modeled on the research. When research finds significant effects from a particular practice, researchers need to offer precise descriptions of what's involved—and educators must commit themselves to embracing the practice with fidelity.
Instead, much of the time, "following the evidence" is shorthand for trusting a researcher's judgment about how to mimic some complex strategy that reportedly "worked" in a handful of schools. Even if one trusts that judgment, the educators who participated in the research inevitably did several subtle, delicate practices carefully and well. Anyone hoping for similar results will need to do a loosely defined array of things carefully and well, too. But even well-intentioned guidance rarely pins down the full set of essential actions or specifies precisely how to actually make them work in this school or that system.

What's a School Leader to Do?

So, what can school and district leaders do to give the term "evidence-based" more teeth in schools? At least three things.
First, bang the drum for more emphasis on practice-oriented research. Leadership organizations and associations should urge foundations, federal research officials, and higher education to prioritize rigorous, systematic explorations of instructional practices and school-level interventions—with research done in schools. Given the burdens of such work, they should push their associations and state agencies to redouble their efforts to honor and support these projects.
Second, demand that published research focus more intently on the particulars of what "works." Conferences and publications should insist that researchers talk in detail about exactly what programs and practices entail. Right now, even potentially useful studies are often imprecise about what programs require, how they're structured, or how staff are trained. Until researchers explain how things work in far greater detail, their findings will be functionally useless for practitioners.
Finally, set forth new expectations governing discussions of evidence-based practice. Make it clear to staff, product vendors, funders, and local or state/federal-level reformers advocating for a particular practice that any claims about a program or practice must be backed not by vague references to "the research," but by specific discussion of the relevant studies, what outcomes those studies have documented, and what the program or practice requires. Enforcing such a norm will push those making casual claims about "best" practices to invest more time and energy in the research—or cease pretending "preferred" practices are more scientific.
I can't promise these three steps will solve the research-practice connection problems. But I do believe they'll lead us to better conversations around evidence in education.
End Notes

1 Carey, B. (2015, August 27). "Many psychology findings not as strong as claimed, study says." The New York Times.

2 John, L. K., Loewenstein, G., & Prelec, D. (2012). Measuring the prevalence of questionable research practices with incentives for truth telling. Psychological Science23(5).

Rick Hess is a resident scholar and the director of education policy studies at the American Enterprise Institute (AEI), where he works on K–12 and higher education issues. He also founded and chairs AEI's Conservative Education Reform Network.

Hess's research and writings are found in many scholarly and popular periodicals, including Harvard Educational ReviewForbes, The Hill, Teachers College RecordPhi Delta KappanEducation WeekWashington Post, and U.S. News and World Report. He also writes Education Week’s blog “Rick Hess Straight Up” and serves as an executive editor of Education Next. Hess taught education and public policy at Harvard, Georgetown, and Rice Universities and at the universities of Pennsylvania and Virginia.

Learn More

ASCD is a community dedicated to educators' professional growth and well-being.

Let us help you put your vision into action.
Related Articles
View all
undefined
Professional Learning
Elevating the Teaching Profession
Rose Wan-Mui Chu
2 months ago

undefined
Checking Your Biases
Silindra McRay & Stephanie Sisk-Hilton
2 months ago

undefined
Tell Us About
Educational Leadership Staff
5 months ago

undefined
Climbing to a Better View
MaryAnn DeRosa
6 months ago

undefined
Stop Coachsplaining!
Jim Knight
8 months ago
Related Articles
Elevating the Teaching Profession
Rose Wan-Mui Chu
2 months ago

Checking Your Biases
Silindra McRay & Stephanie Sisk-Hilton
2 months ago

Tell Us About
Educational Leadership Staff
5 months ago

Climbing to a Better View
MaryAnn DeRosa
6 months ago

Stop Coachsplaining!
Jim Knight
8 months ago
From our issue
Product cover image 121042b.jpg
From Research to Practice
Go To Publication