HomepageISTEEdSurge
Skip to content
ascd logo

Log in to Witsby: ASCD’s Next-Generation Professional Learning and Credentialing Platform
Join ASCD
May 1, 2021
Vol. 78
No. 8

Generating Better Evidence on Ed Tech

When making decisions about education technology, educators shouldn't have to fly blind.

premium resources logo

Premium Resource

Leadership
Generating Better Evidence on Ed Tech thumbnail
Credit: YURI ARCURS PRODUCTIONS
Educators face a "paradox of choice" when it comes to selecting the best education technology tools for their students. In a ballooning market with approximately 8,000 ed tech tools in use and 3,250 ed tech companies (Censuswide, 2020; LearnPlatform, 2020), there are too many choices and too few ways to access useful information about what works where.
When it comes to ed tech, profound obstacles hobble busy educators trying to do right by their students. High-quality efficacy research on technology tools is scarce. Educators want advice from peers, but struggle to find guidance relevant to their specific contexts. Administrators find themselves taking a hit-or-miss approach to technology procurement (Morrison et al., 2014). So how can we change this status quo of flying blind in terms of choosing education technologies? How can the education sector build a better bridge between research and practice, so knowledge about what works where can be documented and spread?

The Goal: Evidence-Informed Practice

Parameters set by the federal Every Student Succeeds Act for choosing instructional practices—including choosing technology tools—require practices to be evidence-based; they must show a statistically significant effect on improving outcomes, as illustrated by strong, moderate, or promising evidence. But since there are so few education technologies with studies demonstrating evidence of their impact on student outcomes, almost all instructional tools are selected on the basis of educators' perceptions that the technology is needed and effective. This isn't to say that educators' expertise isn't important. Similar to evidence-based practice, evidence-informed practice calls for combining research evidence with educators' professional expertise, elevating the professional wisdom and rich contextual understanding educators develop through experiences in the classroom (Nelson & Campbell, 2017).
Educators' wisdom is an incredible resource, but that wisdom depends on their having access to useful research evidence they can combine with their professional expertise to select and use ed tech tools effectively. Educators don't know what they don't know. We wouldn't expect an electric car buyer to make the wisest choice if they've only learned about one tiny slice of this expanding market. In reality, overtaxed administrators may make relatively uninformed, rapid ed-tech decisions, trying to "keep up" with the pace of innovation (Webster, 2017). They often turn to teachers for information about what works (Gallup, 2019). Yet most teachers have only a narrow vantage of the ed-tech market elephant.

The Ed-Tech Research Gap

Our organization, the EdTech Evidence Exchange, wanted to go deeper in understanding educators' decision making so we could ultimately infuse more relevant evidence into the process. The Exchange is a nonprofit working closely with the University of Virginia School of Education and Human Development to help educators access the information they need to make better, more informed education technology decisions. In 2018, we brought educators together at two field-based convenings to discuss their beliefs about, use of, and need for research. Prior to the events, we didn't tell the educators we'd be specifically discussing research.
On a pre-event survey, we asked the 158 educators scheduled to attend: How do you make decisions about implementation of a new technology? We also asked: How do you make decisions about changes to your instructional practice or the instructional practice of those you coach/supervise? We coded each educator's response for the extent to which they used (1) scientific research evidence and (2) experiential professional expertise.
Only four percent of these educators indicated they used evidence-informed decision making—relying on research evidence and professional expertise to inform new technology implementations. Fifty-four percent said they use their professional expertise and knowledge of context, but only six percent used scientific research.
How does this compare to other decisions educators make about instructional practice? We were encouraged to find that 25 percent of participants we surveyed described using evidence-informed decision making for broader changes to their instructional practice; 27 percent said they use scientific research, and 89 percent use professional expertise.
In previous surveys, teachers and administrators have professed to seeing value in research evidence (Barton, Tindle, & Fisher, 2020). But there may be social desirability bias at play here; appreciating research seems to be the "right" answer in an education policy context that calls for evidence-based practice. In many cases, aspiration likely does not match the reality in terms of research utilization in schools.
As educators increase their reliance on technology, failure to consult research on what works becomes a more serious problem. Notably, educators report trusting their colleagues over research evidence when it comes to choosing ed-tech tools (Gallup, 2019). There's likely a good reason for this: Educators can straightforwardly access their colleagues' opinions, while they typically can't access research about technologies. Solid research evidence that feels appropriate for their context is available for very few technologies.
Consider math programs. There's evidence that some technology tools hold promise for improving students' math achievement (Higgins, Huscroft-D'Angelo, & Crawford, 2019). As of spring 2021, high-quality efficacy or effectiveness evidence (i.e., impact evidence) is available for approximately 15 math tools between two reliable, publicly accessible evidence platforms: Evidence for ESSA and the What Works Clearinghouse. But considering the volume of ed-tech tools listed in popular databases like the EdSurge Product Index or LearnPlatform, this means impact evidence is available for approximately one to three percent of the math ed tech tools in the marketplace. Educators are left without many clues about whether a technology is likely to support student achievement.

Three Problems—and a Tool

There are three interwoven problems here: a thin body of reliable impact evidence, limited evidence about what works in specific contexts, and limited distribution of the meager research that exists.
First, the total volume of reliable impact evidence is limited. Over half of products advertised as "research-based" lean on research the company conducted on its own product that may or may not be available for review (Hulleman et al., 2017). Unfortunately, the field can't keep up with the rapid release of new technologies to produce independent, external research, as illustrated by the number of math technologies without rigorous impact evidence. We have serious ground to cover.
Second, context matters. The available evidence is likely collected in highly controlled environments, limiting the influence of contextual features such as the devices available and the training provided. Unsurprisingly, educators are less likely to use research evidence when they don't perceive a connection to their context (Dagenais et al., 2012).
Third, educators can't easily find the evidence that is available. The highest quality studies are available in the databases mentioned here, but other sources of moderate and promising evidence are less organized. And educators are busy! They lack the time to search through a wide variety of research sources, determine their reliability, and judge the applicability of that evidence to their own context and practice.
In response to these challenges, in spring 2021, the Exchange launched the EdTech Evidence Exchange Platform to crowdsource evidence about educators' education technology experiences, systematically aggregating and elevating their expertise. In developing this tool, which is currently in beta-testing, we are documenting educators' perceptions of what works in their contexts—including rich details about their experiences working with particular technologies—and coupling that with available evidence of that technology's impact on student achievement. So decision makers will be able to access information about how ed tech tools have performed in contexts similar to their own.
The upshot, we hope, will be improved selection and implementation of ed-tech tools—desperately needed, given the lack of sizable impact on student learning some tech shows and the incredible number of ed tech licenses wasted each year (Baker & Gowda, 2018).
The Exchange Platform positions educators as both the key producers and consumers of evidence, documenting teachers' firsthand experiences, and the experiences of leaders who select technologies for schools or districts. Participating educators will describe key features of their instructional context (like their infrastructure and operations) by responding to the EdTech Context Inventory, a survey developed by the coalition of researchers, practitioners, and experts participating in the EdTech Genome Project. (Launched in 2019, the EdTech Genome Project is a sector-wide initiative to identify, define, and measure key context variables most likely to be associated with ed tech implementation so the education field has common language to describe the individual and setting features of ed tech implementation contexts.) These data will be combined with demographic features to create context profiles.
Teachers and leaders will also be asked to respond to a second 30-minute survey detailing their experiences with the tools they've used, covering topics such as logistics, perceived success, and potential sources of bias. The Exchange Platform will use these data to automatically create user-friendly "implementation reports" that will eventually integrate with vetted research evidence of those technologies' impact. Based on the context profiles, the system will match educators to evidence from contexts similar to their own, so they can learn from the ed tech victories and failures of colleagues nationwide.
We also plan for the Exchange Platform to increase researchers' efficiency in studying products and producing evidence. The platform will highlight technologies that seem likely to support learning, based on educators' perceptions, but that haven't yet been rigorously studied, pointing researchers to promising areas for investigation. The timing is good for this: The U.S. Department of Education, through the Institute of Education Sciences, is expanding grants that might stimulate this type of time-sensitive, high-need research. Additionally, federal funding for rapid-cycle evaluations conducted by educators could grow the body of available evidence.
Lack of evidence about ed-tech tools shouldn't fall only on educators' shoulders. It's the policy and research communities' responsibility to produce the evidence educators need—and to deliver it in usable ways. We believe the EdTech Evidence Exchange Platform offers a promising model for doing just this.
References

Baker, R. S., & Gowda, S. M. (2018). The 2018 technology & learning insights report: Towards understanding app effectiveness and cost. BrightBytes.

Barton, E. A., Tindle, K. P., & Fisher, C. (2020, Apr 17–21). Evidence-informed? Educator explanations of decision making [Roundtable Session]. AERA Annual Meeting, San Francisco, CA.

Censuswide. (2020). The edtech report: A guide to the current state of educational technology and where it is going.

Dagenais, C., Lysenko, L., Abrami, P. C., Bernard, R. M., Ramde, J., & Janosz, M. (2012). Use of research-based information by school practitioners and determinants of use: A review of empirical research. Evidence & Policy, 8(3), 285–310.

Gallup. (2019). Education technology use in schools: Student and educator perspectives.

Higgins, K., Huscroft-D'Angelo, J., & Crawford, L. (2019). Effects of technology in mathematics on achievement, motivation, and attitude: A meta-analysis. Journal of Educational Computing Research, 57(2), 283–319.

Hulleman, C. S., Burke, R. A., May, M., Charania, M., & Daniel, D. B. (2017). Merit or marketing?: Evidence and quality of efficacy research in educational technology companies. White paper produced by Working Group D for the EdTech Academic Efficacy Symposium. Charlottesville, VA: University of Virginia.

LearnPlatform. (2020). Edtech engagement, access, and equity during the pandemic: Analysis of daily student edtech usage across 8,000 edtech tools used by 2.5m students in 17 states.

Morrison, J. R., Ross, S. M., Corcoran, R. P., & Reid, A. J. (2014). Fostering market efficiency in K-12 ed-tech procurement: A report from Johns Hopkins University to digital promise in partnership with the education industry association. Center for Research and Reform in Education (CRRE), Johns Hopkins University.

Nelson, J., & Campbell, C. (2017). Evidence-informed practice in education: Meanings and applications. Educational Research, 59(2), 127–135.

Webster, M. D. (2017). Philosophy of technology assumptions in educational technology leadership. Educational Technology & Society, 20(1), 25–36.

End Notes

1 For more information on the sample and analytic procedures, see Evidence-Informed? Educator Explanations of Decision-Making (Barton, Tindle, & Fisher, 2020).

2 Educators' personally identifiable information will be protected at every step. The Exchange Platform is currently in beta-testing. If interested in participating in early testing.

Emily Barton is a research assistant professor at the University of Virginia Curry School of Education and Human Development and the director of implementation research at the Jefferson Education Exchange.

Learn More

ASCD is a community dedicated to educators' professional growth and well-being.

Let us help you put your vision into action.
Related Articles
View all
undefined
Leadership
The Problem of Nominal Change
Jim Knight
2 weeks ago

undefined
Leading from Your Core Values
Elena Aguilar
2 months ago

undefined
Making Emotions Matter for Leaders
Juan-Diego Estrada
2 months ago

undefined
Leading with Empathy
Brittany Hogan
2 months ago

undefined
Becoming an Emotionally Intelligent Leader
Naomi Thiers
2 months ago
Related Articles
The Problem of Nominal Change
Jim Knight
2 weeks ago

Leading from Your Core Values
Elena Aguilar
2 months ago

Making Emotions Matter for Leaders
Juan-Diego Estrada
2 months ago

Leading with Empathy
Brittany Hogan
2 months ago

Becoming an Emotionally Intelligent Leader
Naomi Thiers
2 months ago
From our issue
Product cover image 121042b.jpg
From Research to Practice
Go To Publication