Skip to content
ascd logo

May 1, 2016
Vol. 73
No. 8

The Benefits of edTPA

Despite criticisms, edTPA holds great promise to prepare teacher candidates for professional practice.

premium resources logo

Premium Resource

The Benefits of edTPA- thumbnail
Credit: Copyright(C)2000-2006 Adobe Systems, Inc. All Rights Reserved.
The idea of using a performance assessment to gauge a teacher candidate's readiness to teach is fairly new and—as with most significant changes—not without controversy. I serve as senior associate dean of the College of Education at Illinois State University (ISU), and my institution has been deeply involved in the implementation of one particular performance assessment, edTPA, since 2010. Informed by my institution's experiences using the assessment with nearly 3,000 candidates, my aim here is to discuss edTPA's background, some of the positive aspects we've observed, and frequent criticisms.

The Beginnings of edTPA

More than a decade ago, California legislation set the wheels in motion to shift from a traditional, multiple-choice test of knowledge of pedagogy to the use of a summative performance assessment. Education faculty from various campuses in the University of California and California State University systems partnered with Stanford University's Center for Assessment, Learning, and Equity (SCALE) to develop and validate the Performance Assessment for California Teaching (PACT), a subject-specific portfolio. The portfolio required teacher candidates to collect evidence of their teaching practice and provide an explanation of their instructional decisions from a brief learning segment (three to five lessons within a unit). Enthusiasm for PACT grew as more faculty worked with it, reviewed candidate performance, and connected results back to curriculum development. As a result, SCALE partnered with the American Association of Colleges for Teacher Education and the Council of Chief State School Officers in 2009 to transform PACT into a nationally available performance assessment, now known as edTPA.

edTPA Today

edTPA calls on candidates to demonstrate their performance in three tasks: planning, instruction, and assessment. As with PACT, candidates focus on their teaching in a learning segment (three to five connected lessons) at participating schools. Candidates compile instructional artifacts, including lesson plans, student work samples, and videos, as they engage students in learning. The candidates also respond to prompts to explain how their instructional choices reflect what they know about their students and the learning objectives.
Pearson manages the scoring of the portfolios. The company recruits, trains, certifies, and monitors the teachers and teacher education faculty who evaluate the assessments. The evaluators score the portfolios using 13–18 rubrics. The number of rubrics used and what exactly they measure varies by the subject area, but the rubrics assess skills like planning to support varied student needs, engaging students in learning, and analyzing students' use of academic language. Rubrics are designed with 5 performance levels: 1 means performance is unacceptable; 2 means the candidate demonstrates some skills but needs further practice; 3 means the performance is acceptable for beginning teaching; and 4 and 5 capture increasing levels of skill and sophistication. Results are reported per rubric and as a cumulative score across the rubrics. Although edTPA does not assess everything we find important before credentialing a candidate, it inarguably measures a commonly valued denominator—skillful instruction and assessment.
The developers of edTPA followed a theory of action to develop a sound prototype, field-test it, gather extensive user feedback, and convene subject-specific design teams to use that feedback to inform changes. As a result, hundreds of educators, including expert teachers, teacher education faculty, and National Board Certified Teachers, have contributed to the current iteration of edTPA handbooks. Changes have included honing the language of the rubrics and streamlining the commentary prompts to ensure clarity. The handbook for special education has received considerable review and revision over the last four years as members from that field worked to come to consensus about an appropriate standard for beginning professional practice.
Here we are in 2016, and 16 states are using or are considering using edTPA as part of their credentialing and/or program review systems. Programs in many other states have incorporated the assessment locally as part of their evaluation of candidates. At this point, we see widespread use of edTPA nationally, and we should be close to coming to some definitive conclusions about its value and impact.

Positive Experiences

As a large-school model of comprehensive teacher education with more than 800 candidates in 29 specialized programs, my institution is uniquely positioned to test edTPA and the system supporting submissions, scoring, and reporting.
We have experienced many positive results using edTPA. Our program faculty have found reviewing edTPA score reports to be extremely valuable for highlighting areas of strength and areas where we need additional focus. For example, when we analyze score summaries, we find our candidates perform very well in the planning and instruction tasks. However, we find their performances in the assessment task less robust, particularly in terms of providing feedback that students can use to deepen their understanding and using assessment results to inform instruction. After discussion, our faculty realized that we present assessment practices fairly late in the curriculum, meaning our candidates have far fewer opportunities to practice and receive feedback on one of the most technically complex aspects of teaching. We are now exploring ways to incorporate assessment practices in earlier methods courses.
Similarly, candidates are able to use their edTPA scores to chart their professional development agenda. The feedback they receive in the form of discrete scores on specific aspects of instructional practice is far richer than the summary scores they received with multiple-choice pedagogy tests. Candidates are able to sit down with a mentor, review their scores along with their portfolio, and reflect on rubric elements to understand how they scored and what they could have done differently. Also, one of our departments is developing a mentoring and induction process in which candidates and supervisors will review several performance indicators, including edTPA, student teaching evaluations, and dispositions assessments, to document current strengths and plan for areas of additional support.
We have found that edTPA brings us in closer alignment with teacher evaluation practices among our P–12 partners. Educator evaluation in Illinois is now largely based on Danielson's Framework for Teaching, and there is strong alignment between it and edTPA. For example, both call for instructional planning based on knowledge of students' strengths, needs, assets, and prior learning. Also, both set expectations for creating a culture that engages students in learning and using assessments that provide useful feedback to students. We are all now looking at teaching in similar, evidence-grounded ways. As a result, professional dialogue translates more seamlessly across settings. It's worth noting, however, that the Framework for Teaching is more comprehensive than edTPA, and addresses a wider range of professional responsibilities, whereas edTPA focuses specifically on effective beginning instructional practice.
Implementing edTPA, as ISU dean Perry Schoon has stated, is a team sport. Making a go of edTPA has led to an unprecedented degree of collegial conversation and collaboration across teacher education programs. California initially hosted an annual implementation conference to share lessons learned and to support collegial problem-solving. That has become a national effort, and several states using edTPA (Georgia, Illinois, and Wisconsin) have created their own implementation programming, including face-to-face meetings, periodic webinars, and regional meetings. Implementing edTPA has renewed our sense of shared enterprise in professional education.

Addressing the Criticisms

To say that edTPA has been met with skepticism is to put it mildly. The fact that Pearson serves as the operational partner is enough to lead many to throw up their hands. After all, many teachers are experiencing testing fatigue, and Pearson is the industry leader behind standardized assessments. Nevertheless, we must acknowledge the need for industry muscle to bring an effort like this to national scale.
I have also heard edTPA framed as "outsourcing," taking the decision of whom to recommend for teaching out of local university hands. I see it differently, as independent corroboration of our local judgment. Those who score edTPA portfolios must meet strict eligibility standards. They must be teachers or teacher educators with subject-specific expertise in the areas in which they score portfolios. They complete a rigorous training program to understand the rubric level progression and to learn to map evidence. Pearson also monitors scorers' performance to ensure high rates of inter-rater reliability.
There are also concerns that edTPA represents an effort to standardize teacher education. In four years of introducing the assessment to different constituents, I have yet to hear anyone argue that any one of the 15 rubrics is irrelevant to effective practice. The edTPA is better understood as framing fundamentally necessary aspects of readiness to teach rather than as an effort to deny other competencies we also value, such as professional conduct.
Another set of concerns addresses the potential negative impact on candidates completing an edTPA portfolio, namely the time involved, the expense to candidates, and the potential bias against culturally and linguistically diverse candidates or candidates working in urban settings. We surveyed our candidates last year, and they reported that they devoted 28 hours, on average, over the course of five weeks to the preparation of their edTPA portfolio. This does not seem excessive. Given the strong alignment between edTPA's tasks and the work of student teaching—planning and enacting instruction, assessing student learning, and reflecting on the effectiveness of their practice—it also does not seem distracting.
Compared with other state-required exams in teacher education, the edTPA is more expensive, but I argue it provides a greater return on investment, so it is more valuable. Candidates pay $300 for their edTPA, versus $99 for the traditional pedagogy test, but the feedback they receive is richer. It can guide candidates' early professional development priorities, and it prepares them for the professional evaluation they will experience as teachers.
The 2014 edTPA administrative report includes analysis of candidate performance across different demographic factors. The average candidate score within each racial, ethnic, and linguistic subgroup exceeded the national recommended cut score of 42. Comparing averages between white and Hispanic candidate performance, there was no statistically significant difference. (Such comparisons among other groups are limited by the small sample size compared to the white population.) Regarding teaching context and potential bias, SCALE reported that candidates in urban settings achieved the highest mean score (45.84) of all settings. Results to this date do not support criticism that the assessment is biased against certain subgroups or teaching contexts. That is most certainly something that will continue to be monitored.
I suppose I would be among the critics if I saw edTPA as a wild departure or extraneous distraction from the core of work in teacher preparation, but I don't see it that way. Not at all. As one of our science education candidates shared,
Completing edTPA was key to preparing me for my professional evaluations. Although I didn't realize it at the time, going through edTPA prepared me to show evidence of my professional practice, assess my students' learning and respond appropriately, support my students' learning, and keep my students at the center of my instructional decisions. This assessment enabled me to show what I could do as a teacher and that I was ready to have my own classroom.
End Notes

1 SCALE. (2015). Educative assessment and meaningful support: 2014 edTPA administrative report. Retrieved from

ASCD is a community dedicated to educators' professional growth and well-being.

Let us help you put your vision into action.
Discover ASCD's Professional Learning Services
From our issue
Product cover image 116035.jpg
The Working Lives of Educators
Go To Publication