HomepageISTEEdSurge
Skip to content
ascd logo

Log in to Witsby: ASCD’s Next-Generation Professional Learning and Credentialing Platform
Join ASCD
July 23, 2018
ASCD Blog

Two Strategies for Assessing for Learning: The Partial Credit Scoring Key and the Scoring Guide

author avatar
author avatar
    Assessment
    Two Strategies for Assessing for Learning: The Partial Credit Scoring Key and the Scoring Guide Thumbnail
      Written by Brent Duckor and Carrie Holmberg 
      These two assessment strategies can help you with your next-steps decision making. Both work by bringing attention to patterns in student responses. Specifically, they focus attention on the categories you use as you are making sense of student responses and discovering patterns.
      You categorize student responses all the time. This is binning. The art and science of binning—of building bins—can play a powerful role in
      • assessing where students’ understanding is,
      • diagnosing where students might be stuck or hold misconceptions, and
      • generating the kinds of feedback students need: right-sized, next steps-oriented feedback they actually use to move their learning forward—formative feedback.
      How teachers build bins matters.
      Here are two ways of building bins that teachers we’ve worked with say improves the quality of information they take in about students’ performance. The first is a modification of a traditional answer key for scoring a multiple choice question: the partial credit scoring key. The second is an alternative to the rubric: the scoring guide. We present these and other strategies for binning, including binning for feedback, in chapter 7, “Binning,” of our Mastering Formative Assessment Moves: 7 High-Leverage Practices to Advance Student Learning (2017).

      The Partial Credit Scoring Key

      The partial credit scoring key can alert teachers to major and minor misconceptions. Compare the same multiple choice question in Figure 1, scored two different ways. On the left side, all responses are scored with only two bins: correct and incorrect. On the right, the same responses are scored with five bins, bins that distinguish amongst different kinds of incorrect responses, hence improving the teacher’s choice of “next steps” in the instructional cycle.
      Figure 1. A typical multiple choice question on a geography quiz scored with two bins (correct and incorrect) and an ordered, multiple-choice question with five bins (correct, partially correct, misconception, major misconception, incorrect).
      To help students learn, your attention to the different categories of—and class-wide patterns in—incorrect responses makes a difference. Imagine you are examining two class sets of responses to this question, one scored the first way (with two bins) and one scored the second way (using five bins). Which way is going to give you a better snapshot of where student understanding is?

      Benefits

      If used “formatively” in a quiz, results from the partial credit scoring key might allow you to bin and sort student misconceptions on a Friday, assist in how you might reteach certain information on Monday, and inform how you design new lesson activities through the next week to reinforce a concept (or facts) before the final unit test.

      Apply to Your Practice

      The next time you decide to check for understanding with a quiz, see how many of these types of items (questions) you can write and which ones will provide you with a breakdown of who is still laboring with a misunderstanding.
      If you cannot anticipate a range of misconceptions, set up a routine with an open-ended “entry” or “exit” slip. Ask the question. Scribe a word web with all student responses. Probe on productive misconceptions that might be shared. Ask for an explanation or justification. The next time you teach this unit you will have a few useful distractors, along with the correct answer, to use diagnostically or formatively “on the fly.”
      See whether or not these new, ordered multiple-choice items (Briggs, Alonzo, Schwab, & Wilson, 2006) offer more insight into student thinking. We like these sorts of “fixed-choice+” items—where the “+” is your asking for explanation or justification—because at least they open up the possibility for meaningful dialogue during classroom instruction.

      The Scoring Guide

      The scoring guide is an alternative to the rubric. It is a simpler binning strategy for teachers than the analytic rubric. Like rubrics, a scoring guide can be used by students for peer- and self-evaluation.
      Figure 2 is an example of a scoring guide. This particular scoring guide was used to bin student writing in response to the prompt, “After reading about the Dust Bowl, would you agree or disagree with this statement: The Dust Bowl was a man-made problem with man-made solutions?”
      Notice the ordering of the four bins employed. Note also the space for teachers and students to jot down what to do next, that is, to self-assess. The teacher sees patterns across the class; the students can identify their zone of proximal development and the reach required to revise successfully.
      Figure 2. Scoring guide for a performance task that asks students to take a position and provide evidence for that position.

      Scoring Guides Differ From Rubrics In Important Ways

      Compared to rubrics, scoring guides tend to contain less verbiage and dysfunctional detail that easily overwhelm many students (see, e.g., Popham, 1997). We like how this tool reduces the extraneous cognitive load on students while increasing the germane load by reinforcing the importance of providing comments to explain their reasoning. Students using a tool for making their thinking visible and explaining their reasoning on a performance task should be able to quickly focus—not spend undue time and effort on deciphering an incredibly busy rubric.
      The scoring guide, like the partial credit scoring key, is a good tool for awarding partial credit. But the scoring guide is much more.

      Use Scoring Guides To Generate Targeted, Right-Sized Feedback.

      Scoring guides have great potential for binning for feedback during instruction when a teacher needs to check for understanding with a lighter touch. Similar to the rubric, teachers can process, sort, and bin student responses data with scoring guides. Scoring guides provide a clear, articulated justification for partial credit on constructed response tasks, short answer items, and “try now” activities.

      Try Using Without Giving Numeric Credit

      But note, even though it is called a scoring guide, teachers and students can use a scoring guide meaningfully without any awarding, assigning, or giving of “credit” or points actually occurring. Research backs this practice. When a goal is supporting students’ intrinsic motivation, separate points/grades from the use of the scoring guide.
      The two most important aspects of any scoring guide are
      • how
         
        (and why) student responses are binned and
      • what
         
        students
         
        and
         
        teachers
         
        do next
         
        based on what they have learned from the process of binning with the scoring guide.
      To be formative, for use by students to self- and peer-evaluate their work, the focus should be on “levels” not points—and, most importantly of all, on revising. Students need to know, “What do I do next?” means using the scoring guide notes/descriptions based either on a peer or teacher “request” or on one’s own self-generated “next steps.” Ownership and follow through are key with this tool.
      Teachers who use scoring guides can also scan “results” by response level and see patterns and trends in a class. They can use this information to make decisions about students’ current levels of understanding and what to do next. We use quotation marks on the word “results” because the students’ “results” are, of course, temporary. The work, the performances, will be revised.

      A Key Benefit

      As students revise their work based on the feedback the scoring guide helped generate, they become more familiar with how their strategic effort links to improvements in the quality of their work/performance. What teacher doesn’t want their students to develop or hone this “lifelong learning skill”?
      To score or not to score: That is not the question for the formative assessor. Rather, we need assessment tools and items that bring out a consistent message: we are learning to revise our work and our “first draft” thinking in this unit. Whether through partial credit scoring guides or multiple-choice item formats, our goal is to shine the light on the current level of student understanding, before the final test or performance task or project.
      Tools that help you bin for feedback, tools such as the scoring guide, support your efforts to truly assess for student learning. This sort of disposition towards learning—as a process that requires constant calibration and consistent feedback—can help your students achieve while setting up a culture in your school and classroom that truly respects growth and development of the whole child into adulthood.

      Brent Duckor, Ph.D., is an associate professor in the Teacher Education Department and the Ed.D. program in Educational Leadership at San José State University. Carrie Holmberg, Ed.D., and NBCT is a lecturer in the Teacher Education Department at San José State University.  Their first book together, Mastering Formative Assessment Moves: Seven High-Leverage Practices to Advance Student Learning was published by ASCD in June 2017. 

      Brent Duckor, an associate professor in the Department of Teacher Education at San José State University in California, teaches courses and workshops that provide historical, analytical, and empirical lens to help students better understand the role of assessment, testing, and evaluation in education.

      He also serves as a researcher, studying teacher learning progressions in formative assessment. His research on teachers' understanding and use of formative assessment in the K–12 classroom and validation of teacher licensure exams in state, national, and international contexts seeks to integrate a developmental perspective on teachers' growth in the profession. The core focus of Duckor's research on teachers utilizes mixed methods to investigate trajectories of high leverage assessment practices and examining how to model progress variables using Item Response Theory (IRT).

      Prior to San José State University, Duckor taught government, economics, and history at Central Park East Secondary School in New York City in the 1990s before returning to the University of California, Berkeley, to study education measurement, testing, and assessment with the passage of No Child Left Behind.

       

      Learn More

      ASCD is dedicated to professional growth and well-being.

      Let's put your vision into action.
      Related Blogs
      View all
      undefined
      Assessment
      Beyond a Highlight Reel: Portfolios as Dynamic Workspaces
      Starr Sackstein
      3 months ago

      undefined
      3 Strategies for Student Self-Assessment
      Susan M. Brookhart
      4 months ago

      undefined
      Resist the Urge to Grade Students During the Coronavirus Closures
      Joe Feldman
      4 years ago

      undefined
      How to Create Assessments that Drive Learning
      Karin Hess
      1 year ago

      undefined
      Fostering Equity and Inclusion Using Formative Assessment Moves
      Brent Duckor & Carrie Holmberg
      6 years ago
      Related Blogs
      Beyond a Highlight Reel: Portfolios as Dynamic Workspaces
      Starr Sackstein
      3 months ago

      3 Strategies for Student Self-Assessment
      Susan M. Brookhart
      4 months ago

      Resist the Urge to Grade Students During the Coronavirus Closures
      Joe Feldman
      4 years ago

      How to Create Assessments that Drive Learning
      Karin Hess
      1 year ago

      Fostering Equity and Inclusion Using Formative Assessment Moves
      Brent Duckor & Carrie Holmberg
      6 years ago