HomepageISTEEdSurge
Skip to content
ascd logo

Log in to Witsby: ASCD’s Next-Generation Professional Learning and Credentialing Platform
Join ASCD
November 1, 2006
Vol. 64
No. 3

A Tale of Unintended Consequences

An ESL teacher finds that pressure to raise reading rates has kept her from meeting students' true needs.

premium resources logo

Premium Resource

A Tale of Unintended Consequences- thumbnail
Credit: Copyright(C)2000-2006 Adobe Systems, Inc. All Rights Reserved.
On your marks, get set, go!” I started my timer and circulated around the room, listening as students hurriedly began reading aloud to their partners, who followed along on their own copies. After one minute, I told everyone to stop and asked the listening partners to point out any errors the speed-reader had made. Then these 4th graders added up the number of words they had read accurately in a minute and recorded the number in their personal reading-fluency logs. Murmurs of pride and frustration could be heard around the room, as some students saw evidence of growth and others found they were reading no faster than before.
This was a typical “fluency practice session,” a daily routine I instituted with a group of 4th graders at Gold Hills Elementary School, to whom I was providing English language arts instruction within a bilingual program. All 17 students were native Spanish speakers, and 12 were classified as Limited English Proficient. Their 4th grade year was the first time these students had received language arts instruction in English instead of Spanish.
The fluency drills were an attempt to raise students' scores on my school district's reading fluency assessment. This assessment was one of six assessments of language arts skills associated with the Open Court reading system that my district required teachers to administer five times during the year. (The other assessments tested reading comprehension, grammar and usage skills, writing, vocabulary, and spelling.) For the fluency assessment, I listened to students read two different unfamiliar grade-level passages for one minute each and recorded the number of words they read accurately. I then entered the students' average words per minute into a districtwide database.

Why Fluency—and Why Speed?

Administrators were closely watching my school's assessment results in 2004–2005. Gold Hills was a Year 4 Program Improvement School under No Child Left Behind (NCLB). This meant that our school was threatened with closure or conversion to a charter school if we didn't improve our math and language arts scores on the California Standards Test (CST). By midyear, when few of my students had achieved the benchmark for reading fluency, I responded to the pressure by instituting fluency practice sessions.
District leaders chose fluency as a focus because they considered reading rate (which is part of fluency) a reliable predictor of success on the English Language Arts section of the CST. Proponents of instruction in reading fluency point to the importance of automaticity, the ability to decode words quickly and effortlessly. Both Clark (1995) and Rasinski (2003) argue that becoming a fluent reader is a matter of making decoding routine so that the reader can direct his or her conscious attention to comprehending the text. Hudson, Lane, and Pullen (2005) cite a correlation between reading rate and comprehension and argue that automaticity frees the reader to focus on understanding the text and that reading rate is a useful measure of a student's automaticity.
However, fluency is not only about speed; it also involves phrasing, expressiveness, and the ultimate goal of improving comprehension. Moreover, Rasinski (2003) warns thatalthough reading rate appears to be a good measure of the decoding automaticity component of reading fluency and of reading achievement in general, it does not mean that students should receive overt and intensive instruction and practice in becoming fast readers .... If teachers provide the kind of instruction in fluency that works, then fluency, comprehension, and rate will improve. If teachers choose instead to focus primarily on developing students' reading rate at the expense of reading with expression, meaning, and comprehension, students may read fast but with insufficient comprehension. (p. 8)
Ironically, the message I received from my district—that I should explicitly teach my students to read faster—led me to institute just the sort of instruction Rasinski cautions against.

What the Focus on Speed Yielded

At the end of the 2004–05 school year, I looked back to see whether Rasinski's caution had been apt, or whether the speed drills had led my 4th graders to read better overall and perform better on the state test. I found that instruction aimed at helping students read more quickly didnot have a significant effect on their overall reading skills. Most students made minimal progress in increasing their reading speed. Further, even the few who did meet the district's benchmark for 4th grade reading speed (117 words per minute) did not consistently score higher on other reading tests, including the language arts section of the CST.

The Fast Get Faster...

In June 2005, when I administered the final of my five rounds of timed fluency assessment, I celebrated my students' success: 6 of 17 had finally reached the district's benchmark. This represented a dramatic improvement over the previous four rounds, when only one student met the benchmark. As I looked more deeply into my data, however, some disturbing trends became clear. Those six students who met the district's goal started the year reading much more quickly than most of their classmates and had stronger English skills. I was even more upset to discover that those six students made twice as much growth in reading rate as did the rest of the class. As a group, their average change from round one to round five was 37 words per minute. By contrast, the average change among the below-benchmark students was only 18 words per minute, and five students improved fewer than 12 words per minute. Although an intense focus on reading speed had supported six students' dramatic growth, I had allowed many students to stagnate as I provided them with inappropriate instruction.

...But the Test Scores Don't Follow

Considering my district administrators' emphasis on boosting reading rate, I expected to see an obvious correlation between my students' success on the timed fluency test and their success on other reading assessments. Instead, when I examined how students' fluency scores related to their scores on the Open Court reading comprehension assessment and the CST, I found un-predictable results. The students who read the fastest werenot consistently the best performers on other measures of reading, including the all-important CST for which these benchmark assessments were supposed to prepare them.
On the fifth round of the Open Court language arts assessments I gave my students (the last round for the year), six students from my class met the benchmark for reading fluency, but none met the target for reading comprehension. In fact, our class's average reading comprehension score fell from 6.2 (of 10) in the fourth round to an average of 4 of 10 in the fifth round, whereas these students' average words read per minute jumped from 86 to 94 in the same time span. When I averaged each student's reading comprehension scores from all five assessment rounds, I found that the two students with highest average scores on the comprehension tests were also two of the students who met the fluency benchmark, as one might hope. But two other students who met the fluency benchmark had among the five lowest average reading comprehension scores. Faster reading did not correlate with deeper understanding.
I saw little or no correlation between these students' fluency assessment results and their scores on the CST. Only 5 of my 17 students scored at the levels of Below Basic or Far Below Basic on the language section of the CST; two of those five students were high scorers on the fluency test. At the top end of the scale, I had three students who scored Proficient on the CST, and only one of them met the fluency benchmark. I found this particularly unsettling, given that I was pushed to implement Open Court, administer the corresponding skills assessments, and practice for speed in an effort to raise CST scores.

Unintended Harm

Although I had a small sample size for this observation of the effects of timed reading drills, my students' scores indicate there was little payoff from this strategy in terms of test results. Yet there were unintended negative consequences.
There are advantages to the Open Court fluency assessment. It requires a teacher to spend fewer than five minutes per student and provides an easy-to-understand numerical result. Unfortunately, it also ignores many important elements of reading fluency and paints an overly simplistic picture of students' reading abilities. One consequence of emphasizing such a superficial assessment in the name of efficiency—and using the results to judge students, teachers, and schools—is that teachers like me feel pressured to teach in ways that we know to be inappropriate for our students.
Practice in fast reading was not what my students needed; they really needed decoding and comprehension instruction. Interestingly, my students themselves knew this. In class discussions, I asked my students what makes a good reader and whether it's important to read fast. All except one told me that good readers read slowly, reread, and ask and answer questions about the text as they read to help them get meaning. As English language learners, many of them still struggle with decoding in English, particularly with unfamiliar vocabulary, and they realize it's important to read slowly so if they make a mistake they can go back. Students described getting nervous and reading words incorrectly when they read too fast. As Yolanda said, “I say, ‘I'm gonna lose, I'm gonna lose, I better read faster,’ and then I skip words or say the words wrong.” Explicitly teaching students to read quickly may have undermined their progress toward the underlying goal of fluency instruction—comprehension.
I felt compelled to use this strategy, however, because of the high stakes that school leaders attached to this assessment. I reported my students' results on the timed fluency assessment on report cards, and class results were publicly displayed at school. Administrators required my colleagues and me to submit plans for improving scores on the timed fluency test, and my principal discussed my students' scores with me during evaluation conferences. District administrators closely monitored our timed fluency scores as a primary indicator of student achievement. None of the other required Open Court assessments received as much attention.
The school did not reap many benefits from this focus on fluency. At the end of the 2004–05 school year, Gold Hills did not make adequate yearly progress in language arts. The school was closed, redesigned, and reopened as what our school district calls a “small autonomous school,” although with most of the same staff and students. I left Gold Hills after that year and moved to a nearby district that does not require teachers to use Open Court teaching or assessments. But I understand that Gold Hills is continuing along the same path.

A More Balanced Approach

I can envision a fluency assessment thatwould help support good literacy teaching. A better assessment might still measure reading rate and accuracy. But rather than just beating the clock, students would read aloud texts at their instructional levels and either retell the material or answer comprehension questions after reading. Such an assessment would give educators more accurate information about students' reading abilities and encourage teachers to diagnose and address strengths and weaknesses.
We could also avoid harm by lowering the stakes. Why not view the timed fluency test as just one measure among many that inform teachers' understanding of students' reading abilities? If my district had emphasized speed less, I could have focused more on other elements of reading, such as comprehension—which, ironically, might have been more effective in helping students read faster.
References

Clark, C. H. (1995). Teaching students about reading: A fluency example.Reading Horizons, 35(3), 251–265.

Hudson, R. F., Lane, H. B., & Pullen, P. C. (2005). Reading fluency assessment and instruction: What, why, and how? The Reading Teacher, 58(8), 702–714.

Rasinski, T. (2003). Beyond speed: Reading fluency is more than reading fast.The California Reader, 37(2), 5–11.

End Notes

1 Gold Hills Elementary School and the student names are pseudonyms.

ASCD is a community dedicated to educators' professional growth and well-being.

Let us help you put your vision into action.
Discover ASCD's Professional Learning Services
From our issue
Product cover image 107028.jpg
NCLB: Taking Stock, Looking Forward
Go To Publication