Linking Progress Monitoring Results to Interventions

“...I’ve really gotten the necessity of assessment. Without assessing our students, we really don’t know what they know and we don’t know what they’re learning... Certain approaches that may be effective for most of the class may be leaving one or several students in the dark... I would suspect that they are often capable, but the instruction itself is not meeting their need. Maybe they need more practice, maybe they need to have the material presented in an alternative form, maybe they just need a little one-on-one instruction from a teacher or a peer.”
— Mr. Beckwith, a beginning special education teacher
Progress-monitoring assessment is becoming more widely adopted than ever before as a means of tracking the reading development of students with dyslexia and reading difficulties. Both general and special education teachers are including frequent curriculum-based assessments into their routines and tracking the data to monitor growth. However, such assessment is of little use to the classroom teacher without continual analysis of data to determine the appropriate intervention and level of intensity for each student. Mr. Beckwith, the special education teacher quoted above, is quite aware of the importance of using assessment as he plans learning activities for his students. He knows that he has to think about the instruction that he provides to his students so that each one will learn.

Other articles in this issue discuss the why and how of progress monitoring assessment, so the focus of this article is on using data to make meaningful instructional decisions. The linkage between assessment and instruction is an often-overlooked aspect of teaching that can make a critical impact on student outcomes. Teachers need to know how to select appropriate materials and instructional programs based on students’ skills. With data, teachers can answer such questions as, “What is Megan’s instructional reading level?” “What specific skills does Megan need to work on?” and “Is this decoding program that I am using helping Megan to make adequate progress?”

In recent years, our research has supported teachers in using curriculum-based measurement tools for conducting screening assessment to identify students at risk for reading difficulty. Reading intervention has been provided as part of this research to address the students’ specific learning needs and track their progress. Working with teachers participating in this study, the PLUS Project (Promoting Literacy in Urban Schools) has provided insight into how teachers can best use progress-monitoring data to decide the type and intensity of instruction struggling students may need.

What Can Be Learned from Progress Monitoring

With a screening assessment, teachers identify struggling readers and outline a plan for forming intervention groups, small groups of students with similar learning needs. Then, it is important that they have a progress-monitoring, assessment system in place. Progress-monitoring assessment fulfills two main purposes: to assess students’ academic progress and evaluate the effectiveness of intervention. Both purposes require collecting data points frequently that allow teachers to continually monitor progress toward a specific learning goal. In reading, the progress-monitoring tool would be based on the student’s specific areas of need and the grade-level standard that is the target of intervention. Progress-monitoring assessments are typically short, fluency-based assessments that can be given frequently (once a week or every two weeks) to monitor the student's learning.

One type of information provided by progress-monitoring assessment is comparative. Teachers can examine a round of data collection to see how an individual student’s performance compares to other students in the class, academic standards, or others in the same intervention group. For example, Ms. Roth is monitoring the progress of three students in her fourth grade class. She is collecting timed oral reading fluency checks once a week. Each week, she compares the students’ scores to the fluency rates of their classmates and asks herself if these students are within range of catching up. Her school district has also issued grade-level norms, based on the work of Hasbrouck and Tindal (2006). According to the norms chart, the average fourth grader would read about 112 words correct per minute (WCPM). When she looks at the three recent scores, she sees that two of the students have made significant gains from scores in the 80s to 100–105. The third, however, has not moved beyond 85. Ms. Roth uses this information to consider regrouping the two progressing students into other intervention groups in the class and arranging for the third student to receive a more intensive intervention with a reading specialist.

Using Progress Monitoring to Change Instruction: Two Case Studies

Ms. Reeder is an experienced general education teacher who has taught various elementary grades for 7 years at Applewood Elementary School. This year, Ms. Reeder is teaching second grade. A few weeks into the school year, it is apparent that two students, Miguel and George, are having difficulty with reading. Ms. Reeder needs to know about the students’ skills so that she can effectively place them in appropriate instructional groups and plan the best way to assist them in meeting grade-level goals.

The Case of Miguel

Miguel is a seven-year-old boy who came to the United States from Guatemala with his family when he had just turned five years old. He entered kindergarten at that time and has always had good attendance. Though Miguel’s family has lived in the same city since moving to the United States, they have only recently settled into a long-term housing situation. Miguel is currently in his fifth school in three years. At home, Miguel and his three siblings (two older and one younger) speak some English, but mostly Spanish. His parents, who are enrolled in an English as a second language class, encourage the children to speak English; however, Spanish is the language of choice in most of their family interactions. While Miguel’s parents and older siblings are able to read in Spanish, Miguel had not yet learned to read in his primary language when he entered kindergarten in the United States.

The Case of George

George is 8 years old and new to the Applewood School. His family just moved to the area because of his mother’s job. He lives with his mother, new stepfather, and an older sister, spending some holidays and part of the summer with his father, who lives about an hour away. School records indicate that George has struggled academically and socially since entering school. In kindergarten, George’s teacher and parents agreed that he lacked the maturity to complete the year successfully, citing his late birthday, which made him among the youngest children in the class. He repeated kindergarten and seemed to be improving until his parents separated toward the end of that year. George’s mother hopes that a fresh start in a new school will help him settle down behaviorally, make some friends, and gain proficiency with his basic academic skills.

Baseline Assessment Results for Miguel and George

Upon entering Ms. Reeder’s class, Miguel and George were given a baseline assessment of their oral reading fluency using DIBELS (Dynamic Indicators of Basic Early Literacy Skills; Good & Kaminski, 2001). Because Miguel and George were each reading at about the same rate, approximately 25 WCPM, Ms. Reeder knew they were both at risk for failing to attain grade-level proficiency. DIBELS provides benchmark goals for fall, winter, and spring. Using the fall benchmark scores for second grade, they would need to read about 44 WCPM to be on their way to meeting the end-of-year level of 90 WCPM. Initially, Ms. Reeder placed the two boys in the same reading intervention group and used a combination of phonics and reading fluency instruction to try to improve their ability to read meaningful, rich literature and expository texts.

An Instructional Plan Based on Data

Ms. Reeder knew that, according to reading research, weekly gains of 1.5–2 words per minute in reading fluency were feasible if the reading instruction were rich enough (Fuchs, Fuchs, Hamlett, Walz & Germann, 1993). Ms. Reeder drew an aim line, indicating a slope of about two words improvement per week, onto the progress-monitoring graph for each student. This rate of growth, which is attainable during the 13 weeks available for intervention, would result in each boy reading about 75 WCPM. Although this rate of growth would not be enough to move the boys into a category of grade-level proficiency, it would move them out of the “at-risk” category and it is an achievable goal. It is important to help students reach and surpass grade-level expectations; however, Ms. Reeder did not want to frustrate her students or herself by setting goals that would be impossible to achieve.

At first, George made progress in this intervention group, his weekly fluency scores, taken during progress monitoring, matching the aim line. Miguel’s fluency was not increasing as quickly, but Ms. Reeder’s real concern was that by the third week of intervention and progress monitoring, neither boy appeared to be on track to meet his fluency goal. Ms. Reeder decided to conduct further informal assessment to pinpoint the reason each student was continuing to struggle with reading fluency.

Further assessment. Ms. Reeder administered additional DIBELS probes to George and Miguel. She selected the Phoneme Segmentation Fluency (PSF) test and neither boy had any trouble with segmenting and blending words. On the Nonsense Word Fluency (NWF) test, George was in the no-risk category, but Miguel struggled. Ms. Reeder analyzed Miguel’s errors on the NWF and then used an informal reading inventory to diagnose more precisely the problem he was having with phonics. Analysis of Miguel’s phonics skills showed that vowels were a significant weakness for him, particularly long vowels, like CVCe words and vowel teams such as ea, ee, and ai, as well as diphthongs, such as oi and ow.

Closer inspection of George’s fluency problems, using an informal reading inventory, indicated that he was not using context when he read, and that he rarely self-corrected his errors. In addition, unfamiliar vocabulary words appeared to slow him down. Ms. Reeder’s anecdotal records indicated that George tended to fidget, rock back and forth, and make excessive involuntary facial contortions during reading group time. He frequently left his seat during independent work time, preferring to walk around the room and talk to other children.

Armed with this information, Ms. Reeder moved Miguel into a phonics intervention group and put George in a fluency group working with text from a late first-grade level.

Intervening with Miguel. Ms. Reeder’s phonics and decoding group, which included three students, worked with her for 15 minutes each day while the rest of the class was engaged in independent reading and writing activities. Each child in this group has problems similar to Miguel’s in the areas of phonics and letter-sound correspondence. This group worked extensively on phonics activities that required a high level of listening, speaking, reading, and writing. Each intensive intervention session included practice with letter-sound correspondence at the word level, both expressively and receptively, and the use of decodable text for reading practice in connected text (Menzies, Mahdavi, & Lewis, in press).

Miguel’s group enjoyed Making Words, the program by Cunningham and Hall (1994). Students were given eight letter tiles that they used to form words that Ms. Reeder dictated from a list. The words followed particular patterns of long and short, vowel patterns, blends, and word families. After using the tiles to make words, the children wrote the words on their individual whiteboards. They also sorted the words, written on cards, by sound and spelling patterns. The Making Words lessons might last only 15 minutes, but provided many opportunities for students to hear, say, see, and spell words with phonetically regular patterns.


Books for Miguel’s group were selected with easily decoded text so that the students could practice their decoding skills within the context of connected and interesting passages. During each intervention session, the children read at least one of these short, decodable books. They participated in choral reading, as well as whisper reading (each individual reads independently in a whisper so that the teacher can hear mistakes and make sure that the students use decoding strategies). Ms. Reeder also sent these books home with the students for reading homework.

Progress-monitoring assessments in oral reading fluency were given to each child in Miguel’s group each week during the intervention period. Miguel’s chart shows that after two weeks of intensive phonics intervention, his reading fluency scores rose to meet the aimline (see weeks nine and 10 on Miguel’s graph, Figure 1). Ms. Reeder decided to keep Miguel in the phonics group for now, working with him and the others on increasingly difficult phonetic patterns. She will also continue to monitor their progress each week with the DIBELS Oral Reading Fluency measure to evaluate the intervention and decide if it should be continued or changed.

Intervening with George. The fluency group that George joined consisted of four other children who struggled more with reading comprehension than decoding. As a group, they were unlikely to be able to identify the main idea of a passage they had just read. Like George, when they read, they did not seem to notice when they substituted a nonsense word or a word that made no sense in the context of the sentence for the one that was printed. Rates of self-correction of errors when reading were very low in this group. In addition, the oral language skills of George and one other student in the group lagged behind those of their more typically developing peers. They lacked facility with academic vocabulary as well as the vocabulary used in grade-level fictional texts. Ms. Reeder knew that this group of children needed to increase their reading comprehension and widen their vocabularies to read more fluently and more accurately.


This comprehension group met with Ms. Reeder for about 15 minutes per day each day of the week. Ms. Reeder carefully selected books for this group to read and culled them for relevant and interesting vocabulary words. Once she selected words that her students needed to learn, she used various strategies to teach them. On some days, she would act out the words and then have the students do the same. Sometimes she would have the children draw pictures and write their own definitions of the words. Often, Ms. Reeder used photographs, cartoons, or hand and body motions to help the students understand the vocabulary (Gersten, Baker, Haager, & Graves, 2005).

In addition to vocabulary, Ms. Reeder taught the children to ask themselves, “Does that make sense?” and to stop and retell the story at frequent intervals. During these comprehension checks, Ms. Reeder refrained from telling students what they had just read. Rather, she created frequent opportunities for the children to talk among themselves as they read, explaining to one another what they had read and clarifying main ideas, supporting details, and other story features. As part of these activities, Ms. Reeder also asked the children to identify vocabulary words in the story that they did not understand. She would then ask one child to lead the others in activities to help them all understand the words. These strategies were all designed to make the children more responsible for their own learning and more independent in their reading.

Both the vocabulary and comprehension activities for this group were conducted in the context of rich pieces of late first-grade level literature as well as expository text selected to appeal to the four children in the group. Progress was monitored based on the results of the DIBELS Oral Reading Fluency measure administered each week during the intervention period. George’s chart (see Figure 2) indicates that the vocabulary and comprehension activities began to show results in his reading fluency by week nine, after two weeks in that group. Ms. Reeder decided to continue her vocabulary and comprehension interventions with this group, particularly with George for as long as his ORF scores continued to match the aimline.

Solving the Puzzle

Pulling together all the pieces of the reading puzzle is a difficult job. Teachers, such as Ms. Reeder, Ms. Roth, Ms. Greene, and Mr. Beckwith have to make many decisions about which assessments to use, how to monitor progress on an ongoing basis, which materials will teach needed skills, and how to organize their time to intervene with students of very diverse levels of ability. In addition, juggling the demands of state standards and end-of-year testing with those of the needs of individual students who may be reading below grade level is complex and sometimes daunting. Often teachers reject the idea of progress monitoring and intensive intervention without trying it because they cannot see how these activities can be incorporated into their already busy schedules.

Progress monitoring applied to focused and intensive intervention does not have to be overwhelming. Teachers must consider everything in their bag of teaching tricks when selecting interventions. More than likely, they have favorite phonics games, context-building strategies, vocabulary development activities, and much more that could very appropriately supplement the core reading curriculum in a classroom. In addition, core curriculum items can be used to reteach skills to small groups of two to five children.

Determining how to group children who need similar reading instruction is, like anything, complicated the first few times it is attempted. It helps to remember that students are not permanently assigned to a group. Groups are reformed as children make progress at different rates. No intervention group is a long-term commitment; if a child is not making progress in one group, then the teacher can move the student to another group. In fact, changes might be the norm rather than the exception when it comes to reading intervention based on progress-monitoring data.

Finding the time to assess and monitor progress can be tricky; however, the power to select and implement instruction with the precision of a laser beam (rather than the scattered approach of a shotgun), cannot be discounted. Teaching children to read can be accomplished much more effectively and efficiently if the teacher knows what skills the child already has and which ones need to be developed, enabling him or her to put together an intervention package exactly tailored to the child’s needs. The time lost to assessment may well be regained when the student progresses more quickly after receiving the appropriate intervention at the moment it is needed most.


Curriculum-Based Measurement Warehouse


Cunningham, P. M., & Hall, D. P. (1994). Making words. Parsippany, NJ: Good Apple, Inc.

Fuchs, L. S., Fuchs, D., Hamlett, C. L., Walz, L., & Germann, G. (1993). Formative evaluation of academic progress: How much growth should we expect? School Psychology Review. 22, 27–48.

Gersten, R., Baker, S. K., Haager, D., & Graves, A. W. (2005). Exploring the role of teacher quality in predicting reading outcomes for first-grade English learners: An observational study. Remedial and Special Education. 26, 197–206.

Good, R. H., & Kaminski, R. A. (2001). Dynamic Indicators of Basic Early Literacy Skills. Eugene: University of Oregon Center on Teaching and Learning.

Hasbrouck, J., & Tindal, G. (2006). Oral reading fluency norms: A valuable assessment tool for reading teachers. Reading Teacher. 59(7), 636–644.

Menzies, H. M., Mahdavi, J. N., & Lewis, J. (in press). Early reading intervention: From research to practice. Remedial and Special Education.

This article was originally published in Perspectives on Language and Literacy, vol. 33, No. 2, Spring 2007, copyright by The International Dyslexia Association. Used with permission.

Back To Top