By: Evelyn S. Johnson, Ed.D., Deborah R. Carter, Ph.D., and Juli Pool, Ph.D.|Published: August 30, 2010
The National Implementation Research Network writes extensively about the need for implementation studies to determine how research findings that establish efficacy of a practice can move from a passive process approach of research to practice to an active processapproach (Fixsen & Blase, 2009). Passive process approaches are described as our traditional model of research to practice - researchers establish efficacy under very tightly controlled conditions, publish the results, and expect practitioners to translate the findings into best practices in a school environment. Active process approaches include a focus on implementation research, to include examining the contexts in which research-based practices are successfully brought to scale.
Our 2-year Response to Intervention/Positive Behavior Supports (RTI/PBS) implementation project reflects an active process approach, in which our interests lie in examining a) the fidelity with which RTI and PBS can be implemented in a school setting, b) the impact of these implementation efforts on important student outcomes, and c) the social validity of the practices to the staff who are charged with implementation.
The first year of this 2-year implementation project ended in early June. Together with the Silver Sage Elementary staff, we recently reviewed our first-year project data in line with these three areas, and within this post we'll share some of the results across these three areas.
Fidelity of implementation. As we began this project, we focused on fidelity in two broad areas: a) fidelity of the individual components of the model (e.g., are teachers delivering instruction as intended?) and b) fidelity with which the system is implemented (e.g., do teachers use data to make decisions?). Our guiding questions included the following:
- Does the school have the resources, materials, and knowledge to implement the RTI/PBS model?
- What are the policy, funding, regulatory, accountability, and data supports necessary at the school and district level to promote implementation?
- Have policies and resource allocation changed? Do current policies support or detract from implementation?
- Given information about the implementation process, what recommendations exist for expanding implementation, allocating resources, and modifying the process?
- What are the necessary conditions that support implementation? What training and technical assistance is required? For whom?
To answer these questions we collected numerous types of data. The results we report in this post are not complete; rather, in this initial analysis we've highlighted select areas that might be of interest to other schools moving through similar processes.
First, we discovered through the collection and analysis of academic achievement, screening, and benchmark data that the school needs to review its core mathematics program, especially the vertical alignment of the K–5 curriculum and instruction. Patterns of performance in student data across benchmark periods and various measures suggest that earlier grades require stronger instruction in math. Similarly, end of year data suggest that a review of the language arts curriculum should be a priority for the fall. In reading, the evidence collected suggests that the school has a strong, coherent framework for providing quality core instruction that is well-supported by a system of assessment and intervention for students who require additional support to be successful.
Next, the school discovered that it requires district-level support for a variety of components to include adequate progress monitoring and screening tools in math and writing, as well as intervention programs in these areas. We worked on math interventions using the
V-Math series published by Voyager and obtained good results. However, materials were provided by research staff, interventions were delivered by student teachers, and progress monitoring was not implemented with fidelity (student teachers did not routinely monitor progress, nor did they use progress-monitoring data to make instructional decisions). Though school staff understand the components of the RTI process, the logistics of carrying it out proved to trump adherence to the model. This is consistent with much of the implementation research on progress monitoring—even though teachers understand its importance, it can be difficult to implement in practice.
In the area of behavior, the school staff ratings using tools from the
PBIS Center, such as the School-wide Evaluation Tool (SET) and the Effective Behavior Support Survey (EBS), demonstrated good fidelity to implementation of a strong universal system of behavioral support. School staff maintained School-Wide Information System (SWIS) data, and the data (see our outcomes section below!) support strong fidelity of implementation.
In terms of systems, the school did have leadership teams that met routinely. The process that these teams followed in terms of focus and action planning was not always clear, especially for academic concerns. Data were not always reviewed, the team frequently spent the hour-long meeting "bird-walking" about a particular student’s concerns, and communication from the team to the rest of school staff was not always clear.
Finally, implementation at the school level involved strong coordination with the district. We discovered that ongoing dialogues about school-level versus district-level decisions are a necessary part of the implementation process. Negotiating which decisions are school level and which are district level will facilitate open communication and, ultimately, stronger buy-in from school staff. An example of a district-level decision is the selection of assessment tools to screen and benchmark. An example of a school-level decision is the selection of specific intervention materials that meet district guidelines but that also are responsive to the particular needs of the student population in that school. It may be possible that over time, what began as a school-level decision may require district-level review and vice versa, but these conversations do need to be a continued part of implementation and evaluation.
Student outcomes. A difference in student outcome data is often considered to be the "bottom line" of determining the ultimate success of a program. When outcome data show good improvement, staff can have more confidence that they have implemented the model correctly. When outcome data are equivocal, a review of fidelity and other factors is required. Is the lack of improvement due to an ineffective conceptualization of the model? Or to poor implementation? Or to other, unexplained factors that require further investigation?
In Year 1, Silver Sage saw a reduction in office discipline referrals from over 200 the prior year to 61 at the end of this year. The school met AYP targets in reading and math. Over the course of the year, there was an increase in the number of students who met benchmark targets in the areas of reading and math (i.e., more students met winter benchmarks than fall benchmarks, more students met spring benchmarks than winter benchmarks). For students who received intervention in reading and in math, more than 75% of them met performance targets on state assessments. The school did not meet AYP in the area of language arts, and this will be a focus for Year 2 implementation.
Overall, outcome data suggest that the implementation of the RTI/PBS model is having a good effect on important student performance metrics.
Social validity.In social services such as education, policy is often developed by people who are not directly charged with its implementation. Steven Maynard-Moody and Michael Musheno in their book,
Cops, Teachers, Counselors: Stories From the Front Lines of Public Service, talk about the impact that "street level workers" (in education, this means teachers!) can have on the success or failure of policy implementation. Street level workers charged with implementing new policies can be a) workers, meaning they understand and accept their roles and responsibilities under the new policy and carry them out accordingly; b) shirkers, meaning they avoid meaningful changes related to the new policy, perhaps complying only in the strictest meaning of the word but never fully accepting their new charge; or c) saboteurs, meaning they actively seek to undermine the new policy.
As a part of this project, we conducted focus groups and collected data on different aspects of social validity. We found that most teachers fell into the first and second categories, and that a small minority fell into the third. Overall, we found only minor improvements in teacher acceptance of the RTI/PBS model, with most teachers understanding how the system can support student achievement but not necessarily agreeing that their school would be able to successfully and fully implement such a system. Additionally, teachers questioned the long-term adherence to the model, both at the school and district level, and wondered whether changes in leadership would lead to changes in policy and procedures, or whether the model was really here to stay. Despite strong efforts to explain the PBS model, some teachers still questioned the need to screen for behavior concerns and did not seem to buy in to the school-developed behavioral expectations.
A greater acceptance of the use of data was one notable area of improvement. By the end of the year, teachers understood not only the how but the why of data usage and seemed to understand that data is a tool to be used toward improving student outcomes and not a replacement for their professional judgment and expertise. Of the three areas of focus (fidelity, outcomes, social validity), we believe that an emphasis on improving the social validity of the model will have the greatest impact on improved fidelity, and ultimately improved student outcomes.
Conclusions. Overall, the evidence collected to date supports the continued implementation of the RTI/PBS model at Silver Sage. As one teacher noted during a district-level meeting, after 5 years of trying to work with school staff to use data she was excited to announce that Silver Sage is finally "rocking with RTI and PBS implementation." Our preliminary findings certainly support that conclusion and we are excited to continue our efforts in the 2010–2011 school year.
References
Fixsen, D. L., & Blasé, K. A. (2009, January). Implementation: The missing link between research and practice. NIRN Implementation Brief #1. Chapel Hill: The University of North Carolina FPG Child Development Institute, National Implementation Research Network.
Maynard-Moody, S., & Musheno, M. (2003). Cops, teachers, counselors: Stories from the front lines of public service. Ann Arbor: University of Michigan Press.