Treatment Integrity Strategies

Student achievement scores in the United States remain stagnant despite constant reform. New initiatives arise promising hope, only to disappoint after being adopted, implemented, and quickly found wanting. The cycle of reform followed by failure has had a demoralizing effect on schools, making new reform efforts more problematic. These efforts frequently fail because implementing new practices is far more challenging than expected and require that greater attention be paid to implementation. A fundamental factor leading to failure is inattention to treatment integrity. When innovations are not implemented as designed, it should not be a surprise that anticipated benefits are not forthcoming. The question is, what strategies can educators employ to increase the likelihood that practices will be implemented as designed?

Strategies designed to increase treatment integrity fall into two categories: antecedent-based strategies and consequence-based strategies. Antecedent-based strategies involve any setting event or environmental factor that happens prior to implementing the new practice and that increases the likelihood of success as well as eliminates setting events or environmental considerations that decrease the likelihood of success. Consequence-based strategies are designed to impact actions that happen after implementation of the new practice and that are likely to increase or decrease treatment integrity.

View PDF of overview

Treatment Integrity Strategies

Citation: States, J., Detrich, R. & Keyworth, R. (2017). Treatment Integrity Strategies. Oakland, CA: The Wing Institute. https://www.winginstitute.org/effective-instruction-treatment-integrity-strategies.

Student achievement scores in the United States remain stagnant despite repeated attempts to reform the system. New initiatives promising hope arise, only to disappoint after being adopted, implemented, and quickly found wanting. The cycle of reform followed by failure has had a demoralizing effect on schools, making new reform efforts more problematic. These efforts frequently fail because implementing new practices is far more challenging than expected and require that greater attention be paid to how initiatives are implemented (Fixsen, Blase, Duda, Naoom, & Van Dyke, 2010).

Inattention to treatment integrity is a primary factor of failure during implementation. Treatment integrity is defined as the extent to which an intervention is executed as designed, and the accuracy and consistency with which the intervention is implemented (Detrich, 2014; McIntyre, Gresham, DiGennaro, & Reed, 2007).

Treatment Integrity  

Figure 1: Benefit From Evidence-Based Practices Implemented With Integrity

Figure 1 shows the relationship between implementation of empirically supported interventions and treatment integrity. If an empirically supported intervention is implemented with a high degree of treatment integrity, then there is high probability of benefit to the student. If that same intervention is implemented poorly, then the probability of benefit is low. If an intervention is implemented with high integrity but does not have empirical support, then the probability of benefit is still low because the intervention is ineffective. This is similar to taking placebo pills in a medication study. Even if the placebo is taken exactly as prescribed, it is not likely to produce a medically important benefit. Implementing an unsupported intervention poorly is not likely to produce benefit either, because both empirical support and high integrity are absent.

When innovations are not implemented as conceived, it should not be a surprise that anticipated benefits are not forthcoming. This raises the possibility that the issue is with the quality of implementation and not with the practice itself (Fixsen, Naoom, Blase, Friedman, & Wallace, 2005). Research suggests that practices are rarely implemented as designed (Hallfors & Godette, 2002). Despite solid evidence of a relationship between effectiveness of a practice and treatment integrity, it has only been in the past 20 years that systematic efforts have focused on how practitioners can influence the quality of treatment integrity (Durlak & DuPre, 2008; Fixsen et al., 2010). Measuring treatment integrity in schools continues to be rare, but it is the necessary place to begin efforts to improve treatment integrity. The question remains, what strategies can educators employ to increase the likelihood that practices will be implemented as designed?

Two Types of Treatment Integrity Strategies

Strategies designed to increase treatment integrity fall into two main categories: antecedent-based strategies and consequence-based strategies.

Antecedent-based strategies involve any setting event or environmental factor that happens prior to implementing the new practice, which is intended to increases treatment integrity. These strategies also include actions designed to eliminate or reduce the impact of setting events or environmental considerations that impede treatment integrity. An example of an antecedent-based strategy that may increase the chances of the new practice being implemented as designed is staff development. An example of an antecedent-based strategy used to decrease undesirable events or environmental conditions is action taken to mitigate staff opposition to the new practice.

Consequence-based strategies, on the other hand, follow the implementation of a new practice and are designed to increase or maintain high treatment integrity in the future. These strategies reinforce the implementation of the elements of a new practice, thus improving the chance the practice will produce future desired outcomes. Consequence-based strategies may also be used to eliminate or reduce the impact of events and environmental factors that may interfere with the successful execution of the new practice in the future. An example of a consequence-based strategy that may increase future integrity is positive feedback for faithfully implementing the new practice elements. An example of a consequence-based strategy to eliminate or reduce factors that interfere with the successful execution of the new practice is corrective feedback coupled with coaching.

Antecedent-based strategies

Antecedent efforts begin well before rolling out a new innovation. They start with actively engaging staff to obtain their buy-in, an essential step before implementing a change process. Rogers (2003) has suggested that the adoption and implementation of new practices is a social process and concludes that innovations will be adopted and implemented to the extent that they

  1. are compatible with the beliefs, values, and previous experience of individuals within a social system.
  2. solve a problem for the teacher/staff.
  3. have a relative advantage over the current practice.
  4. gain the support of opinion leaders.

Assessment is a key antecedent strategy. An assessment must examine the readiness of the site implementing the practice, evaluate staff development needs, and appraise the available resources required for implementing the practice. Conducting an initial assessment to gauge how closely the new practice will align with the culture of the school or classroom is especially important. It is a fact that change does not come easily. Studies suggest interventions that slightly modify existing routines and practices are more likely to succeed, and large-scale shifts are more likely to be rejected (Detrich, 2014). Too often, new practices are mandated from above without regard for any negative impact on the teachers who must implement them. Measuring the degree of contextual fit allows the school administrator to identify areas of resistance and recognize how to adapt practices to better match the current values and skills of staff. According to Horner, Blitz, and Ross (2104), “Contextual fit is the match between the strategies, procedures, or elements of an intervention and the values, needs, skills, and resources of those who implement and experience the intervention.”

An example of a model that values staff buy-in is Positive Behavioral Interventions and Supports (PBIS), a research-based, schoolwide system approach created to improve school climate and to create safer and more effective schools. PBIS is currently in more than 23,000 schools throughout the United States. Consent is required from 80% of staff before the PBIS framework is introduced into a school. PBIS finds this essential for establishing and maintaining the degree of treatment integrity necessary for the effective implementation of the program’s practices (Horner, Sugai, & Anderson, 2010). Achieving a threshold of support significantly increases the probability of an innovation’s sustainable implementation. Increasing staff motivation does not ensure treatment integrity, but it does increase the chances of success.

Another proven strategy for increasing staff acceptance of change is to provide teachers with choices about which practices they believe are best suited for their setting (Detrich, 1999; Hoier, McConnell, & Pallay, 1987). For a detailed analysis of the strategies and the procedures needed for effective adoption of new practices visit the National Implementation Research Network website and the organization’s research synthesis on the topic (Fixsen et al., 2005).

After staff commitment has been achieved, personnel must be effectively trained in the skills and supporting procedures required for the new practice. Staff development is deemed important, as evidenced by the American education system’s expenditure of $18,000 on average per teacher annually (Jacob & McGovern, 2015). Unfortunately, research suggests that schools receive little return for the substantial time and resources schools spend in teacher development (Garet et al., 2008). Research reveals most staff development in schools consists of staff in-service or professional development workshops. Unfortunately, studies also find these sessions to be ineffective (Joyce & Showers, 2002). 

Coaching 

Figure 2: The Influence of Coaching on Whether a New Skill Is Used in the Classroom

Joyce and Showers (2002) found that when teachers receive workshop training

  • 0% of the teachers transfer a new practice to the classroom after instruction in the theory.
  • 0% of the teachers transfer a new practice to the classroom after instruction in the theory and observing a demonstration of the practice.
  • 5% of the teachers transfer a new practice to the classroom after theory, demonstration, and rehearsal.
  • 95% of the teachers transfer a new practice to the classroom after theory, demonstration, rehearsal, and coaching.

Additionally, only coached teachers were able to adapt strategies and overcome obstacles that arose during implementation.

To reverse the dependence on workshops, training should organize around coaching and the use of written manuals that clearly and objectively outline the performance required of the teacher (Kauffman, 2012; Knight, 2013). The most effective staff development, resulting in consistent implementation of a practice, involves working with actual students in a classroom as opposed to didactic presentations or workshop simulations (Reinke, Sprick, & Knight, 2009). Evidence suggests that ongoing coaching is necessary before teachers consistently use newly taught skills in the classroom, and it is fundamental if the teachers are to sustain the desired degree of treatment integrity to maximize desired outcomes.

Consequence-based strategies 

Studies find that treatment integrity declines almost immediately after training. It appears that acquiring staff commitment and providing training aren’t sufficient to maintain treatment integrity (Duhon, Mesmer, Gregerson, & Witt, 2009; Noell et al., 2000). Avoiding the near certainty of decline requires additional components, mainly, monitoring implementation and providing feedback to the teacher. Ongoing systematic observations offer the best opportunities for trainers to know what is working and what is not (Wasik & Hindman, 2011). The applicable data suggest that performance feedback is not only important but essential for establishing and maintaining treatment integrity. Various authors have evaluated the impact of performance feedback on teacher behavior (see Figure 3).

Feedback

Figure 3: Effects of Feedback on Performance

Performance feedback, usually based on direct sampling of performance, is a requisite feature of any effort to achieve or maintain treatment integrity. It has been demonstrated to reduce the degradation of integrity and is useful as a means of reestablishing integrity after it has declined (Duhon et al., 2009; Witt, Noell, LaFleur, & Mortenson, 1997). Research suggests that daily feedback is most powerful, although weekly feedback does improve performance (Detrich, 2014). Follow-up meetings that take place after observations and that include specific data reviewed with the staff are most effective. Individual feedback sessions following observations produced better outcomes than telephone calls, emails, or written handouts (Easton & Erchul, 2011). Feedback provides opportunities to deliver accurate information that is both positive and corrective. Annual or bi-annual feedback used for high-stakes formal evaluation is less effective than constructive feedback focused on skills improvement and delivered throughout the school year; although acceptable, principal feedback is significantly less preferred by teachers (Hill & Grossman, 2013). Teachers are most receptive to feedback that is nonthreatening and focuses on improving skills and the practices they are being taught. Moreover, comments perceived as direct criticism of the person have consistently shown to produce poorer results (Kluger & DeNisi, 1996).

Conclusion

The increasing pressure on schools to improve student achievement has resulted in teachers being bombarded with a myriad of school reforms. As educators look for solutions to stagnant student performance, demand has increased for schools to embrace evidence-based practices. These practices, vetted using rigorous research methods, are intended to increase confidence in a causal relationship between a practice and student outcomes. Practices must be implemented as they were designed if the predicted outcomes are to be achieved. Educators increasingly embrace the perspective that innovations must be implemented with integrity (Detrich, 2014).

Implementation with high levels of treatment integrity does not come easily and does have costs. Sufficient research and practice-based evidence strongly suggest successful implementation of reform is a complex process requiring the investment of resources, time, and money. Key components of effective implementation must address antecedent- and consequence-based strategies shown to be necessary for sustainable implementation. Paramount is the commitment to measure treatment integrity on an ongoing basis. Principals need to effectively arrange contingencies so that teachers accept this notion, and teachers need to adopt and sustain strategies that support implementation of practices with treatment integrity. Failure to implement with integrity decreases the likelihood that new practices will produce meaningful results. Often, past reform efforts have ignored the issue of treatment integrity, resulting in viable reforms being abandoned when they may actually have worked if implemented with integrity.

Citations

Detrich, R. (1999). Increasing treatment fidelity by matching interventions to contextual variables within the educational setting. School Psychology Review, 28(4), 608–620.

Detrich, R. (2014). Treatment integrity: Fundamental to education reform. Journal of Cognitive Education and Psychology, 13(2), 258–271.

Detrich, R. (2015). Treatment integrity: A wicked problem and some solutions. Missouri Association for Behavior Analysis 2015 Conference. http://winginstitute.org/2015-MissouriABA-Presentation-Ronnie-Detrich

Duhon, G. J., Mesmer, E. M., Gregerson, L., & Witt, J. C. (2009). Effects of public feedback during RTI team meetings on teacher implementation integrity and student academic performance. Journal of School Psychology, 47(1), 19–37.

Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41(3–4), 327–350.

Easton, J. E., & Erchul, W. P. (2011). An exploration of teacher acceptability of treatment plan implementation: Monitoring and feedback methods. Journal of Educational and Psychological Consultation, 21(1), 56-77.

Fixsen, D. L., Blase, K. A., Duda, M., Naoom, S. F., & Van Dyke, M. (2010). Sustainability of evidence-based programs in education. Journal of Evidence-Based Practices for Schools, 11(1), 30–46.

Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: A synthesis of the literature (FMHI Publication No. 231). Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, and the National Implementation Research Network.

Garet, M. S., Cronen, S., Eaton, M., Kurki, A., Ludwig, M., Jones, W., ... Zhu, P. (2008). The impact of two professional development interventions on early reading instruction and achievement. NCEE 2008-4030. Washington, DC: National Center for Education Evaluation and Regional Assistance.

Hallfors, D., & Godette, D. (2002). Will the “principles of effectiveness” improve prevention practice? Early findings from a diffusion study. Health Education Research, 17(4), 461–470.

Hill, H., & Grossman, P. (2013). Learning from teacher observations: Challenges and opportunities posed by new teacher evaluation systems. Harvard Educational Review, 83(2), 371-384.

Hoier, T. S., McConnell, S., & Pallay, A. G. (1987). Observational assessment for planning and evaluating educational transitions: An initial analysis of template matching. Behavioral Assessment, 9(1), 5-19.

Horner, R., Blitz, C., & Ross, S. (2014). The importance of contextual fit when implementing evidence-based interventions. Washington, DC: U.S. Department of Health and Human Services, Office of the Assistant Secretary for Planning and Evaluation. https://aspe.hhs.gov/system/files/pdf/77066/ib_Contextual.pdf

Horner, R. H., Sugai, G., & Anderson, C. M. (2010). Examining the evidence base for school-wide positive behavior support. Focus on Exceptional Children, 42(8), 1.

Jacob, A., & McGovern, K. (2015). The mirage: Confronting the hard truth about our quest for teacher development. Brooklyn, NY: TNTP. https://tntp.org/assets/documents/TNTP-Mirage_2015.pdf.

Joyce, B. R., & B. Showers (2002). Student achievement through staff development. Alexandria, VA: Association for Supervision and Curriculum Development (ASCD) Books.

Kauffman, J. M. (2012). Science and the education of teachers. In: Education at the Crossroads: The State of Teacher Preparation (Vol. 2, pp. 47-64). Oakland, CA: The Wing Institute.

Kluger, A. N., & DeNisi, A. (1996). The effects of feedback interventions on performance: A historical review, a meta-analysis, and a preliminary feedback intervention theory. Psychological Bulletin, 119(2):254–284.

Knight, J. (2013). Focus on teaching: Using video for high-impact instruction. (Pages 8-14). Thousand Oaks, CA: Corwin.

McIntyre, L. L., Gresham, F. M., DiGennaro, F. D., & Reed, D. D. (2007). Treatment integrity of school‐based interventions with children in the Journal of Applied Behavior Analysis 1991–2005. Journal of Applied Behavior Analysis, 40(4), 659–672.

Noell, G. H., Witt, J. C., LaFleur, L. H., Mortenson, B. P., Ranier, D. D., & LeVelle, J. (2000). Increasing intervention implementation in general education following consultation: A comparison of two follow-up strategies. Journal of Applied Behavior Analysis, 33(3), 271–284.

Reinke, W. M., Sprick, R., & Knight, J. (2009). Coaching classroom management. In: J. Knight (Ed.), Coaching: Approaches and perspectives (pp. 91-112). Thousand Oaks, CA: Corwin Press.

Rogers, E. M. (2003). Diffusion of innovations (5th ed.). New York, NY: Free Press.

Wasik, B. A., & Hindman, A. H. (2011). Improving vocabulary and pre-literacy skills of at-risk preschoolers through teacher professional development. Journal of Educational Psychology, 103(2), 455.

Witt, J. C., Noell, G. H., LaFleur, L. H., & Mortenson, B. P. (1997). Teacher use of interventions in general education settings: Measurement and analysis of the independent variable. Journal of Applied Behavior Analysis, 30(4), 693–696.

 

 

View PDF of overview

Publications

TITLE
SYNOPSIS
CITATION
Treatment Integrity: Fundamental to Education Reform

To produce better outcomes for students two things are necessary: (1) effective, scientifically supported interventions (2) those interventions implemented with high integrity.  Typically, much greater attention has been given to identifying effective practices.  This review focuses on features of high quality implementation.

Detrich, R. (2014). Treatment integrity: Fundamental to education reform. Journal of Cognitive Education and Psychology, 13(2), 258-271.

Innovation, Implementation Science, and Data-Based Decision Making: Components of Successful Reform

Schools are often expected to implement innovative instructional programs.  Most often these initiatives fail because what we know from implementation science is not considered as part of implementing the initiative.  This chapter reviews the contributions implementation science can make for improving outcomes for students.

Detrich, R. Innovation, Implementation Science, and Data-Based Decision Making: Components of Successful Reform. Handbook on Innovations in Learning, 31.

Treatment Integrity: A Fundamental Unit of Sustainable Educational Programs.

Reform efforts tend to come and go very quickly in education. This paper makes the argument that the sustainability of programs is closely related to how well those programs are implemented.

Detrich, R., Keyworth, R. & States, J. (2010). Treatment Integrity: A Fundamental Unit of Sustainable Educational Programs. Journal of Evidence-Based Practices for Schools, 11(1), 4-29.

Approaches to Increasing Treatment Integrity

Strategies designed to increase treatment integrity fall into two categories: antecedent-based strategies and consequence-based strategies.

Detrich, R., States, J. & Keyworth, R. (2017). Approaches to Increasing Treatment Integrity. Oakland, Ca. The Wing Institute

 

Treatment Integrity in the Problem Solving Process

The usual approach to determining if an intervention is effective for a student is to review student outcome data; however, this is only part of the task. Student data can only be understood if we know something about how well the intervention was implemented. Student data without treatment integrity data are largely meaningless because without knowing how well an intervention has been implemented, no judgments can be made about the effectiveness of the intervention. Poor outcomes can be a function of an ineffective intervention or poor implementation of the intervention. Without treatment integrity data, the is a risk that an intervention will be judged as ineffective when, in fact, the quality of implementation was so inadequate that it would be unreasonable to expect positive outcomes.

Detrich, R., States, J. & Keyworth, R. (2017). Treatment Integrity in the Problem Solving Process. Oakland, Ca. The Wing Institute.

 

Treatment Integrity Strategies Overview

Student achievement scores in the United States remain stagnant despite repeated attempts to reform the education system. New initiatives promising hope arise, only to disappoint after being adopted, implemented, and quickly found wanting. The cycle of reform followed by failure has had a demoralizing effect on schools, making new reform efforts problematic. These efforts frequently fail because implementing new practices is far more challenging than expected and require that greater attention be paid to how initiatives are implemented. Treatment integrity is increasingly recognized as an essential component of effective implementation in an evidence-based education model that produces results, and inattention to treatment integrity is seen as a primary reason new initiatives fail. The question remains, what strategies can educators employ to increase the likelihood that practices are implemented as designed? The Wing Institute overview on the topic of Treatment Integrity Strategies examines the essential practice elements indispensable for maximizing treatment integrity.

 

States, J., Detrich, R. & Keyworth, R. (2017). Overview of Treatment Integrity Strategies. Oakland, CA: The Wing Institute. http://www.winginstitute.org/effective-instruction-treatment-integrity-strategies.

 

Presentations

TITLE
SYNOPSIS
CITATION
Treatment Integrity and Program Fidelity: Necessary but Not Sufficient to Sustain Programs
If programs are to sustain they must be implemented with integrity. If there is drift over time, it raises questions about whether the program is sustaining or has been substantially changed.
Detrich, R. (2008). Treatment Integrity and Program Fidelity: Necessary but Not Sufficient to Sustain Programs [Powerpoint Slides]. Retrieved from 2008-aba-presentation-ronnie-detrich.
Treatment Integrity: A Fundamental Component of PBS
School-wide initatives have to be well implemented if there is to be any benefit. This talk describes methods for assuring high levels of treatment integrity.
Detrich, R. (2008). Treatment Integrity: A Fundamental Component of PBS [Powerpoint Slides]. Retrieved from 2008-apbs-txint-presentation-ronnie-detrich.
Toward a Technology of Treatment Integrity
If research supported interventions are to be effective it is necessary that they are implemented with integrity. This paper describes approahes to assuring high levels of treatment integrtiy.
Detrich, R. (2011). Toward a Technology of Treatment Integrity [Powerpoint Slides]. Retrieved from 2011-apbs-presentation-ronnie-detrich.
TITLE
SYNOPSIS
CITATION
Treatment integrity: A wicked problem and some solutions

Presentation by Wing Institute with goals: Make the case that treatment integrity monitoring is a necessary part of service delivery; describe dimensions of treatment integrity; suggest methods for increasing treatment integrity; place treatment integrity within systems framework . 

Introduction: Proceedings from the Wing Institute’s Sixth Annual Summit on Evidence-Based Education: Performance Feedback: Using Data to Improve Educator Performance.

This book is compiled from the proceedings of the sixth summit entitled “Performance Feedback: Using Data to Improve Educator Performance.” The 2011 summit topic was selected to help answer the following question: What basic practice has the potential for the greatest impact on changing the behavior of students, teachers, and school administrative personnel?

States, J., Keyworth, R. & Detrich, R. (2013). Introduction: Proceedings from the Wing Institute’s Sixth Annual Summit on Evidence-Based Education: Performance Feedback: Using Data to Improve Educator Performance. In Education at the Crossroads: The State of Teacher Preparation (Vol. 3, pp. ix-xii). Oakland, CA: The Wing Institute.

 

 

Program integrity in primary and early secondary prevention: are implementation effects out of control

Dane and Schneider propose treatment integrity as a multi-dimensional construct and describe five dimensions that constitute the construct.

Dane, A. V., & Schneider, B. H. (1998). Program integrity in primary and early secondary prevention: are implementation effects out of control. Clinical psychology review, 18(1), 23-45.

Increasing treatment fidelity by matching interventions to contextual variables within the educational setting

The impact of an intervention is influenced by how well it fis into the context of a classroom.  This paper suggests a number of variables to consider and how they might be measured prior to the development of an intervention.

Detrich, R. (1999). Increasing treatment fidelity by matching interventions to contextual variables within the educational setting. School Psychology Review, 28(4), 608-620.

Treatment Integrity: Fundamental to Education Reform

To produce better outcomes for students two things are necessary: (1) effective, scientifically supported interventions (2) those interventions implemented with high integrity.  Typically, much greater attention has been given to identifying effective practices.  This review focuses on features of high quality implementation.

Detrich, R. (2014). Treatment integrity: Fundamental to education reform. Journal of Cognitive Education and Psychology, 13(2), 258-271.

Innovation, Implementation Science, and Data-Based Decision Making: Components of Successful Reform

Over the last fifty years, there have been many educational reform efforts, most of which have had a relatively short lifespan and failed to produce the promised results. One possible reason for this is for the most part these innovations have been poorly implemented. In this chapter, the author proposes a data-based decision making approach to assuring high quality implementation.

Detrich, R. Innovation, Implementation Science, and Data-Based Decision Making: Components of Successful Reform. In M. Murphy, S. Redding, and J. Twyman (Eds). Handbook on Innovations in Learning, 31. Charlotte, NC: Information Age Publishing

Treatment Integrity in the Problem Solving Process

The usual approach to determining if an intervention is effective for a student is to review student outcome data; however, this is only part of the task. Student data can only be understood if we know something about how well the intervention was implemented. Student data without treatment integrity data are largely meaningless because without knowing how well an intervention has been implemented, no judgments can be made about the effectiveness of the intervention. Poor outcomes can be a function of an ineffective intervention or poor implementation of the intervention. Without treatment integrity data, there is a risk that an intervention will be judged as ineffective when, in fact, the quality of implementation was so inadequate that it would be unreasonable to expect positive outcomes.

Detrich, R., States, J. & Keyworth, R. (2017). Treatment Integrity in the Problem Solving Process. Oakland, Ca. The Wing Institute.

 

Overview of Treatment Integrity

For the best chance of producing positive educational outcomes for all children, two conditions must be met: (a) adopting effective empirically supported (evidence-based) practices and (b) implementing those practices with sufficient quality that they make a difference (treatment integrity)

Detrich, R., States, J., & Keyworth, R. (2107). Overview of Treatment Integrity. Oakland, Ca. The Wing Institute.

Effects of public feedback during RTI team meetings on teacher implementation integrity and student academic performance.

This study evaluated the impact of public feedback in RtI team meetings on the quality of implementation.  Feedback improved poor implementation and maintained high level implementation.

Duhon, G. J., Mesmer, E. M., Gregerson, L., & Witt, J. C. (2009). Effects of public feedback during RTI team meetings on teacher implementation integrity and student academic performance. Journal of School Psychology, 47(1), 19-37.

Implementation Matters: A Review of Research on the Influence of Implementation on Program Outcomes and the Factors Affecting Implementation

The first purpose of this review is to assess the impact of implementation on program outcomes, and the second purpose is to identify factors affecting the implementation process.

Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American journal of community psychology, 41(3-4), 327-350.

An Exploration of Teacher Acceptability of Treatment Plan Implementation: Monitoring and Feedback Methods.

This paper summarizes survey results about the acceptability of different methods for monitoring treatment integrity and performance feedback.

Easton, J. E., & Erchul, W. P. (2011). An Exploration of Teacher Acceptability of Treatment Plan Implementation: Monitoring and Feedback Methods. Journal of Educational & Psychological Consultation, 21(1), 56-77. Retrieved from http://www.tandfonline.com/doi/abs/10.1080/10474412.2011.544949?journalCode=hepc20.

The impact of two professional development interventions on early reading instruction and achievement

To help states and districts make informed decisions about the PD they implement to improve reading instruction, the U.S. Department of Education commissioned the Early Reading PD Interventions Study to examine the impact of two research-based PD interventions for reading instruction: (1) a content-focused teacher institute series that began in the summer and continued through much of the school year (treatment A) and (2) the same institute series plus in-school coaching (treatment B).

Garet, M. S., Cronen, S., Eaton, M., Kurki, A., Ludwig, M., Jones, W., ... Zhu, P. (2008). The impact of two professional development interventions on early reading instruction and achievement. NCEE 2008-4030. Washington, DC: National Center for Education Evaluation and Regional Assistance.

Does the Match Matter? Exploring Whether Student Teaching Experiences Affect Teacher Effectiveness

This descriptive study examines the relationship between student teaching experiences and a teacher’s future effectiveness on the job. The primary finding is that teachers are more effective when the student demographics of their current schools are similar to the student demographics of the schools in which they did their student teaching. This study suggests that further experimental research be conducted to determine if the data hold up. If they do, the implication is that, in recruiting new teachers, school principals would be well served by choosing candidates whose student teaching experiences were in schools whose demographics match those of their own schools. Teacher preparation programs can also assist by assessing candidates’ preferences for where they plan on working and match student teaching placements to schools with similar demographics where new teachers are likely to be employed.

 

Goldhaber, D., Krieg, J. M., & Theobald, R. (2017). Does the match matter? Exploring whether student teaching experiences affect teacher effectiveness. American Educational Research Journal54(2), 325–359.

Assessment of Treatment Integrity in School Consultation and Prereferral Intervention.

Technical issues (specification of treatment components, deviations from treatment protocols and amount of behavior change, and psychometric issues in assessing Treatment Integrity) involved in the measurement of Treatment Integrity are discussed.

Gresham, F. M. (1989). Assessment of treatment integrity in school consultation and prereferral intervention. School Psychology Review, 18(1), 37-50.

Treatment Integrity Assessment: How Estimates of Adherence, Quality, and Exposure Influence Interpretation of Implementation.

This study evaluated the differences in estimates of treatment integrity be measuring different dimensions of it.

Hagermoser Sanetti, L. M., & Fallon, L. M. (2011). Treatment Integrity Assessment: How Estimates of Adherence, Quality, and Exposure Influence Interpretation of Implementation. Journal of Educational & Psychological Consultation, 21(3), 209-232.

Will the “principles of effectiveness” improve prevention practice? Early findings from a diffusion study

This study examines adoption and implementation of the US Department of Education's new policy, the `Principles of Effectiveness', from a diffusion of innovations theoretical framework. In this report, we evaluate adoption in relation to Principle 3: the requirement to select research-based programs.

Hallfors, D., & Godette, D. (2002). Will the “principles of effectiveness” improve prevention practice? Early findings from a diffusion study. Health Education Research, 17(4), 461–470.

Observational Assessment for Planning and Evaluating Educational Transitions: An Initial Analysis of Template Matching

Used a direct observation-based approach to identify behavioral conditions in sending (i.e., special education) and in receiving (i.e., regular education) classrooms and to identify targets for intervention that might facilitate mainstreaming of behavior-disordered (BD) children.

Hoier, T. S., McConnell, S., & Pallay, A. G. (1987). Observational assessment for planning and evaluating educational transitions: An initial analysis of template matching. Behavioral Assessment.

Examining the evidence base for school-wide positive behavior support.

The purposes of this manuscript are to propose core features that may apply to any practice or set of practices that proposes to be evidence-based in relation to School-wide Positive Behavior Support (SWPBS). 

Horner, R. H., Sugai, G., & Anderson, C. M. (2010). Examining the evidence base for school-wide positive behavior support. Focus on Exceptional Children, 42(8), 1.

 

The importance of contextual fit when implementing evidence-based interventions.

“Contextual fit” is based on the premise that the match between an intervention and local context affects both the quality of intervention implementation and whether the intervention actually produces the desired outcomes for children and families.

Horner, R., Blitz, C., & Ross, S. (2014). The importance of contextual fit when implementing evidence-based interventions. Washington, DC: U.S. Department of Health and Human Services, Office of the Assistant Secretary for Planning and Evaluation. https://aspe.hhs.gov/system/files/pdf/77066/ib_Contextual.pdf

Student Achievement through Staff Development

This book provides research as well as case studies of successful professional development strategies and practices for educators.

Joyce, B. R., & Showers, B. (2002). Student achievement through staff development. ASCD.

The Effects of Feedback Interventions on Performance: A Historical Review, a Meta-Analysis, and a Preliminary Feedback Intervention Theory

The authors proposed a preliminary FI theory (FIT) and tested it with moderator analyses. The central assumption of FIT is that FIs change the locus of attention among 3 general and hierarchically organized levels of control: task learning, task motivation, and meta-tasks (including self-related) processes.

Kluger, A. N., & DeNisi, A. (1996). The effects of feedback interventions on performance: A historical review, a meta-analysis, and a preliminary feedback intervention theory. Psychological bulletin119(2), 254.

Focus on teaching: Using video for high-impact instruction

This book examines the use of video recording to to improve teacher performance. The book shows how every classroom can easily benefit from setting up a camera and hitting “record”.  

Knight, J. (2013). Focus on teaching: Using video for high-impact instruction. (Pages 8-14). Thousand Oaks, CA: Corwin.

Treatment integrity of school‐based interventions with children

This paper examines school-based experimental studies with individuals 0 to 18 years between 1991 and 2005.  Only 30% of the studies provided treatment integrity data. Nearly half of studies (45%) were judged to be at high risk for treatment inaccuracies.

McIntyre, L. L., Gresham, F. M., DiGennaro, F. D., & Reed, D. D. (2007). Treatment integrity of school‐based interventions with children in the Journal of Applied Behavior Analysis 1991–2005. Journal of Applied Behavior Analysis, 40(4), 659–672.

Increasing intervention implementation in general education following consultation: A comparison of two follow-up strategies.

This study compared the effects of discussing issues of implementation challenges and performance feedback on increasing the integrity of implementation. Performance feedback was more effective than discussion in increasing integrity.

Noell, G. H., & Witt, J. C. (2000). Increasing intervention implementation in general education following consultation: A comparison of two follow-up strategies. Journal of Applied Behavior Analysis, 33(3), 271.

Diffusion of innovations

This book looks at how new ideas spread via communication channels over time. Such innovations are initially perceived as uncertain and even risky. To overcome this uncertainty, most people seek out others like themselves who have already adopted the new idea. Thus the diffusion process typically takes months or years. But there are exceptions: use of the Internet in the 1990s, for example, may have spread more rapidly than any other innovation in the history of humankind. 

Rogers, E. M. (2003). Diffusion of innovations (5th ed.). New York, NY: Free Press.

Extending Use of Direct Behavior Rating Beyond Student Assessment.

This paper reviews options for treatment integrity measurement emphasizing how direct behavior rating technology might be incorporated within a multi-tiered model of intervention delivery.

Sanetti, L. M. H., Chafouleas, S. M., Christ, T. J., & Gritter, K. L. (2009). Extending Use of Direct Behavior Rating Beyond Student Assessment. Assessment for Effective Intervention, 34(4), 251-258. 

Treatment Integrity of Interventions With Children in the Journal of Positive Behavior Interventions: From 1999 to 2009

The authors reviewed all intervention studies published in the Journal of Positive Behavior Interventions between 1999-2009 to determine the percent of those studies that reported a measure of treatment integrity. Slightly more than 40% reported a measure of treatment integrity.

Sanetti, L. M. H., Dobey, L. M., & Gritter, K. L. (2012). Treatment Integrity of Interventions With Children in the Journal of Positive Behavior Interventions: From 1999 to 2009. Journal of Positive Behavior Interventions, 14(1), 29-46.

Treatment integrity of interventions with children in the school psychology literature from 1995 to 2008

The authors reviewed four school psychology journals between 1995-2008 to estimate the percent of intervention studies that reported some measure of treatment integrity. About 50% reported a measure of treatment integrity.

Sanetti, L. M. H., Gritter, K. L., & Dobey, L. M. (2011). Treatment integrity of interventions with children in the school psychology literature from 1995 to 2008. School Psychology Review, 40(1), 72-84.

Coaching Classroom Management: Strategies & Tools for Administrators & Coaches

This book is written for school administrators, staff developers, behavior specialists, and instructional coaches to offer guidance in implementing research-based practices that establish effective classroom management in schools. The book provides administrators with practical strategies to maximize the impact of professional development. 

Sprick, et al. (2010). Coaching Classroom Management: Strategies & Tools for Administrators & Coaches. Pacific Northwest Publishing.

Treatment Integrity Strategies Overview

Inattention to treatment integrity is a primary factor of failure during implementation. Treatment integrity is defined as the extent to which an intervention is executed as designed, and the accuracy and consistency with which the intervention is implemented 

States, J., Detrich, R. & Keyworth, R. (2017). Treatment Integrity Strategies. Oakland, CA: The Wing Institute. https://www.winginstitute.org/effective-instruction-treatment-integrity-strategies.

Improving vocabulary and pre-literacy skills of at-risk preschoolers through teacher professional development

In a randomized control study, Head Start teachers were assigned to either an intervention group that received intensive, ongoing professional development (PD) or to a comparison group that received the “business as usual” PD provided by Head Start. The PD intervention provided teachers with conceptual knowledge and instructional strategies that support young children’s development of vocabulary, alpha- bet knowledge, and phonological sensitivity.

Wasik, B. A., & Hindman, A. H. (2011). Improving vocabulary and pre-literacy skills of at-risk preschoolers through teacher professional development. Journal of Educational Psychology, 103(2), 455.

Teacher use of interventions in general education settings: Measurement and analysis of? the independent variable

This study evaluated the effects of performance feedback on increasing the quality of implementation of interventions by teachers in a public school setting.

Witt, J. C., Noell, G. H., LaFleur, L. H., & Mortenson, B. P. (1997). Teacher use of interventions in general education settings: Measurement and analysis of ?the independent variable. Journal of Applied Behavior Analysis, 30(4), 693.

No items found.

Back to Top