Education Drivers

Treatment Integrity

For the best chance of a positive impact on educational outcomes, two conditions must be met: (a) Effective interventions must be adopted, and (b) those interventions must be implemented with sufficient quality (treatment integrity) to ensure benefit. To date, emphasis in education has been on identifying effective interventions and less concern with implementing the interventions. The research on the implementation of interventions is not encouraging. Often, treatment integrity scores are very low and, in practice, implementation is rarely assessed. If an intervention with a strong research base is not implemented with a high level of treatment integrity, then the students do not actually experience the intervention and there is no reason to assume they will benefit from it. Under these circumstances, it is not possible to know if poor outcomes are the result of an ineffective intervention or poor implementation of that intervention. Historically, treatment integrity has been defined as implementing an intervention as prescribed. More recently, it has been conceptualized as having multiple dimensions, among them dosage and adherence which must be measured to ensure that it is occurring at adequate levels.

Treatment Integrity Overview 

View this Overview as PDF

 

For the best chance of producing positive educational outcomes for all children, two conditions must be met: (a) adopting effective empirically supported (evidence-based) practices and (b) implementing those practices with sufficient quality that they make a difference (treatment integrity) (Detrich, 2014). Both are necessary as neither on its own is sufficient to result in positive outcomes. Figure 1 describes the relationship between empirically supported practices and treatment integrity.

        

Treatment Int Fig 1 

Figure 1. Relationship between empirically supported practices and treatment integrity

 

            If an intervention has strong empirical support and is implemented with high integrity, then there is a high probability that positive outcomes will be achieved (upper left quadrant). The other quadrants illustrate that the lack of either element reduces the probability of positive outcomes: If a well-supported, research-based intervention is not implemented with high integrity (lower left quadrant); if an intervention is implemented with high integrity but is not empirically supported (upper right quadrant); or if a nonempirically supported intervention is implemented poorly (lower right quadrant).

            It should be noted that some interventions are not empirically supported because sufficient research has demonstrated that they do not produce positive outcomes. Alternatively, some interventions are not empirically supported because they have not been experimentally evaluated. This latter class of interventions is still in the experimental phase of development. Using these interventions is tantamount to conducting research and all of the rules for engaging in research should be followed. Because these interventions are still experimental, their effectiveness is unknown and they should be implemented only with the fully informed consent of both educators and parents.

            The advent of the evidence-based practice movement in education has resulted in considerable effort to identify practices that are empirically supported. Organizations such as the What Works Clearinghouse (https://ies.ed.gov/ncee/wwc) and the Best Evidence Encyclopedia (http://www.bestevidence.org) have reviewed a large number of interventions to discern the research base for each intervention. Until the past 20 years, treatment integrity did not receive significant scholarly attention. Even with the increased attention, treatment integrity measures are reported in about half of the published intervention research reports (Sanetti, Dobey, & Gritter, 2012; Sanetti, Gritter, & Dobey, 2011). When treatment integrity data are not published in research reports, it is difficult to know if the intervention that the researchers reported was actually implemented and was responsible for the outcomes or if there was some undocumented variation of the intervention that actually accounted for the outcomes.

           As important as treatment integrity is in research, it is equally important in practice settings. Measures of treatment integrity are fundamental to data-based decision making. Effective interventions may be prematurely terminated if the level of treatment integrity is not known. Student performance data tell us only how well a student is responding to the intervention as it is implemented. Treatment integrity measures tell us how well the intervention is being implemented. Without knowing about treatment integrity, it is not possible to know if a student’s failure to benefit from an intervention is a function of an ineffective intervention or ineffective implementation (see Treatment Integrity in the Problem-Solving Process for more discussion).

 

 

References

 

Detrich, R. (2014). Treatment integrity: Fundamental to education reform. Journal of Cognitive Education and Psychology, 13(2), 258–271.

Sanetti, L. M. H., Dobey, L. M., & Gritter, K. L. (2012). Treatment integrity of interventions with children in the Journal of Positive Behavior interventions from 1999 to 2009. Journal of Positive Behavior Interventions, 14(1), 29–46.

Sanetti, L. M. H., Gritter, K. L., & Dobey, L. M. (2011). Treatment integrity of interventions with children in the school psychology literature from 1995 to 2008. School Psychology Review, 40(1), 72–84.

 

 

View this Overview as PDF

 

 

Publications

TITLE
SYNOPSIS
CITATION
Treatment Integrity in the Problem-Solving Process Overview

Treatment integrity is a core component of data-based decision making (Detrich, 2013). The usual approach is to consider student data when making decisions about an intervention; however, if there are no data about how well the intervention was implemented, then meaningful judgments cannot be made about effectiveness.

Evidence-Based Practice in the Broader Context: How Can We Really Use Evidence to Inform Decisions?

This paper provides an overview of the considerations when introducing evidence-based services into established mental health systems.

Chorpita, B. F., & Starace, N. K. (2010). Evidence-Based Practice in the Broader Context: How Can We Really Use Evidence to Inform Decisions? Journal of Evidence-Based Practices for Schools, 11(1), 4-29.

Treatment Integrity: Fundamental to Education Reform

To produce better outcomes for students two things are necessary: (1) effective, scientifically supported interventions (2) those interventions implemented with high integrity.  Typically, much greater attention has been given to identifying effective practices.  This review focuses on features of high quality implementation.

Detrich, R. (2014). Treatment integrity: Fundamental to education reform. Journal of Cognitive Education and Psychology, 13(2), 258-271.

Innovation, Implementation Science, and Data-Based Decision Making: Components of Successful Reform

Schools are often expected to implement innovative instructional programs.  Most often these initiatives fail because what we know from implementation science is not considered as part of implementing the initiative.  This chapter reviews the contributions implementation science can make for improving outcomes for students.

Detrich, R. Innovation, Implementation Science, and Data-Based Decision Making: Components of Successful Reform. Handbook on Innovations in Learning, 31.

Treatment Integrity: A Fundamental Unit of Sustainable Educational Programs.

Reform efforts tend to come and go very quickly in education. This paper makes the argument that the sustainability of programs is closely related to how well those programs are implemented.

Detrich, R., Keyworth, R. & States, J. (2010). Treatment Integrity: A Fundamental Unit of Sustainable Educational Programs. Journal of Evidence-Based Practices for Schools, 11(1), 4-29.

Approaches to Increasing Treatment Integrity

Strategies designed to increase treatment integrity fall into two categories: antecedent-based strategies and consequence-based strategies.

Detrich, R., States, J. & Keyworth, R. (2017). Approaches to Increasing Treatment Integrity. Oakland, Ca. The Wing Institute

 

Dimensions of Treatment Integrity Overview

Historically, treatment integrity has been defined as implementation of an intervention as planned (Gresham, 1989). More recently, treatment integrity has been reimagined as multidimensional (Dane & Schneider, 1998). In this conceptualization of treatment integrity are four dimensions relevant to practice: (a) exposure (dosage), (b) adherence, (c) quality of delivery, and (d) student responsiveness.  It is important to understand that these dimensions do not stand alone but rather interact to impact the ultimate effectiveness of an intervention. It is important for educators to assess all dimensions of treatment integrity to assure that it is being implemented as intended.

Detrich, R., States, J. & Keyworth, R. (2017). Dimensions of Treatment Integrity Overview. Oakland, Ca. The Wing Institute

 

Treatment Integrity in the Problem Solving Process

The usual approach to determining if an intervention is effective for a student is to review student outcome data; however, this is only part of the task. Student data can only be understood if we know something about how well the intervention was implemented. Student data without treatment integrity data are largely meaningless because without knowing how well an intervention has been implemented, no judgments can be made about the effectiveness of the intervention. Poor outcomes can be a function of an ineffective intervention or poor implementation of the intervention. Without treatment integrity data, the is a risk that an intervention will be judged as ineffective when, in fact, the quality of implementation was so inadequate that it would be unreasonable to expect positive outcomes.

Detrich, R., States, J. & Keyworth, R. (2017). Treatment Integrity in the Problem Solving Process. Oakland, Ca. The Wing Institute.

 

Overview of Treatment Integrity

For the best chance of a positive impact on educational outcomes, two conditions must be met: (a) Effective interventions must be adopted, and (b) those interventions must be implemented with sufficient quality (treatment integrity) to ensure benefit.  To date, emphasis in education has been on identifying effective interventions and less concern with implementing the interventions. The research on the implementation of interventions is not encouraging. Often, treatment integrity scores are very low and, in practice, implementation is rarely assessed. If an intervention with a strong research base is not implemented with a high level of treatment integrity, then the students do not actually experience the intervention and there is no reason to assume they will benefit from it. Under these circumstances, it is not possible to know if poor outcomes are the result of an ineffective intervention or poor implementation of that intervention. Historically, treatment integrity has been defined as implementing an intervention as prescribed. More recently, it has been conceptualized as having multiple dimensions, among them dosage and adherence which must be measured to ensure that it is occurring at adequate levels.

Detrich, R., States, J., & Keyworth, R. (2107). Overview of Treatment Integrity. Oakland, Ca. The Wing Institute.

Sustainability of evidence-based programs in education

This paper discusses common elements of successfully sustaining effective practices across a variety of disciplines.

Fixsen, D. L., Blase, K. A., Duda, M., Naoom, S. F., & Van Dyke, M. (2010). Sustainability of evidence-based programs in education. Journal of Evidence-Based Practices for Schools, 11(1), 30-46.

Roles and responsibilities of researchers and practitioners for translating research to practice

This paper outlines the best practices for researchers and practitioners translating research to practice as well as recommendations for improving the process.

Shriver, M. D. (2007). Roles and responsibilities of researchers and practitioners for translating research to practice. Journal of Evidence-Based Practices for Schools, 8(1), 1-30.

Treatment Integrity Strategies Overview

Student achievement scores in the United States remain stagnant despite repeated attempts to reform the education system. New initiatives promising hope arise, only to disappoint after being adopted, implemented, and quickly found wanting. The cycle of reform followed by failure has had a demoralizing effect on schools, making new reform efforts problematic. These efforts frequently fail because implementing new practices is far more challenging than expected and require that greater attention be paid to how initiatives are implemented. Treatment integrity is increasingly recognized as an essential component of effective implementation in an evidence-based education model that produces results, and inattention to treatment integrity is seen as a primary reason new initiatives fail. The question remains, what strategies can educators employ to increase the likelihood that practices are implemented as designed? The Wing Institute overview on the topic of Treatment Integrity Strategies examines the essential practice elements indispensable for maximizing treatment integrity.

 

States, J., Detrich, R. & Keyworth, R. (2017). Overview of Treatment Integrity Strategies. Oakland, CA: The Wing Institute. http://www.winginstitute.org/effective-instruction-treatment-integrity-strategies.

 

Data Mining

TITLE
SYNOPSIS
CITATION
How does performance feedback affect the way teachers carry out interventions?
This analysis examined the impact of performance feedback on the quality of implementation of interventions.
Detrich, R. (2015). How does performance feedback affect the way teachers carry out interventions? Retrieved from how-does-performance-feedback.
How long do reform initiatives last in public schools?
This analysis examined sustainability of education reforms in education.
Detrich, R. (2015). How long do reform initiatives last in public schools? Retrieved from how-long-do-reform.
How often are treatment integrity measures reported in published research?
This analysis examined the frequency that treatment integrity is reported in studies of research-based interventions.
Detrich, R. (2015). How often are treatment integrity measures reported in published research? Retrieved from how-often-are-treatment.

 

Presentations

TITLE
SYNOPSIS
CITATION
Roles and Responsibilities of Researchers and Practitioners Translating Research to Practice

This paper outlines the best practices for researchers and practitioners translating research to practice as well as recommendations for improving the process.

Shriver, M. (2006). Roles and Responsibilities of Researchers and Practitioners Translating Research to Practice [Powerpoint Slides]. Retrieved from 2006-wing-presentation-mark-shriver.

The Role of Data-based Decision Making

This paper identifies the critical features required for developing and maintaining a systemic data-based decision making model aligned at all levels, beginning with an individual student in the classroom and culminating with policy makers.

States, J. (2009). The Role of Data-based Decision Making [Powerpoint Slides]. Retrieved from 2009-campbell-presentation-jack-states.

The Role of Data-based Decision Making

This presentation examines the impact of data-based decision making in schools. It looks at the critical role data plays in building an evidence-based model for education that relies on monitoring for the reliable implementation of practices.

States, J. (2010). The Role of Data-based Decision Making [Powerpoint Slides]. Retrieved from 2010-hice-presentation-jack-states.

What the Data Tell Us

This paper offers an overview of issues practitioners must consider in selecting practices. Types of evidence, sources of evidence, and the role of professional judgment are discussed as cornerstones of effective evidenced-based decision-making.

States, J. (2010). What the Data Tell Us [Powerpoint Slides]. Retrieved from 2010-capses-presentation-jack-states.

Evolution of the Revolution: How Can Evidence-based Practice Work in the Real World?
This paper provides an overview of the considerations when introducing evidence-based services into established mental health systems.
Chorpita, B. (2008). Evolution of the Revolution: How Can Evidence-based Practice Work in the Real World? [Powerpoint Slides]. Retrieved from 2008-wing-presentation-bruce-chorpita.
Treatment Integrity and Program Fidelity: Necessary but Not Sufficient to Sustain Programs
If programs are to sustain they must be implemented with integrity. If there is drift over time, it raises questions about whether the program is sustaining or has been substantially changed.
Detrich, R. (2008). Treatment Integrity and Program Fidelity: Necessary but Not Sufficient to Sustain Programs [Powerpoint Slides]. Retrieved from 2008-aba-presentation-ronnie-detrich.
The Four Assumptions of the Apocalypse
This paper examines the four basic assumptions for effective data-based decision making in education and offers strategies for addressing problem areas.
Detrich, R. (2009). The Four Assumptions of the Apocalypse [Powerpoint Slides]. Retrieved from 2009-wing-presentation-ronnie-detrich.
Toward a Technology of Treatment Integrity
If research supported interventions are to be effective it is necessary that they are implemented with integrity. This paper describes approahes to assuring high levels of treatment integrtiy.
Detrich, R. (2011). Toward a Technology of Treatment Integrity [Powerpoint Slides]. Retrieved from 2011-apbs-presentation-ronnie-detrich.
Treatment Integrity: Necessary by Not Sufficient for Improving Outcomes
Treatment integrity is necessary to improve outcomes but it is not sufficient. It is also necessary to implement scientifically supported interventions.
Detrich, R. (2015). Treatment Integrity: Necessary by Not Sufficient for Improving Outcomes [Powerpoint Slides]. Retrieved from 2015-ebpindisabilities-txint-presentation-ronnie-detrich.

 

Student Research

TITLE
SYNOPSIS
CITATION
Effects of a problem solving team intervention on the problem-solving process: Improving concept knowledge, implementation integrity, and student outcomes.

This study evaluated the effects of a problem solving intervention package that included problem-solving information, performance feedback, and coaching in a student intervention planning protocol.

Vaccarello, C. A. (2011). Effects of a problem solving team intervention on the problem-solving process: Improving concept knowledge, implementation integrity, and student outcomes. Retrieved from student-research-2011.

Transporting an evidence-based school engagement intervention to practice: Outcomes and barriers to implementation.
Check and Connect is an intervention designed to increase student engagement in school. This study was a transportability study that evaluated the impact of Check and Connect when implemented by school personnel rather than researchers.
Pankow, C. (2009). Transporting an evidence-based school engagement intervention to practice: Outcomes and barriers to implementation. Retrieved from student-research-2009-a.
TITLE
SYNOPSIS
CITATION
Treatment Integrity in the Problem-Solving Process Overview

Treatment integrity is a core component of data-based decision making (Detrich, 2013). The usual approach is to consider student data when making decisions about an intervention; however, if there are no data about how well the intervention was implemented, then meaningful judgments cannot be made about effectiveness.

Treatment integrity: A wicked problem and some solutions

Presentation by Wing Institute with goals: Make the case that treatment integrity monitoring is a necessary part of service delivery; describe dimensions of treatment integrity; suggest methods for increasing treatment integrity; place treatment integrity within systems framework . 

Introduction: Proceedings from the Wing Institute’s Sixth Annual Summit on Evidence-Based Education: Performance Feedback: Using Data to Improve Educator Performance.

This book is compiled from the proceedings of the sixth summit entitled “Performance Feedback: Using Data to Improve Educator Performance.” The 2011 summit topic was selected to help answer the following question: What basic practice has the potential for the greatest impact on changing the behavior of students, teachers, and school administrative personnel?

States, J., Keyworth, R. & Detrich, R. (2013). Introduction: Proceedings from the Wing Institute’s Sixth Annual Summit on Evidence-Based Education: Performance Feedback: Using Data to Improve Educator Performance. In Education at the Crossroads: The State of Teacher Preparation (Vol. 3, pp. ix-xii). Oakland, CA: The Wing Institute.

 

 

Increasing Pre-service Teachers' Use of Differential Reinforcement: Effects of Performance Feedback on Consequences for Student Behavior.

This study evaluated the effects of performance feedback to increase the implementation of skills taught during in-service training.

Auld, R. G., Belfiore, P. J., & Scheeler, M. C. (2010). Increasing Pre-service Teachers’ Use of Differential Reinforcement: Effects of Performance Feedback on Consequences for Student Behavior. Journal of Behavioral Education, 19(2), 169-183.

Increasing pre-service teachers’ use of differential reinforcement: Effects of performance feedback on consequences for student behavior

Significant dollars are spent each school year on professional development programs to improve teachers’ effectiveness. This study assessed the integrity with which pre-service teachers used a differential reinforcement of alternate behavior (DRA) strategy taught to them during their student teaching experience.

Auld, R. G., Belfiore, P. J., & Scheeler, M. C. (2010). Increasing pre-service teachers’ use of differential reinforcement: Effects of performance feedback on consequences for student behavior. Journal of Behavioral Education, 19(2), 169-183.

The Use of E-Mail to Deliver Performance-Based Feedback to Early Childhood Practitioners.

This study evaulates the effects of performance feedback as part of proffessional development across three studies.

Barton, E. E., Pribble, L., & Chen, C.-I. (2013). The Use of E-Mail to Deliver Performance-Based Feedback to Early Childhood Practitioners. Journal of Early Intervention, 35(3), 270-297.

Putting the pieces together: An Integrated Model of program implementation

One of the primary goals of implementation science is to insure that programs are implemented with integrity.  This paper presents an integrated model of implementation that emphasizes treatment integrity.

Berkel, C., Mauricio, A. M., Schoenfelder, E., Sandler, I. N., & Collier-Meek, M. (2011). Putting the pieces together: An Integrated Model of program implementation. Prevention Science, 12, 23-33.

Graphical Feedback to Increase Teachers' Use of Incidental Teaching.

Incidental teaching is often a component of early childhood intervention programs.  This study evaluated the use of grahical feedback to increase the use of incidental teaching.

Casey, A. M., & McWilliam, R. A. (2008). Graphical Feedback to Increase Teachers’ Use of Incidental Teaching. Journal of Early Intervention, 30(3), 251-268. 

The impact of checklist-based training on teachers' use of the zone defense schedule.

One of the challenges for increasing treatment integrity is finding effective methods for doing so.  This study evaluated the use of checklist-based training to increase treatment integrity.

Casey, A. M., & McWilliam, R. A. (2011). The impact of checklist-based training on teachers’ use of the zone defense schedule. Journal of Applied Behavior Analysis, 44(2), 397-401. 

Using Performance Feedback To Decrease Classroom Transition Time And Examine Collateral Effects On Academic Engagement.

This study evaluated the impact of performance feedback on how well problem-solving teams implemeted a structured decision-making protocal.  Teams performed better when feedback was provided.

Codding, R. S., & Smyth, C. A. (2008). Using Performance Feedback To Decrease Classroom Transition Time And Examine Collateral Effects On Academic Engagement. Journal of Educational & Psychological Consultation, 18(4), 325-345.

Effects of Immediate Performance Feedback on Implementation of Behavior Support Plans.

This study investigated the effects of performance feedback to increase treatment integrity.

Codding, R. S., Feinberg, A. B., & Dunn, E. K. (2005). Effects of Immediate Performance Feedback on Implementation of Behavior Support Plans. Journal of Applied Behavior Analysis, 38(2), 205-219. 

 

Using Performance Feedback to Improve Treatment Integrity of Classwide Behavior Plans: An Investigation of Observer Reactivity

This study evaluated the effects of performance feedback in increasing treatment integrity.  It also evaluated the possible reactivitiy effects of being observed.

Codding, R. S., Livanis, A., Pace, G. M., & Vaca, L. (2008). Using Performance Feedback to Improve Treatment Integrity of Classwide Behavior Plans: An Investigation of Observer Reactivity. Journal of Applied Behavior Analysis, 41(3), 417-422. 

 

Barriers to Implementing Classroom Management and Behavior Support Plans: An Exploratory Investigation.

This study examines obstacles encountered by 33 educators along with suggested interventions to overcome impediments to effective delivery of classroom management interventions or behavior support plans. Having the right classroom management plan isn’t enough if you can’t deliver the strategies to the students in the classroom.

Collier‐Meek, M. A., Sanetti, L. M., & Boyle, A. M. (2019). Barriers to implementing classroom management and behavior support plans: An exploratory investigation. Psychology in the Schools56(1), 5-17.

The effects of video modeling on staff implementation of a problem-solving intervention with adults with developmental disabilities.

This study evaluated the effects of video modeling on staff implementation of a problem solving intervention.

Collins, S., Higbee, T. S., & Salzberg, C. L. (2009). The effects of video modeling on staff implementation of a problem-solving intervention with adults with developmental disabilities. Journal of applied behavior analysis, 42(4), 849-854.

Program integrity in primary and early secondary prevention: are implementation effects out of control

Dane and Schneider propose treatment integrity as a multi-dimensional construct and describe five dimensions that constitute the construct.

Dane, A. V., & Schneider, B. H. (1998). Program integrity in primary and early secondary prevention: are implementation effects out of control. Clinical psychology review, 18(1), 23-45.

Test Driving Interventions to Increase Treatment Integrity and Student Outcomes.

This study evaluated the effects of allowing teachers to “test drive” interventions and then select the intervention they most preferred.  The result was an increase in treatment integrity.

Dart, E. H., Cook, C. R., Collins, T. A., Gresham, F. M., & Chenier, J. S. (2012). Test Driving Interventions to Increase Treatment Integrity and Student Outcomes. School Psychology Review, 41(4), 467-481.

Increasing treatment fidelity by matching interventions to contextual variables within the educational setting

The impact of an intervention is influenced by how well it fis into the context of a classroom.  This paper suggests a number of variables to consider and how they might be measured prior to the development of an intervention.

Detrich, R. (1999). Increasing treatment fidelity by matching interventions to contextual variables within the educational setting. School Psychology Review, 28(4), 608-620.

Treatment Integrity: Fundamental to Education Reform

To produce better outcomes for students two things are necessary: (1) effective, scientifically supported interventions (2) those interventions implemented with high integrity.  Typically, much greater attention has been given to identifying effective practices.  This review focuses on features of high quality implementation.

Detrich, R. (2014). Treatment integrity: Fundamental to education reform. Journal of Cognitive Education and Psychology, 13(2), 258-271.

Innovation, Implementation Science, and Data-Based Decision Making: Components of Successful Reform

Over the last fifty years, there have been many educational reform efforts, most of which have had a relatively short lifespan and failed to produce the promised results. One possible reason for this is for the most part these innovations have been poorly implemented. In this chapter, the author proposes a data-based decision making approach to assuring high quality implementation.

Detrich, R. Innovation, Implementation Science, and Data-Based Decision Making: Components of Successful Reform. In M. Murphy, S. Redding, and J. Twyman (Eds). Handbook on Innovations in Learning, 31. Charlotte, NC: Information Age Publishing

Dimensions of Treatment Integrity Overview

In this conceptualization of treatment integrity, there are four dimensions relevant to practice: (a) exposure (dosage), (b) adherence, (c) quality of delivery, and (d) student responsiveness.  It is important to understand that these dimensions do not stand alone but rather interact to impact the ultimate effectiveness of an intervention. 

Detrich, R., States, J. & Keyworth, R. (2017). Dimensions of Treatment Integrity Overview. Oakland, Ca. The Wing Institute

 

Treatment Integrity in the Problem Solving Process

The usual approach to determining if an intervention is effective for a student is to review student outcome data; however, this is only part of the task. Student data can only be understood if we know something about how well the intervention was implemented. Student data without treatment integrity data are largely meaningless because without knowing how well an intervention has been implemented, no judgments can be made about the effectiveness of the intervention. Poor outcomes can be a function of an ineffective intervention or poor implementation of the intervention. Without treatment integrity data, there is a risk that an intervention will be judged as ineffective when, in fact, the quality of implementation was so inadequate that it would be unreasonable to expect positive outcomes.

Detrich, R., States, J. & Keyworth, R. (2017). Treatment Integrity in the Problem Solving Process. Oakland, Ca. The Wing Institute.

 

Overview of Treatment Integrity

For the best chance of producing positive educational outcomes for all children, two conditions must be met: (a) adopting effective empirically supported (evidence-based) practices and (b) implementing those practices with sufficient quality that they make a difference (treatment integrity)

Detrich, R., States, J., & Keyworth, R. (2107). Overview of Treatment Integrity. Oakland, Ca. The Wing Institute.

A comparison of performance feedback procedures on teachers' treatment implementation integrity and students' inappropriate behavior in special education classrooms.

This study comared the effects of goal setting about student performance and feedback about student performance with daily written feedback about student performance, feedback about accuracy of implementation, and cancelling meetings if integrity criterion was met. 

DiGennaro, F. D., Martens, B. K., & Kleinmann, A. E. (2007). A comparison of performance feedback procedures on teachers' treatment implementation integrity and students' inappropriate behavior in special education classrooms. Journal of Applied Behavior Analysis, 40(3), 447-461. 

 

Increasing Treatment Integrity Through Negative Reinforcement: Effects on Teacher and Student Behavior

This study evaluated the impact of allowing  teachers to miss coaching meetings if their treatment integrity scores met or exceeded criterion.

DiGennaro, F. D., Martens, B. K., & McIntyre, L. L. (2005). Increasing Treatment Integrity Through Negative Reinforcement: Effects on Teacher and Student Behavior. School Psychology Review, 34(2), 220-231.

Effects of video modeling on treatment integrity of behavioral interventions.

This study evaluated the effects of video modeling on how well teachers implemented interventions.  There was an increase in integrity but it remained variable.  More stable patterns of implementation were observed when teachers were given feedback about their peroformance.

Digennaro-Reed, F. D., Codding, R., Catania, C. N., & Maguire, H. (2010). Effects of video modeling on treatment integrity of behavioral interventions. Journal of Applied Behavior Analysis, 43(2), 291-295. 

 

Effects of public feedback during RTI team meetings on teacher implementation integrity and student academic performance.

This study evaluated the impact of public feedback in RtI team meetings on the quality of implementation.  Feedback improved poor implementation and maintained high level implementation.

Duhon, G. J., Mesmer, E. M., Gregerson, L., & Witt, J. C. (2009). Effects of public feedback during RTI team meetings on teacher implementation integrity and student academic performance. Journal of School Psychology, 47(1), 19-37.

Implementation Matters: A Review of Research on the Influence of Implementation on Program Outcomes and the Factors Affecting Implementation

The first purpose of this review is to assess the impact of implementation on program outcomes, and the second purpose is to identify factors affecting the implementation process.

Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American journal of community psychology, 41(3-4), 327-350.

An Exploration of Teacher Acceptability of Treatment Plan Implementation: Monitoring and Feedback Methods.

This paper summarizes survey results about the acceptability of different methods for monitoring treatment integrity and performance feedback.

Easton, J. E., & Erchul, W. P. (2011). An Exploration of Teacher Acceptability of Treatment Plan Implementation: Monitoring and Feedback Methods. Journal of Educational & Psychological Consultation, 21(1), 56-77. Retrieved from http://www.tandfonline.com/doi/abs/10.1080/10474412.2011.544949?journalCode=hepc20.

Sustainability of evidence-based programs in education

This paper discusses common elements of successfully sustaining effective practices across a variety of disciplines.

Fixsen, D. L., Blase, K. A., Duda, M., Naoom, S. F., & Van Dyke, M. (2010). Sustainability of evidence-based programs in education. Journal of Evidence-Based Practices for Schools, 11(1), 30-46.

Implementation Research: A Synthesis of the Literature

This is a comprehensive literature review of the topic of Implementation examining all stages beginning with adoption and ending with sustainability.

Fixsen, D. L., Naoom, S. F., Blase, K. A., & Friedman, R. M. (2005). Implementation research: A synthesis of the literature.

Sustaining fidelity following the nationwide PMTO™ implementation in Norway

This paper describes the scaling up and dissemination of a partent training program in Norway while maintaining fidelity of implementation.

Forgatch, M. S., & DeGarmo, D. S. (2011). Sustaining fidelity following the nationwide PMTO™ implementation in Norway. Prevention Science, 12(3), 235-246. Retrieved from http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3153633

 

Establishing Treatment Fidelity in Evidence-Based Parent Training Programs for Externalizing Disorders in Children and Adolescents.
This review evaluated methods for improving treatment integrity based on the National Institutes of Health Treatment Fidelity Workgroup.  Strategies related to treatment design produced the highest levels of treatment integrity.  Training and enactment of treatment skills resulted in the lowest level of treatment integrity across the 65 reviewed studies.

Garbacz, L., Brown, D., Spee, G., Polo, A., & Budd, K. (2014). Establishing Treatment Fidelity in Evidence-Based Parent Training Programs for Externalizing Disorders in Children and Adolescents. Clinical Child & Family Psychology Review, 17(3).

The impact of two professional development interventions on early reading instruction and achievement

To help states and districts make informed decisions about the PD they implement to improve reading instruction, the U.S. Department of Education commissioned the Early Reading PD Interventions Study to examine the impact of two research-based PD interventions for reading instruction: (1) a content-focused teacher institute series that began in the summer and continued through much of the school year (treatment A) and (2) the same institute series plus in-school coaching (treatment B).

Garet, M. S., Cronen, S., Eaton, M., Kurki, A., Ludwig, M., Jones, W., ... Zhu, P. (2008). The impact of two professional development interventions on early reading instruction and achievement. NCEE 2008-4030. Washington, DC: National Center for Education Evaluation and Regional Assistance.

Supporting teacher use of interventions: effects of response dependent performance feedback on teacher implementation of a math intervention

This article evaluated the effects of response dependent feedback on accurate implementation of an intervention.  Whenever teachers failed to meet 100% accuracy criterion they were given feedback about their performance.

Gilbertson, D., Witt, J., Singletary, L., & VanDerHeyden, A. (2007). Supporting teacher use of interventions: effects of response dependent performance feedback on teacher implementation of a math intervention. Journal of Behavioral Education, 16(4), 311-326.

Strategies for Improving Treatment Integrity in Organizational Consultation

Organizations house many individuals.  Many of them are responsible implementing the same practice.  If organizations are to meet their goal it is important for the organization have systems for assuring high levels of treatment integrity.

Gottfredson, D. C. (1993). Strategies for Improving Treatment Integrity in Organizational Consultation. Journal of Educational & Psychological Consultation, 4(3), 275. 

Assessment of Treatment Integrity in School Consultation and Prereferral Intervention.

Technical issues (specification of treatment components, deviations from treatment protocols and amount of behavior change, and psychometric issues in assessing Treatment Integrity) involved in the measurement of Treatment Integrity are discussed.

Gresham, F. M. (1989). Assessment of treatment integrity in school consultation and prereferral intervention. School Psychology Review, 18(1), 37-50.

Treatment integrity in applied behavior analysis with children.

This study reviewed all intervention studies published between 1980-1990 in Journal of Applied Behavior Analysis in which children were the subjects of the study. The authors found that treatment integrity was reported in only 16% of the studies.

Gresham, F. M., Gansle, K. A., & Noell, G. H. (1993). Treatment ?integrity in ?applied behavior analysis with children. Journal of Applied ?Behavior Analysis, 26(2), 257-263.

Treatment integrity in learning disabilities intervention research: Do we really know how treatments are implemented

The authors reviewed three learning disabilities journals between 1995-1999 to determine what percent of the intervention studies reported measures of treatment integrity. Only 18.5% reported treatment integrity measures.

Gresham, F. M., MacMillan, D. L., Beebe-Frankenberger, M. E., & Bocian, K. M. ?(2000). Treatment integrity in learning disabilities intervention research: Do we really know how treatments are implemented. Learning Disabilities ?Research & Practice, 15(4), 198-205.

Will the “principles of effectiveness” improve prevention practice? Early findings from a diffusion study

This study examines adoption and implementation of the US Department of Education's new policy, the `Principles of Effectiveness', from a diffusion of innovations theoretical framework. In this report, we evaluate adoption in relation to Principle 3: the requirement to select research-based programs.

Hallfors, D., & Godette, D. (2002). Will the “principles of effectiveness” improve prevention practice? Early findings from a diffusion study. Health Education Research, 17(4), 461–470.

Learning from teacher observations: Challenges and opportunities posed by new teacher evaluation systems

This article discusses the current focus on using teacher observation instruments as part of new teacher evaluation systems being considered and implemented by states and districts. 

Hill, H., & Grossman, P. (2013). Learning from teacher observations: Challenges and opportunities posed by new teacher evaluation systems. Harvard Educational Review, 83(2), 371-384.

Observational Assessment for Planning and Evaluating Educational Transitions: An Initial Analysis of Template Matching

Used a direct observation-based approach to identify behavioral conditions in sending (i.e., special education) and in receiving (i.e., regular education) classrooms and to identify targets for intervention that might facilitate mainstreaming of behavior-disordered (BD) children.

Hoier, T. S., McConnell, S., & Pallay, A. G. (1987). Observational assessment for planning and evaluating educational transitions: An initial analysis of template matching. Behavioral Assessment.

Criteria for Evaluating Treatment Guidelines

This document presents a set of criteria to be used in evaluating treatment guidelines that have been promulgated by health care organizations, government agencies, professional associations, or other entities.1  The purpose of treatment guidelines is to educate health care professionals2 and health care systems about the most effective treatments available

Hollon, D., Miller, I. J., & Robinson, E. (2002). Criteria for evaluating treatment guidelines. American Psychologist57(12), 1052-1059.

Examining the evidence base for school-wide positive behavior support.

The purposes of this manuscript are to propose core features that may apply to any practice or set of practices that proposes to be evidence-based in relation to School-wide Positive Behavior Support (SWPBS). 

Horner, R. H., Sugai, G., & Anderson, C. M. (2010). Examining the evidence base for school-wide positive behavior support. Focus on Exceptional Children, 42(8), 1.

 

The importance of contextual fit when implementing evidence-based interventions.

“Contextual fit” is based on the premise that the match between an intervention and local context affects both the quality of intervention implementation and whether the intervention actually produces the desired outcomes for children and families.

Horner, R., Blitz, C., & Ross, S. (2014). The importance of contextual fit when implementing evidence-based interventions. Washington, DC: U.S. Department of Health and Human Services, Office of the Assistant Secretary for Planning and Evaluation. https://aspe.hhs.gov/system/files/pdf/77066/ib_Contextual.pdf

The mirage: Confronting the hard truth about our quest for teacher development

This piece describes the widely held perception among education leaders that we already  know how to help teachers improve, and that we could achieve our goal of great teaching in far more classrooms if we just applied what we know more widely. 

Jacob, A., & McGovern, K. (2015). The mirage: Confronting the hard truth about our quest for teacher development. Brooklyn, NY: TNTP. https://tntp.org/assets/documents/TNTP-Mirage_2015.pdf.

 

Student Achievement through Staff Development

This book provides research as well as case studies of successful professional development strategies and practices for educators.

Joyce, B. R., & Showers, B. (2002). Student achievement through staff development. ASCD.

The Effects of Feedback Interventions on Performance: A Historical Review, a Meta-Analysis, and a Preliminary Feedback Intervention Theory

The authors proposed a preliminary FI theory (FIT) and tested it with moderator analyses. The central assumption of FIT is that FIs change the locus of attention among 3 general and hierarchically organized levels of control: task learning, task motivation, and meta-tasks (including self-related) processes.

Kluger, A. N., & DeNisi, A. (1996). The effects of feedback interventions on performance: A historical review, a meta-analysis, and a preliminary feedback intervention theory. Psychological bulletin119(2), 254.

Focus on teaching: Using video for high-impact instruction

This book examines the use of video recording to to improve teacher performance. The book shows how every classroom can easily benefit from setting up a camera and hitting “record”.  

Knight, J. (2013). Focus on teaching: Using video for high-impact instruction. (Pages 8-14). Thousand Oaks, CA: Corwin.

Treatment integrity of school‐based interventions with children

This paper examines school-based experimental studies with individuals 0 to 18 years between 1991 and 2005.  Only 30% of the studies provided treatment integrity data. Nearly half of studies (45%) were judged to be at high risk for treatment inaccuracies.

McIntyre, L. L., Gresham, F. M., DiGennaro, F. D., & Reed, D. D. (2007). Treatment integrity of school‐based interventions with children in the Journal of Applied Behavior Analysis 1991–2005. Journal of Applied Behavior Analysis, 40(4), 659–672.

Increasing intervention implementation in general education following consultation: A comparison of two follow-up strategies.

This study compared the effects of discussing issues of implementation challenges and performance feedback on increasing the integrity of implementation. Performance feedback was more effective than discussion in increasing integrity.

Noell, G. H., & Witt, J. C. (2000). Increasing intervention implementation in general education following consultation: A comparison of two follow-up strategies. Journal of Applied Behavior Analysis, 33(3), 271.

Does treatment integrity matter? A preliminary investigation of instructional implementation and mathematics performance

This study examined the impact of three levels of treatment integrity on students'
responding on mathematics tasks.

Noell, G. H., Gresham, F. M., & Gansle, K. A. (2002). Does treatment integrity matter? A preliminary investigation of instructional implementation and mathematics performance. Journal of Behavioral Education, 11(1), 51-67.

Using Coaching to Support Teacher Implementation of Classroom-based Interventions.

This study evaluted the impact of coaching on the implementation of an intervention.  Coaching with higher rates of performance feedback resulted in the highest level of treatment integrity.

Reinke, W., Stormont, M., Herman, K., & Newcomer, L. (2014). Using Coaching to Support Teacher Implementation of Classroom-based Interventions. Journal of Behavioral Education, 23(1), 150-167.

Diffusion of innovations

This book looks at how new ideas spread via communication channels over time. Such innovations are initially perceived as uncertain and even risky. To overcome this uncertainty, most people seek out others like themselves who have already adopted the new idea. Thus the diffusion process typically takes months or years. But there are exceptions: use of the Internet in the 1990s, for example, may have spread more rapidly than any other innovation in the history of humankind. 

Rogers, E. M. (2003). Diffusion of innovations (5th ed.). New York, NY: Free Press.

Extending Use of Direct Behavior Rating Beyond Student Assessment.

This paper reviews options for treatment integrity measurement emphasizing how direct behavior rating technology might be incorporated within a multi-tiered model of intervention delivery.

Sanetti, L. M. H., Chafouleas, S. M., Christ, T. J., & Gritter, K. L. (2009). Extending Use of Direct Behavior Rating Beyond Student Assessment. Assessment for Effective Intervention, 34(4), 251-258. 

Treatment Integrity of Interventions With Children in the Journal of Positive Behavior Interventions: From 1999 to 2009

The authors reviewed all intervention studies published in the Journal of Positive Behavior Interventions between 1999-2009 to determine the percent of those studies that reported a measure of treatment integrity. Slightly more than 40% reported a measure of treatment integrity.

Sanetti, L. M. H., Dobey, L. M., & Gritter, K. L. (2012). Treatment Integrity of Interventions With Children in the Journal of Positive Behavior Interventions: From 1999 to 2009. Journal of Positive Behavior Interventions, 14(1), 29-46.

Treatment integrity of interventions with children in the school psychology literature from 1995 to 2008

The authors reviewed four school psychology journals between 1995-2008 to estimate the percent of intervention studies that reported some measure of treatment integrity. About 50% reported a measure of treatment integrity.

Sanetti, L. M. H., Gritter, K. L., & Dobey, L. M. (2011). Treatment integrity of interventions with children in the school psychology literature from 1995 to 2008. School Psychology Review, 40(1), 72-84.

The effect of performance feedback on teachers’ treatment integrity: A meta-analysis of the single-case literature.

The current study extracted and aggregated data from single-case studies that used Performance feedback (PF) in school settings to increase teachers' use of classroom-based interventions.

Solomon, B. G., Klein, S. A., & Politylo, B. C. (2012). The effect of performance feedback on teachers' treatment integrity: A meta-analysis of the single-case literature. School Psychology Review41(2).

Coaching Classroom Management: Strategies & Tools for Administrators & Coaches

This book is written for school administrators, staff developers, behavior specialists, and instructional coaches to offer guidance in implementing research-based practices that establish effective classroom management in schools. The book provides administrators with practical strategies to maximize the impact of professional development. 

Sprick, et al. (2010). Coaching Classroom Management: Strategies & Tools for Administrators & Coaches. Pacific Northwest Publishing.

Treatment Integrity Strategies Overview

Inattention to treatment integrity is a primary factor of failure during implementation. Treatment integrity is defined as the extent to which an intervention is executed as designed, and the accuracy and consistency with which the intervention is implemented 

States, J., Detrich, R. & Keyworth, R. (2017). Treatment Integrity Strategies. Oakland, CA: The Wing Institute. https://www.winginstitute.org/effective-instruction-treatment-integrity-strategies.

Isolating the effects of active responding in computer‐based instruction

This experiment evaluated the effects of requiring overt answer construction in computer-based programmed instruction using an alternating treatments design.

Tudor, R. M. (1995). Isolating the effects of active responding in computer‐based instruction. Journal of Applied Behavior Analysis28(3), 343-344.

Improving vocabulary and pre-literacy skills of at-risk preschoolers through teacher professional development

In a randomized control study, Head Start teachers were assigned to either an intervention group that received intensive, ongoing professional development (PD) or to a comparison group that received the “business as usual” PD provided by Head Start. The PD intervention provided teachers with conceptual knowledge and instructional strategies that support young children’s development of vocabulary, alpha- bet knowledge, and phonological sensitivity.

Wasik, B. A., & Hindman, A. H. (2011). Improving vocabulary and pre-literacy skills of at-risk preschoolers through teacher professional development. Journal of Educational Psychology, 103(2), 455.

Teacher use of interventions in general education settings: Measurement and analysis of? the independent variable

This study evaluated the effects of performance feedback on increasing the quality of implementation of interventions by teachers in a public school setting.

Witt, J. C., Noell, G. H., LaFleur, L. H., & Mortenson, B. P. (1997). Teacher use of interventions in general education settings: Measurement and analysis of ?the independent variable. Journal of Applied Behavior Analysis, 30(4), 693.

TITLE
SYNOPSIS
Cambridge Center for Behavioral Studies
The mission of the organization is to advance the scientific study of behavior and its humane application to the solution of practical problems in the home, school, community, and the workplace
Center for Research and Reform in Education (CRRE)
CRRE is a research center who’s major goal is to improve the quality of education through high-quality research and evaluation studies and the dissemination of evidence-based research.
Cochrane Collaboration
Cochrane is an independent network of health practitioners, researchers, patient advocates and others, responding to the challenge of making the vast amounts of evidence generated through research useful for informing decisions about health.
Daniel Willingham - Web Site
Daniel Willingham is a resource to help those interested in issues of education to find practical, helpful information on what works and what doesn’t. His videos are of special interest.
National Education Policy Center
The mission of the National Education Policy Center is to produce and disseminate high-quality, peer-reviewed research to inform education policy discussions.
Back to Top