Dimensions of Treatment Integrity Overview
Historically, treatment integrity has been defined as implementation of an intervention as planned (Gresham, 1989). More recently, treatment integrity has been reimagined as multidimensional (Dane & Schneider, 1998). In this conceptualization of treatment integrity are four dimensions relevant to practice: (a) exposure (dosage), (b) adherence, (c) quality of delivery, and (d) student responsiveness.
Exposure (dosage) refers to the amount (frequency and duration) of an intervention a student is receiving. For example, a student’s supplemental reading support may call for the intervention to occur three times per week and 30 minutes per session. If the delivery is less frequent or sessions are shorter, then the student’s exposure to the intervention is less than optimal and the outcome may be compromised.
Adherence is the most commonly measured dimension of treatment integrity (Sanetti, Chafouleas, Christ, & Gritter, 2009). It is the extent to which those responsible for implementing an intervention are doing so as prescribed. Most interventions are multicomponent packages and in some instances very complex. Whether a specific feature of an intervention occurred as planned is usually how adherence is measured.
Quality of delivery is the degree to which the implementation is executed with enthusiasm and sincerity. This dimension is underrepresented in the scholarly literature primarily because of its subjective nature; however, it is an important facet of treatment integrity and warrants more research. Consider the following: Many interventions for challenging behavior include praising students when they are behaving appropriately. Some teachers are effusive with their praise and vary it in many ways so that it does not become rote. Others may praise in a very monotone and rote manner. The differences in the way praise is delivered is likely to influence the impact of the intervention even if both individuals who are praising are doing so with 100% adherence to the intervention protocol.
Student responsiveness is the degree to which the student is engaged during the intervention. This dimension is a bit controversial. Some argue that it should not be a part of treatment integrity measures because it is a measure of student behavior and measures of treatment integrity should reflect what adult educators are doing. The counterargument is that even with high integrity for exposure, adherence, and quality of delivery, it is possible that the student’s lack of engagement with the intervention may negatively impact the intervention. For example, a student receiving an intervention to improve fluency in basic math may minimally participate in instruction even though the intervention is implemented with high integrity across all other dimensions of treatment integrity. This poor participation may be a function of placing the student in the instructional program at his or her failure level. Conversely, a student placed in the instructional program at his or her mastery level might not be engaged because the instruction is boring. Student responsiveness to an intervention can be an important indicator of the appropriateness of the instructional program.
Each of these dimensions can influence the impact of an intervention, but it is also important to be mindful of the interaction among variables. Consider the previously mentioned intervention protocol that calls for a student to receive supplemental reading support three times a week for 30 minutes each session. Both of these measures are part of the exposure dimension. If the student receives only one session per week and that session lasts for 30 minutes, then he or she is exposed to the intervention a third of the time prescribed. Similarly, the student could receive the reading intervention three times a week but for only 10 minutes each session. The student is still exposed to the intervention a third of the prescribed time. To complicate matters, even if the instructor perfectly implements the intervention during the session (adherence), the outcome is likely to be degraded because exposure was limited. This example highlights the importance of measuring all of the dimensions of treatment integrity and not just adherence. If just adherence is assessed and the outcome is less than desired, it might be determined that the intervention was ineffective even with a high level of adherence. Failure to consider all dimensions of treatment integrity can result is errors in decision making.
Dane, A. V., & Schneider, B. H. (1998). Program integrity in primary and early secondary prevention: Are implementation effects out of control? Clinical Psychology Review, 18(1), 23–45.
Gresham, F. M. (1989). Assessment of treatment integrity in school consultation and prereferral intervention. School Psychology Review, 18(1), 37–50.
Sanetti, L. M. H., Chafouleas, S. M., Christ, T. J., & Gritter, K. L. (2009). Extending use of direct behavior rating beyond student assessment: Applications to treatment integrity assessment within a multitiered model of school-based intervention delivery. Assessment for Effective Intervention, 34(4), 251–258.
To produce better outcomes for students two things are necessary: (1) effective, scientifically supported interventions (2) those interventions implemented with high integrity. Typically, much greater attention has been given to identifying effective practices. This review focuses on features of high quality implementation.
Detrich, R. (2014). Treatment integrity: Fundamental to education reform. Journal of Cognitive Education and Psychology, 13(2), 258-271.
Schools are often expected to implement innovative instructional programs. Most often these initiatives fail because what we know from implementation science is not considered as part of implementing the initiative. This chapter reviews the contributions implementation science can make for improving outcomes for students.
Detrich, R. Innovation, Implementation Science, and Data-Based Decision Making: Components of Successful Reform. Handbook on Innovations in Learning, 31.
Reform efforts tend to come and go very quickly in education. This paper makes the argument that the sustainability of programs is closely related to how well those programs are implemented.
Detrich, R., Keyworth, R. & States, J. (2010). Treatment Integrity: A Fundamental Unit of Sustainable Educational Programs. Journal of Evidence-Based Practices for Schools, 11(1), 4-29.
Strategies designed to increase treatment integrity fall into two categories: antecedent-based strategies and consequence-based strategies.
Detrich, R., States, J. & Keyworth, R. (2017). Approaches to Increasing Treatment Integrity. Oakland, Ca. The Wing Institute
Historically, treatment integrity has been defined as implementation of an intervention as planned (Gresham, 1989). More recently, treatment integrity has been reimagined as multidimensional (Dane & Schneider, 1998). In this conceptualization of treatment integrity are four dimensions relevant to practice: (a) exposure (dosage), (b) adherence, (c) quality of delivery, and (d) student responsiveness. It is important to understand that these dimensions do not stand alone but rather interact to impact the ultimate effectiveness of an intervention. It is important for educators to assess all dimensions of treatment integrity to assure that it is being implemented as intended.
Detrich, R., States, J. & Keyworth, R. (2017). Dimensions of Treatment Integrity Overview. Oakland, Ca. The Wing Institute
The usual approach to determining if an intervention is effective for a student is to review student outcome data; however, this is only part of the task. Student data can only be understood if we know something about how well the intervention was implemented. Student data without treatment integrity data are largely meaningless because without knowing how well an intervention has been implemented, no judgments can be made about the effectiveness of the intervention. Poor outcomes can be a function of an ineffective intervention or poor implementation of the intervention. Without treatment integrity data, the is a risk that an intervention will be judged as ineffective when, in fact, the quality of implementation was so inadequate that it would be unreasonable to expect positive outcomes.
Detrich, R., States, J. & Keyworth, R. (2017). Treatment Integrity in the Problem Solving Process. Oakland, Ca. The Wing Institute.