Presentation by Wing Institute with goals: Make the case that treatment integrity monitoring is a necessary part of service delivery; describe dimensions of treatment integrity; suggest methods for increasing treatment integrity; place treatment integrity within systems framework .
This book is compiled from the proceedings of the sixth summit entitled “Performance Feedback: Using Data to Improve Educator Performance.” The 2011 summit topic was selected to help answer the following question: What basic practice has the potential for the greatest impact on changing the behavior of students, teachers, and school administrative personnel?
States, J., Keyworth, R. & Detrich, R. (2013). Introduction: Proceedings from the Wing Institute’s Sixth Annual Summit on Evidence-Based Education: Performance Feedback: Using Data to Improve Educator Performance. In Education at the Crossroads: The State of Teacher Preparation (Vol. 3, pp. ix-xii). Oakland, CA: The Wing Institute.
This Guide seeks to provide assistance to educational practitioners in evaluating whether an educational intervention is backed by rigorous evidence of effectiveness, and in implementing evidence-based interventions in their schools or classrooms.
Baron, J. (2004). Identifying and Implementing Education Practices Supported by Rigorous Evidence: A User Friendly Guide. Journal for Vocational Special Needs Education, 26, 40-54.
This policy brief lays out five components of a vision for the future and identifies opportunities to support teacher education reform. Examples of promising developments are also addressed that involve full-scale program redesign featuring collaboration across general and special education.
Blanton, L. P., Pugach, M. C., & Florian, L. (2011). Preparing general education teachers to improve outcomes for students with disabilities. Washington, DC: American Association of Colleges for Teacher Education; National Center for Learning Disabilities. Retrieved from https://www.ncld.org/wp-content/uploads/2014/11/aacte_ncld_recommendation.pdf
The purpose of this study was to examine the effectiveness of enhanced anchor-instruction and traditional problem instruction in improving problem-solving performance.
Bottge, B. A., Heinrichs, M., Mehta, Z. D., & Hung, Y. H. (2002). Weighing the benefits of anchored math instruction for students with disabilities in general education classes. The Journal of Special Education, 35(4), 186-200.
Dane and Schneider propose treatment integrity as a multi-dimensional construct and describe five dimensions that constitute the construct.
Dane, A. V., & Schneider, B. H. (1998). Program integrity in primary and early secondary prevention: are implementation effects out of control. Clinical psychology review, 18(1), 23-45.
The impact of an intervention is influenced by how well it fis into the context of a classroom. This paper suggests a number of variables to consider and how they might be measured prior to the development of an intervention.
Detrich, R. (1999). Increasing treatment fidelity by matching interventions to contextual variables within the educational setting. School Psychology Review, 28(4), 608-620.
To produce better outcomes for students two things are necessary: (1) effective, scientifically supported interventions (2) those interventions implemented with high integrity. Typically, much greater attention has been given to identifying effective practices. This review focuses on features of high quality implementation.
Detrich, R. (2014). Treatment integrity: Fundamental to education reform. Journal of Cognitive Education and Psychology, 13(2), 258-271.
Over the last fifty years, there have been many educational reform efforts, most of which have had a relatively short lifespan and failed to produce the promised results. One possible reason for this is for the most part these innovations have been poorly implemented. In this chapter, the author proposes a data-based decision making approach to assuring high quality implementation.
Detrich, R. Innovation, Implementation Science, and Data-Based Decision Making: Components of Successful Reform. In M. Murphy, S. Redding, and J. Twyman (Eds). Handbook on Innovations in Learning, 31. Charlotte, NC: Information Age Publishing
Reform efforts tend to come and go very quickly in education. This paper makes the argument that the sustainability of programs is closely related to how well those programs are implemented.
Detrich, R., Keyworth, R. & States, J. (2010). Treatment Integrity: A Fundamental Unit of Sustainable Educational Programs. Journal of Evidence-Based Practices for Schools, 11(1), 4-29.
In this conceptualization of treatment integrity, there are four dimensions relevant to practice: (a) exposure (dosage), (b) adherence, (c) quality of delivery, and (d) student responsiveness. It is important to understand that these dimensions do not stand alone but rather interact to impact the ultimate effectiveness of an intervention.
Detrich, R., States, J. & Keyworth, R. (2017). Dimensions of Treatment Integrity Overview. Oakland, Ca. The Wing Institute
The usual approach to determining if an intervention is effective for a student is to review student outcome data; however, this is only part of the task. Student data can only be understood if we know something about how well the intervention was implemented. Student data without treatment integrity data are largely meaningless because without knowing how well an intervention has been implemented, no judgments can be made about the effectiveness of the intervention. Poor outcomes can be a function of an ineffective intervention or poor implementation of the intervention. Without treatment integrity data, there is a risk that an intervention will be judged as ineffective when, in fact, the quality of implementation was so inadequate that it would be unreasonable to expect positive outcomes.
Detrich, R., States, J. & Keyworth, R. (2017). Treatment Integrity in the Problem Solving Process. Oakland, Ca. The Wing Institute.
For the best chance of producing positive educational outcomes for all children, two conditions must be met: (a) adopting effective empirically supported (evidence-based) practices and (b) implementing those practices with sufficient quality that they make a difference (treatment integrity)
Detrich, R., States, J., & Keyworth, R. (2107). Overview of Treatment Integrity. Oakland, Ca. The Wing Institute.
This study evaluated the impact of public feedback in RtI team meetings on the quality of implementation. Feedback improved poor implementation and maintained high level implementation.
Duhon, G. J., Mesmer, E. M., Gregerson, L., & Witt, J. C. (2009). Effects of public feedback during RTI team meetings on teacher implementation integrity and student academic performance. Journal of School Psychology, 47(1), 19-37.
The first purpose of this review is to assess the impact of implementation on program outcomes, and the second purpose is to identify factors affecting the implementation process.
Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American journal of community psychology, 41(3-4), 327-350.
This paper summarizes survey results about the acceptability of different methods for monitoring treatment integrity and performance feedback.
Easton, J. E., & Erchul, W. P. (2011). An Exploration of Teacher Acceptability of Treatment Plan Implementation: Monitoring and Feedback Methods. Journal of Educational & Psychological Consultation, 21(1), 56-77. Retrieved from http://www.tandfonline.com/doi/abs/10.1080/10474412.2011.544949?journalCode=hepc20.
To help states and districts make informed decisions about the PD they implement to improve reading instruction, the U.S. Department of Education commissioned the Early Reading PD Interventions Study to examine the impact of two research-based PD interventions for reading instruction: (1) a content-focused teacher institute series that began in the summer and continued through much of the school year (treatment A) and (2) the same institute series plus in-school coaching (treatment B).
Garet, M. S., Cronen, S., Eaton, M., Kurki, A., Ludwig, M., Jones, W., ... Zhu, P. (2008). The impact of two professional development interventions on early reading instruction and achievement. NCEE 2008-4030. Washington, DC: National Center for Education Evaluation and Regional Assistance.
Technical issues (specification of treatment components, deviations from treatment protocols and amount of behavior change, and psychometric issues in assessing Treatment Integrity) involved in the measurement of Treatment Integrity are discussed.
Gresham, F. M. (1989). Assessment of treatment integrity in school consultation and prereferral intervention. School Psychology Review, 18(1), 37-50.
This study evaluated the differences in estimates of treatment integrity be measuring different dimensions of it.
Hagermoser Sanetti, L. M., & Fallon, L. M. (2011). Treatment Integrity Assessment: How Estimates of Adherence, Quality, and Exposure Influence Interpretation of Implementation. Journal of Educational & Psychological Consultation, 21(3), 209-232.
Used a direct observation-based approach to identify behavioral conditions in sending (i.e., special education) and in receiving (i.e., regular education) classrooms and to identify targets for intervention that might facilitate mainstreaming of behavior-disordered (BD) children.
Hoier, T. S., McConnell, S., & Pallay, A. G. (1987). Observational assessment for planning and evaluating educational transitions: An initial analysis of template matching. Behavioral Assessment.
The purposes of this manuscript are to propose core features that may apply to any practice or set of practices that proposes to be evidence-based in relation to School-wide Positive Behavior Support (SWPBS).
Horner, R. H., Sugai, G., & Anderson, C. M. (2010). Examining the evidence base for school-wide positive behavior support. Focus on Exceptional Children, 42(8), 1.
“Contextual fit” is based on the premise that the match between an intervention and local context affects both the quality of intervention implementation and whether the intervention actually produces the desired outcomes for children and families.
Horner, R., Blitz, C., & Ross, S. (2014). The importance of contextual fit when implementing evidence-based interventions. Washington, DC: U.S. Department of Health and Human Services, Office of the Assistant Secretary for Planning and Evaluation. https://aspe.hhs.gov/system/files/pdf/77066/ib_Contextual.pdf
This book provides research as well as case studies of successful professional development strategies and practices for educators.
Joyce, B. R., & Showers, B. (2002). Student achievement through staff development. ASCD.
The authors proposed a preliminary FI theory (FIT) and tested it with moderator analyses. The central assumption of FIT is that FIs change the locus of attention among 3 general and hierarchically organized levels of control: task learning, task motivation, and meta-tasks (including self-related) processes.
Kluger, A. N., & DeNisi, A. (1996). The effects of feedback interventions on performance: A historical review, a meta-analysis, and a preliminary feedback intervention theory. Psychological bulletin, 119(2), 254.
This book examines the use of video recording to to improve teacher performance. The book shows how every classroom can easily benefit from setting up a camera and hitting “record”.
Knight, J. (2013). Focus on teaching: Using video for high-impact instruction. (Pages 8-14). Thousand Oaks, CA: Corwin.
Peer assessment has become a popular education intervention. A review of the literature finds few studies on the impact of Peer Review on student outcomes. This meta-analysis examines the effect sizes found in 58 studies.
Li, H., Xiong, Y., Hunter, C. V., Guo, X., & Tywoniw, R. (2020). Does peer assessment promote student learning? A meta-analysis. Assessment & Evaluation in Higher Education, 45(2), 193-211.
This paper examines school-based experimental studies with individuals 0 to 18 years between 1991 and 2005. Only 30% of the studies provided treatment integrity data. Nearly half of studies (45%) were judged to be at high risk for treatment inaccuracies.
McIntyre, L. L., Gresham, F. M., DiGennaro, F. D., & Reed, D. D. (2007). Treatment integrity of school‐based interventions with children in the Journal of Applied Behavior Analysis 1991–2005. Journal of Applied Behavior Analysis, 40(4), 659–672.
The purpose of this chapter is to describe a critical component of the response-to-
intervention (RTI) process: monitoring student response to general education instruction.
McMaster, K. L., & Wagner, D. (2007). Monitoring response to general education instruction. In Handbook of response to intervention (pp. 223-233). Springer, Boston, MA.
This meta-analysis reports on the overall effectiveness of video analysis when used with special educators, as well as on moderator analyses related to participant and instructional characteristics.
Morin, K. L., Ganz, J. B., Vannest, K. J., Haas, A. N., Nagro, S. A., Peltier, C. J., … & Ura, S. K. (2019). A systematic review of single-case research on video analysis as professional development for special educators. The Journal of Special Education, 53(1), 3-14.
This study compared the effects of discussing issues of implementation challenges and performance feedback on increasing the integrity of implementation. Performance feedback was more effective than discussion in increasing integrity.
Noell, G. H., & Witt, J. C. (2000). Increasing intervention implementation in general education following consultation: A comparison of two follow-up strategies. Journal of Applied Behavior Analysis, 33(3), 271.
This book looks at how new ideas spread via communication channels over time. Such innovations are initially perceived as uncertain and even risky. To overcome this uncertainty, most people seek out others like themselves who have already adopted the new idea. Thus the diffusion process typically takes months or years. But there are exceptions: use of the Internet in the 1990s, for example, may have spread more rapidly than any other innovation in the history of humankind.
Rogers, E. M. (2003). Diffusion of innovations (5th ed.). New York, NY: Free Press.
This paper reviews options for treatment integrity measurement emphasizing how direct behavior rating technology might be incorporated within a multi-tiered model of intervention delivery.
Sanetti, L. M. H., Chafouleas, S. M., Christ, T. J., & Gritter, K. L. (2009). Extending Use of Direct Behavior Rating Beyond Student Assessment. Assessment for Effective Intervention, 34(4), 251-258.
The authors reviewed all intervention studies published in the Journal of Positive Behavior Interventions between 1999-2009 to determine the percent of those studies that reported a measure of treatment integrity. Slightly more than 40% reported a measure of treatment integrity.
Sanetti, L. M. H., Dobey, L. M., & Gritter, K. L. (2012). Treatment Integrity of Interventions With Children in the Journal of Positive Behavior Interventions: From 1999 to 2009. Journal of Positive Behavior Interventions, 14(1), 29-46.
The authors reviewed four school psychology journals between 1995-2008 to estimate the percent of intervention studies that reported some measure of treatment integrity. About 50% reported a measure of treatment integrity.
Sanetti, L. M. H., Gritter, K. L., & Dobey, L. M. (2011). Treatment integrity of interventions with children in the school psychology literature from 1995 to 2008. School Psychology Review, 40(1), 72-84.
This book is written for school administrators, staff developers, behavior specialists, and instructional coaches to offer guidance in implementing research-based practices that establish effective classroom management in schools. The book provides administrators with practical strategies to maximize the impact of professional development.
Sprick, et al. (2010). Coaching Classroom Management: Strategies & Tools for Administrators & Coaches. Pacific Northwest Publishing.
This paper examines a range of education failures: common mistakes in how new practices are selected, implemented, and monitored. The goal is not a comprehensive listing of all education failures but rather to provide education stakeholders with an understanding of the importance of vigilance when implementing new practices.
States, J., & Keyworth, R. (2020). Why Practices Fail. Oakland, CA: The Wing Institute. https://www.winginstitute.org/roadmap-overview
Inattention to treatment integrity is a primary factor of failure during implementation. Treatment integrity is defined as the extent to which an intervention is executed as designed, and the accuracy and consistency with which the intervention is implemented
States, J., Detrich, R. & Keyworth, R. (2017). Treatment Integrity Strategies. Oakland, CA: The Wing Institute. https://www.winginstitute.org/effective-instruction-treatment-integrity-strategies.
A summary of the available studies accumulated over the past 40 years on a key education driver, teacher competencies offers practical strategies, practices, and rules to guide teachers in ways to improve instruction that improves student performance and the quality of the work experience.
States, J., Detrich, R. & Keyworth, R. (2017). Effective Instruction Overview. Oakland, CA: The Wing Institute. Retrieved from https://www.winginstitute.org/effective-instruction-overview
The present study was conducted to investigate the relationship between training procedures and treatment integrity.
Sterling-Turner, H. E., Watson, T. S., Wildmon, M., Watkins, C., & Little, E. (2001). Investigating the relationship between training type and treatment integrity. School Psychology Quarterly, 16(1), 56.
A soon to be published meta-analysis of Direct Instruction (DI) curricula that reviews research on DI curricula between 1966-2016 reports that DI curricula produced moderate to large effect sizes across the curriculum areas reading, math, language, and spelling. The review is notable because it reviews a much larger body of DI research than has occurred in the past and covers a wide range of experimental designs (from single subject to randomized trials). 328 studies were reviewed and almost 4,000 effects were considered. Given the variability in research designs and the breadth of the effects considered, it suggests that DI curricula produce robust results. There was very little decline during maintenance phases of the study and greater exposure to the curricula resulted in greater effects.
Stockard, J., Wood, T. W., Coughlin, C., & Rasplica Khoury, C. (2018). The effectiveness of direct instruction curricula: A meta-analysis of a half century of research. Review of Educational Research, 88(4), 479-507.
In a randomized control study, Head Start teachers were assigned to either an intervention group that received intensive, ongoing professional development (PD) or to a comparison group that received the “business as usual” PD provided by Head Start. The PD intervention provided teachers with conceptual knowledge and instructional strategies that support young children’s development of vocabulary, alpha- bet knowledge, and phonological sensitivity.
Wasik, B. A., & Hindman, A. H. (2011). Improving vocabulary and pre-literacy skills of at-risk preschoolers through teacher professional development. Journal of Educational Psychology, 103(2), 455.
This study evaluated the effects of performance feedback on increasing the quality of implementation of interventions by teachers in a public school setting.
Witt, J. C., Noell, G. H., LaFleur, L. H., & Mortenson, B. P. (1997). Teacher use of interventions in general education settings: Measurement and analysis of ?the independent variable. Journal of Applied Behavior Analysis, 30(4), 693.