Acceptability is a proxy measure of how well an intervention fits into the context of the intervention setting.
Allinder, R. M., & Oats, R. G. (1997). Effects of Acceptability on Teachers’ Implementation of Curriculum-Based Measurement and Student Achievement in Mathematics Computation. Remedial & Special Education, 18(2), 113. Retrieved from http://psycnet.apa.org/index.cfm?fa=search.displayRecord&UID=1997-03796-005
Benazzi and colleagues examined the contextual fit of interventions when they were deveopled by different configurations of individuals.
Benazzi, L., Horner, R. H., & Good, R. H. (2006). Effects of Behavior Support Team Composition on the Technical Adequacy and Contextual Fit of Behavior Support Plans. Journal of Special Education, 40(3), 160-170.
Observational data collected in ecologically valid measurement contexts are likely to be influenced by contextual factors irrelevant to the research question. Using multiple sessions and raters often improves the stability of scores for variables from such contexts.
Bruckner, C. T., Yoder, P. J., & McWilliam, R. A. (2006). Generalizability and decision studies: An example using conversational language samples. Journal of Early Intervention, 28(2), 139-153.
In this article is a proposed model for developing interventions so that they fit into the context of public schools.
Cappella, E., Reinke, W. M., & Hoagwood, K. E. (2011). Advancing Intervention Research in School Psychology: Finding the Balance Between Process and Outcome for Social and Behavioral Interventions. School Psychology Review, 40(4), 455-464.conte
The impact of an intervention is influenced by how well it fis into the context of a classroom. This paper suggests a number of variables to consider and how they might be measured prior to the development of an intervention.
Detrich, R. (1999). Increasing treatment fidelity by matching interventions to contextual variables within the educational setting. School Psychology Review, 28(4), 608-620.
Including cost-effectiveness data in the evaluation of programs is the next step in the evolution of evidence-based practice. Evidence-based practice is grounded in three complementary elements: best available evidence, professional judgment, and client values and context. In this article, I discuss some of the considerations that have to be addressed in the decision-making process and implications of including cost-effectiveness analyses in data-based decision making.
Detrich, R. (2020). Cost-effectiveness analysis: A component of evidence-based education. School Psychology Review, 1-8.
This paper provides an integrated understanding of contextual factors that affect implementation.
Han, S. S., & Weiss, B. (2005). Sustainability of Teacher Implementation of School-Based Mental Health Programs. Journal of Abnormal Child Psychology, 33(6), 665-679.
“Contextual fit” is based on the premise that the match between an intervention and local context affects both the quality of intervention implementation and whether the intervention actually produces the desired outcomes for children and families.
Horner, R., Blitz, C., & Ross, S. (2014). The importance of contextual fit when implementing evidence-based interventions. Washington, DC: U.S. Department of Health and Human Services, Office of the Assistant Secretary for Planning and Evaluation. https://aspe.hhs.gov/system/files/pdf/77066/ib_Contextual.pdf
This meta-analysis finds a positive relationship between school principals spending time on five commonly assigned roles and student achievement.
Liebowitz, D. D., & Porter, L. (2019). The Effect of Principal Behaviors on Student, Teacher, and School Outcomes: A Systematic Review and Meta-Analysis of the Empirical Literature. Review of Educational Research, 89(5), 785-827.
Research on data-based decision making has proliferated around the world, fueled by policy recommendations and the diverse data that are now available to educators to inform their practice. Yet, many misconceptions and concerns have been raised by researchers and practitioners. This paper surveys and synthesizes the landscape of the data-based decision-making literature to address the identified misconceptions and then to serve as a stimulus to changes in policy and practice as well as a roadmap for a research agenda.
Mandinach, E. B., & Schildkamp, K. (2021). Misconceptions about data-based decision making in education: An exploration of the literature. Studies in Educational Evaluation, 69, 100842.
Implementation of an intervention always occurs in a specific context. This papers considers the complexity that context contributes to implementation science.
May, C. R., Johnson, M., & Finch, T. (2016). Implementation, context and complexity. Implementation Science, 11(1), 141.
The national education goals express a systemic approach to reform which fosters coherence in the disparate elements of the education system. This report highlights the findings of research conducted by the Center for Research on the Context of Secondary School Teaching (CRC) in California and Michigan during the years 1987-1992 and the implications for policy strategies to achieve the national education goals.
McLaughlin, M. W., & Talbert, J. E. (1993). Contexts that matter for teaching and learning: Strategic opportunities for meeting the nation's educational goals.
This article provides an overview of contextual factors across the levels of an educational system that influence implementation.
Schaughency, E., & Ervin, R. (2006). Building Capacity to Implement and Sustain Effective Practices to Better Serve Children. School Psychology Review, 35(2), 155-166. Retrieved from http://eric.ed.gov/?id=EJ788242
The paper describes the relationship between the three cornerstones of evidence-based practice including context.
Slocum, T. A., Detrich, R., Wilczynski, S. M., Spencer, T. D., Lewis, T., & Wolfe, K. (2014). The Evidence-based Practice of Applied Behavior Analysis. The Behavior Analyst, 37, 41-56.
Evidence-based practice is a decision-making framework. This paper describes the relationships among the three cornerstones of this framework.
Spencer, T. D., Detrich, R., & Slocum, T. A. (2012). Evidence-based Practice: A Framework for Making Effective Decisions. Education & Treatment of Children (West Virginia University Press), 35(2), 127-151.
This paper examines the types of research to consider when evaluating programs, how to know what “evidence’ to use, and continuums of evidence (quantity of the evidence, quality of the evidence, and program development).
Twyman, J. S., & Sota, M. (2008). Identifying research-based practices for response to intervention: Scientifically based instruction. Journal of Evidence-Based Practices for Schools, 9(2), 86-101.
In this video from Cool Reading Facts, Daniel Willingham, professor of psychology at the University of Virginia, discusses significant factors key to success in reading comprehension. His analysis suggests that educators frequently miss the critical role that basic knowledge plays in successfully interpreting and understanding passages in reading texts and that reading comprehension tests are actually knowledge tests in disguise. He makes three important points: (1) Students must have the basic decoding skills to translate print into meaningful information, (2) having a basic familiarity with the subject matter is of prime importance in comprehending what the writer is trying to communicate, and (3) providing students with an enriched knowledge base through the school’s curriculum is especially important for students from disadvantaged circumstances, whose only source of essential background information often is school. In contrast, children from privileged circumstances may be introduced to essential background information away from school.
Willingham, D. (2017). Cool Reading Facts 5: Reading comprehension tests don’t test reading [Video file]. National Public Radio, Science Friday Educator Collaborative.