Education Drivers

Continuum of Evidence

At a time of diminishing student performance and constrained education resources, it is critical that policy makers, educators, and parents make informed decisions about education interventions on the basis of sound science and thoughtful social influence. The Wing Institute, an independent and non-profit organization, was created to promote evidence-based education policies and practices to address these serious challenges facing education today.

The Wing Institute is named in memory of Ernie Wing, who championed evidenced-based education as an educator and child advocate. Ernie set the standards for professionalism, integrity, effectiveness, and caring as he helped thousands of children gain access to effective educational services.

Publications

TITLE
SYNOPSIS
CITATION
Evidence-Based Practice in the Broader Context: How Can We Really Use Evidence to Inform Decisions?

This paper provides an overview of the considerations when introducing evidence-based services into established mental health systems.

Chorpita, B. F., & Starace, N. K. (2010). Evidence-Based Practice in the Broader Context: How Can We Really Use Evidence to Inform Decisions? Journal of Evidence-Based Practices for Schools, 11(1), 4-29.

Evidence-Based, Empirically Supported, OR Best Practice?

Evidence-based, empirically-supported, and best practice are often used interchangeably. A case is made that for clarity each term should have a separate and distinct meaning.

Detrich, R. (2008). Evidence-Based, Empirically Supported, OR Best Practice?. Effective practices for children with autism, 1.

A roadmap to evidence-based education: Building an evidence-based culture

Increasing education’s reliance on evidence to guide decisions requires a significant change in the culture of districts and schools. This paper reviews the implications of moving toward evidence-based education.

Detrich, R., Keyworth, R., & States, J. (2007). A Roadmap to Evidence-based Education: Building an Evidence-based Culture. Journal of Evidence-based Practices for Schools, 8(1), 26-44.

Evidence-Based Education and Best Available Evidence: Decision-Making Under Conditions of Uncertainty

Evidence-based practice is a framework for decision making.  Even with high quality evidence there are likely sources of uncertainty that practitioners must confront.

Detrich, R., Slocum, T. A., & Spencer, T. D. (2013). Evidence-based education and best available evidence: decision-making under conditions of uncertainty. Evidence-Based Practices, 26, 21.

Roles and responsibilities of researchers and practitioners for translating research to practice

This paper outlines the best practices for researchers and practitioners translating research to practice as well as recommendations for improving the process.

Shriver, M. D. (2007). Roles and responsibilities of researchers and practitioners for translating research to practice. Journal of Evidence-Based Practices for Schools, 8(1), 1-30.

Evaluating the Validity of Systematic Reviews to Indentify Empirically Supported Treatments

The systematic review process is an assessment and, as such, concerns about validity of the assessment are paramount.  In this paper, we review the considerations that are important in reaching conclusions about the adequacy of a systematic review.

Slocum, T. A., Detrich, R., & Spencer, T. D. (2012). Evaluating the validity of systematic reviews to identify empirically supported treatments. Education and Treatment of Children, 35(2), 201-233.

The Evidence-Based Practice of Applied Behavior Analysis

Applied behavior analysis emphasizes being scientifically-based In this paper, we discuss how the core features of evidence-based practice can be integrated into applied behavior analysis.

Slocum, T. A., Detrich, R., Wilczynski, S. M., Spencer, T. D., Lewis, T., & Wolfe, K. (2014). The Evidence-Based Practice of Applied Behavior Analysis. The Behavior Analyst, 37(1), 41-56.

Best Available Evidence: Three Complementary Approaches

The notion of best available evidence implies that some evidence is better than other. This paper reviews different sources of evidence and the relative strengths and limitations of each type.

Slocum, T. A., Spencer, T. D., & Detrich, R. (2012). Best available evidence: Three complementary approaches. Education and Treatment of Children, 35(2), 153-181.

Evidence-based Practice: A Framework for Making Effective Decisions

Synopsis: Evidence-based practice is characterized as a framework for decision-making integrating best available evidence, clinical expertise, and client values and context.  This paper reviews how these three dimensions interact to inform decisions.

Spencer, T. D., Detrich, R., & Slocum, T. A. (2012). Evidence-based practice: A framework for making effective decisions. Education and Treatment of Children, 35(2), 127-151.

Framework for Improving Education Outcomes

Multitiered system of support (MTSS) is a framework for organizing service delivery. At the core of MTSS is the adoption and implementation of a continuum of evidence-based interventions that result in improved academic and behavioral outcomes for all students. MTSS is a data-based decision making approach based on the frequent screening of progress for all students and intervention for students who are not making adequate progress.

 

States, J., Detrich, R., and Keyworth, R. (2017). Multitiered System of Support Overview. Oakland, Ca. The Wing Institute.

Identifying research-based practices for response to intervention: Scientifically based instruction

This paper examines the types of research to consider when evaluating programs, how to know what “evidence’ to use, and continuums of evidence (quantity of the evidence, quality of the evidence, and program development).

Twyman, J. S., & Sota, M. (2008). Identifying research-based practices for response to intervention: Scientifically based instruction. Journal of Evidence-Based Practices for Schools, 9(2), 86-101.

 

Presentations

TITLE
SYNOPSIS
CITATION
Roles and Responsibilities of Researchers and Practitioners Translating Research to Practice

This paper outlines the best practices for researchers and practitioners translating research to practice as well as recommendations for improving the process.

Shriver, M. (2006). Roles and Responsibilities of Researchers and Practitioners Translating Research to Practice [Powerpoint Slides]. Retrieved from 2006-wing-presentation-mark-shriver.

Evolution of the Revolution: How Can Evidence-based Practice Work in the Real World?
This paper provides an overview of the considerations when introducing evidence-based services into established mental health systems.
Chorpita, B. (2008). Evolution of the Revolution: How Can Evidence-based Practice Work in the Real World? [Powerpoint Slides]. Retrieved from 2008-wing-presentation-bruce-chorpita.
If We Want More Evidence-based Practice, We Need More Practice-based Evidence
This paper discusses the importance, strengths, and weaknesses of using practice-based evidence in conjunction with evidence-based practice.
Cook, B. (2015). If We Want More Evidence-based Practice, We Need More Practice-based Evidence [Powerpoint Slides]. Retrieved from 2015-wing-presentation-bryan-cook.
From Evidence-based Practice to Practice-based Evidence: Behavior Analysis in Special Education
Evidence-based practice is a decision making framework. This talk reviews the types of evidence that can be used in decision-making and when each source of evidence is best used.
Detrich, R. (2006). From Evidence-based Practice to Practice-based Evidence: Behavior Analysis in Special Education [Powerpoint Slides]. Retrieved from 2006-calstatefresnoaba-presentation-ronnie-detrich.
An Expanded Model of Evidence-based Practice in Special Education
This paper reviews the types of evidence that can used to guide decision-making in special education as well as the necessity for high quality implementation, and monitoring the effects of intervention.
Detrich, R. (2006). An Expanded Model of Evidence-based Practice in Special Education [Powerpoint Slides]. Retrieved from 2006-campbell-presentation-ronnie-detrich.
Single Subject Research and Evidence-based Interventions: Are SSDs Really the Ugly Stepchild?
In most discussions about high quality research, single participant designs have been relegated to a lower status. This paper reviews the characteristics of SSDs and the contributions they can make to the evidence-base.
Detrich, R. (2007). Single Subject Research and Evidence-based Interventions: Are SSDs Really the Ugly Stepchild? [Powerpoint Slides]. Retrieved from 2007-aba-presentation-ronnie-detrich.
Evidence-based Education: Can We Get There From Here?
This paper reviews the steps that will be necessary to make evidence-based education a reality.
Detrich, R. (2008). Evidence-based Education: Can We Get There From Here? [Powerpoint Slides]. Retrieved from 2007-calaba-ebe-presentation-ronnie-detrich.
IDEIA and Evidence-based Interventions: Implications for Practitioners
The reauthorization of special education law (IDEIA) emphasizes using scientifically supported programs. This talk reviews the implications for special education practitioners.
Detrich, R. (2008). IDEIA and Evidence-based Interventions: Implications for Practitioners [Powerpoint Slides]. Retrieved from 2008-apbs-txint-presentation-ronnie-detrich.
Evidence-based Practice: More than a List
Evidence-based practice has at least two meanings. This paper argues that it is best thought of as a decision-making framework.
Detrich, R. (2011). Evidence-based Practice: More than a List [Powerpoint Slides]. Retrieved from 2011-apbs-presentation-ronnie-detrich.
Workshop: Evidence-based Practice of Applied Behavior Analysis.
Evidence-based practice is a decision-making framework that integrates best available evidence, professional judgement, and client values and context. This workshop described the relationship across these three dimensions of decision-making.
Detrich, R. (2015). Workshop: Evidence-based Practice of Applied Behavior Analysis. [Powerpoint Slides]. Retrieved from 2015-missouriaba-workshop-presentation-ronnie-detrich.
TITLE
SYNOPSIS
CITATION
Creating Single-Subject Design in Microsoft Excel
The article provides a task analyses for constructing various types of commonly used single-subject design graphs in Microsoft Excel
Dixon, M. R., Jackson, J. W., Small, S. L., Horner?King, M. J., Lik, N. M. K., Garcia, Y., & Rosales, R. (2009). CCreating Single-Subject Design in Microsoft Excel™ 2007. Journal of applied behavior analysis, 42(2), 277-293.
Inequality and Economic Growth: The Perspective of the New Growth Theories

We analyze the relationship between inequality and economic growth from two directions. The first part of the survey examines the effect of inequality on growth. The second part analyzes several mechanisms whereby growth may increase wage inequality, both across and within education cohorts.

Aghion, P., Caroli, E., & Garcia-Penalosa, C. (1999). Inequality and economic growth: The perspective of the new growth theories. Journal of Economic literature37(4), 1615-1660.

Synthesis of behavioral science learnings about technology transfer

This chapter reviews a set of behavioral science findings derived from the November 1993 NIDA Technical Review, “Reviewing the Behavioral Science Knowledge Base on Technology Transfer.” This is not intended to be a complete recapitulation of the arguments and conclusions drawn by the authors of the 14 papers presented in this monograph.

Backer, T. E., & David, S. L. (1995). Synthesis of behavioral science learnings about technology transfer. NIDA research monograph155, 262-279.

Effective Treatment for Mental Disorders in Children and Adolescents

As pressure increases for the demonstration of effective treatment for children with mental disorders, it is essential that the field has an understanding of the evidence base. To address this aim, the authors searched the published literature for effective interventions for children and adolescents and organized this review

Burns, B. J., Hoagwood, K., & Mrazek, P. J. (1999). Effective treatment for mental disorders in children and adolescents. Clinical child and family psychology review2(4), 199-254.

Evidence-based mental health practice: A textbook.

This comprehensive textbook is an essential primer for all practitioners and students who are grappling with the new age of evidence-based practice. The contributors explore some of the complex challenges in implementing EBPs, and highlight the meaningful opportunities that are inherent in this paradigm shift.

Drake, R. E., Merrens, M. R., & Lynde, D. W. (Eds.). (2005). A Norton professional book. Evidence-based mental health practice: A textbook. New York, NY, US: W W Norton & Co.

Creating new realities: Program development and dissemination

In this paper we will review some of the examples from industrial innovation and dissemination, provide some data on replications of the Achievement Place/Teaching-Family Model over 20 years, and try to share some of the philosophical, practical, and technological guidelines we have come to accept.

Fixsen, D. L., & Blase, K. A. (1993). Creating new realities: Program development and dissemination. Journal of Applied Behavior Analysis26(4), 597-615.

What works for whom?: a critical review of treatments for children and adolescents

The standard reference in the field, this acclaimed work synthesizes findings from hundreds of carefully selected studies of mental health treatments for children and adolescents.

Fonagy, P., Cottrell, D., Phillips, J., Bevington, D., Glaser, D., & Allison, E. (2014). What works for whom?: a critical review of treatments for children and adolescents. Guilford Publications.

Will the “principles of effectiveness” improve prevention practice? Early findings from a diffusion study

This study examines adoption and implementation of the US Department of Education's new policy, the `Principles of Effectiveness', from a diffusion of innovations theoretical framework. In this report, we evaluate adoption in relation to Principle 3: the requirement to select research-based programs.

Hallfors, D., & Godette, D. (2002). Will the “principles of effectiveness” improve prevention practice? Early findings from a diffusion study. Health Education Research, 17(4), 461–470.

Visible learning: A synthesis of over 800 meta-analyses relating to achievement

Hattie’s book is designed as a meta-meta-study that collects, compares and analyses the findings of many previous studies in education. Hattie focuses on schools in the English-speaking world but most aspects of the underlying story should be transferable to other countries and school systems as well. Visible Learning is nothing less than a synthesis of more than 50.000 studies covering more than 80 million pupils. Hattie uses the statistical measure effect size to compare the impact of many influences on students’ achievement, e.g. class size, holidays, feedback, and learning strategies.

Hattie, J. (2008). Visible learning: A synthesis of over 800 meta-analyses relating to achievement. New York, NY: Routledge.

 

History of Behavior Modification

This chapter traces the history of behavior modification as a general movement. Individual conceptual approaches and techniques that comprise behavior modification are obviously important in tracing the history, but they are examined as part of the larger development rather than as ends in their own right. 

Kazdin, A. E. (1982). History of behavior modification. In International handbook of behavior modification and therapy (pp. 3-32). Springer, Boston, MA.

Facts are more important than novelty: Replication in the education sciences

Despite increased attention to methodological rigor in education research, the field has focused heavily on experimental design and not on the merit of replicating important results. The present study analyzed the complete publication history of the current top 100 education journals ranked by 5-year impact factor and found that only 0.13% of education articles were replications. Contrary to previous findings in medicine, but similar to psychology, the majority of education replications successfully replicated the original studies. However, replications were significantly less likely to be successful when there was no overlap in authorship between the original and replicating articles. The results emphasize the importance of third-party, direct replications in helping education research improve its ability to shape education policy and practice.

Makel, M. C., & Plucker, J. A. (2014). Facts are more important than novelty: Replication in the education sciences. Educational Researcher, 43(6), 304–316.

Measuring the fidelity of implementation of a mental health program model.

Developed a fidelity index of program implementation for assertive community treatment (ACT). In Study 1, 20 experts rated the importance of 73 elements proposed as critical ACT ingredients, also indicating ideal model specifications for elements.

McGrew, J. H., Bond, G. R., Dietzen, L., & Salyers, M. (1994). Measuring the fidelity of implementation of a mental health program model. Journal of Consulting and Clinical Psychology, 62(4), 670-678.

Treatment integrity of school‐based interventions with children

This paper examines school-based experimental studies with individuals 0 to 18 years between 1991 and 2005.  Only 30% of the studies provided treatment integrity data. Nearly half of studies (45%) were judged to be at high risk for treatment inaccuracies.

McIntyre, L. L., Gresham, F. M., DiGennaro, F. D., & Reed, D. D. (2007). Treatment integrity of school‐based interventions with children in the Journal of Applied Behavior Analysis 1991–2005. Journal of Applied Behavior Analysis, 40(4), 659–672.

Expanding the frontier of treatment research.

This article covers current efforts by the National Institute of Mental Health to bridge this gap. Included are discussions of problems with the current research portfolio and new efforts in expanding the research portfolio, innovative methodological research, and expansion of training programs. 

Norquist, G., Lebowitz, B., & Hyman, S. (1999). Expanding the frontier of treatment research. Prevention & Treatment, 2(1). Article ID 1a.

Psychology should list empirically supported principles of change (ESPs) and not credential trademarked therapies or other treatment packages

Current systems for listing empirically supported therapies (ESTs) provide recognition to treatment packages, many of them proprietary and trademarked, without regard to the principles of change believed to account for their effectiveness.

Rosen, G. M., & Davison, G. C. (2003). Psychology should list empirically supported principles of change (ESPs) and not credential trademarked therapies or other treatment packages. Behavior modification27(3), 300-312.

Mental health: A report of the Surgeon General--Executive summary.

Two messages are conveyed in the report: Mental health is fundamental to health, and mental disorders are real health conditions. The surgeon general's report summarizes the Office's detailed review of more than 3,000 research articles, plus 1st-person accounts from individuals who have been afflicted with mental disorders. 

Satcher, D. (2000). Mental health: A report of the Surgeon General--Executive summary. Professional Psychology: Research and Practice, 31(1), 5-13.

Multisystemic Therapy: Monitoring Treatment Fidelity

The challenges of specifying a complex and individualized treatment model and measuring fidelity thereto are described, using multisystemic therapy (MST) as an example.

Schoenwald, S. K., Henggeler, S. W., Brondino, M. J., & Rowland, M. D. (2000). Multisystemic therapy: Monitoring treatment fidelity. Family Process39(1), 83-103.

The Impacts of Reading Recovery at Scale: Results From the 4-Year i3 External Evaluation

A recent large-scale evaluation of Reading Recovery, a supplemental reading program for young struggling readers, supports previous research that found it to be effective.  In a 4 year, federally funded project, almost 3,500 students in 685 schools found that generally students benefitted from the intervention. Students receiving Reading Recovery receive supplemental services in a 1:1 instructional setting for 30 minutes 5 days a week from an instructor trained in Reading Recovery.  In the study reported here, students who received Reading Recovery had effect sizes of .35-.37 relative to a control group across a number of measures of reading.  These represent moderate effect sizes and account for about a 1.5 month increase in skill relative to the control group.  Even though the research supports the efficacy of the intervention, it also raises questions about its efficiency.  The schools that participated in the study served about 5 students and the estimated cost per student has ranged from $2,000-$5,000.  These data raise questions about the wisdom of spending this much money per student for growth of about a month and a half.

Sirinides, P., Gray, A., & May, H. (2018). The Impacts of Reading Recovery at Scale: Results From the 4-Year i3 External Evaluation. Educational Evaluation and Policy Analysis, 0162373718764828.

A Meta-Analysis of Direct Instruction

A soon to be published meta-analysis of Direct Instruction (DI) curricula that reviews research on DI curricula between 1966-2016 reports that DI curricula produced moderate to large effect sizes across the curriculum areas reading, math, language, and spelling.  The review is notable because it reviews a much larger body of DI research than has occurred in the past and covers a wide range of experimental designs (from single subject to randomized trials).  328 studies were reviewed and almost 4,000 effects were considered.  Given the variability in research designs and the breadth of the effects considered, it suggests that DI curricula produce robust results.  There was very little decline during maintenance phases of the study and greater exposure to the curricula resulted in greater effects.

Stockard, J., Wood, T. W., Coughlin, C. & Khoury, C. R. (in press), Review of Educational Research.  DOI: 10.3102/0034654317751919

 

Training in and Dissemination of Empirically-Validated Psychological Treatments: Report and Recommendations

At the request of David Barlow, President of Division 12, and under the aegis of Section III, this task force was constituted to consider methods for educating clinical psychologists, third party payors, and the public about effective psychotherapies

Task Force on Promotion and Dissemination of Psychological Procedures, Division of Clinical Psychology, American Psychological Association. (1995). Training in and Dissemination of Empirically-Validated Psychological Treatments: Report and Recommendations. The Clinical Psychologist, 48, 3-23. 

Examining reproducibility in psychology: A hybrid method for combining a statistically significant original study and a replication

The unrealistically high rate of positive results within psychology has increased the attention to replication research. However, researchers who conduct a replication and want to statistically combine the results of their replication with a statistically significant original study encounter problems when using traditional meta-analysis techniques. The original study’s effect size is most probably overestimated because it is statistically significant, and this bias is not taken into consideration in traditional meta-analysis. We have developed a hybrid method that does take the statistical significance of an original study into account and enables (a) accurate effect size estimation, (b) estimation of a confidence interval, and (c) testing of the null hypothesis of no effect. We analytically approximate the performance of the hybrid method and describe its statistical properties. By applying the hybrid method to data from the Reproducibility Project: Psychology (Open Science Collaboration, 2015), we demonstrate that the conclusions based on the hybrid method are often in line with those of the replication, suggesting that many published psychological studies have smaller effect sizes than those reported in the original study, and that some effects may even be absent. We offer hands-on guidelines for how to statistically combine an original study and replication, and have developed a Web-based application (https://rvanaert.shinyapps.io/hybrid) for applying the hybrid method.

van Aert, R. C. M., & van Assen, M. A. L. M. (2018). Examining reproducibility in psychology: A hybrid method for combining a statistically significant original study and a replication. Behavior Research Methods, 50(4),1515–1539.

The Innovation Journey

The Innovation Journey presents the results of a major longitudinal study that examined the process of innovation from concept to implementation of new technologies, products, processes, and administrative arrangements.

Van de Ven, A. H., Polley, D. E., Garud, R., & Venkataraman, S. (1999). The Innovation Journey, New York: Oxford Univ.

Finding, evaluating, refining, and applying empirically supported treatments for children and adolescents

The Child Task Force report represents an important initial step in this direction. Here they offer both praise and critique, suggesting a number of ways the task force process and product may be improved. 

Weisz, J. R., & Hawley, K. M. (1998). Finding, evaluating, refining, and applying empirically supported treatments for children and adolescents. Journal of Clinical Child Psychology27(2), 206-216.

Bridging the Gap Between Laboratory and Clinic in Child and Adolescent Psychotherapy

This article addresses the gap between clinical practice and the research laboratory. We focus on the issue as it relates specifically to interventions for children and adolescents.

Weisz, J. R., Donenberg, G. R., Han, S. S., & Weiss, B. (1995). Bridging the gap between laboratory and clinic in child and adolescent psychotherapy. Journal of consulting and clinical psychology63(5), 688.

More of what? Issues raised by the Fort Bragg study.

The study does suggest that "more is not always better" (L. Bickman, 1996), but more of what? Little is known about the specific interventions that were combined to form the Fort Bragg system of care, so the study does not really reveal what failed or what needs to be changed. 

Weisz, J. R., Han, S. S., & Valeri, S. M. (1997). More of what? Issues raised by the Fort Bragg study.

Stressing the (other) three Rs in the search for empirically supported treatments: Review procedures, research quality, relevance to practice and the public interest

The Society of Clinical Psychology's task forces on psychological intervention developed criteria for evaluating clinical trials, applied those criteria, and generated lists of empirically supported treatments. Building on this strong base, the task force successor, the Committee on Science and Practice, now pursues a three‐part agenda

Weisz, J. R., Hawley, K. M., Pilkonis, P. A., Woody, S. R., & Follette, W. C. (2000). Stressing the (other) three Rs in the search for empirically supported treatments: Review procedures, research quality, relevance to practice and the public interest. Clinical Psychology: Science and Practice7(3), 243-258.

Randomized Trials and Quasi-Experiments in Education Research
This paper examines the benefits and challenges inherent in using randomized clinical trials and quasi-experimental designs in the field of education research.
Angrist, J. D. (2003). Randomized trials and quasi-experiments in education research. NBER Reporter Online, (Summer 2003), 11-14.
The Core Analytics of Randomized Experiments for Social Research
This paper examines the elements of randomized experiments for social research.
Bloom, H. S. (2006). The core analytics of randomized experiments for social research.
Randomized, Controlled Trials, Observational Studies, and the Hierarchy of Research Designs
A study comparing the efficacy of randomized controlled trials to observational studies.
Concato, J., Shah, N., & Horwitz, R. I. (2000). Randomized, controlled trials, observational studies, and the hierarchy of research designs. New England Journal of Medicine, 342(25), 1887-1892.
Can Randomized Trials Answer the Question of What Works?
This article discusses the use of randomized controlled trials as required by the Department of Education in evaluating the effectiveness of educational practices.
EDUC, A. R. O. (2005). Can randomized trials answer the question of what works?.
New Federal Policy Favors Randomized Trials in Education Research
Thisis an article from the Chronicle of Higher Education discussing the pros and cons of randomized controlled trials in education
Glenn, D. A. V. I. D. (2005). New federal policy favors randomized trials in education research. The Chronicle of Higher Education, Retrieved March, 25, 2005.
Implementing Randomized Field Trials in Education: Report of a Workshop
This book examines the use of randomized controlled trial (RCT) studies in education.
Hilton, M., & Towne, L. (Eds.). (2004). Implementing Randomized Field Trials in Education:: Report of a Workshop. National Academies Press.
Single-Case Research: Documenting Evidence-based Practice
Making the case that single-subject design can and should be accepted as an alternative to randomized controlled trials in determining efficacy of practices.
Horner, R. University of Oregon.
Why Most Published Research Findings Are False
This essay discusses issues and concerns that too many research findings may be false. The paper examines reasons a study may prove inaccurate including: the study power and bias, the number of other studies on the same question, and the ratio of true to no relationships. Finally, it considers the implications these problems create for conducting and interpreting research.
Ioannidis, J. P. (2005). Why most published research findings are false. PLoS medicine, 2(8), e124.
Single-Case Designs for Educational Research
This paper examines the benefits and challenges inherent in using of randomized clinical trials and quasi-experimental designs in the field of education research.
Kazdin, A. E. (2011). Single-case research designs: Methods for clinical and applied settings . Oxford University Press.
Single-Case Designs for Educational Research
This book provides a thorough summary of information about the use of single-subject experimental designs in educational research
Kennedy, C. H. (2005). Single-case designs for educational research. Pearson/A & B.
Evaluating Teacher Preparation Programs Using the Performance of their Graduates
This commentary addresses concerns for the use of value-added outcome measures commonly used to evaluate teachers and implications for the use of these metrics to assess the effectiveness of preparation programs.
Koedel, C. and Parsons, E., (2014). Evaluating teacher preparation programs using the performance of their graduates. Teachers College Record. Retrieved November 18, 2014 from http://www.tcrecord.org/Content.asp?ContentID=17741
Single-case research design and analysis: New directions for psychology and education.
This book provides a thorough summary of information about the use of single-subject experimental designs.
Kratochwill, T. R., & Levin, J. R. (1992). Single-case research design and analysis: New directions for psychology and education. Lawrence Erlbaum Associates, Inc.
Single-Case Design Technical Documentation
Single case design has made important contributions to identifying effective educational practices. Until recently, there have been no standards for evaluating the quality and quality of studies across a topic area. These standards were developed by the Institute for Education Science.
Kratochwill, T. R., Hitchcock, J., Horner, R. H., Levin, J. R., Odom, S. L., Rindskopf, D. M. & Shadish, W. R. (2010). Single-case designs technical documentation. Retrieved from What Works Clearinghouse website: http://ies.ed.gov/ncee/wwc/pdf/wwc_scd.pdf.
What Works Clearinghouse: SINGLE?CASE DESIGN TECHNICAL DOCUMENTATION
This paper by a What Works Clearinghouse the panel provides an overview of singlr-subject designs (SCDs), specifies the types of questions that SCDs are designed to answer, and discusses the internal validity of SCDs. The panel then proposes standards to be implemented by the WWC.
Kratochwill, T. R., Hitchcock, J., Horner, R. H., Levin, J. R., Odom, S. L., Rindskopf, D. M., & Shadish, W. R. (2010). Single-case designs technical documentation. What Works Clearinghouse.
Comparing Results of Systematic Reviews: Parallel Reviews of Research on Repeated Reading.
This paper demonstrates that different well accepted methods for reviewing research on repeated readings produces different results.
O’Keeffe, B. V., Slocum, T. A., Burlingame, C., Snyder, K., & Bundock, K. (2012). Comparing Results of Systematic Reviews: Parallel Reviews of Research on Repeated Reading. Education & Treatment of Children (West Virginia University Press), 35(2), 333-366
Combining estimates of effect size
This book is an in depth examination of literature synthesis along with useful advice when one attempts to interpret the results of a meta-analysis.
Shadish, W. R., & Haddock, C. K. (2009). Combining estimates of effect size. The Handbook of Research Synthesis and Meta-analysis, 257-277.
The state of the science in the meta-analysis of single-case experimental designs
This is a review of the issues and methods for conducting a meta-analysis of single-case design research studies.
Shadish, W. R., Rindskopf, D. M. & Hedges, L. V. (2008). The state of the science in the meta-analysis of single-case experimental designs. Evidence-Based Communication Assessment and Intervention, 2(3), 188-196. doi:10.1080/17489530802581603
Evaluating the validity of systematic reviews to identify empirically supported treatments
Systematic reviews are a process for assessing the quality of the literature to determine if a particular practice has met criteria for empirically supported. As with any assessment process there are issues of validity. The concepts and methodological tools of measurement validity can be applied to systematic reviews to identify their strengths and weaknesses.
Slocum, T. A., Detrich, R., & Spencer, T. D. (2012). Evaluating the validity of systematic reviews to identify empirically supported treatments. Education and Treatment of Children, 35(2), 201-233.
TITLE
SYNOPSIS
Journal of Contemporary Clinical Trials
Contemporary Clinical Trials is an international journal that publishes manuscripts pertaining to the design, methods and operational aspects of clinical trials.
Logical Positivism
An overview of Logical Positivism and it’s impact on science and the issue of verifiability.
Spurious Correlations
An important rule of research is; correlation does not equal causation. Just because two events track each other over time does not mean that one caused the other. This web site mines data and uses to humor to make the point that for such correlations are often “Spurious Correlations”.
What Works Clearinghouse (WWC)

The goal of the WWC is a resource for informed education decision-making. The WWC identifies evidence-based practice, program, or policy, and disseminates summary information on the WWC website.

Back to Top