Categories for Societal Outcomes

Which strategies for increasing students’ opportunities to respond produce the best results?

August 15, 2019

Teacher-Delivered Strategies to Increase Students’ Opportunities to Respond: A Systematic Methodological Review. This systematic review of the literature examines the evidence behind teacher-directed strategies to increase students’ opportunities to respond (OTR) during whole-group instruction. The results indicate teacher-directed OTR strategy of response cards in K-12 school settings to be a potentially evidence-based practice according to the Council for Exceptional Children’s Standards for Evidence-Based Practices in Special Education .

Citation: Common, E. A., Lane, K. L., Cantwell, E. D., Brunsting, N. C., Oakes, W. P., Germer, K. A., & Bross, L. A. (2019). Teacher-delivered strategies to increase students’ opportunities to respond: A systematic methodological review. Behavioral Disorders, 0198742919828310.

Linkhttps://journals.sagepub.com/doi/abs/10.1177/0198742919828310

 


 

Which teaching practices produce the best elementary school writers.

August 15, 2019

Teaching Elementary School Students to Be Effective Writers: A Practice Guide. This What Works Clearinghouse (WWC) practice guide examines the research on how teaching elementary students how to write. The report analyzes the evidence supporting those teaching methods commonly employed to increase student competency in becoming a fluent writer. The guide is for teachers, literacy coaches, principals, districts, and curriculum developers, and other educators. The paper summarizes the available research and provides recommendations on the types of activities and strategies teachers can use to increase student writing proficiency. 

Citation: Graham, S., Bollinger, A., Olson, C. B., D’Aoust, C., MacArthur, C., McCutchen, D., & Olinghouse, N. (2012). Teaching Elementary School Students to Be Effective Writers: A Practice Guide. NCEE 2012-4058. What Works Clearinghouse.

Linkhttps://files.eric.ed.gov/fulltext/ED533112.pdf

 


 

Do later school starting times offer a cost-effective method for improving student performance?

July 29, 2019

Answering the Bell: High School Start Times and Student Academic Outcomes. Research in the area of health and sleep has encouraged educators and policymakers to look to delaying school starting times as an intervention with the potential to improve achievement and other relevant student outcomes. At this time, studies conducted on starting school days at a later time show mixed results. Although, a sufficient number of studies exist to suggest that moving back the start time of school can contribute to improving lagging student performance. This research finds starting school later is associated with reduced suspensions and higher course grades. These studies suggest disadvantaged students may especially benefit from delayed starting times. This study attempts to fill in the research gap on the topic of later start times as much of the earlier research has been conducted using small sample sizes. To increase the sample size needed to confirm previous research, Bastin and Fuller use statewide student-level data from North Carolina to estimate start time effects for all students and traditionally disadvantaged students. Statewide achievement results were mixed, with positive and negative associations found between start times and high school students’ test scores. Bastin and Fuller counsel for further research to increase confidence that later start times predictably produce desired outcomes.  Studies of sufficient rigor, using multiple populations, and across different settings are required to address remaining issues and possible unintended consequences associated with changing start times.  

Citation: Bastian, K. C., & Fuller, S. C. (2018). Answering the Bell: High School Start Times and Student Academic Outcomes. AERA Open4(4), 2332858418812424.

Linkhttps://journals.sagepub.com/doi/pdf/10.1177/2332858418812424

 


 

How effective are attention deficit hyperactivity disorder (ADHD) interventions?

July 29, 2019

Attention Deficit Hyperactivity Disorders and Classroom-Based Interventions: Evidence-Based Status, Effectiveness, and Moderators of Effects in Single-Case Design Research. Students with attention deficit hyperactivity disorder (ADHD) traditionally struggle academically and behaviorally. The issues associated with ADHD challenge teachers in meeting the needs of these students. This meta-analysis of single-case design studies evaluates intervention effectiveness, evidence-based status, and moderators of effects for four intervention types (behavioral, instructional, self-management, and environmental) when implemented with students with ADHD in classroom settings. This study suggests that interventions that target academic learning strategies and behavioral challenges produced medium effect sizes. The instructional and self-management interventions examined in this study were deemed as evidence-based by What Works Clearinghouse standards.

Citation: Harrison, J. R., Soares, D. A., Rudzinski, S., & Johnson, R. (2019). Attention Deficit Hyperactivity Disorders and Classroom-Based Interventions: Evidence-Based Status, Effectiveness, and Moderators of Effects in Single-Case Design Research. Review of Educational Research, 0034654319857038.

Linkhttps://journals.sagepub.com/doi/full/10.3102/0034654319857038

 


 

How can peers increase prosocial behavior in a High School classroom?

July 23, 2019

Tootling with a Randomized Independent Group Contingency to Improve High School Class-wide Behavior. Finding strategies and interventions to positively reinforce students for appropriate behavior while decreasing disruptive behavior is core to the effective management of a classroom. This paper examines the practice of “tootling.” Tootling is a peer-mediated classroom management practice designed to have students identify and then report on peer prosocial behavior. Students are taught to be on the look-out for peer behavior that met the criterion for being reinforced. When they witness prosocial behavior, they write it down on a piece of paper and turn it into the teacher. At the end of the class, three “tootles” are drawn from the lot and read out to the classroom. The results suggest that peer reinforcement had a positive impact on increasing appropriate student behavior, reducing disruptive conduct, and student engagement.

Citation: Lum, J. D., Radley, K. C., Tingstrom, D. H., Dufrene, B. A., Olmi, D. J., & Wright, S. J. (2019). Tootling With a Randomized Independent Group Contingency to Improve High School Classwide Behavior. Journal of Positive Behavior Interventions21(2), 93-105.

Linkhttps://journals.sagepub.com/doi/full/10.1177/1098300718792663

 


 

Does research on gaps in student learning caused by summer breaks hold up?

July 23, 2019

Do Test Score Gaps Grow Before, During, or Between the School Years? Measurement Artifacts and What We Can Know in Spite of Them. Concerns regarding gaps in student achievement for students of lower socio-economic status (SES) and students of color continue to concern educators and the public. One of the more influential studies to examine this issue was the Beginning School Study (BSS) of students in Baltimore City Public Schools in 1982 (Alexander and Entwisle, 2003). The authors found an achievement gap exists at the time student entered elementary school. More importantly, they conclude that the discrepancy in performance widened after each summer break, tripling in size by the end of middle school. 

A more recent study published in 2019 by von Hippel and Hamrock offers evidence to counter the Alexander and Entwisle 2003 claims, suggesting that the growing gap is an artifact of the testing and the measurement methods used in the 2003 research. Von Hippel and Hamrock conclude the scaling method, Thurstone scaling (frequently used in the 1960s and 1970s), is flawed and is responsible for the original findings. The Thurstone scaling method has subsequently been replaced in research by more effective methods such as response theory (IRT). When the data from the study was reanalyzed using IRT, the gaps shrank. The new study concludes that gaps are already significant by the time children start school and remain relatively stable until graduation. 

The von Hippel and Hamrock research looked at test score gaps for a range of populations: between boys and girls; between black, white, and Hispanic children; between the children and the mother’s education; between children in poor and nonpoor families; and the gaps between high-poverty and low-poverty schools. The researchers wanted to know whether gaps grow faster during summer or the school year. They were unable to answer this question as the results were inconclusive. Although, von Hippel and Hamrock did find the total gap in performance from kindergarten to eighth grade, is substantially smaller than the gap that exists at the time children enter school. 

Von Hippel and Hamrock highlight two measurement artifacts that skewed Alexander and Entwisle results: test score scaling and changes of test content. Scaling is a mathematical method that transforms right and wrong answers into a test score. Not all scales produce the same results with important implications for whether and when score gaps happen. Along with concluding that a gap between SES populations tripled between first and eighth grade, Alexander and Entwisle found it was summer vacations where the real gap increased each year. Von Hippel and Hamrock found the BSS used CAT Form C, which was a “fixed-form” paper test. In first grade, all BSS children took a test that contained a fixed or unvarying set of questions in fall and spring. This makes sense when you want to know if students are meeting learning expectations over a specific grade. 

But the Alexander and Entwisle wanted to understand the impact of summer breaks on learning, not during a school year. To obtain this information they were used a test designed for the first grade taken at the end of the school year and compared it to the second-grade test given in the fall. Using the spring test of first grader knowledge, then switching the test to the second-grade test in the fall to measure performance the impact of summer break has the effect of confounding summer learning results. Von Hippel and Hamrock propose that changing the test form had the possible effect of distorting the results. Alexander and Entwisle was not the only seasonal learning study to use fixed forms that changed after the summer. Using fixed form tests was a common practice for research from the 1960s into the 1990s. Von Hippel and Hamrock study suggests the summer learning literature was potentially vulnerable to artifacts related to scaling and changes of test form. 

Fixed form tests have been replaced by the use of adaptive tests less vulnerable to artifacts that might affect summer learning. Adaptive tests do not ask the same questions of all students. Adaptive tests measure ability by increasing the difficulty of questions asked of students based on the student’s earlier performance. Hence, adaptive tests are a better tool to gauge the impact of summer on student achievement. 

The von Hippel and Hamrock study concludes that gaps grow fastest in early childhood. They find no evidence of a gap doubling between first grade and eighth grade and some disparities even shrank. The summer gap growth does not hold up when the flawed instrument is replaced with adaptive tests scored using IRT ability scales. When summer learning gaps are present, most of them are small and not easily detectable. The conclusion is that gaps happen mostly in the first five years of life. Resources currently used to solve a summer learning gap that doesn’t appear to exist should be redirected toward early childhood education.  Von Hippel and Hamrock’s study suggests students who are behind peers at the time they enter kindergarten should receive early remedial instruction as the most efficacious way to improve overall performance.

Citation: von Hippel, P. T., & Hamrock, C. (2019). Do test score gaps grow before, during, or between the school years? Measurement artifacts and what we can know in spite of them. Sociological Science, 6, 43-80.

Link: https://www.sociologicalscience.com/download/vol-6/january/SocSci_v6_43to80.pdf

 


 

States’ Identification of Low Performing Schools Under the Every Student Succeeds Act (ESSA)

May 31, 2019

The Number of Low-Performing Schools by State in Three Categories (CSI, TSI, and ATSI), School Year 2018-19. Every Student Succeeds Act (ESSA) gives individual states significant flexibility as to how they identify “low performing schools”. This decision is extremely important as low performing school triggers mandates for states and districts to invest resources to improve them. The more schools identified, the bigger the responsibilities. ESSA identifies three categories of low-performing schools. Going from most intensive to least they include:  Comprehensive Support and Improvement (CSI) schools, Targeted Support and Improvement (TSI) schools, and Additional Targeted Support and Improvement (ATSI) schools.

Ideally, each state would have consistent standards for identifying schools that are low performing. To date, there is no formal system in place to monitor these new standards. This report, completed by the Center on Education Policy, attempts to provide an initial snapshot of the number and percentages of schools each state has identified low performing. It has limitations in that states are in the early stages of implementation and calibration, states offered various degrees of cooperation, and some states had yet to complete implementation. Still, it does provide an early look at a very diverse set of guidelines.  

The following chart captures their results.  

Center on Education Policy (2019)

The data show a wide range of results in terms of the percentage of schools identified as low performing. The overall range is 3% to 99%, with individual states spread out fairly evenly in between. Eight states identified over 40% of their public schools as low performing, eleven states 20%–40%, fifteen states 11%–19%, and thirteen states 3%–10%. Even with the limitations of the data listed above, this data suggests inconsistent standards across states.

Citation: Stark Renter, D., Tanner, K., Braun, M. (2019). The Number of Low-Performing Schools by State in Three Categories (CSI, TSI, and ATSI), School Year 2018-19. A Report of the Center on Education Policy

Link: https://www.cep-dc.org/displayDocument.cfm?DocumentID=1504

 


 

U.S. College Graduation Rates

May 30, 2019

Persistence, Retention, and Attainment of 2011–12 First-Time Beginning Postsecondary Students as of Spring 2017: First Look. This First Look report provides findings from the 2012/17 Beginning Postsecondary Students Longitudinal Study, a national survey of undergraduate students entering postsecondary education for the first time.  It looks at a cohort of beginning students over a six-year period of time, examining persistence, retention and attainment (degrees conferred). 

The overall graduation rate for first-time, full-time undergraduate students who began seeking a bachelor’s degree at a 4-year degree-granting institution in fall 2012 was 60 percent. That is, by 2017 approximately 60 percent of students had completed a bachelor’s degree. The 6-year graduation rate was 59 percent at public institutions, 74 percent at private nonprofit institutions, and 14 percent at private for-profit institutions.  Such low graduation rates are certainly cause for concern for a host of reasons, not the least because the 40% students not graduating have acquired student debt that cannot be offset by the value of having a degree.

The following data examines postsecondary graduation rates over the past seven years by Institutional control: public, private non-profit, and private for-profit. 

Data from The Conditions of Education (2019, 2018, 2017, 2016, 2015, 2014, 2013)

https://surveys.nces.ed.gov/bb/img/IES_NCES_logo.png

Postsecondary graduation rates in public institutions have stayed virtually the same for seven years, with 57% of students graduating in 2011 and 59% in 2017.  Private non-profit institutions remained at 65-66% graduation for the first six years, increasing by 8 percentage points in 2017.  Private for-profit institutions fared the worst, decreasing consistently from 42% in 2011 to 14% in 2017.  

Citation: Chen, X., Elliott, B.G., Kinney, S.K., Cooney, D., Pretlow, J., Bryan, M., Wu, J., Ramirez, N.A., and Campbell, T. (2019). Persistence, Retention, and Attainment of 2011–12 First-Time Beginning Postsecondary Students as of Spring 2017 (First Look) (NCES 2019-401). U.S. Department of Education. Washington, DC: National Center for Education Statistics. 

Link:https://nces.ed.gov/pubs2019/2019401.pdf

 


 

Are current reading practices bridging the reading gap for student with disabilities?

May 30, 2019

Are Students with Disabilities Accessing the Curriculum? A Meta-analysis of the Reading Achievement Gap between Students with and without Disabilities. A critical goal of federal education policy is improving students with disabilities participation in grade level curriculum. This meta-analysis examines 23 studies for student access to curriculum by assessing the gap in reading achievement between general education peers and students with disabilities (SWD). The study finds that SWDs performed more than three years below peers. The study looks at the implications for changing this pictures and why current policies and practices are not achieving the desired results.

Citation: Gilmour, A. F., Fuchs, D., & Wehby, J. H. (2018). Are students with disabilities accessing the curriculum? A meta-analysis of the reading achievement gap between students with and without disabilities. Exceptional Children. Advanced online publication. doi:10.1177/0014402918795830

Link

https://journals.sagepub.com/doi/abs/10.1177/0014402918795830

https://www.researchgate.net/profile/Allison_Gilmour/publication/327653148_Are_Students_With_Disabilities_Accessing_the_Curriculum_A_Meta-Analysis_of_the_Reading_Achievement_Gap_Between_Students_With_and_Without_Disabilities/links/5b9bd83c299bf13e603155c5/Are-Students-With-Disabilities-Accessing-the-Curriculum-A-Meta-Analysis-of-the-Reading-Achievement-Gap-Between-Students-With-and-Without-Disabilities.pdf

 


 

What is the impact of classroom management for ethnically diverse student populations?

May 15, 2019

Classroom management for ethnic–racial minority students: A meta-analysis of single-case design studies. Demographic changes in the United States support the need to examine the impact of evidence-based classroom management interventions for students from ethic and racially diverse backgrounds. Research consistently shows African-American students receive harsher and exclusionary discipline. Studies also reveal that African American, Latinex, and Native American children are subject to punitive consequences disproportionate to their numbers in the population. This meta-analysis of behavior management strategies includes single-subject designed studies of 838 students from 22 studies for K-12 classrooms. Half of the studies included met the What Works Clearinghouse design standards for the use of single subject design methodology. The study finds the behavior management strategies are highly effective for improving student conduct. Interventions that used an individual or group contingency demonstrated large effects and were the most common behavior management strategies used. The study finds few studies included diverse populations other than African-American students. The authors conclude there exists a need to increase the number of studies of diverse student populations when examining classroom management. They also find a need to improve upon the quality of available studies on the classroom management strategies.

Citation: Long, A. C. J., Miller, F. G., & Upright, J. J. (2019). Classroom management for ethnic–racial minority students: A meta-analysis of single-case design studies. School Psychology, 34(1), 1-13. http://dx.doi.org/10.1037/spq0000305

Linkhttps://psycnet.apa.org/record/2018-65318-001?_ga=2.74983442.1820096213.1557858091-255501757.1557858091