IES Blog

Institute of Education Sciences

Measuring the Homelife of Students During the Pandemic

As part of the IES-funded Prekindergarten Through Grade 12 Recovery Research Network, the Georgia Policy Labs has been working to gauge the effects of economic insecurity and health stressors on student engagement and achievement during and after COVID-19 era remote learning. In this guest blog, Dr. Tim Sass, principal investigator of the IES-funded project and Distinguished Professor at Georgia State University, discusses a challenge his research team faced while linking home environment and student academic success, highlighting an obstacle to collecting home background survey data from parents/legal guardians. Dr. Sass shares his team’s practical solution to address this challenge.

The Challenge: Difficulty Collecting Data about Student Home Situation

A major challenge to studying out-of-school factors that contribute to student academic success is the lack of information about a student’s home situation. Standard administrative data barely scratch the surface, providing information on language spoken at home, eligibility for subsidized meals (an admittedly crude measure of poverty), and little else. This lack of information became even more problematic during the COVID-19 pandemic, when students were learning from home, and the pandemic had severe effects on many families’ health and economic well-being.

As part of our project, we focus on identifying the factors associated with student engagement and achievement growth in remote instruction. Our initial strategy was to gather survey data in three of our partner school districts in metro Atlanta. We created a questionnaire to measure student engagement as well as collect information on economic insecurity, health stressors, and protective factors like adult monitoring of student learning at home or hiring a tutor. However, we faced two major challenges that made it difficult for us to collect the information we needed.

The first challenge was creating a process for identifying respondents.  Our partners agreed to send out the survey on our behalf, using their established systems for communicating with parents/guardians. Our intent was to make the responses identifiable so we could directly link information gathered from the survey to outcomes for specific students. While one of our partner districts was not comfortable with identifying individual respondents, it agreed to identify respondents’ school of enrollment. A second district agreed to identify respondents, but due to miscommunication within the district, their survey team made the survey anonymous. Finally, the third district agreed to allow linking individual responses to student ID numbers but left it up to parents/guardians to identify their student in the survey, and only about half of respondents filled in their student’s ID number.   

The second challenge was the very low response rates: 192 respondents from District A (0.4% response rate), 1,171 respondents from District B (1.2% response rate), and 80 respondents from District C (0.1% response rate). While disappointing, the low response rates are not unique to our study. Other researchers have struggled to get parents/guardians to respond to surveys conducted during the pandemic or shortly after the resumption of in-person instruction.

The Solution: Using Non-School-Based Administrative Data from Multiple Agencies to Complement Survey Data

Given the low response rates and non-identifiable responses, we considered how we could use additional non-school-based administrative data to complement the survey evidence. Through a partnership with researchers from the Atlanta Regional Commission, the Federal Reserve Bank of Atlanta, and the Georgia Institute of Technology, we obtained data from court records on evictions in the Atlanta Metro Area. The data cover four of the five “core” counties in the Atlanta Metro Area, spanning calendar years 2019-21. Over this period, there were approximately 300,000 unique eviction filings across the four-county area, including both households with and without school-aged children. The court records contain a property address and filing date, which were used to match yearly eviction filings to students based on district student address files.

The following table provides counts of students that we successfully linked to eviction filings by district and year.  “Experienced Eviction” refers to cases where we directly matched an individual dwelling unit to an eviction filing.  In many cases, however, the street address in the eviction filing is for a multi-family structure, and there is not enough information in the filing or in the student address files to directly match a student living in the complex with the unit in which the eviction filing occurred.  When this type of match occurs, we designate it as being “Exposed to Eviction.”  The “Exposed to Eviction” counts include the instances of “Experienced Eviction.” The “Eviction Rate” is the ratio of “Experienced Eviction” to the total number of students in the district.

 

DISTRICT A

DISTRICT B

DISTRICT C

Exper-ienced Eviction

Exposed to Eviction

Eviction Rate

Exper-ienced Eviction

Exposed to Eviction

Eviction Rate

Exper-ienced Eviction

Exposed to Eviction

Eviction Rate

2019​

2,608

12,555

0.043

3,249

22,520

0.030

32

321

 <0.001

2020​

3,412

13,408

0.057

4,467

25,503

0.042

1,246

10,520

0.013

2021​

2,251

10,789

0.041

2,929

19,514

0.029

2,323

9,842

0.024

 

While an eviction filing does not mean that a family was removed from their dwelling, it does indicate that they were at risk of losing their home. Moving forward, we plan to use the eviction filing information as a measure of housing insecurity and economic stress. We will incorporate this metric when estimating models of student achievement growth during the pandemic, and absenteeism and student behavior after students returned to in-person learning. This will give us a sense of the degree to which external factors affected student performance in remote learning as well as the influence of housing and economic insecurity on achievement, engagement, and behavior once students returned to classrooms. Our findings will provide information to school districts and social-service providers on student exposure to economic stress so as to ensure that in-school supports and "wraparound" services are offered to students who need them the most.  


This blog was produced by Haigen Huang (Haigen.Huang@ed.gov), program officer at NCER.

Navigating the ESSER Funding Cliff: A Toolkit for Evidence-Based Financial Decisions

As the federal Elementary and Secondary School Emergency Relief (ESSER) funds approach their expiration date in September 2024, schools and districts across the nation are facing a budgeting season like no other. ESSER funding has played a pivotal role in supporting schools in recovery from the COVID-19 pandemic, but with the deadline looming, districts and charters must take stock of their investments and ensure that programs that are making a positive impact for students continue in a post-ESSER world.

A team at the North Carolina Department of Public Instruction (NCDPI) has been examining COVID-19 learning recovery in the state as part of their Using Longitudinal Data to Support State Education Policymaking project, which is part of the IES-funded RESTART network. In addition to the research, members of the team at NCDPI developed a toolkit to help local leaders make decisions about what programs to continue or discontinue in the face of the upcoming expiration of federal funding to help schools with learning recovery post-pandemic. Through their work in the Office of Learning and Research (OLR) and the Division of Innovation at NCDPI, Rachel Wright-Junio, Jeni Corn, and Andrew Smith are responsible for managing statewide programs, conducting research on innovative teaching practices, and sharing insights using modern data analysis and visualization techniques. In this guest blog, they describe the need for the toolkit, how they developed it, how it is being used, and next steps.

The ESSER Funding Cliff Toolkit: A Data-Driven Approach

To help district and school leaders navigate the financial uncertainties following the end of ESSER funding, the OLR team created a Funding Cliff Toolkit as a starting point for data-driven decision-making based on unique local contexts. The toolkit provides a comprehensive set of resources, including a Return on Investment Framework and Calculator that uses detailed data on ESSER expenditures as well as the impacts on student outcomes of various investments. By using this toolkit, schools and districts can assess what worked during the ESSER funding period, identify areas for improvement, and create sustainable financial plans that ensure effective programs continue regardless of funding.

Knowing the far-reaching implications for this type of tool, the OLR team worked with federal programs and finance leaders across NCDPI. Additionally, they consulted leaders including superintendents and chief financial officers of North Carolina school districts and charter schools in the design process to ensure that the tool met their immediate needs. Finally, Drs. Brooks Bowden, associate professor at the Graduate School of Education at the University of Pennsylvania, and Nora Gordon, professor at the Georgetown University McCourt School of Public Policy,  served as collaborators on the design of the ROI toolkit to ensure validity of the tool.

Rolling Out the Toolkit: Engaging Leaders Across the State

In rolling out the toolkit, the OLR Team intentionally invited diverse stakeholders to the table, including district staff from finance, federal programs, academics, and cabinet-level leadership. It was crucial to bring together the financial, compliance, and programmatic pieces of the “ESSER puzzle” to allow them to work collaboratively to take stock of their ESSER-funded investments and explore academic progress post-pandemic. 

To ensure that the ESSER Funding Cliff Toolkit reached as many district leaders as possible, the OLR Team organized a comprehensive rollout plan, which began with a series of introductory webinars that provided an overview of the toolkit and its components. These webinars were followed by nine in-person sessions, held in each of the eight state board of education regions across North Carolina where over 400 leaders attended. Building upon the initial learning from informational webinars, in-person learning sessions featured interactive presentations that allowed district teams to practice using the tool with simulated data as well as their own. By the end of the session, participants left with new, personalized data sets and tools to tackle the impending ESSER funding cliff. After each session, the team collected feedback that improved the toolkit and subsequent learning sessions. This process laid the groundwork for continued support and collaboration among district and school leaders.

What's Next: Expanding the Toolkit's Reach

Next steps for the OLR Team include expanding the use of the toolkit and working with district and charter schools to apply the ROI framework to help districts make evidence-based financial decisions across all funding sources. Districts are already using the toolkit beyond ESSER-funded programs. One district shared how they applied the ROI framework to their afterschool tutoring programs. Other districts have shared how they plan to use the ROI framework and funding cliff toolkit to guide conversations with principals who receive Title I funds in their schools to determine potential tradeoffs in the upcoming budget year.

As North Carolina schools inch closer to the end of ESSER, the goal is to continue to encourage districts and charters to incorporate evidence-based decision-making into their budgeting and program planning processes. This ensures that districts and schools are prioritizing those programs and initiatives that deliver the most significant impact for students.

In addition to expanding support to North Carolina districts and schools, we also hope that this supportive approach can be replicated in other SEAs across the nation. We are honored to have our toolkit featured in the National Comprehensive Center’s upcoming Communities of Practice (CoP) Strategic Planning for Continued Recovery (SPCR) and believe that cross-SEA collaboration in this CoP will improve the usefulness of the toolkit. 


Rachel Wright-Junio is the director of the Office of Learning and Research (OLR) at the North Carolina Department of Public Instruction (NCDPI); Jeni Corn is the director of research and evaluation in OLR; and Andrew Smith is the deputy state superintendent in the NCDPI Division of Innovation.

Contact Rachel Wright-Junio at Rachel.WrightJunio@dpi.nc.gov for the webinar recording or copies of the slide deck from the in-person sessions.

This guest blog was produced by Corinne Alfeld (Corinne.Alfeld@ed.gov), NCER Program Officer.

Celebrating the ECLS-K:2024: Providing Key National Data on Our Country’s Youngest Learners

It’s time to celebrate!

This spring, the Early Childhood Longitudinal Study, Kindergarten Class of 2023–24 (ECLS-K:2024) is wrapping up its first school year of data collection with tens of thousands of children in hundreds of schools across the nation. You may not know this, but NCES is congressionally mandated to collect data on early childhood. We meet that charge by conducting ECLS program studies like the ECLS-K:2024 that follow children through the early elementary grades. Earlier studies looked at children in the kindergarten classes of 1998–99 and 2010–11. We also conducted a study, the Early Childhood Longitudinal Study, Birth Cohort (ECLS-B), that followed children from birth through kindergarten entry.

As the newest ECLS program study, the ECLS-K:2024 will collect data from both students and adults in these students’ lives (e.g., parents, teachers, school administrators) to help us better understand how different factors at home and at school relate to children’s development and learning. In fact, the ECLS-K:2024 allows us to provide data not only on the children in the cohort but also on kindergarten teachers and the schools that educate kindergartners.

What we at NCES think is worthy of celebrating is that the ECLS-K:2024—like other ECLS program studies,

  • provides the statistics policymakers need to make data-driven decisions to improve education for all;
  • contributes data that researchers need to answer today’s most pressing questions related to early childhood and early childhood education; and
  • allows us to produce resources for parents, families, teachers, and schools to better inform the public at large about children’s education and development.

Although smaller-scale studies can answer numerous questions about education and development, the ECLS-K:2024 allows us to provide answers at a national level. For example, you may know that children arrive to kindergarten with different skills and abilities, but have you ever wondered how those skills and abilities vary for children who come from different areas of the country? How they vary for children who attended prekindergarten programs versus those who did not? How they vary for children who come from families of different income levels? The national data from the ECLS-K:2024 allow us to dive into these—and other—issues.

The ECLS-K:2024 is unique in that it’s the first of our early childhood studies to provide data on a cohort of students who experienced the coronavirus pandemic. How did the pandemic affect these children’s early development and how did it change the schooling they receive? By comparing the experiences of the ECLS-K:2024 cohort to those of children who were in kindergarten nearly 15 and 25 years ago, we’ll be able to answer these questions.

What’s more, the ECLS-K:2024 will provide information on a variety of topics not fully examined in previous national early childhood studies. The study is including new items on families’ kindergarten selection and choice; availability and use of home computers and other digital devices; parent-teacher association/organization contributions to classrooms; equitable school practices; and a myriad of other constructs.

Earlier ECLS program studies have had a huge impact on our understanding of child development and early education, with hundreds of research publications produced using their data (on topics such as academic skills and school performance; family activities that promote learning; and children’s socioemotional development, physical health, and well-being). ECLS data have also been referenced in media outlets and in federal and state congressional reports. With the launch of the ECLS-K:2024, we cannot wait to see the impact of research using the new data.

Want to learn more? 

Plus, be on the lookout late this spring for the next ECLS blog post celebrating the ECLS-K:2024, which will highlight children in the study. Future blog posts will focus on parents and families and on teachers and schools. Stay tuned!

 

By Jill McCarroll and Korrie Johnson, NCES

Innovating Math Education: Highlights from IES Learning Acceleration Challenges

A teacher and students work on math problems on a white board

The Institute of Education Sciences (IES) held two Learning Acceleration Challenges during the 2022–23 school year, designed to incentivize innovation in math and science. The Math Prize sought school-based, digital interventions to significantly improve math outcomes, specifically in fractions, for upper elementary school students with or at risk for a disability that affects math performance. An unprecedented number of students are performing below grade level in core academic subjects according to the most recent data from the National Assessment of Educational Progress. In response to this problem, the grand prize required interventions to reach an effect size equal to or exceeding 0.77 on a broad measure of math achievement, the NWEA® MAP™ Growth math assessment. The challenge included two phases: In Phase 1, intervention providers submitted information on their interventions and research plans for implementing and testing their interventions under routine conditions. In Phase 2, selected research teams (finalists) were given $25,000 to implement and test their interventions with a shot at receiving the grand prize.

There were four submissions scored by a panel of judges during Phase 1. Two teams were selected to proceed to Phase 2 of the challenge to implement their intervention in schools: The DRUM (Digital Rational Number) Intervention and the ExploreLearning’s Reflex + Frax intervention. These two interventions were implemented in schools between November 2022 and April 2023 and participating students completed the NWEA MAP Growth math assessment before and after implementation. At the completion of Phase 2, the judging panel scored the Phase 2 submissions according to a rigorous set of criteria that included impact (as evaluated by a randomized controlled trial), cost effectiveness, scalability, and sustainability. Based on the scores received by the finalists, the panel did not recommend awarding any Phase 2 Prizes.

We recognize this challenge was an ambitious and rapid effort to improve math achievement. With the knowledge gained from this challenge, we hope to continue to design opportunities that encourage transformative, innovative change within education. While disappointing, these results shed light on some of the challenges of targeting ambitious improvements in student math achievement:

  • The implementation hurdles experienced by both teams reinforce the difficulties of conducting research in schools, especially in the current post-pandemic era climate. In the present circumstances, many schools face extra strains that may make it challenging to implement new interventions, as is required during an RCT.
  • It has historically been, and continues to be, difficult to create accelerated growth in math achievement for students who are with or at risk for disabilities that affect math performance. An improvement in line with the challenge’s 0.77 effect size criterion for the grand prize would substantially lessen the average achievement gap between students with disabilities and their nondisabled peers—and would be no small feat!
  • Barriers still exist to implementation of a technology-based intervention. For intervention developers, the cost and time required to create a digital intervention can be very large. For schools, the necessary infrastructure and acceptance of digital interventions is not always present.
  • Researching interventions within schools takes a lot of time and resources. Sometimes getting answers to our most pressing educational problems takes time, despite the best efforts of those involved to accelerate this process. The results of this competition underscore the continued need for research to support the significant difficulties of this population of learners.

Thank you to all who participated. We would also like to thank Luminary Labs, the contractor providing support for the IES Learning Acceleration Challenges and the two strong partners they included in the work: NWEA and Abt Associates. We appreciate NWEA’s support in conducting the evaluation of the effects of the intervention on the MAP Growth assessment and Abt Associates for their technical assistance during the Phase 2 implementation. We also appreciate all their work to collect and summarize data to understand what we can learn from the challenges and recommendations from other open innovation initiatives to inform future similar work at IES.

If you have an intervention or an idea for an intervention that could accelerate math achievement for students with or at risk for disabilities, you are encouraged to learn more about additional funding opportunities at IES, and contact Sarah Brasiel, program officer for NCSER’s STEM topic area.

This blog was written by Britta Bresina, NCSER program officer.

The 2023 IES PI Meeting: Building on 20 Years of IES Research to Accelerate the Education Sciences

On May 16-18, 2023, NCER and NCSER hosted our second virtual Principal Investigators (PI) Meeting. Our theme this year was Building on 20 Years of IES Research to Accelerate the Education Sciences. Because it was the IES 20th anniversary this past year, we used this meeting as an opportunity to reflect on and celebrate the success of IES and the education research community. Another goal was to explore how IES can further advance the education sciences and improve education outcomes for all learners.

Roddy Theobald (American Institutes for Research) and Eunsoo Cho (Michigan State University) graciously agreed to be our co-chairs this year. They provided guidance on the meeting theme and session strands and also facilitated our plenary sessions on Improving Data on Teachers and Staffing Challenges to Inform the Next 20 Years of Teacher Workforce Policy and Research and the Disproportionate Impact of COVID-19 on Student Learning and Contributions of Education Sciences to Pandemic Recovery Efforts. We want to thank them for their incredible efforts in making this year’s meeting a big success!

Here are a few highlights:

The meeting kicked off with opening remarks from IES Director, Mark Schneider, and a welcome from the Secretary of Education, Miguel Cardona. Director Schneider spoke about the importance of timeliness of research and translation of evidence to practice. IES is thinking about how best to support innovative approaches to education research that are transformative, embrace failure, are quick turnaround, and have an applied focus. He also discussed the need for data to move the field forward, specifically big data researchers can use to address important policy questions and improve interventions and education outcomes. Secretary Cardona acknowledged the robust and useful evidence base that IES-funded researchers have generated over the last 20 years and emphasized the need for continued research to address historic inequities and accelerate pandemic recovery for students.

This year’s meeting fostered connections and facilitated deep conversations around meaningful and relevant topic areas. Across the three day PI Meeting, we had over 1,000 attendees engaged in virtual room discussions around four main topic areas (see the agenda for a complete list of this year’s sessions):

  • Diversity, Equity, Inclusion, and Accessibility (DEIA)—Sessions addressed DEIA in education research
  • Recovering and Learning from the COVID-19 Pandemic—Sessions discussed accelerating pandemic recovery for students and educators, lessons learned from the pandemic, and opportunities to implement overdue changes to improve education
  • Innovative Approaches to Education Research—Sessions focused on innovative, forward-looking research ideas, approaches, and methods to improve education research in both the short- and long-term
  • Making Connections Across Disciplines and Communities—Sessions highlighted connections between research and practice communities and between researchers and projects across different disciplines and methodologies

We also had several sessions focused on providing information and opportunities to engage with IES leadership, including NCER Commissioner’s Welcome; NCSER Acting Commissioner’s Welcome; Open Science and IES, NCEE at 20: Past Successes and Future Directions; and The IES Scientific Review Process: Overview, Common Myths, and Feedback.

Many  sessions also had a strong focus on increasing the practical impacts of education research by getting research into the hands of practitioners and policymakers. For example, the session on Beyond Academia: Navigating the Broader Research-Practice Pipeline highlighted the unique challenges of navigating the pipeline of information that flows between researchers and practitioners and identified strategies that researchers could implement in designing, producing, and publishing research-based products that are relevant to a broad audience. The LEARNing to Scale: A Networked Initiative to Prepare Evidence-Based Practices & Products for Scaling and The Road to Scale Up: From Idea to Intervention sessions centered around challenges and strategies for scaling education innovations from basic research ideas to applied and effective interventions. Finally, the Transforming Knowledge into Action: An Interactive Discussion focused on identifying and capturing ways to strengthen dissemination plans and increase the uptake of evidence-based resources and practices.  

We ended the three-day meeting with trivia and a celebration. Who was the first Commissioner of NCSER? Which program officer started the same day the office closed because of the pandemic? Which program officer has dreams of opening a bakery? If you want to know the answers to these questions and more, we encourage you to look at the Concluding Remarks.  

Finally, although we weren’t in person this year, we learned from last year’s meeting that a real benefit of having a virtual PI meeting is our ability to record all the sessions and share them with the public. A part of IES’s mission is to widely disseminate IES-supported research. We encourage you to watch the recorded sessions and would be grateful if you shared it with your networks.

We want to thank the attendees who made this meeting so meaningful and engaging. This meeting would not have been a success without your contributions. We hope to see our grantees at the next PI Meeting, this time in-person!

If you have any comments, questions, or suggestions for how we can further advance the education sciences and improve education outcomes for all learners, please do not hesitate to contact NCER Commissioner Liz Albro (Elizabeth.Albro@ed.gov) or NCSER Acting Commissioner Jackie Buckley (Jacquelyn.Buckley@ed.gov). We look forward to hearing from you.