IES Blog

Institute of Education Sciences

Navigating the ESSER Funding Cliff: A Toolkit for Evidence-Based Financial Decisions

As the federal Elementary and Secondary School Emergency Relief (ESSER) funds approach their expiration date in September 2024, schools and districts across the nation are facing a budgeting season like no other. ESSER funding has played a pivotal role in supporting schools in recovery from the COVID-19 pandemic, but with the deadline looming, districts and charters must take stock of their investments and ensure that programs that are making a positive impact for students continue in a post-ESSER world.

A team at the North Carolina Department of Public Instruction (NCDPI) has been examining COVID-19 learning recovery in the state as part of their Using Longitudinal Data to Support State Education Policymaking project, which is part of the IES-funded RESTART network. In addition to the research, members of the team at NCDPI developed a toolkit to help local leaders make decisions about what programs to continue or discontinue in the face of the upcoming expiration of federal funding to help schools with learning recovery post-pandemic. Through their work in the Office of Learning and Research (OLR) and the Division of Innovation at NCDPI, Rachel Wright-Junio, Jeni Corn, and Andrew Smith are responsible for managing statewide programs, conducting research on innovative teaching practices, and sharing insights using modern data analysis and visualization techniques. In this guest blog, they describe the need for the toolkit, how they developed it, how it is being used, and next steps.

The ESSER Funding Cliff Toolkit: A Data-Driven Approach

To help district and school leaders navigate the financial uncertainties following the end of ESSER funding, the OLR team created a Funding Cliff Toolkit as a starting point for data-driven decision-making based on unique local contexts. The toolkit provides a comprehensive set of resources, including a Return on Investment Framework and Calculator that uses detailed data on ESSER expenditures as well as the impacts on student outcomes of various investments. By using this toolkit, schools and districts can assess what worked during the ESSER funding period, identify areas for improvement, and create sustainable financial plans that ensure effective programs continue regardless of funding.

Knowing the far-reaching implications for this type of tool, the OLR team worked with federal programs and finance leaders across NCDPI. Additionally, they consulted leaders including superintendents and chief financial officers of North Carolina school districts and charter schools in the design process to ensure that the tool met their immediate needs. Finally, Drs. Brooks Bowden, associate professor at the Graduate School of Education at the University of Pennsylvania, and Nora Gordon, professor at the Georgetown University McCourt School of Public Policy,  served as collaborators on the design of the ROI toolkit to ensure validity of the tool.

Rolling Out the Toolkit: Engaging Leaders Across the State

In rolling out the toolkit, the OLR Team intentionally invited diverse stakeholders to the table, including district staff from finance, federal programs, academics, and cabinet-level leadership. It was crucial to bring together the financial, compliance, and programmatic pieces of the “ESSER puzzle” to allow them to work collaboratively to take stock of their ESSER-funded investments and explore academic progress post-pandemic. 

To ensure that the ESSER Funding Cliff Toolkit reached as many district leaders as possible, the OLR Team organized a comprehensive rollout plan, which began with a series of introductory webinars that provided an overview of the toolkit and its components. These webinars were followed by nine in-person sessions, held in each of the eight state board of education regions across North Carolina where over 400 leaders attended. Building upon the initial learning from informational webinars, in-person learning sessions featured interactive presentations that allowed district teams to practice using the tool with simulated data as well as their own. By the end of the session, participants left with new, personalized data sets and tools to tackle the impending ESSER funding cliff. After each session, the team collected feedback that improved the toolkit and subsequent learning sessions. This process laid the groundwork for continued support and collaboration among district and school leaders.

What's Next: Expanding the Toolkit's Reach

Next steps for the OLR Team include expanding the use of the toolkit and working with district and charter schools to apply the ROI framework to help districts make evidence-based financial decisions across all funding sources. Districts are already using the toolkit beyond ESSER-funded programs. One district shared how they applied the ROI framework to their afterschool tutoring programs. Other districts have shared how they plan to use the ROI framework and funding cliff toolkit to guide conversations with principals who receive Title I funds in their schools to determine potential tradeoffs in the upcoming budget year.

As North Carolina schools inch closer to the end of ESSER, the goal is to continue to encourage districts and charters to incorporate evidence-based decision-making into their budgeting and program planning processes. This ensures that districts and schools are prioritizing those programs and initiatives that deliver the most significant impact for students.

In addition to expanding support to North Carolina districts and schools, we also hope that this supportive approach can be replicated in other SEAs across the nation. We are honored to have our toolkit featured in the National Comprehensive Center’s upcoming Communities of Practice (CoP) Strategic Planning for Continued Recovery (SPCR) and believe that cross-SEA collaboration in this CoP will improve the usefulness of the toolkit. 


Rachel Wright-Junio is the director of the Office of Learning and Research (OLR) at the North Carolina Department of Public Instruction (NCDPI); Jeni Corn is the director of research and evaluation in OLR; and Andrew Smith is the deputy state superintendent in the NCDPI Division of Innovation.

Contact Rachel Wright-Junio at Rachel.WrightJunio@dpi.nc.gov for the webinar recording or copies of the slide deck from the in-person sessions.

This guest blog was produced by Corinne Alfeld (Corinne.Alfeld@ed.gov), NCER Program Officer.

Introducing NCER’s Federation of American Scientists Fellows

We are excited to have Katherine McEldoon and Alexandra Resch, two Federation of American Scientists (FAS) Impact Fellows, who joined the center in December 2023 to support the Accelerate, Transform, Scale (ATS) Initiative. The ATS Initiative supports advanced education research and development (R&D) to create scalable solutions to improve education outcomes for all learners and eliminate persistent achievement and attainment gaps.

Both of our FAS Fellows have experiences that reinforce the need to start with the science and to use the right methods at the right time to build solutions. They’ve observed that while researchers are great at producing insights about education and learning, and developers are great at building education solutions and technologies, the broader field isn’t yet great at is doing the two together. Through their careers, they’ve come to see rigorous research and development happening together as the path forward to build effective, evidence-based solutions.

In this blog, Alex and Katherine share about their career paths and how their unique experiences and perspectives are suited to help grow the ATS Initiative.

Alexandra Resch

I’ve always been driven by an urge to try to improve our education systems. I often felt bored in school and could see huge gaps in resources and opportunities among classrooms in my school and between my district and others nearby. I studied economics because the quantitative and analytic tools came naturally to me and because I could see the importance that incentives and resource constraints play in understanding how our systems work and how to improve them. I love the lens that economics provides to make sense of the world.

When I finished my PhD in 2008, I got my dream job as a researcher at Mathematica. Among other things, I worked on the What Works Clearinghouse, interesting methods papers, and national studies. I enjoyed these projects, but I started to worry that while I was doing great research, it wasn’t answering the questions practitioners had and wasn’t timely enough to inform their decisions. I gradually started shifting my work to be closer to decisions and decision makers, eventually building out a portfolio of work on rapid cycle evaluation and ways to be opportunistic about generating strong evidence. I also started thinking about how we talk about evidence and whether we’re framing questions and findings to privilege the status quo. I’ve come to believe the questions we ask, the methods we use, and how we describe our results all need to be different if we want to affect how the education system works and make a difference for student learning.

Over the last decade, I’ve developed expertise in R&D, learning about and applying tools and processes for human centered design, continuous improvement, product development, and product management. I haven’t put aside the tools I had from economics, but I have a bigger toolbox and am better able to use the right tool at the right time. I’ve seen progress in recent years in bringing more rigor to product development and more speed and agility to education research. I’m excited to support the work that the ATS initiative is doing to bring researchers and developers closer together into productive partnerships in the service of solving genuine problems for educators and students. 

Katherine McEldoon

Early in my career, I set connecting scientific insights and education practice as my north star, and I haven’t looked back since. I was intrigued with what cognitive sciences could unlock: clear explanatory mechanisms of certain behaviors and beliefs—empirically validated, no less! There were so many insights ripe for the classroom, but why weren’t they being used?

Through my doctoral work at Vanderbilt University and the IES-funded Experimental Education Research Training (ExpERT) program, I grounded myself in cognitive theories of learning and designed instruction using those insights while measuring impact. This cross-training equipped me with the skillset I’d need to conduct a range of efficacy studies and honed my ability to speak multiple academic dialects—a skill that became more important as I grew in my career.

Next, I set my sights on scale-up: first at Arizona State University, where we incorporated a theory of active learning into teacher practice; then by running a state-level evaluation study for an EdTech start-up company; and finally by supporting a networked improvement community with the Tennessee Department of Education. I learned firsthand how many layers we had to work through to bring the "active ingredients” into the learner experience. I also developed an appreciation for the multifaceted collaborations it takes to bring these efforts together.

In 2019, I joined Pearson’s Efficacy and Learning division, where we collaborated with product development teams, providing research-based insights to inform learning design and outcome measurement. We started with insights from the learning sciences and conducted iterative R&D with end-users from ideation, to prototypes and designs, to mature product evaluations. The research perspective kept our eye on conducting development work in a careful, measured, and learning outcomes-focused way. The development perspective kept us centered on researching applied and immediate problems and keeping practical significance at the fore. When done well, the balance of research and development hummed into harmony, and resulted in effective, enjoyable experiences that really worked.

Through my career I’ve learned that instead of asking how do we connect research to practice, the better question is how do we intertwine the research and development process? Not only should we be starting with research-based insights, but we should also be integrating research methods and development processes to build a high quality and useful solution from the start. That’s precisely what we’re working to achieve with the ATS Initiative.


This blog was written by Alex Resch and Katherine McEldoon, Accelerate, Transform, Scale Initiative, NCER.

Evidence on CTE: A Convening of Consequence

In 2018, NCER funded a research network to build the evidence base for career and technical education (CTE). As with other research networks, the CTE Research Network comprises a set of research teams and a lead team, which is responsible for collaboration and dissemination among the research teams. On March 20, 2024, the Network held a final convening to present its findings to the field. In this blog, Network Lead Member Tara Smith, Network Director Kathy Hughes, and NCER Program Officer Corinne Alfeld reflect on the success of the final convening and share thoughts about future directions.

Insights From the Convening

An audience of CTE Research Network members, joined by educators, administrators, policymakers and other CTE stakeholders, gathered for a one-day convening to hear about the Network’s findings. Several aspects of the meeting contributed to its significance.

  • The presentations highlighted an important focus of the Network – making research accessible to and useable for practitioners. The agenda included presentations from four Network member researchers and their district or state partners from New York City and North Carolina. Each presentation highlighted positive impacts of CTE participation, but more importantly, they demonstrated the value of translating research findings into action. Translation involves collaboration between researchers and education agency staff to develop joint research questions and discuss the implications of findings for improving programs to better serve students or to take an innovative practice and scale it to other pathways and schools.
  • Brand-new and much-anticipated research was released at the convening. The Network lead announced a systematic review of all of the rigorous causal research on secondary-level CTE from the last 20 years. This is an exciting advancement for building the evidence base for CTE, which was the purpose of the Network. The meta-analysis found that CTE has statistically significant positive impacts on several high school outcomes, such as academic achievement, high school completion, employability skills, and college readiness. The review found no statistically significant negative impacts of CTE participation. The evidence points to the power of CTE to transform lives, although more research is needed. To guide future research, the review provided a “gap analysis” of where causal research is lacking, such as any impacts of high school CTE participation on academic achievement in college or attainment of a postsecondary degree.
  • National CTE leaders and experts put the research findings into a policy context and broadcasted its importance. These speakers commented on the value of research for CTE advocacy on Capitol Hill, in states, and in informing decisions about how to target resources. Luke Rhine, the deputy assistant secretary of the Office of Career, Technical, and Adult Education (OCTAE) said, “The best policy is informed by practice [...] and the best practice is informed by research.” Kate Kreamer, the executive director of Advance CTE, emphasized the importance of research in dispelling myths, saying that “if the data are not there, that allows people to fill the gaps with their assumptions.” However, she noted, as research increasingly shows the effectiveness of CTE, we must also guard against CTE programs becoming selective, and thus limiting equitable access.

New Directions

In addition to filling the critical gaps identified by the Network lead’s review, other future research questions suggested by researchers, practitioners, and policymakers at the convening include:

  • How can we factor in the varied contexts of CTE programs and the wide range of experiences of CTE students to understand which components of CTE really matter?  What does it look like when those are done well?  What does it take to do them well? Where is it happening?
  • How can we learn more about why students decide to participate in CTE generally and in their chosen pathway? What are the key components of useful supports that schools can provide to help them make these decisions?
  • How do we engage employers more deeply and actively in CTE programs and implement high quality work-based learning to ensure that students are acquiring skills and credentials that are valued in the labor market?
  • What are evidence-based practices for supporting special student populations, such as students with disabilities, or English language learners?
  • How can we harness state longitudinal data systems that link education and employment data to examine the long-term labor market outcomes of individuals from various backgrounds who participated in different career clusters or who had access to multiple CTE experiences?

While IES alone will not be able to fund all the needed research, state agencies, school districts, and even individual CTE programs can partner with researchers to study what works in their context and identify where more innovation and investment is needed. The work of the CTE Research Network has provided a good evidence base with which to start, and a good model for additional research that improves practice and policy. Fortunately, the CTE research field will continue to grow via the support of a new CTE Research Network – stay tuned for more information!


This blog was co-written by CTE Network Lead Member Tara Smith of Job for the Future, CTE Network Director Kathy Hughes of AIR, and NCER Program Officer Corinne Alfeld.

Questions can be addressed to Corinne.Alfeld@ed.gov.

NCES Releases Updated 2022–23 Data Table on School District Structures

The National Center for Education Statistics (NCES) has released an updated data table (Excel) on local education agencies (LEAs)1  that serve multiple counties. This new data table—which was updated with 2022–23 data—can help researchers examine LEA structures and break down enrollment by LEA and county. Read this blog post to learn more about the table and how it can be used to understand structural differences in school districts.

The data table—which compiles data from both the Common Core of Data (CCD) and Demographic and Geographic Estimates (EDGE)—provides county and student enrollment information on each LEA in the United States (i.e., in the 50 states and the District of Columbia) with a separate row for each county in which the agency has a school presence. The table includes all LEA types, such as regular school districts, independent charter school districts, supervisory union administrative centers, service agencies, state agencies, federal agencies, specialized public school districts, and other types of agencies.

LEA presence within a county is determined by whether it had at least one operating school in the county. School presence within a county is determined by whether there is at least one operating school in the county identified in the CCD school-level membership file. For example, an LEA that is coterminous with a county has one record (row) in the listing. A charter school LEA that serves a region of a state and has a presence in five counties has five records. LEA administrative units, which do not operate schools, are listed in the county in which the agency is located.

In the 2022–23_LEA_List tab, column D shows the “multicnty” (i.e., multicounty) variable. LEAs are assigned one of the following codes:

1 = School district (LEA) is in single county and has reported enrollment.

2 = School district (LEA) is in more than one county and has reported enrollment.

8 = School district (LEA) reports no schools and no enrollment, and the county reflects county location of the administrative unit. 

9 = School district (LEA) reports schools but no enrollment, and the county reflects county location of the schools.

In the Values tab, the “Distribution of local education agencies, by enrollment and school status: 2022–23” table shows the frequency of each of the codes (1, 2, 8, and 9) (i.e., the number of districts that are marked with each of the codes in the 2022–23_LEA_List tab):

  • 17,042 LEAs had schools in only one county.
  • 754 LEAs had schools located in more than one county and reported enrollment for these schools (note that in the file there are 1,936 records with this characteristic since each LEA is listed once for every county in which it has a presence).
  • 1,008 LEAs had no schools of their own and were assigned to a single county based on the location of the LEA address. (Typically, supervisory union administrative centers are examples of these LEAs.)
  • 262 LEAs had schools located in one county but did not report enrollment for these schools (note that in the file there are 384 records with this characteristic since each LEA is listed once for every county in which it has a presence).

This data table is part of our effort to meet emerging data user needs and provide new products in a timely manner. Be sure to follow NCES on XFacebookLinkedIn, and YouTube and subscribe to the NCES News Flash to stay informed when these new products are released.

By Tom Snyder, AIR


[1] Find the official definition of an LEA.

[2] See Number and enrollment of public elementary and secondary schools, by school level, type, and charter, magnet, and virtual status: Selected years, 1990–91 through 2018–19Enrollment of public elementary and secondary schools, by school level, type, and charter, magnet, and virtual status: School years 2010–11 through 2021–22 (ed.gov)Number of public elementary and secondary education agencies, by type of agency and state or jurisdiction: 2004–05 and 2005–06; and Number of public elementary and secondary education agencies, by type of agency and state or jurisdiction: School years 2020–21 and 2021–22.

[3] See Education Governance for the Twenty-First Century: Overcoming Structural Barriers to School Reform.

[4] The annual School District Finance Survey (F-33) is collected by NCES from state education agencies and the District of Columbia. See Documentation for the NCES Common Core of Data School District Finance Survey (F-33) for more information.

 

IES Releases a New Public Access Plan for Publications and Data Sharing: What You Need to Know

In 2011, IES took a first step towards supporting what was then a burgeoning open science movement—publication and data sharing requirements for awardees. This growing movement found its first government-wide footing in 2013 with the release of a memo from the White House Office of Science and Technology Policy (OSTP) that provided guidance on the need for federally funded researchers to share publications and develop plans for sharing data.

Since that time, infrastructure and informational support for open science practices have continued to grow across federal funding agencies, and adherence to open science principles has evolved with them. In August 2022, OSTP released a new memo providing updated guidance on open science practices. The memo focused on equity, increasing public access to and discoverability of research, and establishing new data and metadata standards for shared materials.

In this blog post, Dr. Laura Namy, associate commissioner of the Teaching and Learning Division at NCER, and Erin Pollard, project officer for the Education Resources Information Center (ERIC) at NCEE, describe IES’s new Public Access Plan and address some important changes in requirements resulting from the new White House guidance for researchers receiving federal funding.

IES, in collaboration and consultation with other funding agencies, has been developing and implementing new policies and guidance to extend our commitment to open science principles. These new policies serve to support broader access among researchers, educators, and policymakers, as well as the general public whose tax dollars subsidize federally-funded research. The resulting changes will certainly require some adjustments and some learning, and IES will be offering guidance and support as these requirements are implemented.   

IES’s commitment to open science practices is already reflected in our Standards for Excellence in Education Research (SEER principles) and other expectations for awardees. These include—

  • Pre-registering studies
  • Uploading full text of published articles to ERIC
  • Submitting (and adhering to) a data management plan
  • Sharing published data
  • Including the cost of article processing charges (APCs) in project budgets to support publishing open access (OA)

The new policies reflect dual priorities: increasing both immediacy and equity of access. For current grant and contract awards, the requirements in place at the time that awards were made will still apply for the duration of those current awards. For each future award, Requests for Applications/Proposals (RFAs and RFPs), Grant Award Notices (GANs), and contracts will indicate the relevant public access/sharing requirements to identify which requirements are in place for the specific award.

Below are some important changes and what they mean for our IES-funded research community.

All publications stemming from federally funded work will have a zero-day public access embargo.

This means that an open access version must be available in ERIC immediately upon publication for all articles proceeding from federal research funding. The current 12-month grace-period before articles become fully available will be gone. Although we’ve seen this change coming, publishers of journals that are not already open access will need to adapt to this new normal, as will universities and many researchers who do not already routinely publish OA. 

What does this mean for IES-funded researchers? 

IES-funded researchers are already required to upload the full text of all articles to ERIC immediately after acceptance. Until now, ERIC released the full text within 12 months of publication. However, for all NEW grants awarded in fiscal year 2025 (as of Oct 1, 2024) and beyond, this zero-day public access embargo requirement will be in effect. Note that the relevant public access requirement depends on the year that the award was made, not the publication date of the article (for example, articles published in 2025 and beyond based on data collected through grants awarded before 2025 will still be under the 12-month embargo). IES awardees will need to ensure (either through your publisher or your own efforts) that a full-text version of the accepted manuscript or published article is uploaded to ERIC for release as soon as it is available online. To facilitate the transition, we encourage all awardees to publish their work in OA journals where feasible, and to budget for APCs accordingly. IES will provide additional guidance to support researchers in complying with this new requirement.

Data sharing will be required at time of publication, or if unpublished, after a certain time interval, whichever comes first.

This means that data curation and identification of an appropriate data repository will need to occur in advance of publication so that data can be shared immediately after publication rather than as a follow-along activity after publication occurs. Although funding agencies will vary in their sharing timelines, IES anticipates requiring data to be shared at time of publication or (for unpublished work) no later than 5 years after award termination. 

What does this mean for IES-funded researchers? 

All awardees who publish findings based on data collected under a new award made in fiscal year 2025 and beyond will need to release the reported data into a data repository at the time of publication. This calls for a change in data curation practices for many researchers who have focused on preparing their data for sharing post-publication. As noted above, any data that remain unshared 5 years post-award will need to be shared, even if publications are still pending. One best practice approach is to set up the data filesharing templates and curation plans in anticipation of sharing prior to data collection so that data are ready for sharing by the time data collection is complete (see Sharing Study Data: A Guide for Education Researchers). When multiple publications stem from the same data set, we recommend planning to share a single master data set to which additional data may be added as publications are released. Researchers should budget for data curation in their applications to support this activity.

Applications for IES funding have shifted from including a data management plan (DMP) to a data sharing and management plan (DSMP) to foreground the shift in emphasis to routine data sharing. Specific plans for sharing data, documentation, and analytic codes in particular repositories will need to be included. In anticipation of new requirements, we encourage researchers to move away from hosting data sets on personal websites or making them available solely upon request. DSMPs should identify an appropriate publicly available data repository. There is now guidance on Desirable Characteristics of Data Repositories for Federally Funded Research that should be followed whenever feasible. IES will be providing additional guidance on repository selection in the coming year. Principal investigators (PIs) and Co-PIs must be in compliance with data sharing requirements from previous IES awards in order to receive new awards from IES.

Unique digital persistent identifiers (PIDs) will need to be established for all key personnel, publications, awards, and data sets.

Digital object identifiers (DOIs) for journal articles are PIDs that uniquely identify a single version of a single publication and can be used to identify and reference that specific publication. This same concept is now being extended to other aspects of the research enterprise including individual researchers, grant and contract awards, and data sets. Unique PIDs for individuals facilitate tracking of individual scholars across name changes, institution changes, and career-stage changes. Having universal conventions across federal funding agencies for individuals, awards, and data sets in addition to publications will not only facilitate discoverability but will help to link data sets to publications, investigators to grants, grants to publications, etc. This will help both researchers and funders to connect the dots among the different components of your important research activities.

What does this mean for IES-funded researchers? 

All key personnel on new IES-funded projects are now required to establish an individual digital PID (such as ORCID) prior to award. DOIs will continue to be the PID assigned by publishers for publications. Authors reporting on IES-funded data should be vigilant about acknowledging their IES funding in all publications stemming from their IES grant awards. Coming soon, IES-funded researchers should be prepared for new digital PIDs (in addition to the IES-specific award numbers) associated with their grants to ensure consistency of PID conventions across funding agencies. New guidance for PID conventions for awards and data linked to IES-funding is forthcoming. 

The Bottom Line

These changes constitute an important step forward in increasing equitable access to and transparency about IES-funded research activities, and other federal funding agencies are making similar changes. The immediate changes at IES (establishing an individual PID and preparing a DSMP) are not onerous, and the bigger changes still to come (immediate sharing of publications and supporting data, using PIDs to refer to awards and data sets) will be rolled out with guidance and support. 

Please don’t hesitate to reach out to us with questions or concerns at Laura.Namy@ed.gov or Erin.Pollard@ed.gov. Or to learn more, please view the presentation and discussion of Open Science at IES that took place at the 2023 PI Meeting.