IES Blog

Institute of Education Sciences

Public State and Local Education Job Openings, Hires, and Separations for January 2023

As the primary statistical agency of the U.S. Department of Education, the National Center for Education Statistics (NCES) is mandated to report complete statistics on the condition of American education. While the condition of an education system is often assessed through indicators of achievement and attainment, NCES is also mandated to report on the conditions of the education workplace.

As such, NCES has reported timely information from schools. For example, this past December, NCES released data that indicated that public schools have experienced difficulty filling positions throughout the COVID-19 pandemic.1 In order to understand the broader labor situation, NCES is utilizing the Job Openings and Labor Turnover Survey to describe the tightness of the job market.

JOLTS Design

The Job Openings and Labor Turnover Survey (JOLTS), conducted by the U.S. Bureau of Labor Statistics (BLS), provides monthly estimates of job openings, hires, and total separations. The purpose of JOLTS data is to serve as demand-side indicators of labor shortages at the national level.2

The JOLTS program reports labor demand and turnover estimates by industry, including education.3 As such, this analysis focuses on the public state and local education industry (“state and local government education” as referred to by JOLTS),4 which includes all persons employed by public elementary and secondary school systems and postsecondary institutions.

The JOLTS program does not produce estimates by Standard Occupational Classification.5 When reviewing these findings, please note occupations6 within the public state and local education industry vary7 (e.g., teachers and instructional aides, administrators, cafeteria workers, transportation workers). Furthermore, as the JOLTS data are tabulated at the industry level, the estimates are inclusive of the elementary, secondary, and postsecondary education levels.

Analysis

In this blog post, we present selected estimates on the number and rate of job openings, hires, and total separations (quits, layoffs and discharges, and other separations). The job openings rate is computed by dividing the number of job openings by the sum of employment and job openings. All other metric rates (hires, total separations, quits, layoffs and discharges, and other separations) are defined by taking the number of each metric and dividing it by employment. Fill rate is defined as the ratio of the number of hires to the number of job openings, and the churn rate is defined as the sum of the rate of hires and the rate of total separations.8


Table 1. Number of job openings, hires, and separations and net change in employment in public state and local education, in thousands: January 2020 through January 2023

*Significantly different from January 2023 (p < .05).
1 Net employment changes are calculated by taking the difference between the number of hires and the number of separations. When the number of hires exceeds the number of separations, employment rises—even if the number of hires is steady or declining. Conversely, when the number of hires is less than the number of separations, employment declines—even if the number of hires is steady or rising.
NOTE: Data are not seasonally adjusted. Detail may not sum to totals because of rounding.
SOURCE: U.S. Department of Labor, Bureau of Labor Statistics, Job Openings and Labor Turnover Survey (JOLTS), 2020–2023, based on data downloaded April 5, 2023, from https://data.bls.gov/cgi-bin/dsrv?jt.


Table 2. Rate of job openings, hires, and separations in public state and local education and fill and churn rates: January 2020 through January 2023

*Significantly different from January 2023 (p < .05).
NOTE: Data are not seasonally adjusted. Detail may not sum to totals because of rounding.
SOURCE: U.S. Department of Labor, Bureau of Labor Statistics, Job Openings and Labor Turnover Survey (JOLTS), 2020–2023, based on data downloaded April 5, 2023, from https://data.bls.gov/cgi-bin/dsrv?jt.


Overview of January 2023 Estimates

The number of job openings in public state and local education was 303,000 on the last business day of January 2023, which was higher than in January 2020 (239,000) (table 1). In percentage terms, 2.8 percent of jobs had openings in January 2023, which was higher than in January 2020 (2.2 percent) (table 2). The number of hires in public state and local education was 218,000 in January 2023, which was higher than in January 2020 (177,000) (table 1). This suggests there was a greater demand for public state and local education employees in January 2023 than before the pandemic (January 2020), and there were more people hired in January 2023 than before the pandemic (January 2020). The number of job openings at the end of January 2023 (303,000) was nearly 1.4 times the number of staff hired that month (218,000). In addition, the fill rate for that month was less than 1, which suggests a need for public state and local government education employees that was not being filled completely by January 2023.

The number of total separations in the state and local government education industry in January 2023 was not measurably different from the number of separations observed in January 2020 or January 2022. However, there was a higher number of total separations in January 2023 (127,000) than in January 2021 (57,000), which was nearly a year into the pandemic. In January 2023, the number of quits (76,000) was higher than the number of layoffs and discharges (36,000). Layoffs and discharges accounted for 28 percent of total separations in January 2023 (which was not measurably different from the percentage of layoffs and discharges out of total separations in January 2021), while quits accounted for 60 percent of total separations (which was not measurably different from the percentage of quits out of total separations in January 2021). These data suggest that there were similar distributions in the reasons behind the separations within the state and local government education industry between 2021 and 2023 in the month of January.

 

By Josue DeLaRosa, NCES

 


[1] U.S. Department of Education, National Center for Education Statistics. Forty-Five Percent of Public Schools Operating Without a Full Teaching Staff in October, New NCES Data Show. Retrieved March 28, 2023, from https://nces.ed.gov/whatsnew/press_releases/12_6_2022.asp.
 

[2] U.S. Bureau of Labor Statistics. Job Openings and Labor Turnover Survey. Retrieved March 28, 2023, from https://www.bls.gov/jlt/jltover.htm.

[3] For more information about these estimates, see https://www.bls.gov/news.release/jolts.tn.htm.

[4] JOLTS refers to this industry as state and local government education, which is designated as ID 92.

[5] For more information on the reliability of JOLTS estimates, see https://www.bls.gov/jlt/jltreliability.htm.

[6] North American Industry Classification System (NAICS) is a system for classifying establishments (individual business locations) by type of economic activity. The Standard Occupational Classification (SOC) classifies all occupations for which work is performed for pay or profit. To learn more on the differences between NAICS and SOC, see https://www.census.gov/topics/employment/industry-occupation/about/faq.html.

[7] JOLTS data are establishment based, and there is no distinction between occupations within an industry. If a teacher and a school nurse were hired by an establishment coded as state and local government education, both would fall under that industry. (From email communication with JOLTS staff, April 7, 2023.)

[8] Skopovi, S., Calhoun, P., and Akinyooye, L. Job Openings and Labor Turnover Trends for States in 2020. Beyond the Numbers: Employment & Unemployment, 10(14). Retrieved March 28, 2023, from https://www.bls.gov/opub/btn/volume-10/jolts-2020-state-estimates.htm.

Bilingüe, Educación y Éxito: Learning from Dual Language Education Programs

April is National Bilingual/Multilingual Learner Advocacy Month! As part of the IES 20th Anniversary celebration, we are highlighting NCER’s investments in field-initiated research. In this guest blog, Drs. Doré LaForett and Ximena Franco-Jenkins (University of North Carolina Chapel Hill) and Adam Winsler (George Mason University) discuss their IES-funded exploration study, some challenges they encountered due to the COVID-19 pandemic, and how their study contributes to supporting multilingual students.

The BEE Project

Our IES-funded study, called the Bilingualism, Education, and Excellence (BEE) project, was born out of a research partnership initiated by a principal of a Spanish-English dual-language (DLE) elementary school. She noticed that student engagement in DLE classrooms seemed to differ depending on the student’s home language and the language of instruction. This got us thinking about how we as a field know very little about what goes on in two-way immersion (TWI) classrooms in terms of teacher language use, student-teacher relationships, student engagement, and learning outcomes for students who speak Spanish or English at home. Therefore, we were excited for the opportunity to dig deeper into links between language of instruction and academic outcomes for students in a relatively new immigrant community like North Carolina. Specifically, we were interested in whether and how the amount of instruction in English and Spanish is related to improvements in student academic outcomes in English and Spanish.

We conducted extensive individual direct student assessments at the beginning and end of the school year, as well as intensive classroom observations to assess both language of instruction and student on-task engagement during both English and Spanish instruction. Although we are still analyzing the data, preliminary findings suggest that language model (90% Spanish/10% English vs. 50% Spanish/50% English), type of 50/50 model used (switching language of instruction mid-day vs alternating days), and initial student language proficiency all matter for student engagement and academic outcomes assessed in English and Spanish. For some outcomes, students with low language proficiency had lower average spring scores when in the 50/50 model compared with students in the 90/10 model. In contrast, students with high language proficiency had higher average spring scores when in the 50/50 model compared with the 90/10 model. In addition, students who speak mostly English at home have a hard time staying engaged on the Spanish day in 50/50 alternate programs.

Impact of COVID-19 on Our Research and Pivots Made

Although we are excited about these findings, like many other studies, we encountered challenges with conducting our study when the pandemic hit. While some studies may have been able to pivot and resume data collection using a remote platform, we had to pause data collection activities during spring 2020 and the 2020-21 school year given our study design and the context in which our research was being conducted. For instance, we used gold-standard, English/Spanish, parallel direct assessments of children which required it to be in person since on-line versions were not available. Also, classroom- and student-level observations were not possible when instruction was remote because, for example, cameras were turned off or there was a lack of access to remote or hybrid learning platforms, due to issues such as contactless video recording technologies that prioritize the talk of only one individual in the classroom rather than the entire class or do not allow for focused observations of individual student behavior.

Therefore, our top priority was maintaining our partnerships with the school districts during the ‘sleeper year.’ We kept in touch and followed our partners’ lead as to when and how we could resume. Meanwhile, we tried to understand what school districts were doing for DLE instruction (in-person, hybrid, remote) during the pandemic. The research team found it necessary to shift tasks during the pandemic, and our efforts were centered on data management and dissemination activities. Once schools started to reopen in 2021-22, our team continued to be patient and flexible to address the health and visitor regulations of the various school districts. In the end, we had one year of data pre-pandemic, one pandemic year without spring data, and one year of data post-pandemic.

Despite these challenges, we used this opportunity to gather information about the learning experiences of students enrolled in the final year of our study, who had been exposed to remote or hybrid learning during the 2020-21 school year. So, when schools reopened in fall 2021, we asked our schools about what instruction was like during the pandemic, and we also asked teachers and parents what they thought about dual language progress during the 2020-21 school year. Teachers were more likely to report that students made good gains in their language skills over that year compared to parents. Further, parents who reported greater English-speaking learning opportunities during remote instruction tended to speak primarily English at home and have more education. Parents who reported that their child had difficulties participating in remote instruction due to technology tended to speak more Spanish at home and have less education.

These findings show how inequities in the home environment, such as those experienced during the pandemic, may have reduced learning opportunities for some students in DLE programs. This is particularly noteworthy because the social experience of language learning is critical in DLE programs, so reduced opportunities to speak in English and Spanish—particularly for students who are not yet fully bilingual or do not live in bilingual homes, can really undermine the goals of DLE programs. These reduced learning opportunities also give us pause as we consider how best to test for cohort effects, choose appropriate procedures for dealing with the missing data, and proceed cautiously with generalizing findings.

A Focus on Diversity, Equity, and Inclusion

Our research is grounded in the cultural mismatch theory, where DLE programs are hypothesized to produce greater alignment or match with English learners’ (ELs’) home environments compared to non-DLE programs. By design, DLE programs that support heritage languages seek to promote bilingualism, bi-literacy, and biculturalism which bolster ELs’ social capital, increase academic performance and reduce the achievement gap for ELs. Thus, effective DLE programs are examples of anti-racist policies and practices. However, some have suggested that DLE programs may be conferring more benefits for White, native English speakers (that is, the Matthew effect, where the rich get richer) compared to the students whose heritage language and culture is being elevated in DLE programs. This is especially concerning given our data showing a potential exacerbation of the Matthew effect during the pandemic due to a variety of factors (lack of access to technology, less-educated families struggling to support their children during remote instruction) suggesting not only learning loss but also language loss. Our research is attempting to open the black box of DLE programs in such classrooms and examine whether experiences, engagement, and outcomes are similar across language backgrounds. We hope that information from our study about the intersection of language proficiency and language of instruction will facilitate decisions regarding how students are assigned to different language models and ultimately support equitable learning opportunities for students attending DLE programs.


Ximena Franco-Jenkins is an Advanced Research Scientist at the Frank Porter Graham Child Development Institute at the University of North Carolina at Chapel Hill.

Adam Winsler is an Associate Chair Professor at George Mason University.

Doré R. LaForett is an Advanced Research Scientist at the Frank Porter Graham Child Development Institute at the University of North Carolina at Chapel Hill.

This blog was produced by Helyn Kim (Helyn.Kim@ed.gov), Program Officer for the English Learners Portfolio, NCER.

 

Pennsylvania Student Proficiency Rates Rebound Partially from COVID-19-Related Declines

Given the magnitude of the disruption the COVID-19 pandemic caused to education practices, there has been considerable interest in understanding how the pandemic may have affected student proficiency. In this guest blog, Stephen Lipscomb, Duncan Chaplin, Alma Vigil, and Hena Matthias of Mathematica discuss their IES-funded grant project, in partnership with the Pennsylvania Department of Education (PDE), that is looking at the pandemic’s impacts in Pennsylvania.  

The onset of the COVID-19 pandemic in spring 2020 brought on a host of changes to K–12 education and instruction in Pennsylvania. Many local education agencies (LEAs) instituted remote learning and hybrid schedules as their primary mode of educating students, while others maintained in-person learning. Statewide assessments, which were suspended in spring 2020, resumed in 2021 with low participation rates, particularly among students with lower performance before the pandemic. Furthermore, test administration dates varied from spring 2021 to fall 2021. Pennsylvania statewide assessment data reveal that student proficiency rates may have rebounded in 2022, despite remaining below pre-pandemic levels. In grades 5–8, there was a marked increase in proficiency in English language arts (ELA) and a slightly smaller increase in proficiency in math compared to 2021 proficiency rates predicted in recent research. Despite these gains, increasing student proficiency rates to pre-pandemic levels will require additional efforts.

The Pennsylvania Department of Education (PDE) has been committed to providing LEAs with the resources and support necessary to help students achieve pre-pandemic academic proficiency rates. To learn more about changes in how those rates may have been associated with the pandemic, PDE and Mathematica partnered to explore trends in student proficiency data for students in grades 5–8. Given the lower and nonrepresentative participation in the 2021 statewide assessments, as well as the differences in when LEAs administered the assessments, we developed a predictive model of statewide proficiency rates for spring 2021 to produce predicted proficiency rates that would be more comparable to previous and future years. The results revealed that steep declines in proficiency likely occurred between 2019 and 2021 (see Figure 1 below). By spring 2022, proficiency rates in grades 5–8 regained 6 percentage points of their 10 percentage point drop in ELA and nearly 5 percentage points of their 13 percentage point drop in math. Taken together, these results suggest that although the pandemic may have originally been associated with declines in students’ academic proficiency, over time, student proficiency might move back towards pre-pandemic levels.

 

Figure 1. Actual and predicted proficiency rates in grades 5–8 in Pennsylvania, 2015–2022

The figure shows actual proficiency rates from the Pennsylvania System of School Assessment, averaged across grades 5–8, unless marked by either an open or closed circle. Notes: Open circle indicates Statewide assessment cancelled; closed circle indicates predicted proficiency rate. The figure shows actual proficiency rates from the Pennsylvania System of School Assessment, averaged across grades 5–8, unless marked by either an open or closed circle.

Source: Data from 2015–2019 and 2022 are from the Pennsylvania Department of Education. The 2021 data are predicted proficiency rates from Lipscomb et al. (2022a). The figure originally appeared in Lipscomb et al. (2022b).  

 

The next steps for this project will include a strong focus on dissemination of our findings. For example, we will develop a research brief that describes the role of remote learning in shaping academic outcomes beyond proficiency rates and community health outcomes during the pandemic. The findings will help PDE and LEAs refine strategies for supporting vulnerable students and help state policymakers and educators learn from the COVID-19 pandemic—specifically how it might have affected student outcomes and educational inequities.


This blog was produced by Allen Ruby (Allen.Ruby@ed.gov), Associate Commissioner for Policy and Systems Division, NCER.

Measuring In-Person Learning During the Pandemic

Some of the most consequential COVID-19-related decisions for public education were those that modified how much in-person learning students received during the 2020-2021 school year. As part of an IES-funded research project in collaboration with the Virginia Department of Education (VDOE) on COVID’s impact on public education in Virginia, researchers at the University of Virginia (UVA) collected data to determine how much in-person learning students in each grade in each division (what Virginia calls its school districts) were offered over the year. In this guest blog, Erica Sachs, an IES predoctoral fellow at UVA, shares brief insights into this work.

Our Process

COVID-19 has caused uncertainty and disruptions in public education for nearly three years. The purpose of the IES-funded study is to describe how Virginia’s response to COVID-19 may have influenced access to instructional opportunities and equity in student outcomes over multiple time periods. This project is a key source of information for the VDOE and Virginia schools’ recovery efforts. An important first step of this work was to uncover how the decisions divisions made impacted student experiences during the 2020-21 school year. This blog focuses on the processes that were undertaken to identify how much in-person learning students could access.

During 2020-21, students were offered school in three learning modalities: fully remote (no in-person learning), fully in-person (only in-person learning), and hybrid (all students could access some in-person learning). Hybrid learning often occurred when schools split a grade into groups and assigned attendance days to each group. For the purposes of the project, we used the term “attendance rotations” to identify whether and which student group(s) could access in-person school on each day of the week. Each attendance rotation is associated with a learning modality.

Most divisions posted information about learning modality and attendance rotations on their official websites, social media, or board meeting documents. In June and July of 2021, our team painstakingly scoured these sites and collected detailed data on the learning modality and attendance rotations of every grade in every division on every day of the school year. We used these data to create a division-by-grade-by-day dataset.

A More Precise Measure of In-Person Learning

An initial examination of the dataset revealed that the commonly used approach of characterizing student experiences by time in each modality masked potentially important variations in the amount of in-person learning accessible in the hybrid modality. For instance, a division could offer one or four days of in-person learning per week, and both would be considered hybrid. To supplement the modality approach, we created a more precise measure of in-person learning using the existing data on attendance rotations. The new variable counts all in-person learning opportunities across the hybrid and fully in-person modalities, and, therefore, captures the variation obscured in the modality-only approach. To illustrate, when looking only at the time in each modality, just 6.7% of the average student’s school year was in the fully in-person modality. However, using the attendance rotations data revealed that the average student had access to in-person learning for one-third of their school year.

Lessons Learned

One of the biggest lessons I learned working on this project was that we drastically underestimated the scope of the data collection and data management undertaking. I hope that sharing some of the lessons I learned will help others doing similar work.

  • Clearly define terminology and keep records of all decisions with examples in a shared file. It will help prevent confusion and resolve disagreements within the team or with partners. Research on COVID-19 in education was relatively new when we started this work. We encountered two terminology-related issues. First, sources used the same term for different concepts, and second, sources used different terms for the same concept. For instance, the VDOE’s definition of the “in-person modality” required four or more days of access to in-person learning weekly, but our team classified four days of access as hybrid because we define “fully in-person modality” as five days of access to in-person learning weekly. Without agreed-upon definitions, people could categorize the same school week under different modalities. Repeated confusion in discussions necessitated a long meeting to hash out definitions, examples, and non-examples of each term and compile them in an organized file.
  • Retroactively collecting data from documents can be difficult if divisions have removed information from their web pages. We found several sources especially helpful in our data collection, including the Wayback Machine, a digital archive of the internet, to access archived division web pages, school board records, including the agenda, meeting minutes, or presentation materials, and announcements or letters to families via divisions’ Facebook or Twitter accounts.
  • To precisely estimate in-person learning across the year, collect data at the division-by-grade-by-day level. Divisions sometimes changed attendance rotations midweek, and the timing of these changes often differed across grades. Consequently, we found that collecting data at the day level was critical to capture all rotation changes and accurately estimate the amount of in-person learning divisions offered students.

What’s Next?

The research brief summarizing our findings can be downloaded from the EdPolicyWorks website. Our team is currently using the in-person learning data as a key measure of division operations during the reopening year to explore how division operations may have varied depending on division characteristics, such as access to high-speed broadband. Additionally, we will leverage the in-person learning metric to examine COVID’s impact on student and teacher outcomes and assess whether trends differed by the amount of in-person learning divisions offered students.


Erica N. Sachs is an MPP/PhD Student, IES Pre-doctoral Fellow, & Graduate Research Assistant at UVA’s EdPolicyWorks.

This blog was produced by Helyn Kim (Helyn.Kim@ed.gov), Program Officer, NCER.

NCER’s Investments in Education Research Networks to Accelerate Pandemic Recovery Network Lead Spotlight: Dr. Susan Therriault, RESTART Network

We hope you enjoyed yesterday’s network lead spotlight! Today, we would like to introduce Dr. Susan Therriault, director, K–12 Systemic Improvement Portfolio at the American Institute for Research. Dr. Therriault’s network, the PreK-12 Research on Education Strategies to Advance Recovery and Turnaround (RESTART) Network, aims to coordinate activities across research teams and provides national leadership on learning acceleration and recovery from pandemic-induced learning loss, sharing findings from the network with education agencies across the United States. Happy reading!

 

NCER: What are the mission and goals of the PreK–12 RESTART (Research on Education Strategies to Advance Recovery and Turnaround) Network? 

Dr. Therriault: The PreK–12 RESTART Network is an opportunity to develop a coherent and connected research community that speaks directly to the needs of policymakers, leaders, and practitioners. The network focuses on identifying and disseminating evidence-based strategies aligned with the needs of policymakers, leaders, and educators who are serving and supporting accelerated student recovery efforts. This requires the network to identify critical needs of the field and support the research community in developing coherent and coordinated research strategies that build evidence for practices that ensure student recovery—especially among students who have disproportionately struggled in the pandemic context. The PreK–12 RESTART Network will achieve this by need sensing, synthesizing evidence, and building a community that makes meaningful connections between the research community and policymakers, leaders, and educators.

NCER: Why is the PreK–12 RESTART Network important to you? 

Dr. Therriault: The PreK–12 RESTART Network is important to me because, as a researcher, I have watched how the pandemic and subsequent aftershocks of the pandemic have created disruptions to our lives and our public pre-K to 12 education system. The pandemic created fragmentation and division as communities responded and supported individuals in a context marked by social distance and separation. While there are many common challenges across communities, limited social connection affected our ability to share evidence-based solutions and to equitably address the needs of all members of our communities, especially those communities most adversely impacted by COVID-19, including Black and Latinx communities and those marked by poverty and housing and food insecurity.

NCER: How do you think the PreK–12 RESTART Network will impact the pre-K to 12 community?

Dr. Therriault: The PreK–12 RESTART Network is an opportunity to develop a coherent and connected research community that speaks directly to the needs of policymakers, leaders, and practitioners. The key differentiator of the network is that it is purposefully designed to assess needs and engage the research community in providing insight and building evidence for solutions to address those needs.

The network has an important role to play in drawing researchers together to develop measurement solutions and build consensus for approaches to conducting and making meaning of research so that it informs the field. These solutions will be shared with the field to create a more coherent research agenda informed by needs.

The network will ensure that policymakers, leaders, and educators are able to easily access network evidence syntheses and research-team findings through multiple communication formats. Information will be shared through actionable guidance and recommendations purposefully designed for these audiences. A combination of strategies will ensure accessibility. These include digital tools and dashboards that school leaders can easily use to adapt to their circumstances and access to evidence-based strategies by offering recorded webinars, tutorials, videos, and other learning formats.

NCER: What’s one thing you wish more people knew about recovery in pre-K to 12 education? 

Dr. Therriault: The magnitude of the challenge of pandemic recovery in the pre-K to 12 education system is vast and will require new ways of approaching education and support for students. In turn, with the investment of American Rescue Plan funds in schools, this will likely lead to evidence-based innovation and deeper understanding of how to design an education system, district, and school to meet student needs.

The needs of students are varied and highly connected to the experience during the pandemic and after the pandemic; thus, family and community factors are highly relevant and critical to understanding student needs and strategies to address these needs. Recently released NAEP scores provide evidence of the variation and suggest more-significant losses in mathematics and English language arts for students living in low-income households compared to their peers who are not. Further, students living in low-income households were more likely to report not having a place to do work or access to a computer or adequate uninterrupted time compared to their peers. These are critical factors in a remote and even a hybrid learning environment. These differences in experience exacerbate differences in outcomes during the pandemic.

NCER: What are some of the biggest challenges to recovery in pre-K to 12 education? 

Dr. Therriault: The amount of need among students and their families and the fatigued pre-K to 12 education system workforce are the biggest challenges to recovery. Adding to this challenge is the focus on expanding learning time through summer school or longer school days in an effort to accelerate learning. This requires teachers and leaders at a time when, like many of us, they are experiencing burnout.

Finally, we know that most students suffered learning loss and more during the pandemic. Supporting students emotionally as well as academically is necessary for recovery. While many schools have provided emotional support to students prior to the pandemic, the current need is far greater than schools have experienced. This will require extensive outreach to community support and services and additional interventions.

NCER: What are some effective ways to translate education research into practice so that your work will have a direct impact on states, districts, and schools? 

Dr. Therriault: There are several ways we plan to support the translation of research to practice through the PreK12 RESTART Network. These include:

  • Understanding and sharing needs of the field.
    • Conducting needs assessments of the field to examine the evolving needs and share these with the research community.
  • Identifying, sharing, and amplifying evidence-based strategies that respond to the needs of the field.
    • Exploring existing and emerging research to identify evidence-based strategies and interventions that align evidence syntheses with the needs of the field.
    • Providing actionable guidance and recommendations that can be easily implemented across different schools and are customized based on need.
    • Creating digital tools that school leaders and educators can use to adapt to their circumstance.
  • Building a coherent and coordinated research community focused on pandemic recovery research.
    • Connecting and building consensus among researchers through convenings and solutions working groups that address challenges to conducting research in the pandemic context.
    • Empowering research teams to build studies that align with needs in the field through meaningful connections with policymakers, leaders, educators, and other members of the research community.
    • Supporting engagement and preparation of early-career researchers through trainings and networking opportunities.

NCER: What are some barriers to the uptake of the research outcomes by these organizations?

Dr. Therriault: One of the critical barriers to uptake is timing. States, districts, and schools cannot wait for findings and results—they must act to support the students they have in their classrooms right now. The syntheses and reviews of prior research will help point educators in the direction of interventions and other supports that have a strong evidence base. The researchers participating in the RESTART Network will be supported in rapidly sharing and disseminating findings over the course of their studies to inform decision-making about how best to help students’ academic recovery.


Thank you for reading our conversation with Dr. Susan Therriault! We hope you’ve enjoyed getting to know NCER’s network leads throughout our grantee spotlight series. Let us know your thoughts on the series on Twitter at @IESResearch.