Inside IES Research

Notes from NCER & NCSER

Measuring the Homelife of Students During the Pandemic

As part of the IES-funded Prekindergarten Through Grade 12 Recovery Research Network, the Georgia Policy Labs has been working to gauge the effects of economic insecurity and health stressors on student engagement and achievement during and after COVID-19 era remote learning. In this guest blog, Dr. Tim Sass, principal investigator of the IES-funded project and Distinguished Professor at Georgia State University, discusses a challenge his research team faced while linking home environment and student academic success, highlighting an obstacle to collecting home background survey data from parents/legal guardians. Dr. Sass shares his team’s practical solution to address this challenge.

The Challenge: Difficulty Collecting Data about Student Home Situation

A major challenge to studying out-of-school factors that contribute to student academic success is the lack of information about a student’s home situation. Standard administrative data barely scratch the surface, providing information on language spoken at home, eligibility for subsidized meals (an admittedly crude measure of poverty), and little else. This lack of information became even more problematic during the COVID-19 pandemic, when students were learning from home, and the pandemic had severe effects on many families’ health and economic well-being.

As part of our project, we focus on identifying the factors associated with student engagement and achievement growth in remote instruction. Our initial strategy was to gather survey data in three of our partner school districts in metro Atlanta. We created a questionnaire to measure student engagement as well as collect information on economic insecurity, health stressors, and protective factors like adult monitoring of student learning at home or hiring a tutor. However, we faced two major challenges that made it difficult for us to collect the information we needed.

The first challenge was creating a process for identifying respondents.  Our partners agreed to send out the survey on our behalf, using their established systems for communicating with parents/guardians. Our intent was to make the responses identifiable so we could directly link information gathered from the survey to outcomes for specific students. While one of our partner districts was not comfortable with identifying individual respondents, it agreed to identify respondents’ school of enrollment. A second district agreed to identify respondents, but due to miscommunication within the district, their survey team made the survey anonymous. Finally, the third district agreed to allow linking individual responses to student ID numbers but left it up to parents/guardians to identify their student in the survey, and only about half of respondents filled in their student’s ID number.   

The second challenge was the very low response rates: 192 respondents from District A (0.4% response rate), 1,171 respondents from District B (1.2% response rate), and 80 respondents from District C (0.1% response rate). While disappointing, the low response rates are not unique to our study. Other researchers have struggled to get parents/guardians to respond to surveys conducted during the pandemic or shortly after the resumption of in-person instruction.

The Solution: Using Non-School-Based Administrative Data from Multiple Agencies to Complement Survey Data

Given the low response rates and non-identifiable responses, we considered how we could use additional non-school-based administrative data to complement the survey evidence. Through a partnership with researchers from the Atlanta Regional Commission, the Federal Reserve Bank of Atlanta, and the Georgia Institute of Technology, we obtained data from court records on evictions in the Atlanta Metro Area. The data cover four of the five “core” counties in the Atlanta Metro Area, spanning calendar years 2019-21. Over this period, there were approximately 300,000 unique eviction filings across the four-county area, including both households with and without school-aged children. The court records contain a property address and filing date, which were used to match yearly eviction filings to students based on district student address files.

The following table provides counts of students that we successfully linked to eviction filings by district and year.  “Experienced Eviction” refers to cases where we directly matched an individual dwelling unit to an eviction filing.  In many cases, however, the street address in the eviction filing is for a multi-family structure, and there is not enough information in the filing or in the student address files to directly match a student living in the complex with the unit in which the eviction filing occurred.  When this type of match occurs, we designate it as being “Exposed to Eviction.”  The “Exposed to Eviction” counts include the instances of “Experienced Eviction.” The “Eviction Rate” is the ratio of “Experienced Eviction” to the total number of students in the district.

 

DISTRICT A

DISTRICT B

DISTRICT C

Exper-ienced Eviction

Exposed to Eviction

Eviction Rate

Exper-ienced Eviction

Exposed to Eviction

Eviction Rate

Exper-ienced Eviction

Exposed to Eviction

Eviction Rate

2019​

2,608

12,555

0.043

3,249

22,520

0.030

32

321

 <0.001

2020​

3,412

13,408

0.057

4,467

25,503

0.042

1,246

10,520

0.013

2021​

2,251

10,789

0.041

2,929

19,514

0.029

2,323

9,842

0.024

 

While an eviction filing does not mean that a family was removed from their dwelling, it does indicate that they were at risk of losing their home. Moving forward, we plan to use the eviction filing information as a measure of housing insecurity and economic stress. We will incorporate this metric when estimating models of student achievement growth during the pandemic, and absenteeism and student behavior after students returned to in-person learning. This will give us a sense of the degree to which external factors affected student performance in remote learning as well as the influence of housing and economic insecurity on achievement, engagement, and behavior once students returned to classrooms. Our findings will provide information to school districts and social-service providers on student exposure to economic stress so as to ensure that in-school supports and "wraparound" services are offered to students who need them the most.  


This blog was produced by Haigen Huang (Haigen.Huang@ed.gov), program officer at NCER.

Navigating the ESSER Funding Cliff: A Toolkit for Evidence-Based Financial Decisions

As the federal Elementary and Secondary School Emergency Relief (ESSER) funds approach their expiration date in September 2024, schools and districts across the nation are facing a budgeting season like no other. ESSER funding has played a pivotal role in supporting schools in recovery from the COVID-19 pandemic, but with the deadline looming, districts and charters must take stock of their investments and ensure that programs that are making a positive impact for students continue in a post-ESSER world.

A team at the North Carolina Department of Public Instruction (NCDPI) has been examining COVID-19 learning recovery in the state as part of their Using Longitudinal Data to Support State Education Policymaking project, which is part of the IES-funded RESTART network. In addition to the research, members of the team at NCDPI developed a toolkit to help local leaders make decisions about what programs to continue or discontinue in the face of the upcoming expiration of federal funding to help schools with learning recovery post-pandemic. Through their work in the Office of Learning and Research (OLR) and the Division of Innovation at NCDPI, Rachel Wright-Junio, Jeni Corn, and Andrew Smith are responsible for managing statewide programs, conducting research on innovative teaching practices, and sharing insights using modern data analysis and visualization techniques. In this guest blog, they describe the need for the toolkit, how they developed it, how it is being used, and next steps.

The ESSER Funding Cliff Toolkit: A Data-Driven Approach

To help district and school leaders navigate the financial uncertainties following the end of ESSER funding, the OLR team created a Funding Cliff Toolkit as a starting point for data-driven decision-making based on unique local contexts. The toolkit provides a comprehensive set of resources, including a Return on Investment Framework and Calculator that uses detailed data on ESSER expenditures as well as the impacts on student outcomes of various investments. By using this toolkit, schools and districts can assess what worked during the ESSER funding period, identify areas for improvement, and create sustainable financial plans that ensure effective programs continue regardless of funding.

Knowing the far-reaching implications for this type of tool, the OLR team worked with federal programs and finance leaders across NCDPI. Additionally, they consulted leaders including superintendents and chief financial officers of North Carolina school districts and charter schools in the design process to ensure that the tool met their immediate needs. Finally, Drs. Brooks Bowden, associate professor at the Graduate School of Education at the University of Pennsylvania, and Nora Gordon, professor at the Georgetown University McCourt School of Public Policy,  served as collaborators on the design of the ROI toolkit to ensure validity of the tool.

Rolling Out the Toolkit: Engaging Leaders Across the State

In rolling out the toolkit, the OLR Team intentionally invited diverse stakeholders to the table, including district staff from finance, federal programs, academics, and cabinet-level leadership. It was crucial to bring together the financial, compliance, and programmatic pieces of the “ESSER puzzle” to allow them to work collaboratively to take stock of their ESSER-funded investments and explore academic progress post-pandemic. 

To ensure that the ESSER Funding Cliff Toolkit reached as many district leaders as possible, the OLR Team organized a comprehensive rollout plan, which began with a series of introductory webinars that provided an overview of the toolkit and its components. These webinars were followed by nine in-person sessions, held in each of the eight state board of education regions across North Carolina where over 400 leaders attended. Building upon the initial learning from informational webinars, in-person learning sessions featured interactive presentations that allowed district teams to practice using the tool with simulated data as well as their own. By the end of the session, participants left with new, personalized data sets and tools to tackle the impending ESSER funding cliff. After each session, the team collected feedback that improved the toolkit and subsequent learning sessions. This process laid the groundwork for continued support and collaboration among district and school leaders.

What's Next: Expanding the Toolkit's Reach

Next steps for the OLR Team include expanding the use of the toolkit and working with district and charter schools to apply the ROI framework to help districts make evidence-based financial decisions across all funding sources. Districts are already using the toolkit beyond ESSER-funded programs. One district shared how they applied the ROI framework to their afterschool tutoring programs. Other districts have shared how they plan to use the ROI framework and funding cliff toolkit to guide conversations with principals who receive Title I funds in their schools to determine potential tradeoffs in the upcoming budget year.

As North Carolina schools inch closer to the end of ESSER, the goal is to continue to encourage districts and charters to incorporate evidence-based decision-making into their budgeting and program planning processes. This ensures that districts and schools are prioritizing those programs and initiatives that deliver the most significant impact for students.

In addition to expanding support to North Carolina districts and schools, we also hope that this supportive approach can be replicated in other SEAs across the nation. We are honored to have our toolkit featured in the National Comprehensive Center’s upcoming Communities of Practice (CoP) Strategic Planning for Continued Recovery (SPCR) and believe that cross-SEA collaboration in this CoP will improve the usefulness of the toolkit. 


Rachel Wright-Junio is the director of the Office of Learning and Research (OLR) at the North Carolina Department of Public Instruction (NCDPI); Jeni Corn is the director of research and evaluation in OLR; and Andrew Smith is the deputy state superintendent in the NCDPI Division of Innovation.

Contact Rachel Wright-Junio at Rachel.WrightJunio@dpi.nc.gov for the webinar recording or copies of the slide deck from the in-person sessions.

This guest blog was produced by Corinne Alfeld (Corinne.Alfeld@ed.gov), NCER Program Officer.

IES is Investing in Research on Innovative Financial Aid Programs in Five States

State financial aid programs have the potential to substantially augment the support that students receive from the federal Pell Grant. Federal programs, most notably the Federal Pell Grant program, have historically played the lead role of providing a solid foundation of financial support to students, with states playing the supporting role of providing additional aid to students who meet specific eligibility requirements. In recent years, states have moved to innovate their financial aid programs in ways that have the potential to increase total aid packages, meet a wider range of needs, and serve a broader population of students. The effects of these recent innovations are mostly unknown yet of great interest to state legislators and policymakers. To address this issue, IES is funding a set of five research projects that assess the scope and effects of innovative financial aid programs in California, Connecticut, Michigan, Tennessee, and Washington state. This blog describes how the five projects are contributing to the evidence base.

State financial aid program eligibility rules differ in ways that can substantially alter total aid awards, the scope of the population that can be served, and the ways in which students can use aid funds to meet their various needs while enrolled in college. For example, one key policy attribute that affects the total aid award is whether awards are calculated independently of the Pell Grant­–as “first-dollar” awards that add to the Pell award if state eligibility requirements are met– or as “last-dollar” awards that supplement Pell awards conditional upon eligibility and appropriate-use requirements. Policies including an eligibility requirement for recent high school graduation within the state tend to limit aid access for older and returning students. In addition, financial need requirements can limit or broaden the pool of eligible recipients, depending on family income thresholds. Policies that require completion of the federal FAFSA Form without offering an alternative state application tend to close off access to aid for undocumented immigrants. Merit and high school GPA requirements can close off aid access to students who are otherwise ready for college. Moreover, appropriate-use requirements in some states limit aid usage to tuition and registration expenses while other states allow aid usage for living expenses such as housing and transportation.

Given these variations in program eligibility rules, state officials want to know if their aid programs are reaching targeted student groups, meeting their needs in ways that allow them to focus on their studies, and making a difference in their academic and subsequent labor market outcomes. In an effort to support decision making, IES is funding five projects that are each working closely with state officials to understand the features of their programs and conducting research to assess which students are accessing the programs, the extent of support provided by the programs, and their effects on enrollment in and progression through college. Below is the list of the IES-funded projects.

We are excited to fund these projects and look forward to the findings they will be sharing, starting in fall 2024.


This blog was written by James Benson (James.Benson@ed.gov), program officer in the Policy and Systems team at NCER.

Unlocking Opportunities: Understanding Connections Between Noncredit CTE Programs and Workforce Development in Virginia

With rapid technological advances, the U.S. labor market exhibits a growing need for more frequent and ongoing skill development. Community college noncredit career and technical education (CTE) programs that allow students to complete workforce training and earn credentials play an essential role in providing workers with the skills they need to compete for jobs in high-demand fields. Yet, there is a dearth of research on these programs because noncredit students are typically not included in state and national postsecondary datasets. In this guest blog for CTE Month, researchers Di Xu, Benjamin Castleman, and Betsy Tessler discuss their IES-funded exploration study in which they build on a long-standing research partnership with the Virginia Community College System and leverage a variety of data sources to investigate the Commonwealth’s FastForward programs. These programs are noncredit CTE programs designed to lead to an industry-recognized credential in one of several high-demand fields identified by the Virginia Workforce Board.

In response to the increasing demand for skilled workers in the Commonwealth, the Virginia General Assembly passed House Bill 66 in 2016 to establish the New Economy Workforce Credential Grant Program (WCG) with the goal of providing a pay-for-performance model for funding noncredit training. The WCG specifically funds FastForward programs that lead to an industry-recognized credential in a high-demand field in the Commonwealth. Under this model, funding is shared between the state, students, and training institutions based on student performance, with the goal of ensuring workforce training is affordable for Virginia residents. An important implication of WCG is that it led to systematic, statewide collection of student-level data on FastForward program enrollment, program completion, industry credential attainment, and labor market performance. Drawing on these unique data, coupled with interviews with key stakeholders, we generated findings on the characteristics of FastForward programs, as well as the academic and labor market outcomes of students enrolled in these programs. We describe the preliminary descriptive findings below.

FastForward programs enroll a substantially different segment of the population from credit-bearing programs and offer a vital alternative route to skill development and workforce opportunities, especially for demographic groups often underrepresented in traditional higher education. FastForward programs in Virginia enroll a substantially higher share of Black students, male students, and older students than short-duration, credit-bearing programs at community colleges that typically require one year or less to complete. Focus groups conducted with FastForward students at six colleges indicate that the students were a mix of workers sent by their employers to learn specific new skills and students who signed up for a FastForward program on their own. Among the latter group were older career changers and recent high school graduates, many of whom had no prior college experience and were primarily interested in landing their first job in their chosen field. Moreover, 61% of FastForward participants have neither prior nor subsequent enrollment in credit-bearing programs, highlighting the program’s unique role in broadening access to postsecondary education and career pathways.

FastForward programs offer an alternative path for students who are unsuccessful in credit-bearing programs. The vast majority of students (78%) enrolled in only one FastForward program, with the average enrollment duration of 1.5 quarters, which is notably shorter than most traditional credit-bearing programs. While 36% have prior credit-bearing enrollment, fewer than 20% of these students earned a degree or certificate from it, and less than 12% of FastForward enrollees transitioned to credit-bearing training afterward. Interviews with administrators and staff indicated that while some colleges facilitate noncredit-to-credit pathways by granting credit for prior learning, others prioritize employment-focused training and support over stackable academic pathways due to students’ primary interest in seeking employment post-training.

FastForward programs have a remarkable completion rate and are related to high industry credential attainment rates. Over 90% of students complete their program, with two-thirds of students obtaining industry credentials. Student focus groups echoed this success. They praised the FastForward program and colleges for addressing both their tuition and non-tuition needs. Many students noted that they had not envisioned themselves as college students and credited program staff, financial aid, and institutional support with helping them to be successful.

Earning an industry credential through FastForward on average increases quarterly earnings by approximately $1,000. In addition, industry credentials also increase the probability of being employed by 2.4 percentage points on average. We find substantial heterogeneity in economic return across different fields of study, where the fields of transportation (for example, commercial driver’s license) and precision production (for example, gas metal arc welding) seem to be associated with particularly pronounced earnings premiums. Within programs, we do not observe significant heterogeneity in economic returns across student subgroups.

What’s Next?

In view of the strong economic returns associated with earning an industry credential and the noticeable variation in credential attainment between training institutions and programs, our future exploration intends to unpack the sources of variation in program-institution credential attainment rates and to identify specific program-level factors that are within the control of an institution and which are associated with higher credential rates and lower equity gaps. Specifically, we will collect additional survey data from the top 10 most highly-enrolled programs at the Virginia Community College System (VCCS) that will provide more nuanced program-level information and identify which malleable program factors are predictive of higher credential attainment rates, better labor market outcomes, and smaller equity gaps associated with these outcomes.


Di Xu is an associate professor in the School of Education at UC, Irvine, and the faculty director of UCI’s Postsecondary Education Research & Implementation Institute.

Ben Castleman is the Newton and Rita Meyers Associate Professor in the Economics of Education at the University of Virginia.

Betsy Tessler is a senior associate at MDRC in the Economic Mobility, Housing, and Communities policy area.

Note: A team of researchers, including Kelli Bird, Sabrina Solanki, and Michael Cooper contributed jointly to the quantitative analyses of this project. The MDRC team, including Hannah Power, Kelsey Brown, and Mark van Dok, contributed to qualitative data collection and analysis. The research team is grateful to the Virginia Community College System (VCCS) for providing access to their high-quality data. Special thanks are extended to Catherine Finnegan and her team for their valuable guidance and support throughout our partnership.

This project was funded under the Postsecondary and Adult Education research topic; questions about it should be directed to program officer James Benson (James.Benson@ed.gov).

This blog was produced by Corinne Alfeld (Corinne.Alfeld@ed.gov), NCER program officer for the CTE research topic.

Developing the Vanderbilt Assessment of Leadership in Education (VALED)

As education accountability policies continue to hold school leaders responsible for the success of their schools, it is crucial to assess and develop leadership throughout the school year. In honor of the IES 20th Anniversary, we are highlighting NCER’s investment in leadership measures. This guest blog discusses the Vanderbilt Assessment of Leadership in Education (VALED). The VALED team was led by Andy Porter and included Ellen Goldring, Joseph Murphy and Steve Elliott, all at Vanderbilt University at the time. Other important contributors to the work are Xiu Cravens, Morgan Polikoff, Beth Minor Covay, and Henry May. The VALED was initially developed with funding from the Wallace Foundation and then further developed and validated with funding from IES.

What motivated your team to develop VALED?

There is currently widespread agreement that school principals have a major impact on schools and student achievement. However, at the time we developed VALED, we noticed that there were limited research-based instruments to measure principal leadership effectiveness aligned to both licensure standards and rooted in the evidence base. Prior to the VALED, principal leadership evaluation focused primarily on managerial tasks. However, we believed that principal leadership centered on improving teaching and learning, school culture, and community and parent engagement (often called learning-centered leadership) is at the core of leadership effectiveness.

What does VALED measure?

The VALED is a multi-rater assessment of learning-centered leadership behaviors. The principal, his/her supervisor, and teachers in the school complete it, which is why VALED is sometimes referred to as a 360 assessment or multi-source feedback.

VALED measures six core components and six key processes that define learning-centered leadership. The core components are high standards for student learning, rigorous curriculum, quality instruction, culture of learning and professional behavior, connections to external communities, and performance accountability. The key processes are planning, implementing, supporting, communicating, monitoring, and advocating.

How is the VALED different from other school leadership assessments?

The VALED is unique because it focuses on school leadership behaviors aligned to school improvement and school effectiveness, incorporates feedback and input from those who collaborate closely with the principal, includes a self- assessment, acknowledges the distributed work of leadership in a school, and has strong psychometric properties. We think there are several elements that contribute to the uniqueness of the instrument.

First, VALED is based on what we have learned from scholarship and academic research rather than less robust frameworks such as personal opinions and or unrepresentative samples. The VALED was crafted from concepts identified as important in that knowledge and understanding. The VALED model is based upon knowledge about connections between leadership and learning and provides a good deal of the required support for the accuracy, viability, and stability of the instrument.

Second, principals rarely receive data-based feedback, even though feedback is essential for growth and improvement. The rationale behind multi-source or 360-degree feedback is that information regarding leadership efficacy resides within the shared experiences of teachers and supervisors, collaborating with the principal, rather than from any one source alone. Data that pinpoint gaps between principal’s own self-assessment, and their teachers’ and supervisors’ ratings of their leadership effectiveness can serve as powerful motivators for change.

Finally, in contrast to some other leadership measures, VALED has undergone extensive psychometric development and testing. We conducted a sorting study to investigate content validity and a pilot study where we addressed ceiling effects, and cognitive interviews to refine wording. We also conducted a known group study that showed the tool’s ability to reliably distinguish principals, test-retest reliability, convergent-divergent validity, and principal value-added to student achievement. As part of this testing, we identified several key properties of VALED. The measure—  

  • Works well in a variety of settings and circumstances
  • Is construct valid
  • Is reliable
  • Is feasible for widespread use
  • Provides accurate and useful reporting of results
  • Is unbiased
  • Yields a diagnostic profile for summative and formative purposes
  • Can be used to measure progress over time in the development of leadership
  • Predicts important outcomes
  • Is part of a comprehensive assessment of the effectiveness of a leader's behaviors

What is the influence of VALED on education leadership research and practice?

VALED is used in schools and districts across the US and internationally for both formative and evaluative purposes to support school leadership development. For example, Baltimore City Public Schools uses VALED as a component of their School Leader Evaluations. VALED has also spurred studies on principal evaluation, including the association between evaluation, feedback and important school outcomes, the implementation of principal evaluation, and its uses to support principal growth and development. In addition, it provides a reliable and valid instrument for scholars to use in their studies as a measure of leadership effectiveness.


Andy Porter is professor emeritus of education at the Pennsylvania State University. He has published widely on psychometrics, student assessment, education indicators, and research on teaching.

Ellen Goldring is Patricia and Rodes Hart Chair, professor of education and leadership at Vanderbilt University. Her research interests focus on the intersection of education policy and school improvement with emphases on education leadership.

Joseph Murphy is an emeritus professor of education and the former Frank W. Mayborn Chair of Education at Peabody College, Vanderbilt University. He has published widely on school improvement, with special emphasis on leadership and policy and has been led national efforts to develop leadership standards. 

Produced by Katina Stapleton (Katina.Stapleton@ed.gov), program officer for NCER’s education leadership portfolio.