NCEE Blog

National Center for Education Evaluation and Regional Assistance

IES Resources for Supporting Student Engagement and Attendance

The United States is facing a chronic absenteeism crisis. Over 14 million students nationwide during the 2021–22 school year were chronically absent. This means that they missed at least 10 percent of school days—equivalent to approximately 18 days in the year. Missing this much instructional time creates significant learning challenges for students and adversely affects student wellbeing. School systems across the nation are looking for ways to address this crisis and the accompanying problems it presents.

IES has created four handouts that discuss research findings and research-based tools from across IES that educators and policymakers can use to improve student attendance and engagement: 

These resources from IES can help educators and policymakers consider different research-based approaches to improving student engagement and attendance. They include ways to partner with families, promote a positive and safe learning environment, use data and early warning systems, and apply cycles of evidence-based continuous improvement. Before selecting any particular strategy to address chronic absenteeism, we recommend all educators consult Applying a Cycle of Evidence-Based Continuous Improvement when Selecting Interventions and Project Components to Improve Attendance. Educators can also go to the REL program page on the IES website to learn more about the program and search for other REL products and resources.

We hope you find these resources helpful. Please send any feedback or questions you may have to ncee.feedback@ed.gov.


Text Messaging with Families to Support Student Attendance

A smiling parent and child sit on a couch looking at a smart phone

Findings from IES-funded research suggest that text messaging can be effective in reducing rates of chronic absenteeism.

What is the text messaging practice?

As schools and districts work to decrease chronic absenteeism rates, text messaging has emerged as an evidence-based practice to increase engagement with families and support their efforts to get students to school regularly. It involves schools sending messages to parents or guardians informing them of their child’s attendance record and encouraging them to get the child back in school. The text messages can include many different messages, discussed below, but generally the approach is to let families know how many total days students have missed. Additionally, the texts often have a message about how important school attendance is and where families can turn to for support if there is an issue that the school should be aware of, such as chronic health challenges or transportation issues. These texts aim to engage families and have them partner with schools to increase attendance.

How easy is it for schools to adopt this practice?

Text messaging is a low-cost practice that districts can adopt to encourage family engagement. The Institute of Education Sciences (IES) at the U.S. Department of Education developed a toolkit that provides information on how districts can develop their own text messaging approach. The toolkit encourages districts to form an attendance team to determine the priorities of the text messaging approach and then to develop the system that can be automated and easily implemented. The toolkit provides guidance to districts on how to incorporate existing student information systems to develop the texts.

What are the different types of text messaging?

IES’s evaluation of the text messaging strategy involved different types of text messaging to determine if certain features improved attendance.[1] The “basic” texts involved a weekly message every Sunday to families about how important attendance is and different ways to overcome potential challenges to attendance. In addition, schools sent automated same-day texts when a child missed a day of school, which were personalized to include the student’s name and the total number of days the student had been absent that school year. These basic texts can be framed to emphasize the benefits of attending school (i.e., “Going to school every day can help [the child’s name] learn math and reading.”) or the consequences of missing school (i.e., “Children who miss 2 or more days a month starting in elementary school are less likely to graduate from high school.”)

The IES evaluation also included “intensified texts,” which were targeted to families of students who had already missed many school days, despite receiving the basic texts in the previous school semester. These texts either consisted of school staff reaching out directly to families to increase family engagement and to provide individualized support, or an automated text message asking families to set goals for their child to attend school every day in the upcoming school week.

How effective is the text messaging practice at improving attendance?

The evaluation study by IES found that, regardless of the type or framing of text messaging used by the district, the percent of students who were identified as chronically absent decreased by 12 to 18 percent when schools implemented the text messaging strategy.  On average, basic text messaging was sufficient to increase overall attendance in schools. Among students with a prior history of chronic absenteeism, intensified text messages further decreased the chronic absenteeism rate. Thus, schools might benefit by implementing a basic text messaging strategy for all students and targeting students with records of chronic absenteeism to receive the intensive texts. Since the effectiveness did not vary by framing or approach, districts and schools can identify a strategy that meshes with their school mission and approach.

Where can I learn more about how to implement text messaging?

The IES toolkit on using text messaging provides step-by-step information on how districts and schools can adopt and implement the text messaging strategy to support their families and students. The toolkit contains strategies for both the leadership team as well as the IT team developing the text messaging system using the student information system that the district already uses. This toolkit also contains many examples of different versions of texts that the district can employ, based on the age of students, how frequently the district decides to send text messages, and the framing the district decides to use.

Resources

Heppen, J.B., Kurki, A., & Brown, S. (2020). Can texting parents improve attendance in elementary school? A test of an adaptive message strategy (NCEE 2020-006). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance. Retrieved from https://ies.ed.gov/ncee

Kurki, A., Heppen, J.B., & Brown, S. (2021). How to text message parents to reduce chronic absence using an evidence-based approach (NCEE 2022-001). Washington, DC: U.S. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance. Retrieved from https://ies.ed.gov/ncee


Using Data and Early Warning Systems to Improve Student Attendance and Engagement

Six students with their backs facing the camera walk toward a school entrance

Findings from IES-funded research suggest that early warning systems can help school systems improve student attendance:

To learn more about designing and implementing an early warning system:

  • Read the Forum Guide to Early Warning Systems (2018) published by the National Forum on Education Statistics.  The guide provides information and best practices to help education agencies plan, develop, implement, and use an early warning system in their agency to inform interventions that improve student outcomes. The document includes a review of early warning systems and their use in education agencies and explains the role of early warning indicators, quality data, and analytical models in early warning systems. It also describes how to adopt an effective system planning process and recommends best practices for early warning system development, implementation, and use. The document highlights seven case studies from state and local education agencies who have implemented, or are in the process of implementing, an early warning system.
  • Read the transcript and accompanying materials from the REL webinar, Using Attendance Data for Decisionmaking: Strategies for State and Local Education Agencies (2018, REL West). The webinar includes a discussion about the Forum Guide to Early Warning Systems and Attendance Works’ Key Ingredients for Systemic Change.  Presenters Sue Fothergill (Attendance Works) and Laura Hansen (Metro Nashville Public Schools) share highlights from their work conducting “deep dives” into student attendance data, including understanding the reasons that students are absent and building effective interventions to directly address them. They will discuss the importance of accurately tracking student attendance data and how it can be used to make decisions in policy and practice that will support students who are chronically absent get back on track with their attendance.
  • Watch the State Longitudinal Data System (SLDS) program’s webinar, Supporting LEA Early Warning Systems with SEA Support and Infrastructure, (2023, SLDS). This webinar includes presentations by representatives from three State Education Agencies about the SEA role in supporting LEA early warning systems.
  • Watch the REL webinar, Connecting with Parents about Early Warning Systems (2016, REL Midwest). This webinar is intended for a state education agency audience and discusses  strategies for communicating with parents about early warning systems.
  • Read the REL report, Using Data from Schools and Child Welfare Agencies to Predict Near-Term Academic Risks (REL Mid-Atlantic, 2020) to learn about an approach for developing a model that predicts near-term academic problems such as absenteeism, suspensions, poor grades, and low performance on state tests. The report provides information for administrators, researchers, and student support staff in local education agencies who are interested in identifying students who are likely to have near-term academic problems such as absenteeism, suspensions, poor grades, and low performance on state tests. The report describes an approach for developing a predictive model and assesses how well the model identifies at-risk students using data from two local education agencies. It also examines which types of predictors--in-school variables (performance, behavior, and consequences) and out-of-school variables (human services involvement and public benefit receipt)--are individually related to each type of near-term academic problem to better understand why the model might flag students as at risk and how best to support these students. The study finds that predictive models using machine learning algorithms identify at-risk students with moderate to high accuracy.
  • Read the REL report, Comparing methodologies for developing an early warning system (REL Southeast, 2015). The purpose of this report was to explicate the use of logistic regression and classification and regression tree (CART) analysis in the development of early warning systems. It was motivated by state education leaders' interest in maintaining high classification accuracy while simultaneously improving practitioner understanding of the rules by which students are identified as at-risk or not at-risk readers. 

Partnering with Families to Support Student Attendance and Learning

A parent and child speak with a teacher holding a tablet

Resources from the Regional Educational Laboratory (REL) Program

Schools and districts can use the following REL tools and resources to support family engagement broadly:

  • Toolkit of Resources for Engaging Families and the Community as Partners in Education (REL Pacific, 2016) The Toolkit of Resources for Engaging Families and the Community as Partners in Education is a four-part resource that brings together research, promising practices, and useful tools and resources to guide educators in strengthening partnerships with families and community members to support student learning. The toolkit defines family and community engagement as an overarching approach to support family well-being, strong parent-child relationships, and students' ongoing learning and development. The primary audiences for this toolkit are administrators, teachers, teacher leaders, and trainers in diverse schools and districts.
  • Pillars for Family Engagement: Foundation for Meaningful & Equitable School & Family Partnerships (REL Mid-Atlantic, 2021) This video highlights a research-based family engagement framework that identifies practices that are meaningful for schools in Delaware. The goal is to help support school districts in adopting and implementing research-based family engagement practices.
  • Go-Learn-Grow Toolkit: Improving the School Attendance of New Jersey’s Youngest Learners (REL Mid-Atlantic, 2019) New Jersey Department of Education and REL Mid-Atlantic created this toolkit of simple, easy-to-use resources and handouts to support districts, schools, and early childhood providers in improving school attendance in pre-kindergarten and kindergarten. The goals of these materials are to: help educators and families understand the importance of attendance in the early grades; encourage schools to gather and include data on preschool students when reporting chronic absenteeism rates on school report cards; help schools collect information from families to help identify reasons for absenteeism in the early grades; and provide guidance on selecting and implementing research-based strategies to improve attendance in pre-kindergarten and kindergarten, based on the identified challenges.

Other REL tools and resources can support family engagement with specific types of academic content:


Promoting a Positive School Climate and Safe Learning Conditions

A teacher holds the door as smiling students leave school

All students should be afforded safe, supportive, and fair learning environments. Reducing exclusionary discipline actions is one strategy leaders may seek to use in service of that larger goal. Schools and districts can use the following REL tools and resources to support more equitable and less punitive discipline practices.

Two REL tools to support schools and districts with analyzing disciplinary data:

  • School discipline data indicators: A guide for districts and schools (REL Northwest, 2017) This guide is designed to supply educators with a means to identify whether disproportionality in discipline practice across different student groups—such as those informed by gender, race/ethnicity, or disability status—exists in their schools or districts. It also aims to help educators use data to reduce disproportionality in suspensions and expulsions.  
  • Analyzing student-level disciplinary data: A guide for districts (REL Northeast & Islands, 2017) This guide provides information on how to conduct such an examination and explores differences in student academic outcomes across the types of disciplinary actions that students receive. It serves as a blueprint to assist districts with designing and carrying out their own analyses and engaging with external researchers who are doing the same. 

One REL resource supports using that data to improve discipline policies: 

  • Using Data to Promote Equity in School Discipline (REL Northwest, 2019) REL Northwest developed this training series to help schools and districts improve their school discipline policies and practices. The series provides resources to help school and district teams use data to identify areas of concern related to the overuse of exclusionary discipline or disproportionality in assigning discipline to student groups, such as students of color or students with disabilities. The training series also helps teams use evidence to identify interventions, develop an action plan, track their effectiveness, and inform improvement decisions.

The following REL resources can be used to support schools and districts with improving school climate. 

Check out these REL resources on trauma and student mental health.

This blog was produced by Casey Archer (casey.archer@ed.gov), education research scientist and contracting officer’s representative for the WWC program; Liz Eisner (elizabeth.eisner@ed.gov), associate commissioner for NCEE's Knowledge Use Division; and Janelle Sands, (janelle.sands@ed.gov), research analyst and contracting officer’s representative for the REL program. 


[1] The study was conducted using 26,843 elementary school students during the 2017-2018 school year, prior to the COVID-19 pandemic. While absenteeism rates have increased nationally post-pandemic, the practice may still help schools to increase family engagement and encourage student attendance.

Getting to Know ED: My Journey as a STEM Next Fellow at IES

This guest blog was contributed by Dr. Holly Miller, who currently serves as a STEM Next Opportunity Fund Fellow at the Institute of Education Science’s National Center for Education Evaluation.

Since August 2022, I’ve been serving as the STEM Next Opportunity Afterschool and Summer Learning Fellow at the U.S. Department of Education (ED). More specifically, I work within the Department’s Institute of Education Sciences (IES).

Upon arriving at IES, I was charged with a specific challenge: amplify how evidence-based practice in out-of-school time (OST) can support student learning and development. This mission was made all the more relevant by the need for states and districts to respond to the consequences of the COVID pandemic which, at the time, remained an official national emergency.

Perhaps naively, I hoped to walk in on Day One and find “The Official Compendium of Evidence-based Practices in Global Pandemics and Related Crises” that I could pull off the shelf and hand to educators. Unfortunately, I quickly discovered no such tome existed. And I began to realize that one of the biggest challenges I’d face in my new role was getting to know ED itself! To an outsider, the Department can seem like a huge machine. Getting to know it, though, can pay incredible dividends. As I came to learn, there are tons of great resources—if only you know where to look.

One of OST educators’ first stops in getting to know ED should be IES. For the uninitiated, IES is the Department’s statistics, research, and evaluation arm. The mission of IES is to provide scientific evidence on which to ground education practice and policy and to share this information in formats that are useful and accessible to educators, parents, policymakers, researchers, and the public. It is independent and non-partisan.

Across its four centers—the National Centers for Education Statistics, Education Evaluation, Education Research, and Special Education Research—IES conducts six broad types of work (http://ies.ed.gov):

1. Providing data to describe the “condition of education,” including students’ academic proficiency.

2. Conducting surveys and sponsoring research projects to understand where education needs improvement and how these improvements might be made.

3. Funding development and rigorous testing of new approaches for improving education outcomes for all students.

4. Conducting large-scale evaluations of federal education programs and policies.

5. Providing resources to increase the use of data and research in education decision-making, including independent reviews of research on “what works” in education through the What Works Clearinghouse.

6. Supporting the advancement of statistics and research through specialized training and development of methods and measures.

I could see that this work had the potential to benefit a variety of stakeholders—teachers, administrators, students, researchers, and policymakers. Still, I had so many unanswered questions. As a middle school teacher, I frequently told students, “The only dumb question is the one you don’t ask.” Therefore, as I surveyed the education research landscape at IES, I asked lots and lots of questions. My presence at IES was akin to a toddler at the zoo for the first time: “What are those? Why is that so big? Why don’t we have more of these? When do we eat?” Months of asking and I find my queries have been distilled into two essential questions:

  1. What has been the impact of the COVID pandemic on students and educators; and

 

  1. How can education research, like that conducted or sponsored by IES, help us understand—and address—those impacts?

What has been the impact of the COVID-19 pandemic?

The pandemic disrupted nearly every aspect of daily life in the United States, including the education system. One of the most alarming impacts of the pandemic on education has been the widening of pre-existing gaps in student achievement and the resources that students need to be successful.

We all know the statistics … students have lost tons of learning. The "Report on the Condition of Education" is a congressionally mandated annual report from the National Center for Education Statistics (NCES). Using the most recent data available from NCES and other sources, the report contains key indicators on the condition of education in the United States at all levels, from prekindergarten through postsecondary, as well as labor force outcomes and international comparisons. For example, the report on the condition of education 2023 recently released shares that on both the 4th- and 8th-grade NAEP mathematics assessments, higher percentages of students performed below NAEP Basic in 2022 than in 2019 (Irwin et al., 2023).  This has been particularly bad among students who have historically been underserved. The average NAEP mathematics scores in 2022 were generally lower for English Learners (EL) students than for non-EL students; lower for those identified as students with disabilities than for their peers without disabilities; and higher for students in low-poverty schools than for students in high-poverty schools. These patterns were similar to those observed for reading (Irwin et al., 2023).

This is surely due, at least in part, to differences in the resources students have access to. Even before the pandemic, huge gaps in resources existed. The pandemic only made matters worse. According to a report by the U.S. Department of Education’s Office for Civil Rights (OCR) (2021), low-income students and students of color have been disproportionately negatively impacted by school shutdowns and remote learning practices. These students often lack access to reliable technology and internet resources, making it difficult for them to participate fully in online classes and complete assignments. Additionally, many students rely on meals provided by schools, so the closure of physical school buildings has led to food insecurity for some.

Also of note: the dramatic effect on student wellbeing. During the pandemic, mental health concerns such as fear, anxiety, and depression were common among the general public, especially children and older adults (Brooks et al., 2020; Pfefferbaum & North, 2020).  Research on the pandemic’s impact on mental health among students finds that “they showed increased fear, stress, and decreased happiness, and these were associated with their learning quality change.” (Hu et al., 2022).

Furthermore, the impact of COVID on educators is increasingly well-known. Educators had to make changes in short order, often with limited resources. This had consequences. Educators faced increased stress levels due to the shift to remote instruction, and many reported struggling to maintain a work-life balance while working from home. Findings indicate teachers reported greater mental health concerns than those in many other professions, and that remote teachers reported significantly higher levels of distress than those teaching in person (Kush et al., 2021). For some, it was too much, and they made the decision to leave the profession. Forty percent of public schools hiring for open teaching positions in special education in 2020–21 reported having difficulties filling the opening, compared with 17 percent in 2011–12 (Irwin et al., 2023) Not only were teachers leaving the workforce, but potential teachers were second-guessing their career choice. The number of persons enrolled in traditional teacher preparation programs decreased by 30 percent between 2012–13 and 2019–20, and the number of persons completing such programs decreased by 28 percent between 2012–13 and 2019–20 (Irwin et al., 2023).  

All of us are looking for solutions to all these problems. Given that I entered IES during the pandemic, I wanted to know how I could leverage its resources to help.

How can education research help?

First, I had to understand how IES, as a science agency, was structured to do the work of education research. My college textbook on education research (Newby, 2010) asserted that it should have three objectives: to explore issues and find answers to questions, to collect and disseminate information that shapes policy and decision-making, and to improve practice for practitioners.

It’s easy to see how the six broad areas of work at IES I listed above fit within those three objectives. For example, in normal (that is, pre-COVID) times, it’s the job of the National Center for Education Statistics (NCES) to collect and disseminate education-related statistics and information about student achievement to inform the work of researchers, policymakers, and other education decision-makers. IES’ two research Centers, the National Centers for Education Research (NCER) and Special Education Research (NCSER) support researchers’ exploration of a wide range of education topics and their use of high-quality methods to answer important questions of policy and practice. Finally, the National Center for Education Evaluation and Regional Assistance (NCEE) conducts its own rigorous evaluations of federal policies and programs; supports states and districts in the use of data, evidence, and applied research to improve local practice; and disseminates information about “what works” through its What Works Clearinghouse (WWC). In the wake of the pandemic, IES had to quickly focus its activities and resources to meet new demands across the education system. Here are just a few of the new questions that IES had to address amid the pandemic.

  • What’s happening in schools, and who is learning in-person versus virtually or in hybrid settings? In late 2021, NCES leveraged work being done as part of the National Assessment of Educational Progress (NAEP) to meet an immediate need to better understand schools’ policies about learning mode, masking, and social distancing. In the weeks that followed, the School Pulse Panel was created (https://ies.ed.gov/schoolsurvey/spp/). Initially, the School Pulse focused on collecting monthly information on the impact of the COVID-19 pandemic from a national sample of elementary, middle, high, and combined-grade public schools. Over time, its focus has broadened. While some survey questions are asked repeatedly to observe trends over time, others are unique each month. IES is now able to provide regular and near-real-time snapshots into “what’s happening” in the nation’s schools on a wide range of topics that matter to educators, policymakers, and families.

 

 

  • How can educators and caregivers support student learning in online, hybrid, and at-home settings? With schools closed and remote learning becoming the norm, educators and caregivers had to adapt their teaching methods and find new ways to engage students. As part of a mandate to provide assistance about “what works” in education, NCEE supported a series of efforts to bring together information for teachers navigating online and hybrid teaching environments and for caregivers who were providing instruction at home. NCEE commissioned work leading to the development of the “Best Practice in K-12 Online Teaching” minicourse (here), freely available from North Carolina State University, to support teachers new to online education in their transition to the medium. (The literature review on which the mini-course is based can be found here). NCEE’s Regional Educational Laboratories developed nearly 200 pandemic-related resources. Notable examples include “Supporting Your Child’s Reading at Home” (https://ies.ed.gov/ncee/rel/Products/Region/southeast/Resource/100679), which focuses on the development of early literacy skills, and “Teaching Math to Young Children for Families and Caregivers” (https://ies.ed.gov/ncee/rel/Products/Region/central/Resource/100652).

   

Since its inception in 2002, IES and its Centers have supported decision-makers—be they federal, state, or local—and educators in making use of high-quality evidence in their practice. The pandemic showed just important IES, its resources, and its infrastructure, can be.

In the pandemic’s wake, though, it seems to me that building even more evidence about “what works” is vital. The American Rescue Plan (ARP) provided historic levels of resources to expand educational opportunities and to ensure that education is better able to address the wide-ranging needs of students and their families – especially those who were disproportionately impacted by the pandemic. Many ARP investments, including those related to OST, have the requirement that programs be rooted in evidence-based practices. Because there are still things to learn about what makes strong programs, we can strengthen the field by building evidence that can address key problems of practice.

Conclusion

When I came to ED and IES, searching for information on how to use evidence-based practices to support COVID recovery within the context of OST, I was lost. As I’ve come to better understand the organization, I’ve learned that vast resources are available. Half of the battle was just figuring out “what lives where” within the Department! I hope this blog has given OST practitioners a bit of a roadmap to make their own process of discovery easier.

In Part Two of this series, I will explore how OST learning fits into ED, education research, and the post-pandemic education system. The latter has been profoundly affected, creating an opportunity for innovation and transformation in the delivery of education. The value of research cannot be underestimated in this context. As a result, my next blog will pose two questions. First, I’ll ask what the role of OST in learning recovery can be in the years ahead.  Then I’ll consider what evidence needs to be built to make the most of what OST can offer. I hope you’ll read it!

I’d love to hear your thoughts on this blog. Send them my way at holly.miller@ed.gov.

 

Citations

Brooks S.K., Webster R.K., Smith L.E., Woodland L., Wessely S., Greenberg N., Rubin G.J. (2020). The psychological impact of quarantine and how to reduce it: rapid review of the evidence. Lancet North Am. Ed. 395(10227):912–920.

Education in a Pandemic: The Disparate Impacts of COVID-19 on America's Students 2021 U.S. Department of Education’s Office for Civil Rights (OCR).

Hu K, Godfrey K, Ren Q, Wang S, Yang X, Li Q. (2022). The impact of the COVID-19 pandemic on college students in USA: Two years later. Psychiatry Res. Sep; 315:114685.

Huck, C., & Zhang, J. (2021). Effects of the COVID-19 Pandemic on K-12 Education: A Systematic Literature Review. New Waves-Educational Research and Development Journal, 24(1), 53-84.

Irwin, V., Wang, K., Tezil, T., Zhang, J., Filbey, A., Jung, J., ... & Parker, S. (2023). Report on the Condition of Education 2023. NCES 2023-144. National Center for Education Statistics.

Kush, J. M., Badillo-Goicoechea, E., Musci, R. J., & Stuart, E. A. (2021). Teacher mental health during the COVID-19 pandemic: informing policies to support teacher well-being and effective teaching practices.

Newby, P. (2010). Research Methods for Education. Pearson Education.

Pfefferbaum B., North C.S. (2020). Mental health and the Covid-19 pandemic. N. Engl. J. Med.;383(6):510–512.

An open letter to Superintendents, as summer begins

If the blockbuster attendance at last month’s Summer Learning and Enrichment Collaborative convening is any sign, many of you are in the midst of planning—or have already started to put in place—your plans for summer learning. As you take time to review resources from the Collaborative and see what’s been learned from the National Summer Learning Project, I’d like to add one just one more consideration to your list: please use this summer as a chance to build evidence about “what works” to improve outcomes for your students. In a word: evaluate!

Given all the things that need to be put in place to even make summer learning happen, it’s fair to ask why evaluation merits even a passing thought.   

I’m urging you to consider building evidence about the outcomes of your program through evaluation because I can guarantee you that, in about a year, someone to whom you really want to give a fulsome answer will ask “so, what did we accomplish last summer?” (Depending upon who they are, and what they care about, that question can vary. Twists can include “what did students learn” or business officers’ pragmatic “what bang did we get for that buck.”)

When that moment comes, I want you to be able to smile, take a deep breath, and rattle off the sort of polished elevator speech that good data, well-analyzed, can help you craft. The alternative—mild to moderate panic followed by an unsatisfying version of “well, you know, we had to implement quickly”—is avoidable. Here’s how.

  1. Get clear on outcomes. You probably have multiple goals for your summer learning programs, including those that are academic, social-emotional, and behavioral. Nonetheless, there’s probably a single word (or a short phrase) that completes the following sentence: “The thing we really want for our students this summer is …” It might be “to rebuild strong relationships between families and schools,” “to be physically and emotionally safe,” or “to get back on track in math.” Whatever it is, get clear on two things: (1) the primary outcome(s) of your program and (2) how you will measure that outcome once the summer comes to an end. Importantly, you should consider outcome measures that will be available for both program participants and non-participants so that you can tell the story about the “value add” of summer learning. (You can certainly also include measures relevant only to participants, especially ones that help you track whether you are implementing your program as designed.)
  2. Use a logic model. Logic models are the “storyboard” of your program, depicting exactly how its activities will come together to cause improvement in the student outcomes that matter most. Logic models force program designers to be explicit about each component of their program and its intended impact. Taking time to develop a logic model can expose potentially unreasonable assumptions and missing supports that, if added, would make it more likely that a program succeeds. If you don’t already have a favorite logic model tool, we have resources available for free!  
  3. Implement evidence-based practices aligned to program outcomes. A wise colleague (h/t Melissa Moritz) recently reminded me that a summer program is the “container” (for lack of a better word) in which other learning experiences and educationally purposeful content are packaged, and that there are evidence-based practices for the design and delivery of both. (Remember: “evidence-based practices” run the gamut from those that demonstrate a rationale to those supported by promising, moderate, or strong evidence.) As you are using the best available evidence to build a strong summer program, don’t forget to ensure you’re using evidence-based practices in service of the specific outcomes you want those programs to achieve. For example, if your primary goal for students is math catch-up, then the foundation of your summer program should be an evidence-based Tier I math curriculum. If it is truly important that students achieve the outcome you’ve set for them, then they’re deserving of evidence-based educational practices supported by an evidence-based program design!
  4. Monitor and support implementation. Once your summer program is up and running, it’s useful to understand just how well your plan—the logic model you developed earlier—is playing out in real life. If staff trainings were planned, did they occur and did everyone attend as scheduled? Are activities occurring as intended, with the level of quality that was hoped for? Is attendance and engagement high? Monitoring implementation alerts you to where things may be “off track,” flagging where more supports for your team might be helpful. And, importantly, it can provide useful context for the outcomes you observe at the end of the summer. If you don't already have an established protocol for using data as part of continuous improvement, free resources are available!
  5. Track student attendance. If you don’t know who—specifically—participated in summer learning activities, describing how well those activities “worked” can get tricky. Whether your program is full-day, half-day, in-person, hybrid, or something else, develop a system to track (1) who was present, (2) on what days, and (3) for how long. Then, store that information in your student information system (or another database) where it can be accessed later. 
  6. Analyze and report your data, with an explicit eye toward equity. Data and data analysis can help you tell the story of your summer programming. Given the disproportionate impact COVID has had on students that many education systems have underserved, placing equity at the center of your planned analyses is critical. For example:
    • Who—and who did not—participate in summer programs? Data collected to monitor attendance should allow you to know who (specifically) participated in your summer programs. With that information, you can prepare simple tables that show the total number of participants and that total broken down by important student subgroups, such as gender, race/ethnicity, or socioeconomic status. Importantly, those data for your program should be compared with similar data for your school or district (as appropriate). Explore, for example, whether there are one or more populations disproportionately underrepresented in your program and the implications for the work both now and next summer.
    • How strong was attendance? Prior research has suggested that students benefit the most from summer programs when they are “high attenders.” (Twenty or more days out programs’ typical 25 to 30 total days.) Using your daily, by-student attendance data, calculate attendance intensity for your program’s participants overall and by important student subgroups. For example, what percentage of students attended between 0 and 24%, 25% to 49%, 50% to 74%, and 75% or more days?
    • How did important outcomes vary between program participants and non-participants? At the outset of the planning process, you identified one or more outcomes you hoped students would achieve by participating in your program and how you’d measure them. In the case of a “math catch-up” program, for example, you might be hoping that more summer learning participants get a score of “on-grade level” at the outset of the school year than their non-participating peers, potentially promising evidence that the program might have offered a benefit. Disaggregating these results by student subgroups when possible highlights whether the program might have been more effective for some students than others, providing insight into potential changes for next year’s work.     
    • Remember that collecting and analyzing data is just a means to an end: learning to inform improvement. Consider how involving program designers and participants—including educators, parents, and students—in discussions about what was learned as a result of your analyses can be used to strengthen next year’s program.
  7. Ask for help. If you choose to take up the evaluation mantle to build evidence about your summer program, bravo! And know that you do not have to do it alone. First, think locally. Are you near a two-year or four-year college? Consider contacting their education faculty to see whether they’re up for a collaboration. Second, explore whether your state has a state-wide “research hub” for education issues (e.g., Delaware, Tennessee) that could point you in the direction of a state or local evaluation expert. Third, connect with your state’s Regional Educational Lab or Regional Comprehensive Center for guidance or a referral. Finally, consider joining the national conversation! If you would be interested in participating in an Evaluation Working Group, email my colleague Melissa Moritz at melissa.w.moritz@ed.gov.

Summer 2021 is shaping up to be one for the record books. For many students, summer is a time for rest and relaxation. But this year, it will also be a time for reengaging students and families with their school communities and, we hope, a significant amount of learning. Spending time now thinking about measuring that reengagement and learning—even in simple ways—will pay dividends this summer and beyond.

My colleagues and at the U.S. Department of Education are here to help, and we welcome your feedback. Please feel free to contact me directly at matthew.soldner@ed.gov.

Matthew Soldner
Commissioner, National Center for Education Evaluation and Regional Assistance Agency

NCEE is hiring!

The U.S. Department of Education’s Institute of Education Sciences (IES) is seeking professionals in education-related fields to apply for an open position in the National Center for Education Evaluation and Regional Assistance (NCEE). Located in NCEE’s Evaluation Division, this position would support impact evaluations and policy implementation studies. Learn more about our work here: https://ies.ed.gov/ncee.

If you are even potentially interested in this sort of position, you are strongly encouraged to set up a profile in USAJobs (https://www.usajobs.gov/) and to upload your information now. As you build your profile, include all relevant research experience on your resume whether acquired in a paid or unpaid position. The position will open in USAJobs on July 15, 2019 and will close as soon as 50 applications are received, or on July 29, 2019, whichever is earlier. Getting everything in can take longer than you might expect, so please apply as soon as the position opens in USAJobs (look for vacancy number IES-2019-0023).

 

Regional Educational Laboratories: Connecting Research to Practice

By Joy Lesnick, Acting Commissioner, NCEE

Welcome to the NCEE Blog! 

Joy Lesnick

We look forward to using this space to provide information and insights about the work of the National Center for Education Evaluation and Regional Assistance (NCEE). A part of the Institute of Education Sciences (IES), NCEE’s primary goal is providing practitioners and policymakers with research-based information they can use to make informed decisions. 

We do this in a variety of ways, including large-scale evaluations of education programs and practices supported by federal funds; independent reviews and syntheses of research on what works in education; and a searchable database of research citations and articles (ERIC) and reference searches from National Library of Education. We will explore more of this work in future blogs, but in this post I’d like to talk about an important part of NCEE—the Regional Educational Laboratories (RELs).

It’s a timely topic. Last week, the U.S. Department of Education released a solicitation for organizations seeking to become REL contractors beginning in 2017 (the five-year contracts for the current RELs will conclude at the end of 2016). The REL program is an important part of the IES infrastructure for bridging education research and practice. Through the RELs, IES seeks to ensure that research does not “sit on a shelf” but rather is broadly shared in ways that are relevant and engaging to policymakers and practitioners. The RELs also involve state and district staff in collaborative research projects focused on pressing problems of practice. An important aspect of the RELs’ work is supporting the use of research in education decision making – a charge that the Every Student Succeeds Act has made even more critical.

The RELs and their staff must be able to navigate comfortably between the two worlds of education research and education practice, and understand the norms and requirements of both.  As part of this navigating, RELs focus on: (1) balancing rigor and relevance; (2) differentiating support to stakeholders based on need; (3) providing information in the short term, and developing evidence over the long term; and (4) addressing local issues that can also benefit the nation.

While the RELs are guided by federal legislation, their work reflects – and responds to – the needs of their communities. Each REL has a governing board comprised of state and local education leaders that sets priorities for REL work. Also, nearly all REL work is conducted in collaboration with research alliances, which are ongoing partnerships in which researchers and regional stakeholders work together over time to use research to address an education problem.  

Since the current round of RELs were awarded in 2012, these labs and their partners have conducted meaningful research resulting in published reports and tools, held hundreds of online and in-person seminars and training events that have been attended by practitioners across the country, and produced videos of their work that you can find on the REL Playlist on the IES YouTube site. Currently, the RELs have more than 100 projects in progress. RELs do work in nearly every topic that is crucial to improving education—kindergarten readiness, parent engagement, discipline, STEM education, college and career readiness, teacher preparation and evaluation, and much more.

IES’s vision is that the 2017–2022 RELs will build on and extend the current priorities of high-quality research, genuine partnership, and effective communication, while also tackling high-leverage education problems.  High-leverage problems are those that: (1) if addressed could result in substantial improvements in education outcomes for many students or for key subgroups of students; (2) are priorities for regional policymakers, particularly at the state level; and (3) require research or research-related support to address well. Focusing on high-leverage problems increases the likelihood that REL support ultimately will contribute to improved student outcomes.

Visit the IES REL website to learn more about the 2012-2017 RELs and how you can connect with the REL that serves your region.  Visit the FedBizOpps website for information about the competition for the 2017-2022 RELs.