Inside IES Research

Notes from NCER & NCSER

Navigating the ESSER Funding Cliff: A Toolkit for Evidence-Based Financial Decisions

As the federal Elementary and Secondary School Emergency Relief (ESSER) funds approach their expiration date in September 2024, schools and districts across the nation are facing a budgeting season like no other. ESSER funding has played a pivotal role in supporting schools in recovery from the COVID-19 pandemic, but with the deadline looming, districts and charters must take stock of their investments and ensure that programs that are making a positive impact for students continue in a post-ESSER world.

A team at the North Carolina Department of Public Instruction (NCDPI) has been examining COVID-19 learning recovery in the state as part of their Using Longitudinal Data to Support State Education Policymaking project, which is part of the IES-funded RESTART network. In addition to the research, members of the team at NCDPI developed a toolkit to help local leaders make decisions about what programs to continue or discontinue in the face of the upcoming expiration of federal funding to help schools with learning recovery post-pandemic. Through their work in the Office of Learning and Research (OLR) and the Division of Innovation at NCDPI, Rachel Wright-Junio, Jeni Corn, and Andrew Smith are responsible for managing statewide programs, conducting research on innovative teaching practices, and sharing insights using modern data analysis and visualization techniques. In this guest blog, they describe the need for the toolkit, how they developed it, how it is being used, and next steps.

The ESSER Funding Cliff Toolkit: A Data-Driven Approach

To help district and school leaders navigate the financial uncertainties following the end of ESSER funding, the OLR team created a Funding Cliff Toolkit as a starting point for data-driven decision-making based on unique local contexts. The toolkit provides a comprehensive set of resources, including a Return on Investment Framework and Calculator that uses detailed data on ESSER expenditures as well as the impacts on student outcomes of various investments. By using this toolkit, schools and districts can assess what worked during the ESSER funding period, identify areas for improvement, and create sustainable financial plans that ensure effective programs continue regardless of funding.

Knowing the far-reaching implications for this type of tool, the OLR team worked with federal programs and finance leaders across NCDPI. Additionally, they consulted leaders including superintendents and chief financial officers of North Carolina school districts and charter schools in the design process to ensure that the tool met their immediate needs. Finally, Drs. Brooks Bowden, associate professor at the Graduate School of Education at the University of Pennsylvania, and Nora Gordon, professor at the Georgetown University McCourt School of Public Policy,  served as collaborators on the design of the ROI toolkit to ensure validity of the tool.

Rolling Out the Toolkit: Engaging Leaders Across the State

In rolling out the toolkit, the OLR Team intentionally invited diverse stakeholders to the table, including district staff from finance, federal programs, academics, and cabinet-level leadership. It was crucial to bring together the financial, compliance, and programmatic pieces of the “ESSER puzzle” to allow them to work collaboratively to take stock of their ESSER-funded investments and explore academic progress post-pandemic. 

To ensure that the ESSER Funding Cliff Toolkit reached as many district leaders as possible, the OLR Team organized a comprehensive rollout plan, which began with a series of introductory webinars that provided an overview of the toolkit and its components. These webinars were followed by nine in-person sessions, held in each of the eight state board of education regions across North Carolina where over 400 leaders attended. Building upon the initial learning from informational webinars, in-person learning sessions featured interactive presentations that allowed district teams to practice using the tool with simulated data as well as their own. By the end of the session, participants left with new, personalized data sets and tools to tackle the impending ESSER funding cliff. After each session, the team collected feedback that improved the toolkit and subsequent learning sessions. This process laid the groundwork for continued support and collaboration among district and school leaders.

What's Next: Expanding the Toolkit's Reach

Next steps for the OLR Team include expanding the use of the toolkit and working with district and charter schools to apply the ROI framework to help districts make evidence-based financial decisions across all funding sources. Districts are already using the toolkit beyond ESSER-funded programs. One district shared how they applied the ROI framework to their afterschool tutoring programs. Other districts have shared how they plan to use the ROI framework and funding cliff toolkit to guide conversations with principals who receive Title I funds in their schools to determine potential tradeoffs in the upcoming budget year.

As North Carolina schools inch closer to the end of ESSER, the goal is to continue to encourage districts and charters to incorporate evidence-based decision-making into their budgeting and program planning processes. This ensures that districts and schools are prioritizing those programs and initiatives that deliver the most significant impact for students.

In addition to expanding support to North Carolina districts and schools, we also hope that this supportive approach can be replicated in other SEAs across the nation. We are honored to have our toolkit featured in the National Comprehensive Center’s upcoming Communities of Practice (CoP) Strategic Planning for Continued Recovery (SPCR) and believe that cross-SEA collaboration in this CoP will improve the usefulness of the toolkit. 


Rachel Wright-Junio is the director of the Office of Learning and Research (OLR) at the North Carolina Department of Public Instruction (NCDPI); Jeni Corn is the director of research and evaluation in OLR; and Andrew Smith is the deputy state superintendent in the NCDPI Division of Innovation.

Contact Rachel Wright-Junio at Rachel.WrightJunio@dpi.nc.gov for the webinar recording or copies of the slide deck from the in-person sessions.

This guest blog was produced by Corinne Alfeld (Corinne.Alfeld@ed.gov), NCER Program Officer.

Evidence on CTE: A Convening of Consequence

In 2018, NCER funded a research network to build the evidence base for career and technical education (CTE). As with other research networks, the CTE Research Network comprises a set of research teams and a lead team, which is responsible for collaboration and dissemination among the research teams. On March 20, 2024, the Network held a final convening to present its findings to the field. In this blog, Network Lead Member Tara Smith, Network Director Kathy Hughes, and NCER Program Officer Corinne Alfeld reflect on the success of the final convening and share thoughts about future directions.

Insights From the Convening

An audience of CTE Research Network members, joined by educators, administrators, policymakers and other CTE stakeholders, gathered for a one-day convening to hear about the Network’s findings. Several aspects of the meeting contributed to its significance.

  • The presentations highlighted an important focus of the Network – making research accessible to and useable for practitioners. The agenda included presentations from four Network member researchers and their district or state partners from New York City and North Carolina. Each presentation highlighted positive impacts of CTE participation, but more importantly, they demonstrated the value of translating research findings into action. Translation involves collaboration between researchers and education agency staff to develop joint research questions and discuss the implications of findings for improving programs to better serve students or to take an innovative practice and scale it to other pathways and schools.
  • Brand-new and much-anticipated research was released at the convening. The Network lead announced a systematic review of all of the rigorous causal research on secondary-level CTE from the last 20 years. This is an exciting advancement for building the evidence base for CTE, which was the purpose of the Network. The meta-analysis found that CTE has statistically significant positive impacts on several high school outcomes, such as academic achievement, high school completion, employability skills, and college readiness. The review found no statistically significant negative impacts of CTE participation. The evidence points to the power of CTE to transform lives, although more research is needed. To guide future research, the review provided a “gap analysis” of where causal research is lacking, such as any impacts of high school CTE participation on academic achievement in college or attainment of a postsecondary degree.
  • National CTE leaders and experts put the research findings into a policy context and broadcasted its importance. These speakers commented on the value of research for CTE advocacy on Capitol Hill, in states, and in informing decisions about how to target resources. Luke Rhine, the deputy assistant secretary of the Office of Career, Technical, and Adult Education (OCTAE) said, “The best policy is informed by practice [...] and the best practice is informed by research.” Kate Kreamer, the executive director of Advance CTE, emphasized the importance of research in dispelling myths, saying that “if the data are not there, that allows people to fill the gaps with their assumptions.” However, she noted, as research increasingly shows the effectiveness of CTE, we must also guard against CTE programs becoming selective, and thus limiting equitable access.

New Directions

In addition to filling the critical gaps identified by the Network lead’s review, other future research questions suggested by researchers, practitioners, and policymakers at the convening include:

  • How can we factor in the varied contexts of CTE programs and the wide range of experiences of CTE students to understand which components of CTE really matter?  What does it look like when those are done well?  What does it take to do them well? Where is it happening?
  • How can we learn more about why students decide to participate in CTE generally and in their chosen pathway? What are the key components of useful supports that schools can provide to help them make these decisions?
  • How do we engage employers more deeply and actively in CTE programs and implement high quality work-based learning to ensure that students are acquiring skills and credentials that are valued in the labor market?
  • What are evidence-based practices for supporting special student populations, such as students with disabilities, or English language learners?
  • How can we harness state longitudinal data systems that link education and employment data to examine the long-term labor market outcomes of individuals from various backgrounds who participated in different career clusters or who had access to multiple CTE experiences?

While IES alone will not be able to fund all the needed research, state agencies, school districts, and even individual CTE programs can partner with researchers to study what works in their context and identify where more innovation and investment is needed. The work of the CTE Research Network has provided a good evidence base with which to start, and a good model for additional research that improves practice and policy. Fortunately, the CTE research field will continue to grow via the support of a new CTE Research Network – stay tuned for more information!


This blog was co-written by CTE Network Lead Member Tara Smith of Job for the Future, CTE Network Director Kathy Hughes of AIR, and NCER Program Officer Corinne Alfeld.

Questions can be addressed to Corinne.Alfeld@ed.gov.

ED/IES SBIR: Advancing Research to Practice at Scale in Education

This image depicts a young girl with headphones holding onto a mic that is attached.

The Department of Education and Institute of Education Sciences Small Business Innovation Research Program (known as ED/IES SBIR), funds projects to develop and evaluate new education technology products that ready to be widely deployed to address pressing educational needs.

In advance of IES Innovation Day at the ED Games Expo on September 21, 2023 at the Kennedy Center REACH in Washington, DC, this blog features a series of ED/IES SBIR awards that were funded for the purpose of creating education technology to ready previously funded evidence-based products for use at scale. Two of the projects highlighted below, one led by Jay Connor of Learning Ovations and the other by Clark McKown of xSEL Labs, will be featured as part of panels. This event is open to the public. Register for the Expo here.


Over its 20-year history, ED/IES SBIR has been well known for stimulating pioneering firms, such as Filament Games, Future EngineersPocketLab, and Schell Games, to create entrepreneurial and novel education technology products. ED/IES SBIR has also established a track record for investing in a different set of projects—ones that facilitate the uptake of innovations originally developed in university or laboratory settings. This is important because even when researcher-developed innovations (for example, models, programs, and tools) are shown to have evidence for impact, many are not delivered at scale, preventing learners from fully benefiting from these innovations.

Examples of ED/IES SBIR Research to Practice Projects

Over the past two decades, ED/IES SBIR projects have provided useful models for how researchers can navigate and overcome the research-to-practice gap. ED/IES SBIR has made several awards to projects that were originally researcher-initiated, many through IES research grants. These researchers either founded a small business or partnered with an existing small business to develop and commercialize new education technology products to advance research to practice at scale in education.

The short descriptions of these projects below include links to IES website pages with additional information on the unique project models. These projects converted findings from research into scalable, education technology delivered interventions, added new components to existing research-based prototypes to enable feasible implementation and to improve the user experience, and upgraded technology systems to handle large numbers of users across numerous sites.

  • Learning Ovations: Through a series of IES and NIH funded research, Dr. Carol Connor led an academic team to develop a personalized early learning assessment, the A2i, and demonstrated its efficacy for improving literacy outcomes through multiple trials. To ready the A2i for use in larger numbers of settings and to improve data processing and reporting, Learning Ovations won an ED/IES SBIR award to upgrade the underlying data architecture and create automated supports and functionalities. In 2022, Scholastic acquired Learning Ovations, with plans for the A2i to be integrated into its suite of products. See the Learning Ovations Success Story for more information.
  • Mindset Works: Through an IES research grant in 2002 and with funding from other sources, Dr. Carol Dweck led a research team to develop the concept of the growth mindset—the understanding that ability and intelligence can develop with effort and learning. Lisa Blackwell, a member of the research team, founded Mindset Works and won a 2010 ED/IES SBIR award to develop training modules and animated lessons to deploy this instructional model through a multi-media website. A research grant funded in 2015 tested and demonstrated the efficacy of the technology-delivered Growth Mindset Intervention to improve outcomes of struggling learners. See the Mindset Works Success Story for more information.
  • Nimble Assessment Systems: Through IES and other grants, Dr. Michael Russell led team of researchers to conducted foundational research and develop and validation of new forms of assessment. Informed by this research, Nimble Assessment Systems developed NimbleTools with an award from a ED/IES SBIR, a set of universally designed accommodation tools to improve accessibility of assessments for students with disabilities. Measured Progress acquired Nimble Assessment Systems, and the product was integrated into its suite of products for state and district assessments. See the Nimble Tools Success Story for more information.
  • Children’s Progress: Through NIH grants, Dr. Eugene Galanter led a research team to create a computer-based assessment that adapted to how a student responded to each question and delivered individualized narratives for each student. With awards from NIH SBIR and ED/IES SBIR, Children’s Progress developed a commercial version of the computer-adaptive dynamic assessment (CPAA) for early childhood in literacy and math. In 2012, Northwest Evaluation Association (NWEA) acquired Children’s Progress, with the assessment technology incorporated into the NWEA’s assessment platform and used at scale. See the Children’s Progress Success Story for more information.
  • Teachley: Through IES and NSF funded research, Dr. Herb Ginsburg led an academic team to develop prototype software programs for children from preschool to grade 3 to practice mathematics. In 2011, three members of the research team founded a small business, Teachley, which won ED/IES SBIR awards to extend the research model into easily playable, engaging, and widely used math game apps. See the Teachley Success Story for more information.
  • Analytic Measures: With funding from IES, Dr. Jaren Bernstein led a research team to develop prototypes of automated oral reading fluency assessments that were administered to students during the NAEP and other national assessments by IES’s National Center for Education Statistics. Analytic Measures won ED/IES SBIR awards (here and here) to develop the school-ready version of these assessments. In 2022, Google acquired the intellectual property of the assessments with plans to incorporate the tools into its suite of products for education. See this Analytic Measures Success Story more information.
  • Lightning Squad: Through awards from ED’s Office of Education Research and Improvement (now IES) and the Office of Elementary and Secondary Education, Drs. Nancy Madden and Bob Slavin led a research team to develop a model to make tutoring more cost-effective. With awards from ED/IES SBIR, Sirius Thinking partnered with Success For All to develop a mixed online and face-to-face multimedia intervention for struggling readers in grades 1 to 3. The program is now in wide-scale use in schools and in tutoring programs. See the Lightning Squad Success Story for more information.
  • Apprendis: With research grants from IES and other sources, Dr. Janice Gobert led teams at Worcester Polytechnic Institute and Rutgers University to develop and evaluated Inq-ITS (Inquiry Intelligent Tutoring System) virtual labs for students in grades 4 to 10. Apprendis was founded in order to commercialize InqITS and won an ED/IES SBIR award to develop a teacher alert system that generates real-time insights to inform instruction. InqITS is currently in wide-scale use.
  • Common Ground Publishing: Through IES and other grants, Drs. Bill Cope and Mary Kalantzis led a team of researchers to conduct research on new forms of technology-delivered formative assessment for student writing. A technology-based company spun out of a university tech-transfer office, Common Ground Publishing, and won ED/IES SBIR awards (here and here) to develop CGScholar based on this research. CGScholar is an AI-based digital media learning management system designed to support student writing, learning, and formative assessment, which has been in wide-scale use for several years.  See the CGScholar Success Story for more information.
  • xSEL Labs: With funding from IES, Dr. Clark McKown led a team led to develop screening assessments for social and emotional learning and conducted research to demonstrate the efficacy of the tool. xSEL Labs was founded to commercialize the assessments, and with an ED/IES SBIR award, is developing a platform to support educators and administrators using research-based SEL assessments. In 2023, 7 Mindsets acquired xSEL Labs was acquired to commercialize the platform at scale.

A New Program Area at ED/IES SBIR to Continue Advancing Research to Practice
With a history of awards to advance research to practice, ED/IES SBIR created a new program area in 2022 called Direct to Phase II to invest in more projects to develop commercially viable education technology products to ready existing evidence-based research for use at scale. The program resulted in one award (see here) in 2022. Please see the ED/IES SBIR solicitation page for information on the next opportunity for funding through its FY2024 program.


Stay tuned for updates on Twitter, Facebook, and LinkedIn as ED/IES SBIR continues to support projects to advance research to practice at scale.

Edward Metz (Edward.Metz@ed.govis a research scientist and the program manager for the Small Business Innovation Research Program at the US Department of Education’s Institute of Education Sciences.

 

Adult Foundational Skills Research: Reflections on PIAAC and Data on U.S. Adult Skills

In this blog, NCER program officer, Dr. Meredith Larson, interviews Dr. Holly Xie from NCES about the Program for the International Assessment of Adult Competencies (PIAAC), an OECD-developed international survey  of adult skills in literacy, numeracy, and digital problem solving administered at least once a decade. PIAAC also collects information on adult activities (such as skill use at home or work, civic participation, etc.), demographics (such as level of education, race), and other factors (such as health outcomes). To date, NCER has funded three research grants (herehere, and here) and one training grant that relied on PIAAC data.

NCES has led the U.S. efforts in administering PIAAC and has been sharing results for over a decade. PIAAC in Cycle I (PIAAC I) included three waves of data collection in the United States with the first data released in 2013. From PIAAC I, we learned a wealth of information about the skills of U.S. adults. For example, the 2017 wave of data collection found that the percentages of U.S. adults performing at the lowest levels were 19 percent in literacy, 29 percent in numeracy, and 24 percent in digital problem solving. As we look forward to learning from PIAAC II, Dr. Xie reflects on the products from PIAAC I and possibilities for PIAAC II (release in 2024).

What is your role at NCES and with PIAAC specifically?

I am the PIAAC national program manager and oversee all aspects of the PIAAC study in the United States, including development and design, data collection, analysis and reporting, and dissemination/outreach. I also represent the United States at PIAAC international meetings.

What is something you’re particularly excited about having produced during PIAAC I?

I am most excited about the U.S. PIAAC Skills Map. The Skills Map provides information on adult skills at the state and county levels. Users can explore adult skills in literacy and numeracy in their state or county and get estimates of literacy or numeracy proficiency overall and by age and education levels. Or they can compare a county to a state, a state to the nation, or compare counties (or states) to each other. The map also has demographic and socioeconomic data from the American Community Survey (ACS) to provide context for the state or county estimates. This YouTube video demonstrates what the map can do.

 

 

We also have other PIAAC web products and publications such as national and international reports, Data Points, and PIAAC publications that provide invaluable information on U.S. adult skills and interrelationships of those skills to other social, economic, and demographic factors.

Do you have examples of how information from PIAAC I has been used?

PIAAC data cover results at the national, state, and county levels, and as such, they can be useful for policymakers or decision makers who would like to know where things stand in terms of the skills of their adult population and where they need to allocate resources at these different levels of the system. In other words, PIAAC data can be useful for drafting targeted policies and programs that will benefit their population and constituencies.

For instance, at the national level, former Vice President Biden used information from PIAAC I in his report Ready to Work for the June 2014 reauthorization of the Workforce Innovation and Opportunity Act, known as WIOA. PIAAC was also cited in the discussion of extending the Second Chance Pell experiment as identified in the 2019 report titled Prisoners’ Eligibility for Pell Grants: Issues for Congress.

The Digital Equity Act of 2021 also leveraged the PIAAC. This legislation identifies particular populations that determine the funding formula. The quick guide to these populations uses PIAAC to estimate one of these populations: Individuals with a language barrier, including individuals who are English learners and have low levels of literacy.

Local governments have also used PIAAC products. For example, the Houston Mayor’s Office for Adult Literacy in collaboration with the Barbara Bush Foundation used the PIAAC Skills Map data to inform the Adult Literacy Blueprint.

And the adult education advocacy group, ProLiteracy, also used the PIAAC and the Skills Map to develop a toolkit for local program adult education and adult literacy program advocacy.

When will the results of PIAAC II be available, and how does this cycle differ from PIAAC I?

PIAAC II data collection began in 2022 and results will be released in December 2024 and will include information on the literacy, numeracy, and adaptive problem-solving skills of adults in the United States. The numeracy assessment now includes a measure of “numeracy components,” which focus on number sense, smaller/bigger number values, measurement, etc. This information will help us learn more about the skills of adults who have very low numeracy skills. The adaptive problem-solving component is a new PIAAC module and will measure the ability to achieve one’s goals in a dynamic situation in which a method for reaching a solution is not directly available.

PIAAC II will also include, for the first time, questions about financial literacy in the background questionnaire, using items on managing money and tracking spending and income, savings methods, and budgeting. These additional questions will allow people to explore relationships between foundational skills, financial literacy, and other constructs in PIAAC.

What types of research could you imagine stemming from the PIAAC II?

One of the most unique features of PIAAC (both PIAAC I and II) is the direct assessment of literacy, numeracy, and problem-solving skills (information that no other large-scale assessment of adults provides). Thirty-one countries, including the United States, participated in PIAAC II (2022/23), so researchers will be able to compare the adult skills at the international level and also study trends between PIAAC I and PIAAC II.

It’s worth noting that the data collection took place while we were still experiencing the effects of the COVID-19 pandemic. This may provide researchers opportunities to explore how the pandemic is related to adults’ skills, health, employment, training, and education status.

Where can the public access data from PIAAC?

Researchers can find information about the available data from the national U.S. PIAAC 2017 Household, PIAAC 2012/14 Household, and PIAAC 2014 Prison datasets, and international and trend datasets on the NCES Data Files page. PIAAC restricted-use data files contain more detailed information, such as continuous age and earnings variables, that can be used for more in-depth analysis. Accessing the restricted-use data requires a restricted-use license from NCES.

NCES also has an easy-to-use online analysis tool: the International Data Explorer (IDE). The IDE allows users to work directly with the PIAAC data and produce their own analyses, tables, regressions, and charts. An IDE tutorial video provides comprehensive, step-by-step instructions on how to use this tool. It contains detailed information about the content and capabilities of the PIAAC IDE, as well as how the PIAAC data are organized in the tool.


This blog was produced by Meredith Larson (Meredith.Larson@ed.gov), research analyst and program officer for postsecondary and adult education, NCER.

Recipe for High-Impact Research

The Edunomics Lab at Georgetown University’s McCourt School of Public Policy has developed NERD$, a national school-by-school spending data archive of per-pupil expenditures using the financial data states publish as required by the Every Student Succeeds Act (ESSA). In this guest blog post, Laura Anderson, Ash Dhammani, Katie Silberstein, Jessica Swanson, and Marguerite Roza of the Edunomics Lab discuss what they have learned about making their research more usable by practitioners.

 

When it comes to getting research and data used, it’s not just a case of “build it and they will come” (apologies to the movie “Field of Dreams”). In our experience, we’ve found that state, district, and school leaders want—and need—help translating data and research findings to inform decision making.

Researchers frequently use our IES-funded school-by-school spending archive called NERD$: National Education Resource Database on Schools. But we knew the data could have immediate, effective, and practical use for education leaders as well, to help them make spending decisions that advance equity and leverage funds to maximize student outcomes. Funding from the U.S. Department of Education’s  the National Comprehensive Center enabled us to expand on the IES-funded NERD$ by building the School Spending & Outcomes Snapshot (SSOS), a customizable, research-tested data visualization tool. We published a related guide on leading productive conversations on resource equity and outcomes and conducted numerous trainings for federal, state, and local leaders on using SSOS. The data visualizations we created drew on more than two years of pilot efforts with 26 school districts to find what works best to drive strategic conversations.

 

 

We see this task of translating research to practice as an essential element of our research efforts. Here, we share lessons learned from designing data tools with end users in mind, toward helping other researchers maximize the impact of their own work.

Users want findings converted into user-friendly data visualizations. Before seeing the bar chart below, leaders of Elgin Area School District U-46 in Illinois did not realize that they were not systematically allocating more money per student to schools with higher shares of economically disadvantaged students. It was only when they saw their schools lined up from lowest to highest by per pupil spending and color coded by the share of economically disadvantaged students (with green low and red high) that they realized their allocations were all over the map.

 

Note. This figure provides two pieces of information on schools in Elgin Area School District U-46. It shows the spending per pupil for each school in dollars, and it shows the share of the students in each school who are categorized as economically disadvantaged from 0 to 100% using the colors green to red. The schools are lined up from lowest to highest per pupil spending. When lined up this way, there is no pattern of where schools with more economically disadvantaged students fit in as they fall across the spectrum of low to high spending per pupil schools. The figure shows there is little correlation between school per pupil spending and the percent of economically disadvantaged students they serve. The figure made it easier for users to understand the lack of the relationship between per pupil spending by schools and the percent of economically disadvantaged students they serve.

 

Users want research converted into locally relevant numbers. Embedding district-by-district figures into visualizations takes effort but pays off. Practitioners and decisionmakers can identify what the research means for their own context, making the data more immediately actionable in their local community.

That means merging lots of data for users. We merged demographic, spending, and outcomes data for easy one-stop access in the SSOS tool. In doing so, users could then do things like compare peer schools with similar demographics and similar per-student spending levels, surfacing schools that have been able to do more for students with the same amount of money. Sharing with lower-performing schools what those standout schools are doing can open the door for peer learning toward improving schooling.

Data displays need translations to enable interpretation. In our pilot effort, we learned that at first glance, the SSOS-generated scatterplot below could be overwhelming or confusing. In focus groups, we found that by including translation statements, such as labeling the four quadrants clearly, the information became more quickly digestible.

 

Note. This figure provides three pieces of information on schools in Elgin Area School District U-46. It shows the spending per pupil for each school in dollars, the share of the students in each school who are categorized as economically disadvantaged from 0 to 100% using the colors green to red, and the achievement level of each school based on a composite of its students’ math and reading scores. Schools are placed into 1 of 4 categories on the figure, and a translation statement is put in each category to make clear what each category represents. These four translation statements are: 1) spend fewer dollars than peers but get higher student outcomes, 2) spend fewer dollars than peers but get lower student outcomes, 3) spend more dollars than peers but get higher student outcomes, and 4) spend more dollars than peers but get lower student outcomes. These translation statements were found to make it easier for users to understand the data presented in the figure.

 

Short webinar trainings (like this one on SSOS) greatly enhanced usage. Users seem willing to come to short tutorials (preloaded with their data). Recording these tutorials meant that attendees could share them with their teams.

Users need guidance on how and when to use research findings. We saw usage increase when leaders were given specific suggestions on when and where to bring their data. For instance, we advised that school board members could bring NERD$ data to early stage budget workshops held in the spring. That way the data could inform spending decisions before district budgets get finalized and sent to the full board for approval in May or June.

It's worth the extra efforts to make research usable and useful. These efforts to translate data and findings to make them accessible for end users have helped make the federally supported school-by-school spending dataset an indispensable resource for research, policy, and practice. NERD$ makes transparent how much money each school gets from its district. SSOS helps move the conversation beyond the “how much” into “what is the money doing” for student outcomes and equity, encouraging stakeholders to dig into how spending patterns are or are not related to performance. State education agencies are using the displays to conduct ESSA-required resource allocation reviews in districts that serve low-performing schools. The tool has more than 5,000 views, and we have trained more than 2,000 education leaders on how to use the data displays to improve schooling.

IES has made clear it wants research to be used, not to sit on a shelf. In our experience, designing visualizations and other tools around user needs can make data accessible, actionable, and impactful.


This blog was produced by Allen Ruby (Allen.Ruby@ed.gov), Associate Commissioner, NCER.