IES Blog

Institute of Education Sciences

Investing in Next Generation Technologies for Education and Special Education

The Department of Education’s (ED) Small Business Innovation Research (SBIR) program, administered by the Institute of Education Sciences (IES), funds entrepreneurial developers to create the next generation of technology products for students, teachers, and administrators in education and special education. The program, known as ED/IES SBIR, emphasizes an iterative design and development process and pilot research to test the feasibility, usability, and promise of new products to improve outcomes. The program also focuses on planning for commercialization so that the products can reach schools and end-users and be sustained over time.

In recent years, millions of students in tens of thousands of schools around the country have used technologies developed through ED/IES SBIR, including more than million students and teachers who used products for remote teaching and learning during the COVID-19 pandemic.

ED/IES SBIR Announces 2022 Awards

IES has made 10 2022 Phase I awards for $250,000*. During these 8 month projects, teams will develop and refine prototypes of new products and test their usability and initial feasibility. All awardees who complete a Phase I project will be eligible to apply for a Phase II award in 2023.

IES has made nine 2022 Phase II awards, which support further research and development of prototypes of education technology products that were developed under 2021 ED/IES SBIR Phase I awards. In these Phase II projects, teams will complete product development and conduct pilot studies in schools to demonstrate the usability and feasibility, fidelity of implementation, and the promise of the products to improve the intended outcomes.

IES also made one Direct to Phase II award to support the research, development, and evaluation of a new education technology product to ready an existing researcher-developed evidence-based intervention for use at scale and to plan for commercialization. The Direct to Phase II project is awarded without a prior Phase I award. All Phase II and the Direct to Phase II awards are for $1,000,000 for two-years. Across all awards, projects address different ages of students and content areas.

The list of all 2022 awards is posted here. This page will be updated with the two additional Phase I awards after the contracts are finalized.

 

 

The 2022 ED/IES SBIR awards highlight three trends that continue to emerge in the field of education technology.

Trend 1: Projects Are Employing Advanced Technologies to Personalize Learning and Generate Insights to Inform Tailored Instruction

About two-thirds of the new projects are developing software components that personalize teaching and learning, whether through artificial intelligence, machine learning, natural language processing, automated speech recognition, or algorithms. All these projects will include functionalities afforded by modern technology to personalize learning by adjusting content to the level of the individual learner, offer feedback and prompts to scaffold learning as students progress through the systems, and generate real-time actionable information for educators to track and understand student progress and adjust instruction accordingly. For example:

  • Charmtech Labs and Literably are fully developing reading assessments that provide feedback to inform instruction.
  • Sirius Thinking and studio:Sckaal are developing prototypes to formatively assess early grade school students in reading.
  • Sown To Grow and xSEL Labs are fully developing platforms to facilitate student social and emotional assessments and provide insights to educators.
  • Future Engineers is fully developing a platform for judges to provide feedback to students who enter STEM and educational challenges and contests.
  • Querium and 2Sigma School are developing prototypes to support math and computer science learning respectively.
  • ,Soterix is fully developing a smart walking cane and app for children with visual impairments to learn to navigate.
  • Alchemie is fully developing a product to provide audio cues to blind or visually impaired students learning science.
  • Star Autism Support is developing a prototype to support practitioners and parents of children with autism spectrum disorder.

Trend 2: Projects Focusing on Experiential and Hands-On Learning
Several new projects are combining hardware and software solutions to engage students through pedagogies employing game-based, hands-on, collaborative, or immersive learning:

  • Pocketlab is fully developing a matchbox-sized car with a sensor to collect physical science data as middle school students play.
  • GaiaXus is developing a prototype sensor used for environmental science field experiments.
  • Mind Trust is a developing a virtual reality escape room for biology learning.
  • Smart Girls is developing a prototype science game and accompanying real-world hands-on physical activity kits.
  • Indelible Learning is developing a prototype online multi-player game about the electoral college.
  • Edify is fully developing a school-based program for students to learn about, create, and play music.

Trend 3: Projects to Advance Research to Practice at Scale

Several new awards will advance existing education research-based practices into new technology products that are ready to be delivered at scale:

  • INSIGHTS is fully developing a new technology-delivered version to ready an NIH- and IES-supported social and emotional intervention for use at scale.
  • xSEL Laband Charmtech Labs (noted above) are building on prior IES-funded research-based interventions to create scalable products.
  • Scrible is developing an online writing platform in partnership with the National Writers Project based on prior Department of Education-funded research. 

 


*Note: Two additional 2022 Phase I awards are forthcoming in 2022. The contracts for these awards are delayed due to a back-up in the SAM registration process.

Stay tuned for updates on Twitter and Facebook as IES continues to support innovative forms of technology.

Edward Metz (Edward.Metz@ed.gov) is the Program Manager of the ED/IES SBIR program.

Michael Leonard (Michael.Leonard@ed.gov) is the Program Analyst of the ED/IES SBIR program.

 

Improving Academic Achievement through Instruction in Self-Regulated Strategy Development: The Science Behind the Practice

Self-Regulated Strategy Development (SRSD) is an evidence-based instructional approach characterized by active, discussion-based, scaffolded, and explicit learning of knowledge of the writing process; general and genre-specific knowledge; academic vocabulary; and validated strategies for teaching reading and writing. IES has supported multiple research studies on SRSD for students with learning disabilities in K-12 and postsecondary general education settings. SRSD is used in as many as 10,000 classrooms across the United States and in 12 other countries. In this interview blog, we spoke with Dr. Karen Harris, the developer of SRSD, to learn more about this effective instructional strategy, the IES research behind it, and next steps for further scaling of SRSD so that more students can benefit.

What led you to develop the Self-Regulated Strategy Development model?

Photo of Karen Harris

I began developing what became the SRSD model of instruction in the 1980s, based on my experiences tutoring and teaching. No one theory could address all of what I needed to do as a teacher, or all that my students needed as learners. SRSD instruction pulls together what has been learned from research across theories of learning and teaching. It is a multicomponent instructional model that addresses affective, behavioral, and cognitive aspects of learning. Further, SRSD instruction is intended to take place in inclusive classrooms, is discourse-driven, integrates social-emotional supports, and involves learning in whole class and group settings with peer collaboration. SRSD research started in writing because Steve Graham (my husband and colleague) was deeply interested in writing, and we co-designed the initial studies. Today, SRSD instruction research exists across a wide variety of areas, such as reading comprehension, mathematical problem solving, fractions, social studies, and science.

What are some of the key findings about this instructional strategy?

SRSD has been recognized by the What Works Clearinghouse (WWC) as an evidence-based practice with  consistently positive effects on writing outcomes.  A 2013 meta-analysis of SRSD for writing found that SRSD was effective across different research teams, different methodologies, differing genres of writing (such as narrative or persuasive), and students with diverse needs including students with learning disabilities and emotional and behavioral disorders. Effect sizes in SRSD research are typically large, exceeding .85 in meta-analyses and commonly ranging from 1.0 to 2.55 across writing and affective outcome measures.

Over the years, IES has supported a number of studies on SRSD, which has led to some key findings that have practical implications for instruction from elementary school through college.

Do you know how many teachers use SRSD in their classrooms?

It is hard to be sure how prevalent SRSD instruction is in practice, but there are two groups dedicated to scaling up SRSD in schools— thinkSRSD and SRSD Online—both of which I voluntarily advise. Together, they have reached over 300,000 students and their teachers in the United States. In addition, I am following or in touch with researchers or teachers in 12 countries across Europe, North America, Australia, Africa, Asia, and the Middle East.

What’s next for research on SRSD?  

Many students have difficulty writing by the time they get to upper elementary school. Currently, there is an ongoing development project that is adapting and testing SRSD for children in the lower elementary grades to support their oral language skills, transcription, and writing strategy skills. The research team is in the process of conducting a small-scale randomized controlled study and will have findings soon.

Beyond this study, there are many future directions for SRSD research, including further work in different genres of writing, different grades, and involving families in the SRSD process. More work on how to integrate SRSD strategies into instruction across content areas, such as social studies or science is also needed. Despite the evidence base for and interest in SRSD, a major challenge is scaling up SRSD in schools. We and other researchers have identified numerous barriers to this goal. We also need research on working with administrators, schools, and teachers to use writing more effectively as a tool for self-expression, self-advocacy, and social and political engagement. Writing can also be an important and effective means of addressing issues of equity and identity, and little SRSD research has been done in these areas.

Dr. Karen Harris is Regents Professor and the Mary Emily Warner Professor at Arizona State University’s Mary Lou Fulton Teachers College. Her current research focuses on refining a web-based intelligent tutor to augment SRSD instruction with elementary students in persuasive writing, integrating SRSD with reading to learn and writing to inform, developing a Universal Design for Learning Science Notebook, and developing practice-based professional development for SRSD.

This blog was produced by Julianne Kasper, Virtual Student Federal Service intern at IES and graduate student in education policy & leadership at American University.

Unexpected Value from Conducting Value-Added Analysis

This is the second of a two-part blog series from an IES-funded partnership project. The first part described how the process of cost-effectiveness analysis (CEA) provided useful information that led to changes in practice for a school nurse program and restorative practices at Jefferson County Public Schools (JCPS) in Louisville, KY. In this guest blog, the team discusses how the process of conducting value-added analysis provided useful program information over and above the information they obtained via CEA or academic return on investment (AROI).

Since we know you loved the last one, it’s time for another fun thought experiment! Imagine that you have just spent more than a year gathering, cleaning, assembling, and analyzing a dataset of school investments for what you hope will be an innovative approach to program evaluation. Now imagine the only thing your results tell you is that your proposed new application of value-added analysis (VAA) is not well-suited for these particular data. What would you do? Well, sit back and enjoy another round of schadenfreude at our expense. Once again, our team of practitioners from JCPS and researchers from Teachers College, Columbia University and American University found itself in a very unenviable position.

We had initially planned to use the rigorous VAA (and CEA) to evaluate the validity of a practical measure of academic return on investment for improving school budget decisions on existing school- and district-level investments. Although the three methods—VAA, CEA, and AROI—vary in rigor and address slightly different research questions, we expected that their results would be both complementary and comparable for informing decisions to reinvest, discontinue, expand/contract, or make other implementation changes to an investment. To that end, we set out to test our hypothesis by comparing results from each method across a broad spectrum of investments. Fortunately, as with CEA, the process of conducting VAA provided additional, useful program information that we would not have otherwise obtained via CEA or AROI. This unexpected information, combined with what we’d learned about implementation from our CEAs, led to even more changes in practice at JCPS.

Data Collection for VAA Unearthed Inadequate Record-keeping, Mission Drift, and More

Our AROI approach uses existing student and budget data from JCPS’s online Investment Tracking System (ITS) to compute comparative metrics for informing budget decisions. Budget request proposals submitted by JCPS administrators through ITS include information on target populations, goals, measures, and the budget cycle (1-5 years) needed to achieve the goals. For VAA, we needed similar, but more precise, data to estimate the relative effects of specific interventions on student outcomes, which required us to contact schools and district departments to gather the necessary information. Our colleagues provided us with sufficient data to conduct VAA. However, during this process, we discovered instances of missing or inadequate participant rosters; mission drift in how requested funds were actually spent; and mismatches between goals, activities, and budget cycles. We suspect that JCPS is not alone in this challenge, so we hope that what follows might be helpful to other districts facing similar scenarios.

More Changes in Practice 

The lessons learned during the school nursing and restorative practice CEAs discussed in the first blog, and the data gaps identified through the VAA process, informed two key developments at JCPS. First, we formalized our existing end-of-cycle investment review process by including summary cards for each end-of-cycle investment item (each program or personnel position in which district funds were invested) indicating where insufficient data (for example, incomplete budget requests or unavailable participation rosters) precluded AROI calculations. We asked specific questions about missing data to elicit additional information and to encourage more diligent documentation in future budget requests. 

Second, we created the Investment Tracking System 2.0 (ITS 2.0), which now requires budget requesters to complete a basic logic model. The resources (inputs) and outcomes in the logic model are auto-populated from information entered earlier in the request process, but requesters must manually enter activities and progress monitoring (outputs). Our goal is to encourage and facilitate development of an explicit theory of change at the outset and continuous evidence-based adjustments throughout the implementation. Mandatory entry fields now prevent requesters from submitting incomplete budget requests. The new system was immediately put into action to track all school-level Elementary and Secondary School Emergency Relief (ESSER)-related budget requests.

Process and Partnership, Redux

Although we agree with the IES Director’s insistence that partnerships between researchers and practitioners should be a means to (eventually) improving student outcomes, our experience shows that change happens slowly in a large district. Yet, we have seen substantial changes as a direct result of our partnership. Perhaps the most important change is the drastic increase in the number of programs, investments, and other initiatives that will be evaluable as a result of formalizing the end-of-cycle review process and creating ITS 2.0. We firmly believe these changes could not have happened apart from our partnership and the freedom our funding afforded us to experiment with new approaches to addressing the challenges we face.   


Stephen M. Leach is a Program Analysis Coordinator at JCPS and PhD Candidate in Educational Psychology Measurement and Evaluation at the University of Louisville.

Dr. Robert Shand is an Assistant Professor at American University.

Dr. Bo Yan is a Research and Evaluation Specialist at JCPS.

Dr. Fiona Hollands is a Senior Researcher at Teachers College, Columbia University.

If you have any questions, please contact Corinne Alfeld (Corinne.Alfeld@ed.gov), IES-NCER Grant Program Officer.

 

Unexpected Benefits of Conducting Cost-Effectiveness Analysis

This is the first of a two-part guest blog series from an IES-funded partnership project between Teachers College, Columbia University, American University, and Jefferson County Public Schools in Kentucky. The purpose of the project is to explore academic return on investment (AROI) as a metric for improving decision-making around education programs that lead to improvements in student education outcomes. In this guest blog entry, the team showcases cost analysis as an integral part of education program evaluation.

Here’s a fun thought experiment (well, at least fun for researcher-types). Imagine you just discovered that two of your district partner’s firmly entrenched initiatives are not cost-effective. What would you do? 

Now, would your answer change if we told you that the findings came amidst a global pandemic and widespread social unrest over justice reform, and that those two key initiatives were a school nurse program and restorative practices? That’s the exact situation we faced last year in Jefferson County Public Schools (JCPS) in Louisville, KY. Fortunately, the process of conducting rigorous cost analyses of these programs unearthed critical evidence to help explain mostly null impact findings and inform very real changes in practice at JCPS.

Cost-Effectiveness Analysis Revealed Missing Program Components

Our team of researchers from Teachers College, Columbia University and American University, and practitioners from JCPS had originally planned to use cost-effectiveness analysis (CEA) to evaluate the validity of a practical measure of academic return on investment for improving school budget decisions. With the gracious support of JCPS program personnel in executing our CEAs, we obtained a treasure trove of additional quantitative and qualitative cost and implementation data, which proved to be invaluable.

Specifically, for the district’s school nurse program, the lack of an explicit theory of change, of standardized evidence-based practices across schools, and of a monitoring plan were identified as potential explanations for our null impact results. In one of our restorative practices cost interviews, we discovered that a key element of the program, restorative conferences, was not being implemented at all due to time constraints and staffing challenges, which may help explain the disappointing impact results.

Changes in Practice

In theory, our CEA findings indicated that JCPS should find more cost-effective alternatives to school nursing and restorative practices. In reality, however, both programs were greatly expanded; school nursing in response to COVID and restorative practices because JCPS leadership has committed to moving away from traditional disciplinary practices. Our findings regarding implementation, however, lead us to believe that key changes can lead to improved student outcomes for both.

In response to recommendations from the team, JCPS is developing a training manual for new nurses, a logic model illustrating how specific nursing activities can lead to better outcomes, and a monitoring plan. For restorative practices, while we still have a ways to go, the JCPS team is continuing to work with program personnel to improve implementation.

One encouraging finding from our CEA was that, despite imperfect implementation, suspension rates for Black students were lower in schools that had implemented restorative practices for two years compared to Black students in schools implementing the program for one year. Our hope is that further research will identify the aspects of restorative practices most critical for equitably improving school discipline and climate.

Process and Partnership

Our experience highlights unexpected benefits that can result when researchers and practitioners collaborate on all aspects of cost-effectiveness analysis, from collecting data to applying findings to practice. In fact, we are convinced that the ongoing improvements discussed here would not have been possible apart from the synergistic nature of our partnership. While the JCPS team included seasoned evaluators and brought front-line knowledge of program implementation, information systems, data availability, and district priorities, our research partners brought additional research capacity, methodological expertise, and a critical outsider’s perspective.

Together, we discovered that the process of conducting cost-effectiveness analysis can provide valuable information normally associated with fidelity of implementation studies. Knowledge gained during the cost analysis process helped to explain our less-than-stellar impact results and led to key changes in practice. In the second blog of this series, we’ll share how the process of conducting CEA and value-added analysis led to changes in practice extending well beyond the specific programs we investigated.


Stephen M. Leach is a Program Analysis Coordinator at JCPS and PhD Candidate in Educational Psychology Measurement and Evaluation at the University of Louisville.

Dr. Fiona Hollands is a Senior Researcher at Teachers College, Columbia University.

Dr. Bo Yan is a Research and Evaluation Specialist at JCPS.

Dr. Robert Shand is an Assistant Professor at American University.

If you have any questions, please contact Corinne Alfeld (Corinne.Alfeld@ed.gov), IES-NCER Grant Program Officer.

Student-Led Action Research as a School Climate Intervention and Core Content Pedagogy

Improving the social and emotional climate of schools has become a growing priority for educators and policymakers in the past decade. The prevailing strategies for improving school climate include social and emotional learning, positive behavioral supports, and trauma-informed approaches. Many of these strategies foreground the importance of students having a voice in intervention, as students are special experts in their own social and emotional milieus.

Parallel to this trend has been a push toward student-centered pedagogical approaches in high schools that are responsive to cultural backgrounds and that promote skills aligned with the demands of the modern workplace, like critical thinking, problem-solving, and collaboration. Culturally responsive and restorative teaching and problem- and project-based learning are prominent movements. In this guest blog, Dr. Adam Voight at Cleveland State University discusses an ongoing IES-funded Development and Innovation project taking place in Cleveland, Ohio that aims to develop and document the feasibility of a school-based youth participatory action research intervention.

 

Our project is exploring how youth participatory action research (YPAR) may help to realize two objectives—school climate improvement and culturally-restorative, engaged learning. YPAR involves young people leading a cycle of problem identification, data collection and analysis, and evidence-informed action. It has long been used in out-of-school and extracurricular spaces to promote youth development and effect social change. We are field testing its potential to fit within more formal school spaces.

Project HighKEY

The engine for our project, which we call Project HighKEY (High-school Knowledge and Education through YPAR), is a design team composed of high school teachers and students, district officials, and university researchers. It is built from the Cleveland Alliance for Education Research, a research-practice partnership between the Cleveland Metropolitan School District, Cleveland State University, and the American Institutes for Research. The design team meets monthly to discuss YPAR theory and fit with high school curriculum and standards and make plans for YPAR field tests in schools. We have created a crosswalk of the documented competencies that students derive from YPAR and high school standards in English language arts (ELA), mathematics, science, and social studies in Ohio. For example, one state ELA standard is “Write arguments to support claims in an analysis of substantive topics or texts, using valid reasoning and relevant and sufficient evidence,” and through YPAR students collect and analyze survey and interview data and use their findings to advocate for change related to their chosen topic. A state math standard is “Interpret the slope and the intercept of a linear model in the context of data,” and this process may be applied to survey data students collect through YPAR, making an otherwise abstract activity more meaningful to students.  

Assessing the Effectiveness of YPAR

Remaining open-minded about the various ways in which YPAR may or may not fit in different high school courses, we are currently testing its implementation in a pre-calculus course, a government course, an English course, and a life-skills course. For example, a math teacher on our design team has built her statistics unit around YPAR. Students in three separate sections of the course have worked in groups of two or three to identify an issue and create a survey that is being administered to the broader student body. These issues include the lack of extracurricular activities, poor school culture, and unhealthy breakfast and lunch options. Their survey data will be used as the basis for learning about representing data with plots, distributions, measures of center, frequencies, and correlation after the winter holiday. Our theory is that students will be more engaged when using their own data on topics of their choosing and toward the goal of making real change. Across all of our project schools, we are monitoring administrative data, student and teacher survey data, and interview data to assess the feasibility, usability, and student and school outcomes of YPAR.

Impact of COVID-19 and How We Adapted

We received notification of our grant award in March 2020, the same week that COVID-19 shut down K-12 schools across the nation. When our project formally began in July 2020, our partner schools were planning for a wholly remote school year, and we pivoted to hold design team meetings virtually and loosen expectations for teacher implementation. Despite these challenges, several successful YPAR projects during that first year—all of which were conducted entirely remotely—taught all of us much about how YPAR can happen in online spaces. This school year, students and staff are back to in-person learning, but, in addition to the ongoing pandemic, the crushing teacher shortage has forced us to continue to adapt. Whereas we once planned our design team meeting during the school day, we now meet after school due to a lack of substitute teachers, and we use creative technology to allow for mixed virtual and in-person attendance. Our leadership team is also spending a great deal of time in classrooms with teachers to assist those implementing for the first time. Our goal is to create a resource that teachers anywhere can use to incorporate YPAR into their courses. The product will be strengthened by the lessons we have learned from doing this work during these extraordinary times and the resulting considerations for how to deal with obstacles to implementation.


Adam Voight is the Director of the Center for Urban Education at Cleveland State University.

For questions about this grant, please contact Corinne Alfeld, NCER Program Officer, at Corinne.Alfeld@ed.gov.