# IES Blog

### Institute of Education Sciences

During spring 2020, the COVID-19 pandemic forced the closure of millions of U.S. schools. As schools reopened this fall, conversations have revolved around using this unique situation as a chance to rethink education and how students learn. When we think about innovative ways to improve education, ideas tend to gravitate towards radical changes to the classroom experience, expensive interventions, and costly professional development. Everyone is looking for the next “big” idea, but perhaps part of the solution lies in a more subtle, inexpensive, and less disruptive change that may be as impactful as a completely new education approach: strategic revisions to the materials teachers and students already use in their classrooms (whether in person or virtual).

Textbooks (or ebooks) and supplemental education materials are central to providing students with the content knowledge and practice experiences to support mastery of academic skills. Textbook developers spend significant time and effort to ensure that the content in those textbooks aligns to standards and provides students with the information and examples needed to understand key concepts. However, even with age-appropriate content and high-quality practice exercises, textbooks may not be effective as learning tools if they present and sequence information in a way that is not aligned to what we know about how people learn.

You may be wondering how much room there is for improvement—textbooks seem pretty good at delivering content as is, right? Actually, findings from three IES-funded projects demonstrate that there are multiple ways to improve texts and student understanding of key concepts. Here are a few of those ways:

Present a wide range of fraction practice problems. Textbooks focused on fractions learning tend to present more problems with equal denominators for addition and subtraction problems than for multiplication problems. Why does this matter? In IES-funded research, David Braithwaite and Bob Siegler showed that students pick up on this bias. As a result, students are more likely to make errors on equal denominator fractions multiplication problems because they are so used to seeing those problems when practicing fractions arithmetic and subtraction. The recommended minor change is to include a wider range of fractions practice problems, including equal denominator multiplication problems, to ensure that students do not form irrelevant associations between superficial features of a practice problem and the solution strategies they are practicing.

Provide students with a mix of practice problems that require different strategies rather than practice problems of the same type. Typical math practice involves solving the same type of problem repeatedly to practice the specific solution strategy a student just learned. However, across numerous IES-funded studies, Douglas Rohrer and his research team have shown that students benefit substantially more from math practice that involves a mix of problems that require different strategies (those learned in previous lessons mixed with those just learned). One of the major benefits of this approach is that students get practice choosing which strategy to use for a particular problem. Rohrer and his team found that across 13,505 practice problems from six popular math textbooks, only 9.7% of those problems were mixed up in this way. The recommended minor change is to simply mix up the problem sets so that students have more experiences encountering different types of problems in a single sitting.

Where and how you place visuals on textbook pages matters, especially when you want students to compare them. Textbooks typically use visuals such as diagrams and photos to help reinforce key concepts. In an IES-funded study, Bryan Matlen and colleagues examined anatomy and evolution chapters within three popular middle school science textbooks and found an average of 1.8 visuals per page. Students were expected to make comparisons using about a third of those visuals. Of those they had to compare, about half were positioned in suboptimal ways—that is, the images were not presented in a way that made it easy to identify how the elements of one image compare to the elements of the other. For example, imagine a student is asked to compare two x-ray images of hands to identify a bone that is missing from one of them. This task is much harder if one hand is shown upside down and the other is right-side up or perpendicular to the first image. Consistent with this example, Matlen and colleagues have conducted studies showing that visual comparisons are more effective when the features of the visuals that need to be compared are spatially aligned. The recommended minor change is to be intentional about the placement of visuals that students are supposed to be comparing; make sure they are placed in optimal alignment to each other so that it is easier for students to see how the features of one correspond to those of the other.

In sum, transformative, radical ideas about how to improve education are interesting to brainstorm about, but sometimes the key to improvement is identifying small changes that can deliver big results.

Written by Erin Higgins (Erin.Higgins@ed.gov), Program Officer for the Cognition and Student Learning Program, National Center for Education Research.

Our nation continues to navigate a unique and challenging year due to the COVID-19 pandemic. In our first blog post in this series, we highlighted how educators, students, families, and researchers are adapting while trying to engage in opportunities to support learning. COVID-19 has created numerous challenges in education research with many studies needing to be modified or put on hold. At the same time, new research questions arise focusing on the impact of the pandemic on student learning, engagement, and achievement. Here, we highlight two IES-funded projects that are conducting timely and relevant research exploring the impact of COVID-19 on learning and critical thinking.

Guanglei Hong, Lindsey Richland, and their research team at University of Chicago and University of California, Irvine have received supplemental funds to build off their current grant, Drawing Connections to Close Achievement Gaps in Mathematics. The research team will conduct a study during the 2020-21 school year to explore the relationship between student anxiety about the health risks associated with COVID-19 and their math learning experiences. They predict that pressure and anxiety, like that induced by COVID-19, use the same executive function resources that students need to engage in higher order thinking and reasoning during math instruction, which negatively affects the ability to learn. Through this study, the research team will also test whether particular instructional approaches reduce the effects of pressure and anxiety on learning. These findings will be useful for teachers and students in the near term as they navigate the COVID-19 pandemic and longer term for students who experience anxiety due to a variety of other reasons.

In addition, IES has funded an unsolicited grant to Clarissa Thompson at Kent State University to investigate whether an education intervention aimed at decreasing whole number bias errors can help college-aged students and adults more accurately interpret health statistics about COVID-19. During the COVID-19 pandemic, the public receives daily updates about the number of people locally, nationally, and globally who are infected with and die from COVID-19. Beliefs about the risk of getting a disease is a key predictor of engagement in prevention behaviors. Understanding the magnitude of one’s risk may require making sense of numerical health information, often presented in the form of rational numbers, such as fractions, whole number frequencies, and percentages. An intervention to decrease whole number bias errors and improve understanding of rational numbers has the immediate and pressing benefit of being able to accurately reason about the risk of COVID-19 and other health risks. This skill is also critical for success in science, technology, engineering, and mathematics (STEM) fields.

Both of these projects offer opportunities to better understand learning and critical thinking in the midst of the pandemic. They will also provide the field with generalizable information about ways to improve learning in STEM fields. Stay tuned for more COVID-19 related education research discussions as we continue this series on our blog.

Written by Christina Chhin (christina.chhin@ed.gov) and Erin Higgins (erin.higgins@ed.gov), National Center for Education Research (NCER).

This is the third in a series of blog posts focusing on conducting education research during COVID-19. Other blog posts in this series include Conducting Education Research During COVID-19 and Measuring Attendance during COVID-19: Considerations for Synchronous and Asynchronous Learning Environments.

The National Center for Rural Education Research Networks (NCRERN) is an IES-funded R&D Center that has established a continuous improvement network of 50 rural districts in New York and Ohio. The purpose of the Network is to build the capacity of rural school districts and supporting state agencies to use their own data to improve the education of their students. Districts are currently tackling the problem of student absenteeism through piloting, evaluating, and improving various interventions.  Katherine Kieninger, David Hersh, and Jennifer Ash describe how the Network is tackling the problem of measuring attendance during COVID-19, taking into consideration the various learning environments.

NCRERN has been working to develop a viable attendance construct given that districts and schools are currently struggling with how to define and track attendance for remote or blended learning models. When students are not physically present, the typical observe-and-log model of attendance tracking is not an option. However, not tracking attendance is not an option either given the importance of attendance for identifying at-risk students, predicting key student outcomes, and acting during the pandemic as a proxy for the general safety and well-being of students.

We considered several possible attendance constructs and assessed them by the degree to which they met the following criteria. First, a viable construct should be measurable equitably across all students and learning environments, including in-person, synchronous and asynchronous virtual internet-accessible environments, and asynchronous environments without internet access. The attendance construct should also be simple to understand, easy to capture, and quick to collect. Finally, access to technology and reliable low-cost high-speed internet must be considered, especially in rural areas lacking such infrastructure.

We concluded that tracking student exposure to instructional content best meets these criteria, as seen in the table below. While not without its own challenges, exposure to content is the least complicated option, can be tracked across learning environments consistently and is the closest in principal to what in-person attendance captures.

 Attendance Construct Simple Easy to Capture All Students High Frequency Reliable & Valid Consistent Across Grade Levels Consistent across Virtual OR In-Person In-Person Attendance ✅ ✅ ✅ ✅ ✅ ✅ X Exposure to Instructional Content ✅ ✅ ✅ ✅ ✅ ✅ ✅ Participation X ✅ ✅ ✅ ? ? ✅ Assignment Submission X ✅ ✅ ✅ ✅ ✅ ✅ Engagement X ✅ ✅ ✅ X X ✅ Mastery X ? ✅ X X X ✅

In guidance provided to Network districts, we use the table below to outline how to define tracking exposure to content across the learning environments, suggest capture options, and provide a non-exhaustive list of considerations for school district stakeholders. Districts should acknowledge that a student can float between learning environments. For example, an in-person student in quarantine and healthy enough to continue classwork will become a virtual learner. Based on their individual home context, this could place a student in any of the three virtual environments. Creating a plan for seamless attendance tracking across learning environments is key to measuring attendance with fidelity.

 Attendance Construct: Exposure to Instructional Content Learning Environment Definition Capture Options Considerations In-Person Student is present Student Information System (SIS) Will in-person students be able to log in for remote learning if they are not able to come to school?* *For example, a student must miss school for an extended period (i.e. needs to quarantine) Virtual   Synchronous Student is present for virtual class Student Information System (SIS) Can you avoid concurrent classes for students in the same family? If a student loses internet, do you have an asynchronous back-up option for course content? Virtual   Asynchronous with Internet Access Student affirmatively accessed content Learning Management System (LMS) log-in with a minimum time threshold OR Daily form completion (form asks students on what content they worked) How/when will teachers capture results in the SIS? How do you count daily attendance for different class periods? If using LMS log-in option, what is the minimum amount of time a student needs to be logged in? If using a daily form, what question(s) will you ask? We recommend a low threshold equivalent to something a student who was present could answer regardless of their level of engagement. Virtual   Asynchronous without Internet Access Student affirmatively accessed content Contact each student for whom the above guidance does not or cannot apply.   Student is absent only if they have not worked on any instructional content. How will you know when a student does not have internet access, therefore need to call? How do you contact the students who many not have consistent cell service or a landline? What time of day will you contact students or caregivers? How many attempts does a teacher or staff member need to make per day before a student is marked absent? How will you address unresponsive caregivers? How will you count daily attendance for different class periods in MS/HS? If students have multiple content teachers, who will reach out to students?

In the guidance, we also considered assignment submission as a potentially viable attendance construct. An equitable implementation of an assignment submission construct across all learning environments, however, would result in one unique challenge: Would a school district be willing to mark an in-person student absent for the day if the student failed to submit an assignment? While surmountable, addressing this issue would be challenging in the short-term.

As school districts finalize their attendance measurement plans, they will need to ensure that any selected attendance measurements are feasible and sustainable for the duration of the school year for the individuals capturing attendance. This includes considering how long tracking attendance will take for teachers and additional staff members daily. Gathering feedback from teachers and staff regarding the ongoing execution of gathering attendance data is key to ensuring reliable attendance tracking within a district.

We welcome individuals to reach out to NCRERN with additional recommendations or considerations. We are also interested in hearing how attendance is being measured in practice at school districts across the country. Connect with NCRERN via email at ncrern@gse.harvard.edu.

Katherine Kieninger, M.P.A. is the Ohio State Network Manager for the National Center for Rural Education Research Networks (NCRERN) at the Center for Education Policy Research at Harvard University.

David Hersh, J.D., Ph.D. is the Director of Proving Ground at the Center for Education Policy Research at Harvard University.

Jennifer Ash, Ph.D. is the Director of the National Center for Rural Education Research Networks (NCRERN) at the Center for Education Policy Research at Harvard University.

This is the second in a series of blog posts focusing on conducting education research during COVID-19. Other blog posts in this series include Conducting Education Research During COVID-19.

As schools and school districts plan instruction amid the current coronavirus pandemic, the use of technology and digital resources for student instruction is a key consideration.

In this post, the final in a three-part series, we present results from the NAEP TEL and ICILS educator questionnaires (see the first post for information about the results of the two assessments and the second post for the results of the student questionnaires). The questionnaires ask about the focus of technology instruction in schools, school resources to support technology instruction, and the use of technology in teaching practices.

It is important to note that NAEP TEL surveys the principals of U.S. eighth-grade students, while ICILS surveys a nationally representative sample of U.S. eighth-grade teachers.

Emphasis in technology instruction

According to the 2018 NAEP TEL principal questionnaire results, principals1 of 61 percent of U.S. eighth-grade students reported that prior to or in eighth grade, much of the emphasis in information and communication technologies (ICT) instruction was placed on teaching students how to collaborate with others. In addition, principals of 51 percent of eighth-grade students reported that a lot of emphasis was placed on teaching students how to find information or data to solve a problem. In comparison, principals of only 10 percent of eighth-grade students reported that a lot of emphasis was placed on teaching students how to run simulations (figure 1).

According to the 2018 ICILS teacher questionnaire results, 40 percent of U.S. eighth-grade teachers reported a strong emphasis on the use of ICT instruction to develop students’ capacities to use computer software to construct digital work products (e.g., presentations). In addition, 35 percent of eighth-grade teachers reported a strong emphasis on building students’ capacities to access online information efficiently. In comparison, 17 percent reported a strong emphasis on developing students’ capacities to provide digital feedback on the work of others (figure 2).

Resources at school

NAEP TEL and ICILS used different approaches to collect information about technology-related school resources. NAEP TEL asked about hindrances that limited schools’ capabilities to provide instruction in technology or engineering concepts. According to NAEP TEL, principals of 5 percent of U.S. eighth-grade students indicated that a lack or inadequacy of internet connectivity was a “moderate” or “large” hindrance in their schools. However, principals of 61 percent of eighth-grade students indicated that a lack of time due to curriculum content demands was a “moderate” or “large” hindrance. Principals of 44 percent of eighth-grade students indicated that a lack of qualified teachers was a “moderate” or “large” hindrance (figure 3).

ICILS asked about the adequacy of school resources to support ICT use in teaching. Eighty-six percent of U.S. teachers “agreed” or “strongly agreed” that technology was considered a priority for use in teaching. Nearly three-quarters of teachers “agreed” or “strongly agreed” that their schools had access to sufficient digital learning resources and had good internet connectivity (74 and 73 percent, respectively) (figure 4).

Use of technology in teaching

Teachers of U.S. eighth-grade students reported that they often used technology in their teaching practices. ICILS found that 64 percent of U.S. teachers regularly (i.e., “often” or “always”) used technology to present class instruction. Fifty-four percent of teachers regularly used technology to communicate with parents or guardians about students’ learning. In addition, 45 percent of teachers regularly used technology to provide remedial or enrichment support to individual or small groups of students, and a similar percentage (44 percent) regularly used technology to reinforce skills through repetition of examples (figure 5).

ICILS also reported results from U.S. eighth-grade teachers about how they collaborated on technology use. About three-quarters “agreed” or “strongly agreed” that they talked to other teachers about how to use technology in their teaching. Similarly, about three-quarters “agreed” or “strongly agreed” that they shared technology resources with other teachers in the school. More than half of the teachers “agreed” or “strongly agreed” that they collaborated with colleagues on the development of technology-based lessons.

Overall, the responses of teachers and principals suggested that emphasis had been put on different aspects of instruction for eighth-grade students. The majority of schools had enough digital resources and adequate internet access. However, technologies were also used differently in different teaching practices.

It should be noted that the data presented here were collected in 2018; any changes since then due to the coronavirus pandemic are not reflected in the results reported here. The NAEP TEL and ICILS samples both include public and private schools. The 2018 ICILS also included a principal questionnaire, but the questions are not directly related to the topics included in this blog. Data reported in the text and figures are rounded to the nearest integer.

By Yan Wang, AIR, and Taslima Rahman, NCES

[1] The unit of analysis for TEL principal responses is student.

Since the start of the pandemic, we have all heard about the unprecedented changes to schooling in the U.S. and the ways that educators, students, and families have been adapting to the new reality.

Education researchers have also been adapting their work due to school closings, canceled testing, and different school reopening plans in the 2020-21 school year.

How have education researchers handled the new reality?

Some researchers have been busy compiling and disseminating research findings to support districts and schools to continue instruction during the pandemic. For example, evidence-based recommendations were made available to help parents and schools pivot to a virtual environment (from very young children up to the postsecondary level), maintain engagement, address mental health (including in rural areas), protect against learning loss, and decide how to prioritize needs when considering re-opening. And many education technology researchers and developers have provided online resources to schools.

Other researchers have been working hard to understand the overall disruption to schooling due to COVID-19 and the ramifications on student learning around the world.  For example, there have been efforts to keep track of school closures, document what is happening in schools across the country (including in rural districts), study the switch to online learning and attend to unequal access to technology for remote learning, forecast funding scenarios, and examine changes in teacher recruitment.

In addition, education researchers are thinking about new ways to conduct research in light of the changes to schooling. They are looking at alternatives to standardized testing, new approaches to teaching and learning to strengthen schools moving forward, and ways to rebuild our education systems after the pandemic. Indeed, there are myriad ways that education researchers can and are using their skills to continue to support education during this unprecedented time.

How has COVID-19 impacted IES-funded education research studies?

IES realizes that the pandemic has changed things in ways that may complicate education research – both how it is conducted and how it is interpreted. So, we are actively working with grantees to help ensure the integrity of their work and to respond to the needs, interests, and concerns of the schools and colleges they are working with and the communities they are trying to help. In a follow-up to an IES-funded study on students in foster care, a researcher-practitioner partnership in Colorado is examining the implications of challenging circumstances such as COVID-19 on the postsecondary education of vulnerable youth.

Many IES-funded researchers have had to alter their research plans to accommodate the needs of their partner schools and overcome the challenges posed by the abrupt transition to virtual learning. Because of continued uncertainty, they may need to change plans again. Program officers at IES have been working with grantees on a case-by-case basis to adapt their timelines and, in some cases, their research designs.

IES’s priority is to help researchers maintain scientific rigor while holding a realistic view of what can and cannot be done this year. As we work with our grantees, we take into consideration where the project is in its overall timeline. For example, if the project has collected all of its data and is in the final analysis stage, the remaining work may not be affected. Or, if a project has not yet started to begin an intervention in schools, it can pause during the 2020-2021 academic school year and resume in 2021-2022. Still, other projects may find themselves unable to either continue or pause. These projects may not be able to achieve their initial purpose and may need to end.

Despite some of the challenges, the pandemic offers a unique natural experiment for learning and instruction, as well as opportunities for innovation that can ultimately benefit education. IES, our funded researchers, and the communities that rely on research evidence continue to pull in the same direction: building strong evidence to inform policy and practice. Through collaboration and dialog, we will work together to ensure that data and results are meaningful, valid, and as timely as possible. IES will continue to focus on high-quality education research to improve student learning and achievement both now and in the future.

Stay tuned for future blog posts on what our researchers are doing to address some of the challenges that face educators, families, and policymakers during this unprecedented time!

Written by Corinne Alfeld (Corinne.Alfeld@ed.gov), National Center for Education Research.