Inside IES Research

Notes from NCER & NCSER

Six Strategies for Effectively Communicating Research Findings to Decision Makers

Researchers must possess the ability to clearly communicate research findings to non-technical audiences, including decision makers who may have limited time and varying levels of tolerance for data-rich reports. We and our colleagues recently honed these skills while preparing research briefs for the Virginia Department of Education (VDOE) as part of an IES-funded partnership project between VDOE and the University of Virginia exploring the impacts of the COVID-19 pandemic on students and teachers. These briefs are a key mechanism through which our project services the purpose of IES’s Using Longitudinal Data to Support State Education Policymaking Grantmaking Programs to generate useful findings to inform the decision making of policy makers and education leaders at the state, district, and school levels.

In their initial feedback, VDOE described the briefs as “too technical.” When we led with the numbers, our intended audience quickly became overwhelmed by the need to also interpret the findings on their own. Our conversations with VDOE provided helpful direction on how we could revise the briefs to better reach non-technical, decision-making audiences in Virginia and beyond. We share six strategies we have applied to all our research briefs.

  • Yes, briefs need a summary too: The draft briefs were short (4-7 pages) inclusive of figures and endnotes, and they began with a list of key findings. Based on the feedback, we morphed this list into a proper summary of the brief. Many of the decision makers we want to reach only have time to read a page summary, and that summary needs to be self-contained. Without additional context, the initial list of key findings would have had minimal impact.
  • Lead with the headline: Numbers are a powerful tool for storytelling; however, too many numbers can also be hard for many people—researchers and non-researchers alike—to consume. We therefore edited each paragraph to lead with a numbers-free sentence that provides the main take away from the analysis and followed that up with the supporting evidence (the numbers).
  • Answer the question: Our initial groundwork to develop solid relationships with agency staff allowed us to identify priority questions on which to focus the briefs. While several tangential but interesting findings also resulted from our analysis, the briefs we developed only focused on answering the priority research questions. Tangential findings can be explored in more depth in future research projects.
  • Accurate but not over-caveated: All research makes some assumptions and has some limitations. The average non-technical audience member is unlikely to want a thorough detailing of each of these; however, some are too important to exclude. We chose to include those that were most vital to helping the reader make the correct interpretation.
  • A picture speaks a thousand words: This was something at which our initial drafts succeeded. Rather than providing tables of statistics, we included simple, well-labeled figures that clearly presented the key findings graphically to visually tell the story.
  • Conclude by summarizing not extrapolating: The purpose of these briefs was to describe the changes that the pandemic wrought to Virginia’s public schools and convey that knowledge to decision makers charged with plotting a course forward. The briefs were not intended to provide explicit guidance or recommendations to those decision makers.

These strategies, of course, are also useful when writing for technical audiences. While their training and experiences may equip them to consume research that doesn’t exhibit these six strategies, using these strategies will enhance the impact of your research findings with even the most technical of audiences.


Luke C. Miller is a Research Associate Professor at the University of Virginia’s School of Education and Human Development. He is the lead researcher and co-Principal Investigator on the IES-funded project led by VDOE in partnership with UVA.

Jennifer Piver-Renna is the Director of the Office of Research in the Department of Data, Research and Technology at the Virginia Department of Education. She is the state education agency (SEA) co-Investigator on the IES-funded project.

This blog was produced by Allen Ruby (Allen.Ruby@ed.gov), Associate Commissioner for Policy and Systems Division, NCER.  

Lessons Learned as the Virginia Education Science Training (VEST) Program Creates Pathways for Diverse Students into Education Science

Since 2004, the Institute of Education Sciences has funded predoctoral training programs to increase the number of well-trained PhD students who are prepared to conduct rigorous and relevant education research. In addition to providing training to doctoral students, all IES-funded predoctoral programs are encouraged to help broaden participation in the education sciences as part of their leadership activities. In this guest blog post, the leadership team of the University of Virginia predoctoral training program discusses their continuing efforts to create diverse pathways for students interested in education research.

In 2008, the IES-funded Virginia Education Science Training (VEST) Program began the Summer Undergraduate Research Program (SURP) with the goal of recruiting more students from traditionally marginalized groups into education science research. Each year, 8–10 students from around the United States traveled to receive faculty mentorship in independent research at the University of Virginia. In doing so, they experienced facilitated opportunities to develop new research skills and reflect about their own identities as scholars and students of color, first generation college students and/or students from families with low income. They became active members of research groups, visited IES program officers in Washington, DC, and presented their own research at the Leadership Alliance National Symposium.

Quite fortuitously, at an IES principal investigator meeting, we connected with the leadership of the IES-funded Research Institute for Scholars of Equity (RISE) program taking place at North Carolina Central University (NCCU). As a result, for four years, we collaborated with RISE leadership to host two-day RISE fellow visits to UVA. During these visits RISE fellows shared their projects and ideas with VEST fellows and faculty. The RISE and SURP fellows also mingled and attended workshops on graduate school admissions.

We had three goals for these efforts:

  • Provide IES pre-doctoral fellows with the opportunity to apply leadership skills to working with undergraduates
  • Increase the diversity of education scientists
  • Increase the diversity of our IES-sponsored PhD program

Enter COVID. In 2020, bringing students to UVA for the summer wasn’t feasible or wise. Instead, we reflected on our past successful experiences with NCCU and realized we could improve the quality of student experiences if we also worked closely with faculty at other universities. To start, we engaged with Virginia State University (VSU) and Norfolk State University (NSU), two Virginia HBCUs, to create the Open Doors Program.

Initially, eight faculty and administrators from NSU and VSU met with the UVA team, which included a post-doctoral fellow and a PhD student who coordinated discussions, helped design the curriculum, and built an Open Doors handbook. The design team built a program in which 12 rising juniors at NSU and VSU would:

  • Engage in the research and writing process that will lead to a research product and presentation that reflects their strengths, interests, and goals
  • Gain a deeper understanding of the opportunities available to them in graduate school
  • Have the opportunity to examine the complexities and multiple layers of their intersectional identities, identify assets and cultural wealth, and identify academic strengths and areas of growth
  • Build relationships with faculty and graduate student mentors

Due to the pandemic, the program was offered virtually over four weeks with a combination of seminars and mentoring sessions. The program exceeded our expectations. The students all indicated that Open Doors was a useful learning experience for them and provided them with a better understanding of the opportunities available in graduate school. The faculty valued the opportunity to work with each other. We will be offering Open Doors 2.0 next June with another cohort of 12 students from NSU and VSU. We learned a lot from our first year and have planned several modifications to the program. For example, this year, we anticipate that students and some NSU and VSU faculty will be on campus at UVA for two of the four weeks; the other two weeks will be virtual.

These efforts have been true learning experiences for UVA faculty and VEST fellows. We have several recommendations for other programs eager to create pathways programs.

  • Clarify your goals and organize the program around the key outcomes that you are trying to achieve. For SURP and Open Doors, we focused in on four outcomes: preparation to conduct education research, preparation for graduate school, expansion of networks, and providing access to new mentoring relationships.
  • Teach skills as well as knowledge. Our evaluation of SURP points to the importance of teaching skills so students can formulate research questions, recognize research designs, analyze and interpret data, and write about research. Students reported gaining skills in these areas which are critical to success in graduate school in education research.
  • Identify ways to enhance cultural capital. Students benefit from knowledge, familiarity, and comfort with university life. In Open Doors, we wanted to build an authentic collaboration that allowed faculty, graduate students, and undergraduate students at the HBCUs and UVA to learn from each other, extending the cultural capital of all participants.

Our efforts have been exciting yet humbling. Above all, we enjoy listening to and learning from the SURP and Open Doors students. In Open Doors, we also enjoyed building relationships with faculty at other institutions. We have increasingly become aware of the challenges we face in efforts to increase the diversity of our programs. Recruitment is just a first step. Creating graduate school experiences that are conducive to learning and engagement for students from diverse group is an important second step. And a third critical step is to transform life at our universities so that students (and faculty) from traditionally marginalized groups can thrive and flourish. In doing so, we expect that universities will be better able to meet a full range of new challenges that lie ahead in education science.

 


Sara Rimm-Kaufman is the Commonwealth Professor of Education in the Educational Psychology-Applied Developmental Science program at the University of Virginia School of Education and Human Development.

Jim Wyckoff is the Memorial Professor of Education and Public Policy in the Education Policy program and directs the Center on Education Policy and Workforce Competitiveness at the University of Virginia.

Jamie Inlow is the Coordinator for the VEST Predoctoral Training Program in the University of Virginia School of Education and Human Development.

This blog post is part of an ongoing series featuring IES training programs as well as our blog series on diversity, equity, inclusion, and accessibility (DEIA) within IES grant programs.

Produced by Katina Stapleton (Katina.Stapleton@ed.gov), co-Chair of the IES Diversity and Inclusion Council and predoctoral training program officer.

Student-Led Action Research as a School Climate Intervention and Core Content Pedagogy

Improving the social and emotional climate of schools has become a growing priority for educators and policymakers in the past decade. The prevailing strategies for improving school climate include social and emotional learning, positive behavioral supports, and trauma-informed approaches. Many of these strategies foreground the importance of students having a voice in intervention, as students are special experts in their own social and emotional milieus.

Parallel to this trend has been a push toward student-centered pedagogical approaches in high schools that are responsive to cultural backgrounds and that promote skills aligned with the demands of the modern workplace, like critical thinking, problem-solving, and collaboration. Culturally responsive and restorative teaching and problem- and project-based learning are prominent movements. In this guest blog, Dr. Adam Voight at Cleveland State University discusses an ongoing IES-funded Development and Innovation project taking place in Cleveland, Ohio that aims to develop and document the feasibility of a school-based youth participatory action research intervention.

 

Our project is exploring how youth participatory action research (YPAR) may help to realize two objectives—school climate improvement and culturally-restorative, engaged learning. YPAR involves young people leading a cycle of problem identification, data collection and analysis, and evidence-informed action. It has long been used in out-of-school and extracurricular spaces to promote youth development and effect social change. We are field testing its potential to fit within more formal school spaces.

Project HighKEY

The engine for our project, which we call Project HighKEY (High-school Knowledge and Education through YPAR), is a design team composed of high school teachers and students, district officials, and university researchers. It is built from the Cleveland Alliance for Education Research, a research-practice partnership between the Cleveland Metropolitan School District, Cleveland State University, and the American Institutes for Research. The design team meets monthly to discuss YPAR theory and fit with high school curriculum and standards and make plans for YPAR field tests in schools. We have created a crosswalk of the documented competencies that students derive from YPAR and high school standards in English language arts (ELA), mathematics, science, and social studies in Ohio. For example, one state ELA standard is “Write arguments to support claims in an analysis of substantive topics or texts, using valid reasoning and relevant and sufficient evidence,” and through YPAR students collect and analyze survey and interview data and use their findings to advocate for change related to their chosen topic. A state math standard is “Interpret the slope and the intercept of a linear model in the context of data,” and this process may be applied to survey data students collect through YPAR, making an otherwise abstract activity more meaningful to students.  

Assessing the Effectiveness of YPAR

Remaining open-minded about the various ways in which YPAR may or may not fit in different high school courses, we are currently testing its implementation in a pre-calculus course, a government course, an English course, and a life-skills course. For example, a math teacher on our design team has built her statistics unit around YPAR. Students in three separate sections of the course have worked in groups of two or three to identify an issue and create a survey that is being administered to the broader student body. These issues include the lack of extracurricular activities, poor school culture, and unhealthy breakfast and lunch options. Their survey data will be used as the basis for learning about representing data with plots, distributions, measures of center, frequencies, and correlation after the winter holiday. Our theory is that students will be more engaged when using their own data on topics of their choosing and toward the goal of making real change. Across all of our project schools, we are monitoring administrative data, student and teacher survey data, and interview data to assess the feasibility, usability, and student and school outcomes of YPAR.

Impact of COVID-19 and How We Adapted

We received notification of our grant award in March 2020, the same week that COVID-19 shut down K-12 schools across the nation. When our project formally began in July 2020, our partner schools were planning for a wholly remote school year, and we pivoted to hold design team meetings virtually and loosen expectations for teacher implementation. Despite these challenges, several successful YPAR projects during that first year—all of which were conducted entirely remotely—taught all of us much about how YPAR can happen in online spaces. This school year, students and staff are back to in-person learning, but, in addition to the ongoing pandemic, the crushing teacher shortage has forced us to continue to adapt. Whereas we once planned our design team meeting during the school day, we now meet after school due to a lack of substitute teachers, and we use creative technology to allow for mixed virtual and in-person attendance. Our leadership team is also spending a great deal of time in classrooms with teachers to assist those implementing for the first time. Our goal is to create a resource that teachers anywhere can use to incorporate YPAR into their courses. The product will be strengthened by the lessons we have learned from doing this work during these extraordinary times and the resulting considerations for how to deal with obstacles to implementation.


Adam Voight is the Director of the Center for Urban Education at Cleveland State University.

For questions about this grant, please contact Corinne Alfeld, NCER Program Officer, at Corinne.Alfeld@ed.gov.

How Remote Data Collection Enhanced One Grantee’s Classroom Research During COVID-19

Under an IES grant, Michigan State University, in collaboration with the Michigan Department of Education, the Michigan Center for Educational Performance and Information, and the University of Michigan, is assessing the implementation, impact, and cost of the Michigan “Read by Grade 3” law intended to increase early literacy outcomes for Michigan students. In this guest blog, Dr. Tanya Wright and Lori Bruner discuss how they were able to quickly pivot to a remote data collection plan when COVID-19 disrupted their initial research plan.  

The COVID-19 pandemic began while we were planning a study of early literacy coaching for the 2020-2021 academic year. It soon became abundantly clear that restrictions to in-person research would pose a major hurdle for our research team. We had planned to enter classrooms and record videos of literacy instruction in the fall. As such, we found ourselves faced with a difficult choice: we could pause our study until it became safer to visit classrooms and miss the opportunity to learn about literacy coaching and in-person classroom instruction during the pandemic, or we could quickly pivot to a remote data collection plan.

Our team chose the second option. We found that there are multiple technologies available to carry out remote data collection. We chose one of them (a device known as the Swivl) that included a robotic mount, where a tablet or smartphone can be placed to take the video, with a 360-degree rotating platform that works in tandem with a handheld or wearable tracker and an app that allows videos to be instantly uploaded to a cloud-based storage system for easy access.

Over the course of the school year, we captured over 100 hours of elementary literacy instruction in 26 classrooms throughout our state. While remote data collection looks and feels very different from visiting a classroom to record video, we learned that it offers many benefits to both researchers and educators alike. We also learned a few important lessons along the way.

First, we learned remote data collection provides greater flexibility for both researchers and educators. In our original study design, we planned to hire data collectors to visit classrooms, which restricted our recruitment of schools to a reasonable driving distance from Michigan State University (MSU). However, recording devices allow us to capture video anywhere, including rural areas of our state that are often excluded from classroom research due to their remote location. Furthermore, we found that the cost of purchasing and shipping equipment to schools is significantly less than paying for travel and people’s time to visit classrooms. In addition, using devices in place of data collectors allowed us to easily adapt to last-minute schedule changes and offer teachers the option to record video over multiple days to accommodate shifts in instruction due to COVID-19.

Second, we discovered that we could capture more classroom talk than when using a typical video camera. After some trial and error, we settled on a device with three external wireless microphones: one for the teacher and two additional microphones to place around the classroom. Not only did the extra microphones record audio beyond what the teacher was saying, but we learned that we can also isolate each microphone during data analysis to hear what is happening in specific areas of the classroom (even when the teacher and children were wearing masks). We also purchased an additional wide-angle lens, which clipped over the camera on our tablet and allowed us to capture a wider video angle.  

Third, we found remote data collection to be less intrusive than sending a research team into schools. The device is compact and can be placed on any flat surface in the classroom or be mounted on a basic tripod. The teacher has the option to wear the microphone on a lanyard to serve as a hands-free tracker that signals the device to rotate to follow the teacher’s movements automatically. At the end of the lesson, the video uploads to a password-protected storage cloud with one touch of a button, making it easy for teachers to share videos with our research team. We then download the videos to the MSU server and delete them from our cloud account. This set-up allowed us to collect data with minimal disruption, especially when compared to sending a person with a video camera to spend time in the classroom.

As with most remote work this year, we ran into a few unexpected hurdles during our first round of data collection. After gathering feedback from teachers and members of our research team, we were able to make adjustments that led to a better experience during the second round of data collection this spring. We hope the following suggestions might help others who are considering such a device to collect classroom data in the future:

  1. Consider providing teachers with a brief informational video or offering after-school training sessions to help answer questions and address concerns ahead of your data collection period. We initially provided teachers with a detailed user guide, but we found that the extra support was key to ensuring teachers had a positive experience with the device. You might also consider appointing a member of your research team to serve as a contact person to answer questions about the remote data collection during data collection periods.
  2. As a research team, it is important to remember that team members will not be collecting the data, so it is critical to provide teachers with clear directions ahead of time: what exactly do you want them to record? Our team found it helpful to send teachers a brief two-minute video outlining our goals and then follow up with a printable checklist they could use on the day they recorded instruction. 
  3. Finally, we found it beneficial to scan the videos for content at the end of each day. By doing so, we were able to spot a few problems, such as missing audio or a device that stopped rotating during a lesson. While these instances were rare, it was helpful to catch them right away, while teachers still had the device in their schools so that they could record missing parts the next day.

Although restrictions to in-person research are beginning to lift, we plan to continue using remote data collection for the remaining three years of our project. Conducting classroom research during the COVID-19 pandemic has proven challenging at every turn, but as we adapted to remote video data collection, we were pleased to find unanticipated benefits for our research team and for our study participants.


This blog is part of a series focusing on conducting education research during COVID-19. For other blog posts related to this topic, please see here.

Tanya S. Wright is an Associate Professor of Language and Literacy in the Department of Teacher Education at Michigan State University.

Lori Bruner is a doctoral candidate in the Curriculum, Instruction, and Teacher Education program at Michigan State University.

Educating English Learner Students During the Pandemic: Remote and In-Person Instruction and Assessment

The IES-funded R&D Center, the Center for the Success of English Learners (C-SEL), is undertaking a focused program of research aimed at improving access and outcomes for English Learners. One of C-SEL’s recent activities has been to develop resources to aid policymakers and practitioners working with middle school and secondary English learners. The research team at the Center, including Drs Diane August and Coleen Carlson, along with Maria Yolanda Cieslak and Kenneth Michael Nieser, recently released a brief on Educating English Learner Students During the Pandemic: Remote & In-person Instruction & Assessment-Recommendations and Resources for State and Districts. In this guest blog, the researchers provide an overview of the brief.

English Learners (ELs) benefit from specialized support to help them acquire second language proficiency and core content knowledge that builds on their cultural and linguistic assets. This specialized support is required by law, and the U.S. Department of Education reminded States that this is the case, even when learning is remote. Based on a review of the existing relevant literature, this brief provides detailed information related to the impact of remote learning on English Learners (ELs) and their teachers during the pandemic and the potential and limitations of using digital learning resources (DLRs) to educate these students. For instance, while DLRs have the potential to support learning and engagement for ELs, districts report barriers to their use, such as lack of home access to DLRs; teachers’ level of expertise and technology skills; and the lack of knowledge around what are the appropriate DLRs for ELs.

The brief also describes current legislation that authorizes funds for a variety of activities that could be used to support ELs and their families when instruction is delivered remotely. Some of these federal resources include:

  • Formula Grant Programs Under the Every Student Succeeds Act
  • CARES Act
  • Coronavirus Response and Relief Supplemental Appropriations Act

In addition, the brief includes information about policies and resources for each of the 50 states in place in December 2020 that support districts and schools in instructing and assessing EL students remotely. It also includes a reading list of recent resources focused on remote learning for ELs, with brief descriptions of each resource and links to the resource.

In March 2021, the Center hosted a webinar to discuss recent recommendations for states and districts. One set of recommendations focuses on methods to enhance the learning and emotional well-being of EL students who have lost ground during the pandemic. Recommendations are also made for assessing ELs when schooling is or has been remote or hybrid. These recommendations can be found in the brief.  

In the upcoming months, C-SEL investigators look forward to preparing future blog posts and research briefs on the research of the Center and the students and teachers we are serving. Next up on the agenda is an overview of the students, highlighting their diversity, and some too often ignored, forgotten, or simply unknown characteristics of this important subgroup. Stay tuned!


Dr. Diane August is Principal at D. August and Associates and a Senior Research Scientist at the Center for Applied Linguistics.

Dr. Coleen Carlson is an Associate Research Professor, and Associate Director at the Texas Institute for Measurement, Evaluation, and Statistics (TIMES) at the University of Houston.

Ms. Maria Yolanda Cieslak is a Professional Development Specialist at the Center for Applied Linguistics.

Mr. Kenneth Michael Nieser is a Researcher at the Texas Institute for Measurement, Evaluation, and Statistics at the University of Houston.