IES Blog

Institute of Education Sciences

Designing Culturally Responsive and Accessible Assessments for All Adult Learners

Dr. Meredith Larson, program officer for adult education at NCER, interviewed Dr. Javier Suárez-Álvarez, associate professor and associate director at the Center for Educational Assessment, University of Massachusetts Amherst. Dr. Suárez-Álvarez has served as the project director for the Adult Skills Assessment Project: Actionable Assessments for Adult Learners (ASAP) grant and was previously an education policy analyst in France for the Organisation for Economic Co-operation and Development (OECD), where he was the lead author of the PISA report 21st-Century Readers: Developing Literacy Skills in a Digital World. He and the ASAP team are working on an assessment system to meet the needs of adult education learners, educators, and employers that leverages online validated and culturally responsive banks of literacy and numeracy tasks. In this interview, Dr. Suárez-Álvarez discusses the importance of attending to learners’ goals and cultural diversity in assessment.

How would you describe the current context of assessment for adult education, and how does ASAP fit in it?

In general, the adult education field lacks assessments that meet the—sometimes competing—needs and goals of educators and employers and that attend to and embrace learner characteristics, goals, and cultural diversity. There is often a disconnect where different stakeholders want different things from the same assessments. Educators ask for curriculum-aligned assessments, learners want assessments to help them determine whether they have job-related skills for employment or promotion, and employers want to determine whether job candidates are trained in high-demand skills within their industries.

Despite these differing needs and interests, everyone involved needs assessment resources for lower skilled and culturally diverse learners that are easy to use, affordable or free, and provide actionable information for progress toward personal or occupational goals. ASAP is one of the first attempts to respond to these needs by developing an assessment system that delivers real-time customizable assessments to measure and improve literacy and numeracy skills. ASAP incorporates socioculturally responsive assessment principles to serve the needs of all learners by embracing the uniqueness of their characteristics. These principles involve ensuring that stakeholders from diverse socioeconomic, cultural, linguistic, racial, and ethnic groups are represented in our test design and development activities.

Why is attending to cultural diversity important to ASAP and assessment, and how are you incorporating this into your work?

U.S. Census projections for 2045 predict a shift in the demographic composition of the population from a White majority to a racially mixed majority. This suggests that we should prepare for cultural shifts and ensure our assessments fully embrace socioculturally responsive assessment practices. Without these practices, assessments limit the ability of adults from varied demographic backgrounds to demonstrate their capabilities adequately. Socioculturally responsive assessments are pivotal for representing the growing diversity in the learner population and for uncovering undetected workforce potential.

In ASAP, we are conducting focus groups, interviews, and listening sessions with learners, educators, and employers to understand their needs. We are also co-designing items in collaboration with key stakeholders and building consensus across adult education, workforce, and policy experts. We are developing use cases to understand hypothetical product users and conducting case studies to establish linkages between instruction and assessment as well as across classroom and workplace settings.

How has your background informed your interest in and contributions to ASAP?

As a teenager growing up in Spain, I saw first-hand the possible negative impact assessments could have when they don’t attend to learner goals and circumstances. When I was 15, my English teacher, based on narrow assessments, told my parents I was incapable of learning English, doubted my academic potential, and suggested I forego higher education for immediate employment. Defying this with the support of other teachers and my family, I pursued my passion. I became proficient in English at the age of 25 when I needed it to be a researcher, and I completed my PhD in psychology (psychometrics) at the age of 28.

Many adult students may have heard similar messages from prior teachers based on assessment results. And even now, many of the assessments the adult education field currently uses for these learners are designed by and for a population that no longer represents most learners. These adult learners may be getting advice or feedback that does not actually reflect their abilities or doesn’t provide useful guidance. Unfortunately, not all students are as lucky as I was. They may not have the support of others to counterbalance narrow assessments, and that shouldn’t be the expectation.

What are your hopes for the future of assessments for this adult population and the programs and employers that support them?

I hope we switch from measuring what we know generally how to measure (such as math and reading knowledge on a multiple-choice test) to measuring what matters to test takers and those using assessment results so that they can all accomplish goals in ways that honor individuals’ circumstances. Knowledge and skills—like the real world—are much more than right and wrong responses on a multiple-choice item. I also hope that as we embrace the latest developments in technology, such as AI, we can use them to deliver more flexible and personalized assessments.

In addition, I hope we stop assuming every learner has the same opportunities to learn or the same goals for their learning and that we start using assessments to empower learners rather than just as a measure of learning. In ASAP, for example, the adult learner will decide the type of test they want to take when to take it, the context within which the assessment will be framed, and when, where, and to whom the assessment result will be delivered.


This blog was produced by Meredith Larson (Meredith.Larson@ed.gov), program officer for adult education at NCER.

 

Creating a Community of Writers

The Predoctoral Interdisciplinary Research Training Program in the Education Sciences was established by IES to increase the number of well-trained PhD students who are prepared to conduct rigorous and relevant education research. IES encourages our predoctoral fellows to develop strong writing skills in addition to subject-matter and methodological expertise. In this guest blog, we asked IES predoctoral fellow, Todd Hall, co-chair of the Black Scholars in Education and Human Development Writing Group at the University of Virginia, to discuss how participating in this writing group has helped his development as an education researcher. Todd, is part of the IES-funded Virginia Education Science Training (VEST) program and studies early childhood education policy as well as school discipline in both early childhood and K-12 settings.

How did you become involved in the Black Scholars in Education and Human Development Writing Group?

I started my PhD in education policy in August 2020. The COVID-19 pandemic made networking and simply making friends awkward. During my first week in Charlottesville, VA, I watched wistfully from my window as a Black person jogged past my house. For me, the jogger represented communities of color at UVA that I did not know how to connect with.

Enter Dr. Edward Scott and Dr. Miray Seward, then students and co-chairs of the Black Scholars in Education and Human Development Writing Group. They sent me a personal email invitation to join the group’s first virtual writing retreat. When I joined the Zoom room, I found the affinity space I was looking for. I connected with graduate students whom I later turned to for informal mentorship, course recommendations, tips on navigating the hidden curriculum of grad school, insights from job market experiences, and examples of successful written proposals. The laughs shared virtually during check-ins between writing blocks helped ward off the pandemic blues.

I resolved to pay it forward, so I began shadowing Edward and Miray. When they graduated, I stepped into a leadership role alongside my co-chair, Sasha Miller-Marshall.

How has participating in the writing group helped you develop as a scholar?

The writing group has reminded me that I am not the only one who experiences writer’s block and has provided me with writing process role models. The professional development sessions we host have been one of the few opportunities that I have found to see faculty expose and reflect on their own writing challenges, from protecting their time for writing to incorporating critical feedback. This provides a unique perspective on the writing process—I often see faculty discuss works in progress, but the format is usually an oral presentation with slides rather than something written.

In the Black Scholars Writing Group sessions, speakers often share candidly about their own process, including writer’s block and how they overcome it. For example, a senior faculty member shared that they used voice memos to process their thoughts when they feel stuck. That disclosure normalized my experience of writer’s block and made me feel comfortable sharing that I write memos on my phone when I feel stuck. Moments like these have provided tools to overcome resistance in my writing process and normalized the experience of strategizing about writing rather than expecting words to flow effortlessly.

The presenters who lead sessions with our group have diverse racial/ethnic backgrounds, but the focus of the group on creating affinity space for Black doctoral and PhD students allows me to be less concerned about stereotype threat. Whereas I am often the only Black person in other rooms, I am never the only Black person in this writing group. That alleviates any concern about being perceived as a token representative of Black people, or worse, as less capable if I choose to share my difficulties. In one session, I was able to unpack with the faculty speaker that a particular piece of writing was difficult because I had not yet answered the simple question of why the work was important. I got to that realization because the speaker modeled vulnerability about their own writing process, and I felt at ease to discuss my own.

How can the broader education research community help graduate student researchers develop as writers?

Where appropriate and feasible, education researchers can share their successful conference proposals, grant applications, budgets, reviewer response letters, and perhaps even dissertation chapters. If it does not make sense to post them publicly, researchers could offer to share materials with graduate students that they meet at speaking engagements, conferences, etc.

Successful models have given me helpful guidance, especially when tackling a new format. Beyond the writing group, I am immensely grateful to the alumni of my IES pre-doctoral fellowship who have provided many of their materials for current students to reference.

What advice can you give other student researchers who wish to further develop their writing skills?

Cultivate authentic relationships with a network of mentors who are willing to share examples of their successful writing and review your work. My advisor is amazing and thorough with her feedback. That said, it has been useful to strategically ask others who bring in complementary perspectives to review my work. For example, my advisor is a quantitative researcher, and I recently proposed a mixed methods study. Researchers who do qualitative and mixed methods work were able to challenge and strengthen the qualitative aspects of my proposal based on their expertise. You might also be applying for opportunities or submitting to journals that other mentors have succeeded with or reviewed for. They may help you anticipate what that audience might be looking for.

In addition, when you receive feedback, do so graciously, weigh it seriously, and ask yourself if there’s a broader piece of constructive criticism to apply to your other writing.


This blog was produced training program officer Katina Stapleton (Katina.Stapleton@ed.gov) and is part of a larger series on the IES research training programs.

Experimenting with Science Education to Improve Learner Opportunities and Outcomes

The NAEP science assessment measures science knowledge and ability to engage in scientific inquiry and conduct scientific investigations. According to results from the 2019 NAEP science assessment, only one-third of grade 4 and grade 8 students, and less than one-quarter of grade 12 students scored at or above proficient. In addition, for grade 4 middle-performing and low-performing students, their science performance showed declines from 2015. While IES has a history of investing in high quality science education research to improve science teaching and learning, these data suggest that much more work is needed.

To that end, during the 2022-23 school year, IES held two Learning Acceleration Challenges designed to incentivize innovation to significantly improve learner outcomes in math and science. Under the Challenge for the Science Prize, IES sought interventions to significantly improve science outcomes for middle school students with low performance in science. Unfortunately, the judging panel for the Challenge did not recommend any finalists for the Science Prize (more information about the Math Prize results can be found here). IES recognized this Challenge was an ambitious and rapid effort to improve science achievement. Feedback from potential Science Prize entrants indicated that the rapid cycle for evaluating the intervention along with the lack of resources to implement the intervention were barriers to this competition.

With the knowledge gained from the Science Prize, IES is continuing to design opportunities that encourage transformative, innovative change to improve teaching and learning in science. In our newest opportunity, the National Center for Education Research (NCER) at IES, in partnership with the National Science Foundation (NSF), released a Request for Applications for a National Research and Development Center (R&D Center) on Improving Outcomes in Elementary Science Education. Results from the most recent NAEP science assessment and the lessons learned from the Science Prize suggest opportunities for improving teaching and learning in science education need to begin early in education, and more resources are needed to conduct high quality research in science education. Through this R&D Center, IES and NSF will provide greater resources (grant award of up to $15 million over 5 years) to tackle persistent challenges in elementary science education, including the measurement of elementary science learning outcomes, and generating evidence of the impact of elementary science interventions on learner’s science achievement. In doing so, the new Elementary Science R&D Center will provide national leadership on elementary science education and build capacity in conducting high-quality science education research.


This blog was written by NCER program officer, Christina Chhin. For more information about the Elementary Science R&D Center competition, contact NCER program officers, Jennifer Schellinger or Christina Chhin, take a look at the 84.305C RFA, and/or attend one of our virtual office hours.

Risk and Resilience in Children Experiencing Homelessness

In celebration of National Homeless Youth Awareness Month, Dr. Ann Masten, Regents Professor at the Institute of Child Development, University of Minnesota, reflects on her research with children in families experiencing homelessness, highlighting what inspired her, key findings, and advice for the future. Her research on homelessness has been supported by IES, NSF, NIH, her university, and local foundations. She underscores the power of a resilience lens for research with high-risk families and the vital role of research-practice partnerships.  

What inspired you to study homelessness?

In 1988, the issue of homelessness among children surged onto the front pages of newspapers and magazines as communities were confronted with rapidly growing numbers of unhoused families. That year, Jonathan Kozol published his book, Rachel and her Children: Homeless Families in America, about the desperate lives of families without homes crowded into hotels in New York City. Kozol gave a compelling talk I attended at the University of Minnesota. At the time, I was doing part-time, pro bono clinical work with children at a mental health clinic run by the Wilder Foundation. The foundation president requested that I help them learn about the needs of children and families experiencing homelessness in the Twin Cities.

Digging into the literature as I visited shelters and interviewed school personnel who were faced with the surge of family homelessness, I quickly learned that there was little information to guide educators or service providers. Shelters could barely keep track of the numbers and ages of children in residence each day, and schools were struggling to accommodate the overwhelming needs of kids in emergency shelter.

This search inspired me to launch research that might be helpful. I was deeply moved by the plight of these families who were trying to care for their children without the security of a stable home, income, food, healthcare, or emotional support. I had grown up in a military family, frequently moving and dealing with parental deployment, which was stressful even with adequate resources.

As an early career scholar, I had funding to start new work aligned with my research focus on resilience in child development. In 1989, I initiated my first study of homelessness, surveying parents and their children residing in emergency shelter compared with similar but housed families. Although I knew from the outset that homelessness was not good for children, I also realized it was important to document the risks and resilience of these families.

How has your research on homelessness evolved in the past three decades?

Initially, my research with students and community collaborators was descriptive, focused on discovering the nature of adversities children and parents had faced, variations in how well they were doing, barriers to school access, and what made a positive difference—the protective factors in their lives. Over the years, we learned that children and parents in emergency shelter had much in common with other impoverished families, although they often had faced higher cumulative risk, as well as more acute trauma, and their children had more education issues. Administrative longitudinal data provided strong evidence of academic risk among children identified as homeless, with significantly worse achievement than housed children who qualified for free lunch. Poor attendance was an issue but did not account for the striking range of academic achievement we observed. Importantly, there was ample evidence of resilience: warm, effective parents and sociable, high-achieving children, eager to play and learn.

Research on homelessness aligned with a broader story of risk and resilience in development, revealing the importance of multisystem processes and protections for children as well as the hazards of high adversity in contexts of low resources and structural inequality. Results pointed to three basic intervention strategies: (1) lowering risks and toxic stress exposure, (2) increasing resources for healthy child development, and (3) nurturing resilience at multiple levels in children, their families, schools, and communities.

Given the range of school readiness and achievement of children experiencing homelessness, we focused on malleable protective factors for school success, particularly during the preschool years. Parenting quality and executive function (EF) skills were strong candidates. We tested EF skills that reflect neurocognitive processes involved in goal-directed behavior that are vital to learning. Many of the children in shelters struggled with self-regulation and related learning skills, which predicted how well they did at school, both in the short-term and over time.  

With funding from a local foundation and IES, we developed an intervention to boost EF skills among young highly mobile children. Ready? Set. Go! (RSG) was designed to foster EF skills through practice embedded in routine preschool activities led by teachers, educating parents about brain development and how to encourage EF skills, and training parents and children with games, books, and music. Given family mobility, RSG was intended to be brief, appealing, and easy to implement. Pilot results were promising, indicating appeal to parents and teachers, fidelity of implementation, and encouraging changes in EF skills among the children.

In recent years, I have co-directed the Homework Starts with Home Research Partnership, a “grand challenge” project focused on ending student homelessness with a dedicated group of university, state, and community partners. This project integrates long-term administrative data in order to study effects of housing and other interventions on the educational success of students. Our work has underscored for me the power of collaborative partnerships and integrated data.

What advice do you have for researchers interested in conducting research on homelessness?

Connect with multisystem partners! Homelessness is a complex issue that calls for research-practice partnerships spanning multiple systems and perspectives, including lived experience.  Integrated data systems that include multisystem administrative data are particularly valuable for understanding and following mobile populations. Sign up for updates from Federal and state agencies, as well as NGOs that disseminate research updates about homelessness. And aim for positive goals! Our focus on resilience and positive outcomes as well as risks and adversity was key to engaging families and our collaborators. 


This blog was produced by Haigen Huang (Haigen.Huang@ed.gov), program officer at NCER.

Training the Next Generation of CTE Researchers: A Conversation with the CTE Research Network

IES funded the Expanding the Evidence Base for Career and Technical Education (CTE) Research Network (CTERN) in FY 2018 in order to increase the quality and rigor of CTE research, specifically by (1) coordinating IES-funded researchers studying CTE using causal designs and (2) training new researchers in causal methods to address CTE-related research questions. In this guest blog, the Network Lead’s PI, Katherine Hughes, and Training Lead, Jill Walston, from the American Institutes for Research (AIR), discuss the evolution of the institute across four years of training supported by the grant and what they learned about the components of effective training, in the hopes of sharing lessons learned for future IES-funded trainings.

About the Summer Training Institute

Each summer since 2020, CTERN has held summer training institute on causal research methods in CTE.  Across four summers, we had 81 trainees, including junior faculty, researchers in state or university research offices or institutes, doctoral students, and researchers in non-profit organizations. During the institutes, we had expert CTE researchers and national and state CTE leaders deliver presentations about CTE history, policies, theories, and recent research.

The major focus of the training was on research designs and statistical methods for conducting research that evaluates the causal impact of CTE policies and practices on student outcomes. The participants learned about conducting randomized controlled trials—considered the gold standard for causal research—as well as two quasi-experimental approaches, regression discontinuity and comparative interrupted time series designs. After presentations about the approaches, students worked with data in small groups to complete data analysis assignments designed to provide practical experience with the kinds of data and analyses common in CTE research. The small groups had dedicated time to meet with one of the instructors to discuss their analyses and interpret findings together. The combination of presentations and practical applications of data analysis with real data, and time in small groups for troubleshooting and discussion with CTE researchers, made for a rich experience that students found engaging and effective. The students received an IES certificate of course completion to mark their accomplishment.

Making Continuous Improvements Based on Lessons Learned

We had a continuous improvement mindset for our summer institute. After each week-long session was completed, the CTE research network director, training coordinator, and instructors met to review their perceptions of the training and most importantly the feedback students provided at the end of the week. We applied the lessons learned to make improvements to the agenda, communications, and student grouping approaches to the plans for the following summer.

Over the course of the four years of the summer institute training, we made a number of adjustments in response to feedback.

  • We continued to offer the institute virtually. The institute was originally intended to be held in person; an earlier blog describes our necessary pivot to the online format. While we could have safely changed to an in-person institute in 2022 and 2023, feedback from our students showed that the virtual institute was more accessible to a geographically diverse group. Many trainees said they would not have even applied to the institute if they would have had to travel, even with a stipend to help cover those costs.
  • We added more time for the students to get to know one another with virtual happy hours. Compared to in-person trainings, virtual trainings lack those natural opportunities for informal communications between students and with instructors that can foster engagement, trust, and joint purpose. While we couldn’t replicate in-person networking opportunities, we were able to improve the experience for the students by being intentional with informal gatherings.
  • We expanded the time for the small groups to meet with their instructors. Students reported that this office hour time was very valuable for their understanding of the material and in interpreting the output of the analyses they ran. We extended this time to optimize opportunities for discussion and problem solving around their data analysis assignments.   
  • We made improvements to the data assignment guidance documents. In the first year, students reported that they spent more time on figuring out initial tasks with the data which left less time for running analyses and interpreting their output. We modified our guidance documents that accompanied the assignments to spell out more explicitly some of the initial steps to shorten the time students spent on set-up and maximize their time doing the important work of coding for the analyses and examining output. We also provided links to resources about the statistical packages used by the students for those that needed time to brush up on their skills before the training began.
  • We doubled down on efforts to stay connected with the trainees and supported ways to have them stay connected to each other. For example, we let them know when CTERN’s researchers are presenting at conferences and invite them to connect with us and each other at these conferences. We’re now organizing a LinkedIn group to try to develop a community for our training alumni.

Our summer training institutes were a great success. We look forward to continuing this opportunity for researchers into the future, with a new version to be offered in the summer of 2025 by the CTE Research Network 2.0.


Jill Walston, Ph.D., is a principal researcher at the American Institutes for Research with more than 20 years of experience conducting quantitative research, developing assessments and surveys, and providing technical support to researchers and practitioners to apply rigorous research and measurement practices. Dr. Walston is the lead for training initiatives for the IES-funded Career and Technical Education Research Network.

Katherine Hughes, Ph.D., is a principal researcher at the American Institutes for Research and the principal investigator and director of the CTE Research Network and CTE Research Network 2.0. Dr. Hughes’ work focuses on career and technical education in high schools and community colleges, college readiness, and the high school-to-college transition.

This blog was produced by Corinne Alfeld (Corinne.Alfeld@ed.gov), a Program Officer in the National Center for Education Research (NCER).