The National Center for Education Statistics (NCES) is committed to using the latest research and cutting-edge technologies in developing assessments, and collaborates with a wide range of advisory groups on content, modeling, methodology, and reporting.
Current and future initiatives for the National Assessment of Educational Progress (NAEP) include
Starting with the next schedule of assessments for NAEP 2020-2030, NCES will implement a more efficient, two-subject assessment design. Most students selected to participate in NAEP will be assessed in two subjects instead of one. NAEP’s design will continue to feature its hallmark sampling matrix structure where students are given only a portion of content, but the new design will utilize most students twice making for a more efficient psychometric and operational design.
Traditionally, NAEP used a one-subject, focused design, administered to students in two cognitive blocks. The new design will add an additional cognitive block from another subject. Most students will take two-blocks of one subject, followed by a break, and then one-block of another subject. The additional block results in 30 more minutes of testing time for most individual students.
A multi-subject, matrix block design is characteristic of most large-scale assessment indicator systems, and, in practice, is not new to the industry. Well-established assessments, such as PISA, TIMSS, and PIAAC, all make effective use of unfocused block designs of two or more subjects administered to the same student. In refining its analytic procedures, NAEP will benefit from decades of experience accumulated by the suite of international assessments.
While the three-block design will result in longer testing for any given students sampled to participate in NAEP, the overall testing footprint in American schools will be significantly reduced. Specifically, the new design will have several notable advantages:
In addition, the new NAEP design affords the possibility to report performance relationships across subjects, resulting in statistical information that can be used to improve measurement efficiency.
Furthermore, potential threats to NAEP’s trend lines have been addressed by embedding the current single subject, two-block design into the two-subject three-block design.
Starting in 2021, the new three-block design will commence with the administration of the 4th and 8th grades reading and mathematics. In 2021 U.S. history and civics will be administered in a three-block design to 8th graders. And in 2023, science and technology and engineering literacy will be administered similarly. NCES will conduct special studies to identify analytic changes necessary to safeguard reporting achievement trends at the national, state or TUDA level.
NAEP digitally based assessments use innovative testing methods and interactive item types to capture information about students’ problem-solving methods. Some questions might use audio and video multimedia. Other questions use embedded technological features such as an onscreen calculator, drag-and-drop items, and slider controls to form a response. Questions also might engage students in solving problems using realistic scenarios.
To ensure that the assessment development process remains innovative, NCES applies evidence-centered design (ECD) principles to NAEP assessments. ECD provides an evidence-driven framework for designing, producing, and delivering assessments. It is used as a tool to support developing assessments with clear links for measuring and reporting goals.
In NAEP, multidisciplinary design teams establish clear goals and assessment designs. Teams include cognitive scientists, user experience professionals, assessment developers, and psychometric staff. The ECD process requires documented, explicit links among the purpose for a test, the claims made about test takers, the evidence supporting those claims, and the test takers’ responses to the tasks that provide the evidence. The ECD process provides a logical and systematic method of developing NAEP assessment and tasks and questions based on the knowledge and skills outlined in subject frameworks.
A Brief Introduction to Evidence-Centered Design offers additional information.Learn More
The Survey Assessment Innovations Lab (SAIL) was formed as a result of the 2013 Future of NAEP conference. SAIL, part of an expanded assessment research and development initiative, supports and oversees a portfolio of innovative research studies essential to keeping NAEP at the forefront of innovation and best practices. Current efforts focus on
Current NAEP SAIL projects include
This page will include links to the results of these studies as they become available.
NAEP’s survey data helps policymakers, researchers, educators, and the public understand the context of student achievement results and enables meaningful comparison among student groups.
Historically, NAEP has designed its survey questionnaires around single questions, and questionnaire results were therefore reported for single questions. In 2014, the program enhanced the survey questionnaire design and reporting approach to examine information of key interest to NAEP audiences. Specifically, while some survey questions are still analyzed and reported as single items (for example, gender), several questions on the same topic are combined into indices measuring a single underlying construct or concept.
NCES collaborates with the NAEP Validity Studies (NVS) Panel, the Design Committee (DAC), and attendees at NAEP conferences. Since 1995, the American Institutes for Research (AIR) has maintained the NVS, an independent panel of experts that meets to commission and discuss research addressing validity considerations for NAEP. DAC has been instrumental in assisting NCES through the transition to a digitally based assessment.
In addition, NCES periodically convenes larger conferences where experts from a variety of fields can provide additional input regarding the future direction of the program five, ten, and twenty years down the road.