Skip to main content
Skip Navigation

Table of Contents | Search Technical Documentation | References

NAEP InstrumentsCognitive Items and Instruments → Development of the Science Cognitive Items and Instruments

​​NAEP Technical DocumentationDevelopment of the Science Cognitive Items and Instruments

       

Student test forms: 2019
Student booklets: 2015
Student booklets: 2011
Student booklets: 2009
Student booklets: 2005
Student booklets: 2000

Number of items: 2019
Number of items: 2015
Number of items: 2011
Number of items: 2009
Number of items: 2005
Number of items: 2000

New and common blocks: 2019
Common blocks: 2015
Common blocks: 2011
Common blocks: 2005
Common blocks: 2000

The instruments used in the NAEP science assessment are usually composed of blocks of cognitive items from the previous NAEP assessments, as well as blocks that are newly developed for the current year's assessment. Administering the same blocks of items across years allows for the reporting of trends in science performance. Developing new blocks of items makes it possible to release some items for public use. In some assessment years, one or more blocks at each grade are released to the public and can be accessed via the NAEP Questions Tool.

The NAEP science framework and specifications documents guide the item development efforts. In 2009, a new science framework was introduced, and most of the items used were newly developed; there were no common blocks between the 2009 assessment and those used in previous years. Because of resulting changes to the assessment, 2009 marked the start of a new NAEP science trend line; therefore, beginning with the 2009 assessment, performance results cannot be compared to those from previous assessment years. Whenever changes are made to a framework, efforts are made to maintain the trend lines that permit the reporting of changes in student achievement over time. If, however, the nature of the changes made to an assessment is such that the results would not be comparable to earlier assessments, a new trend line is started. See a comparison of the two frameworks. This same framework guided the development of the 2011, 2015, and 2019 assessments.

Items are written by NAEP test development staff, with input from members of the NAEP Science Standing Committee as well as from elementary, secondary, and postsecondary teachers around the United States. All assessment materials are reviewed by specialists in science education, measurement, and assessment development for accessibility and potential bias. Scenario-based tasks (SBTs), which include Interactive Computer Tasks (ICTs) and Hands-on Hybrid Tasks (HHOTs), undergo reviews for functionality via tryouts and cognitive labs in addition to these content reviews. Plus, HHOTS undergo kit reviews to determine the accuracy of equipment and the procedures asked in the question. The cognitive items are then assembled into 30-minute blocks.

Each science cognitive block contains a range of questions covering three science content areas:

  • Physical Science;
  • Life Science; and
  • Earth and Space Sciences.

See what the NAEP science assessment measures for more information.

Following approval from the National Center for Education Statistics (NCES), the cognitive blocks are assembled into test booklets or digital test forms for computer delivery and administration of the assessment.

The NAEP science assessment was administered for the first time in 2019 as a digitally based assessment at grades 4, 8, and 12. The science digital assessment was designed to continue reporting trends in student performance while keeping pace with the new generation of classroom environments in which digital technology has increasingly become a part of students' learning. The science digital assessments at all three grades were administered on tablets.

Assessment content at all three grades consisted of standalone, discrete questions that were transadapted from the previous paper-based assessment as well as newly developed, digitally based discrete questions that were designed to take advantage of the digital delivery system. Questions that were previously used in the paper-based format were adapted to fit a tablet screen. While the presentation of the content changed, the content itself was not changed. Most of the 2015 paper-based assessment content at all three grades was also used in the 2019 digitally based assessment. The goal of adapting questions from paper-based to a digital format included retaining the same measurement targets in the digital version as were in the original paper-based version of the question. At grade 4, of the ten question blocks administered in 2019, seven were transadapted from the 2015 paper-based assessment and three blocks were newly developed for the digital assessment. At grade 8, of the 11 blocks administered in 2019, eight were transadapted from the paper-based format and three were newly developed. At grade 12, of the 12 question blocks administered in 2019, nine were transadapted from the paper-based format and three were newly developed.

In addition to standalone, discrete questions, the science digitally based assessment included interactive SBTs that were designed to engage students in solving real-world scientific problems in a digital environment. The 2019 science assessment included two types of SBT question blocks: ICTs and HHOTs. At grades 4, 8, and 12 students could be administered different combinations of discrete and SBT question blocks.

To estimate overall trend scores and evaluate the effectiveness of the digital transition, the 2015 science paper-based assessment at grades 4, 8, and 12 were re-administered in 2019. A multistep process was used for the transition in order to preserve trend lines that show student performance over time. The transition process involved administering the assessment in both the paper-based and digital formats to randomly assigned groups of students at each grade in 2019. NCES administered the assessments in both modes—paper-based and digitally based—in all the sampled schools to investigate potential differences in performance between students taking the assessment on a tablet and students taking the paper-based assessment. The results from the digital assessment can therefore be compared to those from previous assessment years, showing how students' performance in science has changed over time. Read more about the science transition and mode evaluation.

After the administration of the assessments at each grade, NCES conducted rigorous analyses of the data and aligned the 2019 results to previous assessment years using a two-step process. The first step used common item linking followed by common population linking in the second step. These analyses—common item linking based on paper results and common population linking of paper results to digital results—enabled NCES to successfully maintain the science trend line while transitioning to the digital assessment in 2019. The 2019 science assessment results at each grade are based on the combined performance of students who took the assessment on paper and students who took the assessment on tablets.


Last updated 09 October 2023 (SK)