Skip to main content
Skip Navigation

Table of Contents | Search Technical Documentation | References

NAEP Processing Assessment Materials → Processing Schedule and Overall Counts for the 2018 Assessment

NAEP Technical DocumentationProcessing Schedule and Overall Counts for the 2018 Assessment


2018 Student Participation and Session Information

2017 Student Participation and Session Information

2016 Student Participation and Session Information

2015 Student Participation and Session Information

The National Center for Education Statistics (NCES) conducted the 2018 National Assessment of Educational Progress (NAEP) with several organizations under the umbrella of the NAEP Alliance: alliance coordination, item development, sample design, materials processing, design analysis and reporting, web technology, and scheduling staff. 

The NAEP 2018 assessment was comprised of the following student assessments and special studies:

    • civics, geography, and U.S. history, at grade 8, paper-based and digitally based;
    • science pilot (discrete tasks, inter-active computer tasks, hybrid hands-on-tasks) at grades 4, 8, and 12, digitally based;
    • reading scenario-based task study, at grades 4, 8, and 12, digitally based;
    • oral reading fluency study, at grade 4, digitally based;
    • technology & engineering literacy operational, at grade 8, digitally based;
    • mathematics and reading pilot, at grade 12, digitally based;
    • mathematics assessment delivery study, at grades 4, 8, and 12, digitally based;
    • national teachers and principals survey, at grades 4, 8, and 12, digitally based.

        Materials preparation, distribution, processing and scoring staff were responsible for the following tasks in the NAEP 2018 assessment:

        • printing test booklets and questionnaires;
        • packaging and distributing materials;
        • receipt control;
        • data capture through image and optical mark recognition scanning;
        • data editing and validation;
        • upload of open-ended responses into NAEP's electronic scoring system, including scanned images from paper booklets and digital responses completed on tablets;
        • creation of training sets for new items;
        • training and performance scoring of constructed-response items;
        • quality control;
        • data file creation; and
        • inventory control and materials storage. 

        For NAEP 2018, a total of 109 types of scannable student booklets were designed and printed, along with the non-scannable large-print and braille student booklets used in the paper-based assessment. 

        School and teacher questionnaires were offered online through a secure file transfer protocol (FTP) site. Scannable questionnaires were provided to respondents upon request. This site was available in early December 2017 through mid-March 2018. 

        Approximately, 9,600 questionnaires were entered online and 320 scannable questionnaires were completed for the operational assessment, digitally based pilot, and special studies. 

        There were approximately 22,600 students assessed in the operational assessments in civics, geography, and U.S. history paper-based; 21,100 assessed in civics, geography, and US history operational digitally based; 6,000 assessed in the reading scenerio based task; 1,900 assessed in oral reading fluency study; 10,500 assessed in mathematics pilot, 4,600 in reading pilot, and 45,700 science pilot; 2,600 in the mathematics assessment delivery study; and 15,400 assessed in technology and engineering literacy study in NAEP 2018.

        Scoring of NAEP 2018 occurred in Iowa City, IA, for the science pilot (discrete tasks, inter-active computer tasks, hybrid hands-on-tasks); Norfolk, VA, for the reading pilot, oral reading fluency and reading scenerio based study, and technology and engineering literacy study; and Mesa, AZ, for the mathematics pilot, civics, geography, and U.S. history.  As in past cycles, project and content specialists were on site to monitor scoring. There was daily communication between staff at the scoring sites and project staff in the District of Columbia; Iowa City, IA; and Princeton, NJ. Reports were available to a variety of staff from NCES, the NAEP program, and organizations within the Alliance. Reports included the Trend Reliability and Mean Comparison Report, which highlighted items that were in need of more discussion, and the Completion Report, which showed detailed scoring information for each item, as well as cumulative current and historical monitoring statistics for both trend and non-trend items. More than 1.2 million responses were scored from approximately 880 items. All of the NAEP 2018 constructive open-ended responses were scored from April 2, 2018 through May 14, 2018.

        Last updated 08 December 2022 (PG)