Skip to main content
Skip Navigation

Table of Contents | Search Technical Documentation | References

NAEP Processing Assessment Materials → Processing Schedule and Overall Counts for the 2017 Assessment

NAEP Technical DocumentationProcessing Schedule and Overall Counts for the 2017 Assessment


2017 Student Participation and Session Information

2016 Student Participation and Session Information

2015 Student Participation and Session Information

The National Center for Education Statistics (NCES) conducted the 2017 National Assessment of Educational Progress (NAEP) with staff from several organizations under the umbrella of the NAEP Alliance. Item development, sample design, materials processing, data analysis, and program staff are responsible for: 

  • alliance coordination, item development, and design, analysis, and reporting;
  • sampling and data collection as well as the NAEP State Service Center;
  • web technology, design, operation, and maintenance;
  • maintenance of a master schedule and earned value reporting; and
  • materials preparation, distribution, processing, and scoring.

The NAEP 2017 assessment was comprised of the following student assessments:

    • mathematics and reading operational at grades 4 and 8, paper based and digitally based;
    • writing operational, at grades 4 and 8, digitally based;
    • integrated mathematics pilot and multi-stage testing (MST), at grade 4 and grade 8, digitally based;
    • integrated reading pilot, at grades 4 and 8, digitally based;
    • integrated civics pilot, U.S. history pilot, and geography pilot, at grade 8, digitally based;
    • writing Comparability Study at grade 8 to compare mode effect of assessing students on laptops v. tablets, digitally based;
    • a special equating study of KaSA, in mathematics, at grades 4 and 8, both nationally and in Puerto Rico, paper based and digitally based.

        Materials processing and scoring staff were responsible for the following tasks in the NAEP 2017 assessment:

        • printing test booklets and questionnaires;
        • packaging and distributing materials;
        • receipt control;
        • data capture through image and optical mark recognition scanning;
        • data editing and validation;
        • upload of open-ended responses into NAEP's electronic scoring system, including scanned images from paper booklets and digital responses completed on tablets;
        • creation of scoring training sets for new items;
        • training and performance scoring of constructed-response items;
        • quality control;
        • data file creation; and
        • inventory control and materials storage. 

        For NAEP 2017, a total of 348 types of scannable student booklets were designed and printed, along with the non-scannable large-print and braille student booklets used in the paper based assessment. 

        School and teacher questionnaires were offered online through a secure file transfer protocol (FTP) site. Scannable questionnaires were provided to respondents upon request. This site was available in early December 2016 through mid-March 2017. 

        Approximately, 115,000 questionnaires were entered online and 9,000 scannable questionnaires were completed for the operational assessment, digitally based pilot, and Puerto Rico proof of concept assessments. 

        There were approximately 176,000 students assessed in the operational assessments in mathematics and reading paper based; 585,000 assessed in mathematics and reading operational digitally based; 115,000 assessed in the integrated mathematics, mathematics KaSa, reading, civics, US history, and geography pilots; 47,000 assessed in writing operational, 6,000 assessed in mathematics KaSA in Puerto Rico, and 3,000 assessed in the writing comparability study in NAEP 2017.

        Scoring of NAEP 2017 occurred in Iowa City, IA, for the writing, civics, US history, geography; Norfolk, VA, for the reading operational and integrated pilot; and Mesa, AZ, for the mathematics operational and integrated pilot.  As in past cycles, project and content specialists were on site to monitor scoring. There was daily communication between staff at the scoring sites and project staff in the District of Columbia; Iowa City, IA; and Princeton, NJ. Reports were available to a variety of staff from NCES, the NAEP program, and organizations within the Alliance. Reports included the Trend Reliability and Mean Comparison (TRMC) Report, which highlighted items that were in need of more discussion, and the Completion Report, which showed detailed scoring information for each item, as well as cumulative current and historical monitoring statistics for both trend and non-trend items. More than 7 million responses were scored from approximately 1,200 items. All of NAEP 2017 responses were scored from April 3 through June 23, 2017.

        Last updated 12 May 2022 (PG)