Skip to main content

Table of Contents | Search Technical Documentation | References

NAEP Processing Assessment Materials → Processing Schedule and Overall Counts for the 2019 Assessment

​​​​NAEP TECHNICAL DOCUMENTATIONProcessing Schedule and Overall Counts for the 2019 Assessment



2019 Student Participation and Session Information

2018 Student Participation and Session Information

2017 Student Participation and Session Information

2016 Student Participation and Session Information

2015 Student Participation and Session Information

Th​e National Center for Education Statistics (NCES) conducted the 2019 National Assessment of Educational Progress (NAEP) with several organizations under the umbrella of the NAEP Alliance:  alliance planning and coordination (PC), item development (ID), sampling and data collection (SDC), materials, distribution, processing, and scoring (MDPS), design, analysis, and reporting (DAR), and web technology development, operations, and maintence (WTDOM) staff.
 

The NAEP 2019 assessment was comprised of the following student assessments and special studies:

    • reading, mathematics, and science, at grades 4, 8, and 12, digitally based;
    • mathematics and reading, at grade 12, paper-based;
    • science, at grades 4, 8, and 12, paper-based;
    • reading, mathematics, pilot, at grades 4 and 8, digitally based;
    • mathematics, at grades 4 and 8, Puerto Rico, digitally based;
    • meaning vocabulary, pilot, at grades 4 and 8, digitally based;
    • National Indian Education Study (NIES), at grades 4 and 8, digitally based;
    • High School Transcript Study (HSTS), at grade 12;
    • Middle School Transcript Study (MSTS), at grade 8;
    • Computer Access and Familiarity Study (CAFS), at grades 4, 8, and 12, digitally based;
    • Extended Student Questionnaire study (ESQ), at grades 4 and 8, digitally based.

        MDPS staff were responsible for the following tasks in the NAEP 2019 assessment:

        • printing test booklets and questionnaires;
        • packaging and distributing materials;
        • receipt control;
        • data capture through image and optical mark recognition scanning;
        • data editing and validation;
        • uploading of open-ended responses into NAEP's electronic scoring system, including scanned images from paper booklets and digital responses completed on tablets;
        • creation of scoring training sets for new items;
        • training and performance scoring of constructed-response items;
        • quality control;
        • data file creation; and
        • inventory control and materials storage. 

        For NAEP 2019, a total of 325 types of scannable student booklets were designed and printed by MDPS, along with non-scannable large-print student booklets, used in the paper-based assessment (PBA). Braille student booklets were created to be used in both the PBA and digitally based assessment (DBA) sessions.

        School and teacher questionnaires were offered online through a secure file transfer protocol (FTP) site. Scannable questionnaires were provided to respondents upon request. This site was available in early December 2018 through mid-March 2019. 

        Approximately, 110,000 questionnaires were entered online and 17,000 scannable questionnaires were completed for the operational assessments, digitally based pilot, and special studies. 

        There were approximately 65,700 students assessed in the operational assessments in reading, mathematics, and science paper-based; 744,600 assessed in reading, mathematics, and science operational digitally based; 26,300 assessed in mathematics pilot, and 9,800 in reading pilot; 7,400 assessed in mathematics in Puerto Rico; and 4,600 in the meaning vocabulary pilot.

        Scoring of NAEP 2019 occurred in Norfolk, VA and Columbus OH, for reading; and Mesa, AZ, for mathematics operational items in the Spring of 2019.  The grade 12 mathematics, reading, and science; grade 4 and 8 science; and mathematics and reading pilot scoring was moved out of the normal schedule to the Fall of 2019. 

        As in past cycles, project and content scoring specialists were on-site to monitor scoring. There was daily communication between staff at the scoring sites and project staff in the District of Columbia; Iowa City, IA; and Princeton, NJ. Reports were available to a variety of staff from NCES, the NAEP program, and organizations within the Alliance. Reports included the Trend Reliability and Mean Comparison (TRMC) Report, which highlighted items that were in need of more discussion, and the Completion Report, which showed detailed scoring information for each item, as well as cumulative current and historical monitoring statistics for both trend and non-trend items. More than 5.2 million responses were scored from approximately 383 items.  


        Last updated 02 June 2023 (PG)