Table of Contents | Search Technical Documentation | References
|
|
|
2019 Student Participation and Session Information
2018 Student Participation and Session Information | |
The NAEP 2019 assessment was comprised of the following student assessments and special studies:
MDPS staff were responsible for the following tasks in the NAEP 2019 assessment:
For NAEP 2019, a total of 325 types of scannable student booklets were designed and printed by MDPS, along with non-scannable large-print student booklets, used in the paper-based assessment (PBA). Braille student booklets were created to be used in both the PBA and digitally based assessment (DBA) sessions.
School and teacher questionnaires were offered online through a secure file transfer protocol (FTP) site. Scannable questionnaires were provided to respondents upon request. This site was available in early December 2018 through mid-March 2019.
Approximately, 110,000 questionnaires were entered online and 17,000 scannable questionnaires were completed for the operational assessments, digitally based pilot, and special studies.
There were approximately 65,700 students assessed in the operational assessments in reading, mathematics, and science paper-based; 744,600 assessed in reading, mathematics, and science operational digitally based; 26,300 assessed in mathematics pilot, and 9,800 in reading pilot; 7,400 assessed in mathematics in Puerto Rico; and 4,600 in the meaning vocabulary pilot.
Scoring of NAEP 2019 occurred in Norfolk, VA and Columbus OH, for reading; and Mesa, AZ, for mathematics operational items in the Spring of 2019. The grade 12 mathematics, reading, and science; grade 4 and 8 science; and mathematics and reading pilot scoring was moved out of the normal schedule to the Fall of 2019.
As in past cycles, project and content scoring specialists were on-site to monitor scoring. There was daily communication between staff at the scoring sites and project staff in the District of Columbia; Iowa City, IA; and Princeton, NJ. Reports were available to a variety of staff from NCES, the NAEP program, and organizations within the Alliance. Reports included the Trend Reliability and Mean Comparison (TRMC) Report, which highlighted items that were in need of more discussion, and the Completion Report, which showed detailed scoring information for each item, as well as cumulative current and historical monitoring statistics for both trend and non-trend items. More than 5.2 million responses were scored from approximately 383 items.