The National Center for Education Statistics (NCES) conducted the 2013 National Assessment of Educational Progress (NAEP) staff from several organizations under the umbrella of the NAEP Alliance. Item development, sample design, materials processing, data analysis, and program staff are responsible for:
Alliance coordination, item development, and design, analysis, and reporting;
sampling and data collection as well as the NAEP State Service Center;
web technology, design, operation, and maintenance;
maintenance of a master schedule and earned value reporting; and
materials preparation, distribution, processing, and scoring.
The NAEP 2013 was comprised of the following student assessments.
pilot assessment at grade 8 (technology-based assessment)
In addition, several special studies were conducted.
NAEP-HSLS (High School Longitudinal Study)
Students who were part of the HSLS taking the grade 12 mathematics assessment
NAEP-PISA Linking Study
Sampled students in Grades 9, 10, and 11 took two grade 8 and/or grade 12 mathematics blocks
Reading Accessible Study (NAEP Validity Studies Panel (NVS)
Sampled students in grades 4 and 8 took a regular NAEP reading block and a revised (by NVS) NAEP reading block
Reading Read Aloud Study
Sampled SD and/or ELL students in grades 4 and 8 reading used the Read Aloud accommodation
Read aloud the story, directions, and the questions
Read aloud the directions and questions
Reading MetaMetrics Lexile Study
Sampled grade 8 reading students took one Lexile (all multiple choice) and one NAEP block
Materials processing and scoring staff were responsible for the following tasks in this assessment:
printing test booklets and questionnaires;
packaging and distributing materials;
data capture through image and optical mark recognition scanning;
data editing and validation;
upload of open-ended responses into ePEN, including scanned images from paper booklets, text files from technology-based items, and image files from one technology-based item;
creation of training sets for new items;
training and performance scoring of constructed response items;
data file creation; and
inventory control and materials storage.
For NAEP 2013, 854 documents were designed, along with the following non-scannable documents: Braille versions of student test booklets, companion booklets for each Braille version, large-print student booklets, and six non-scannable worksheets.
The assessment school questionnaire and teacher questionnaires were offered in traditional paper form or online at www.naepq.org. This site was available January 3 through March 16, 2013, and 28,000 questionnaires were entered online. (There were 105,000 total completed questionnaires.)
There were 878,000 students assessed in 2013.
Scoring of NAEP 2013 occurred at three scoring sites.
Mesa, AZ - mathematics and TEL
Virginia Beach, VA - reading
Columbus, OH - reading
As in past cycles, project and content specialists were on-site to monitor scoring. There was daily communication between the scoring sites and project staff in the District of Columbia; Iowa City, IA; and Princeton, NJ. Reports were available to a variety staff from NCES, NAEP program, and organizations within the Alliance. Reports included the Trend Reliability and Mean Comparison (TRMC) Report, which highlighted items that were in need of more discussion, and the Completion Report. More than 7 million responses were scored from 800 items. There were several scoring windows for the NAEP 2013.
Mathematics operational: March 18 - April 19
Mathematics pilot and a special mathematics assessment in Puerto Rico: May 13 - May 24