Page Title:
Keywords:
Description:
Skip to main content
Skip Navigation
Processing Assessment Materials → Processing Schedules and Overall Counts for the 2009 Assessment

NAEP Technical DocumentationProcessing Schedule and Overall Counts for the 2009 Assessment

        

2009 Processing Schedule

2009 Student Participation and Session Information

In the spring of 2009, the National Assessment of Educational Progress (NAEP) assessed a nationally representative sample of students in mathematics, reading, and science at grades 4, 8, and 12. Probes were conducted at all three grades for science hands-on tasks (HOT) and science interactive computer tasks (ICT), as were pilot tests for civics, geography, and U.S. history. In addition, some students were asked to respond to an Extended Student Background Questionnaire (ESBQ), and information was collected for the National Indian Education Study (NIES).

Materials staff was responsible for the following tasks:

  • printing test booklets and questionnaires,
  • packaging and distributing materials,
  • receipt control,
  • data capture through image and optical mark recognition scanning,
  • data editing and validation,
  • creating training sets for new items,
  • training and performance scoring of constructed response items,
  • data file creation, and
  • inventory control and materials storage.

For the assessment, NAEP staff designed 1,060 student booklet types, 30 questionnaires, and 10 tracking forms. There were almost five million forms printed resulting in approximately 1,300,000 student documents and more than 450,000 questionnaires scanned. Most questionnaires were offered either in traditional paper form or online at www.naepq.com. The site was available from January 2 through March 20, 2009.

Scoring of the 2009 NAEP Assessment occurred at five sites:

  • Columbus, Ohio (reading)
  • Lansing, Michigan (science)
  • Mesa, Arizona (mathematics)
  • Tucson, Arizona (pilots in civics, geography, and U.S. history)
  • Virginia Beach, Virginia (reading)

As in past cycles, project and content specialists were on-site to monitor scoring. There was daily communication between the scoring sites and project staff in the District of Columbia; Iowa City, Iowa; and Princeton, New Jersey. Reports included the Trend Reliability and Mean Comparison (TRMC) Report and the Completion Report. The TRMC had a macro applied to highlight items that were in need of more discussion. More than 13 million responses were first, second, or trend scored during March, April, and June.

 


Last updated 17 March 2016 (GF)