Skip Navigation
small NCES header image
NAEP Processing Assessment Materials → Processing Schedule and Overall Counts for the 2008 Assessment

Processing Schedule and Overall Counts for the 2008 Assessment

        

2008 Processing Schedule

2008 Student Participation and Session Information

In the spring of 2008, NAEP assessed a national sample of students in arts (music and visual arts) at grade 8. Pilot tests based on new frameworks were conducted in science at grades 4, 8, and 12 and in reading and mathematics at grade 12. Field tests were conducted at grades 4 and 8 in mathematics and reading, with the reading assessment based on a new framework. In addition, a special study on inclusion was conducted at grade 4 in reading and mathematics. Long-term trend studies in mathematics and reading were conducted in the fall of 2008 for students at age 13, in the winter for students at age 9, and in the spring for students at age 17. Normally, pilot testing materials are not described or reported on; however, in this case they are integral to processing materials.

Materials staff were responsible for the following tasks:

  • printing test booklets and questionnaires,
  • materials packaging and distribution,
  • receipt control,
  • data capture through image and optical mark recognition scanning,
  • data editing and validation,
  • creation of training sets for new items,
  • training and performance scoring of trend constructed-response items,
  • preparation of materials for scoring of constructed-response items,
  • data file creation, and
  • inventory control and materials storage.

More than 1.2 million forms were printed, resulting in approximately 200,000 student documents and more than 36,000 questionnaires scanned. Most questionnaires were offered in either traditional paper form or on-online at www.naepq.org, The site was available from January 2 through March 17, 2008.

Scoring of the 2008 NAEP Assessment occurred at four sites:

  • Iowa City, Iowa (arts); 
  • Mesa, Arizona  (mathematics); 
  • Virginia Beach, Virginia (reading); and 
  • Lansing, Michigan (science).

As in past cycles, project and content specialists were on-site to monitor scoring. There was daily communication between the scoring sites and project staff in Washington, D.C.; Iowa City, Iowa; and Princeton, New Jersey. Reports included the Trend Reliability and Mean Comparison (TRMC) Report and the Completion Report. A macro was applied the TRMC to highlight those items that needed more discussion. Approximately one million responses were first, second, or trend scored during January, March, April, May, and June.


Last updated 24 August 2010 (JL)
Would you like to help us improve our products and website by taking a short survey?

YES, I would like to take the survey

or

No Thanks

The survey consists of a few short questions and takes less than one minute to complete.
National Center for Education Statistics - http://nces.ed.gov
U.S. Department of Education