Skip to main content
Skip Navigation

The NAEP 1996 Technical Report

July 1999   Errata Notice

Authors: Nancy L. Allen, James E. Carlson, and Christine A. Zelenak

Download sections of the report (or the complete report) in a PDF file for viewing and printing.


Introduction[1]

by James E. Carlson (Educational Testing Service)

The 1996 National Assessment of Educational Progress (NAEP) monitored the performance of students in American schools in the subject areas of reading, mathematics, science, and writing. The national sample involved nearly 124,000 public and nonpublic-school students who were 9-, 13-, or 17 years old or in grades 4, 8, or 12.

The purpose of this technical report is to provide details on the instrument development, sample design, data collection, and data analysis procedures of the 1996 assessment. Detailed substantive results are not presented here but can be found in a series of NAEP reports on the status of and trends in student performance; several additional reports provide information on how the assessment was designed and implemented. The reader is directed to the following reports for 1996 results and supporting documentation:

  • NAEP 1996 Mathematics Report Card for the Nation and the States: Findings from the National Assessment of Educational Progress (Reese, Miller, Mazzeo, & Dossey, 1997)

  • NAEP 1996 Science Report Card for the Nation and the States: Findings from the National Assessment of Educational Progress (O'Sullivan, Reese, & Mazzeo, 1997)

  • The NAEP Guide: A Description of the Content and Methods of the 1994 and 1996 Assessments (NAEP, 1996)

  • NAEP 1996 Trends in Academic Progress: Achievement of U.S. Students in Science, 1969 to 1996; Mathematics, 1973 to 1996; Reading, 1971 to 1996; and Writing, 1984 to 1996 (Campbell, Voelkl, & Donahue, 1997)

  • NAEP 1996 Mathematics Cross-State Data Compendium for the Grade 4 and Grade 8 Assessment (Shaughnessy, Nelson, & Norris, 1997)

  • NAEP 1996 Science Cross-State Data Compendium for the Grade 8 Assessment (Keiser, Nelson, Norris, & Szyszkiewicz, 1998)

  • Mathematics Framework for the 1996 National Assessment of Educational Progress (National Assessment Governing Board, 1994)

  • Science Framework for the 1996 National Assessment of Educational Progress (National Assessment Governing Board, 1993)

  • Technical Report of the NAEP 1996 State Assessment Program in Mathematics (Allen, Jenkins, Kulick, & Zelenak, 1997)

  • Technical Report of the NAEP 1996 State Assessment Program in Science (Allen, Swinton, Isham, & Zelenak, 1998)

  • NAEP 1996 National Assessment Secondary-Use Data Files User Guide (Rogers, Kline, & Schoeps, 1999)

  • NAEP 1996 State Assessment Program in Mathematics Secondary-Use Data Files User Guide (O'Reilly, Zelenak, Rogers, & Kline, 1999)

  • NAEP 1996 State Assessment Program in Science Secondary-Use Data Files User Guide (O'Reilly, Zelenak, Rogers, & Kline, 1999)

  • NAEP 1996 Science Performance Standards: Achievement Results for the Nation and the States (Bourque, Champagne, & Crissman, 1997)

  • School Policies Affecting Instruction in Mathematics: Findings from the National Assessment of Educational Progress (Hawkins, Stancavage, & Dossey, 1998)

  • Student Work and Teacher Practices in Mathematics (Mitchell, Hawkins, Jakwerth, Stancavage, & Dossey, 1999)

  • Estimation Skills, Mathematics-in-context, and Advanced Skills in Mathematics (Hawkins, Mitchell, Stancavage, & Dossey, 1999)

  • Students Learning Science: A Report on Policies and Practices in U.S. Schools (O'Sullivan, Weiss, & Askew, 1998)

  • Student Work and Teacher Practices in Science: A Report on What Students Know and Can Do (O'Sullivan & Weiss, 1999)

  • The 1996 NAEP Sampling and Weighting Report (Wallace & Rust, 1999)

  • Report on Data Collection Activities for the 1996 National Assessment of Educational Progress (Westat, Inc., 1996)

  • Report of Processing and Professional Activities (National Computer Systems, 1996)

The Report Card publications highlight results for the nation, states, and selected subgroups. Reports on student work and teacher practices focus on instructional variables related to mathematics and science education and are designed to meet the information needs of teachers and curriculum specialists. The aim of the reports on school policies, which focus on instruction-relevant variables from the school or community level, is to meet the information needs of principals, school boards, and interested citizens. Technical and other reports listed above provide more detailed information on the NAEP data and analysis procedures. Many of the NAEP reports, including the almanacs (summary data tables), are also available on the Internet at http://nces.ed.gov/naep. For ordering information on printed copies of these reports, go to the Department of Education web page http://edpubs.ed.gov, call toll free 1-877-4ED PUBS (877-433-7827), or write to:

Education Publications Center (ED Pubs)
U.S. Department of Education
P.O. Box 1398
Jessup, MD 20794-1398

The Frameworks are designed to assess the outcomes of students' education in mathematics and science in grade 4, 8, and 12 as part of NAEP. For ordering information on these reports, write:

National Assessment Governing Board
800 North Capitol Street NW
Suite 825
Washington, DC 20002

The Frameworks and other NAGB documents are also available through the web at http://www.nagb.org.

Additional samples of approximately 125,000 fourth- and 125,000 eighth-graders in 48 jurisdictions were assessed in the 1996 state assessment in mathematics. Also a sample of approximately 125,000 fourth-graders in 47 states and jurisdictions was assessed as part of the 1996 state assessment in science. A representative sample of about 2,500 students was selected in each jurisdiction for each subject at each grade level. The state-level sampling plan allowed for cross-state comparisons and comparisons with the nation in fourth-grade science and fourth- and eighth-grade mathematics achievement. Technical details of the state assessments are not presented in this technical report but can be found in the state technical reports.


AN OVERVIEW OF NAEP IN 1996

For the 1996 assessment, NAEP researchers continued to build on the original design technology outlined in A New Design for a New Era (Messick, Beaton, & Lord, 1983). In order to maintain its links to the past and still implement innovations in measurement technology, NAEP continued its multistage sampling approach. Long-term trend and short-term trend samples use the same methodology and population definitions as in previous assessments. Main assessment samples use innovations associated with new NAEP technology and address current educational issues. Long-term trend data are used to estimate changes in performance from previous assessments; main assessment sample data are used primarily for analyses involving the current student population, but also to estimate short-term trends for a small number of recent assessments. In continuing to use this two-tiered approach, NAEP reaffirms its commitment to maintaining long-term trends while at the same time implementing the latest in measurement technology.

A major new design feature was introduced for 1996 to permit the introduction of new inclusion rules for students with disabilities (SD) and limited English proficient (LEP) students, and the introduction of testing accommodations for those students. The 1996 national NAEP incorporated a multiple sampling plan that allowed for the study of changes in NAEP inclusion and accommodation procedures. In order to provide for studies of the effects of these changes, students from different samples were administered the NAEP instruments using different sets of inclusion rules and accommodation procedures. Testing accommodations were provided for SD and LEP students in certain samples who could be assessed, but not with standard instruments or administration procedures.

In the 1996 assessment, many of the innovations that were implemented for the first time in 1988 were continued and enhanced. For example, a variant of the focused balanced incomplete block (focused-BIB) booklet design that was used in 1988 and has continued to be used in other assessment years, was used in the 1996 main assessment samples in mathematics and science. In the focused-BIB design, an individual receives blocks of cognitive items in the same subject area. The focused-BIB design allows for improved estimation within a particular subject area, and estimation continues to be optimized for groups rather than individuals.

In 1996, NAEP continued to apply the plausible values approach to estimating means for demographic as well as curriculum-related subgroups. Proficiency estimates were based on draws from a posterior distribution that was based on an optimum weighting of two sets of information: the student's responses to cognitive items, and his or her demographic and associated educational process variables. This Bayesian procedure was developed by Mislevy (see Chapter 11 or Mislevy, 1991). The 1996 procedures continued to use an improvement that was implemented first in 1988 and refined for the 1994 assessment. This is a multivariate procedure that uses information from all scales within a given subject area in the estimation of the proficiency distribution on any one scale in that subject area.

A major improvement used in the 1992 and 1994 assessments, and continued in 1996, was the use of the generalized partial credit model for item response theory (IRT) scaling. This allowed the incorporation of constructed-response questions that are scored on a multipoint rating scale into the NAEP scale in a way that utilizes the information available in each response category.

One important innovation in reporting the 1990 assessment data that was continued through 1996 was the use of simultaneous comparison procedures in carrying out significance tests for the differences across assessment years. Methods such as the Bonferroni allow one to control for the type I error rate for a fixed number of comparisons. In 1996, a new procedure that provided more powerful procedures that control for the false discovery rate were implemented for some comparisons. Tests for linear and quadratic trends were also applied to the national trend data in reading, mathematics, science, and writing.


ORGANIZATION OF THE TECHNICAL REPORT

Part I of this report presents the details of the design of the 1996 National Assessment, summarized in Chapter 1. Chapters 2 through 8 describe the development of the objectives and the items used in the assessment, the sample selection procedures, the assessment booklets and questionnaires, the administration of the assessment in the field, the processing of the data from the assessment instruments into computer-readable form, the professional scoring of constructed-response items, and the methods used to create a complete NAEP database.

The 1996 NAEP data analysis procedures are described in Part II of the report. Chapter 9 provides a summary of the analysis steps. Subsequent chapters provide a general discussion of the weighting and variance estimation procedures used in NAEP, an overview of NAEP scaling methodology, and details of the trend and main assessment analyses performed for each subject area in the 1996 assessment.

Chapter 19 presents basic data from the 1996 assessment, including the properties of the measuring instruments and characteristics of the sample.


  1. James E. Carlson is responsible for psychometric and statistical analyses of NAEP.


Download sections of the report (or the complete report) in a PDF file for viewing and printing:

PDF Introduction and Part I: The Design and Implementation of the 1996 NAEP (also includes front cover, title page, and other front matter; Table of Contents, List of Tables, and List of Figures; and Acknowledgments).   741K

PDF Part II: The Analysis of 1996 NAEP Data, Chapters 9 through 12.   1,014K

PDF Part II (continued): The Analysis of 1996 NAEP Data, Chapters 13 through 19.   660K

PDF Appendices and References (also includes the back cover).   902K

PDF The complete NAEP 1996 Technical Report.   3,285K

NCES 1999-452 Ordering information

Suggested Citation
U.S. Department of Education. Office of Educational Research and Improvement. National Center for Education Statistics. The NAEP 1996 Technical Report, NCES 1999-452, by Allen, N.L., Carlson, J.E., & Zelenak, C.A. (1999). Washington, DC: National Center for Education Statistics.

Last updated 14 March 2001 (RH)

Go to Top of Page