Skip to main content

Technical Report of the NAEP 1996 State Assessment Program in Mathematics

August 1997

Authors: Nancy L. Allen, Frank Jenkins, Edward Kulick, and Christine A. Zelenak

PDF Download the complete report in a PDF file for viewing and printing. 2,227K


OVERVIEW [1]

In April 1988, Congress reauthorized the National Assessment of Educational Progress (NAEP) and added a new dimension to the program--voluntary state-by-state assessments on a trial basis in 1990 and 1992, in addition to continuing the national assessments that NAEP had conducted since its inception. In 1994, Congress authorized a third Trial State Assessment for administration in 1994. It should be noted that the word trial in Trial State Assessment refers to the Congressionally mandated trial to determine whether such assessments can yield valid, reliable state representative data. Enough experience had been gained for Congress to authorize State Assessments, rather than Trial State Assessments, to be conducted in 1996. In this report, we will refer to the voluntary state-by-state assessment program as the State Assessment program. The State Assessment program, which is designed to provide representative data on achievement for participating jurisdictions, is distinct from the assessment designed to provide nationally representative data, referred to in this report as the national assessment. (This terminology is also used in all other reports of the 1996 assessment results.) All instruments and procedures used in the 1990, 1992, 1994, and 1996 state and national assessments were previously piloted in field tests conducted in the year prior to each assessment.

The 1990 Trial State Assessment program collected information on the mathematics knowledge, skills, understanding, and perceptions of a representative sample of eighth-grade students in public schools in 37 states, the District of Columbia, and two territories. The second phase of the Trial State Assessment program, conducted in 1992, collected information on the mathematics knowledge, skills, understanding, and perceptions of a representative sample of fourth- and eighth-grade students and the reading skills and understanding of a representative sample of fourth-grade students in public schools in 41 states, the District of Columbia, and two territories.

The 1994 Trial State Assessment program once again assessed the reading skills and understanding of representative samples of fourth-grade students, this time in 44 participating jurisdictions. The 1994 program broke new ground in two ways. The 1994 NAEP authorization called for the assessment of samples of both public- and nonpublic-school students. Thus, for the first time in NAEP, jurisdiction-level samples of students from Catholic schools, other religious schools and private schools, Department of Defense Domestic Dependent Elementary and Secondary Schools (DDESS), and Bureau of Indian Affairs (BIA) schools were added to the Trial State Assessment program. Second, samples of students from the Department of Defense Dependents Schools (DoDDS) schools participated as a jurisdiction, along with the states and territories that have traditionally had the opportunity to participate in the Trial State Assessment program.

The 1996 State Assessment program, described in this report, again collected information on the mathematics knowledge, skills, understanding, and perceptions of a representative sample of fourth- and eighth-grade students for a third time. In addition, grade 8 public- and nonpublic-school students were assessed in science (see the Technical Report of the NAEP 1996 State Assessment Program in Science, Allen, Swinton, & Zelenak, 1996).

A special feature of the 1996 State Assessments was the introduction of new rules for student inclusion in NAEP assessments. In order to assure that the mathematics results for state assessments in 1990, 1992, and 1996 are comparable, half of the schools selected for participation in the 1996 assessment used the old inclusion rules to determine whether students should be included in the assessment and the other half used the new inclusion rules. In addition to the two groups of schools using the old and new inclusion rules without offering students special testing accommodations, the 1996 national assessment included a third group of schools that used the new inclusion rules and offered students within those schools the accommodations to the standard NAEP administration procedures. More details on the procedures for student exclusion are presented in the report on field procedures for the 1996 State Assessment program (Westat, Inc., 1996). More details on the procedures used for student exclusion are presented in the report on field procedures for the 1996 State Assessment program (Westat, 1996).

The accommodations provided by NAEP in the national assessments were meant to match those specified in the student’s individualized education plan (IEP) or those ordinarily provided in the classroom for testing situations. The most common accommodation was extended time. In the State Assessment, no special accommodations were offered.

The old and new inclusion rules are applied only when a student has been categorized in his or her IEP as a student with disabilities (SD) or as a student with limited English proficiency (LEP); all other students are asked to participate in the assessment. For this reason, the sample of students that were selected for most analysis and reporting purposes consisted of students from schools using either set of inclusion rules that were not categorized as SD or LEP students and students from the schools using the old inclusion rules that were categorized as SD or LEP. The advantage of this reporting sample is that it preserves trend with previous assessments and it makes use of most of the data from the assessment.

Special analyses that used the national mathematics assessment data to compare the old and new inclusion rules and examine the effect of offering testing accommodations, indicated little difference in proportions of students included in the assessment using the old and new inclusions. More students were included in the assessment when they were offered accommodations; however, a portion of students who would have participated in the assessment under standard conditions were assessed with accommodations when they were offered. A result of this is that fewer students were assessed under standard conditions when accommodations were offered.

Table 1-1 lists the jurisdictions that participated in the 1996 State Assessment program. Over 125,000 students at each grade participated in the 1996 State Assessments in the jurisdictions shown. Students were administered the same assessment booklets that were used in either NAEP’s 1996 national mathematics or national science assessments.

Table 1-1

Jurisdictions Participating in the 1996 State Assessment Program in Mathematics


Jurisdictions

Alabama Georgia Mississippi [2] Pennsylvania [4]
Alaska [1] Guam Missouri [2] Rhode Island
Arizona Hawaii Montana [3] South Carolina [2]
Arkansas Indiana Nebraska Tennessee [2]
California Iowa Nevada [1] Texas
Colorado Kentucky New Hampshire [5] Utah [2]
Connecticut Louisiana New Jersey Vermont [1]
Delaware Maine [2] New Mexico Virginia
DoDEA/DDESS [1/6] Maryland New York Washington [1]
DoDEA/DoDDS [1/6] Massachusetts [2] North Carolina West Virginia
District of Columbia Michigan North Dakota Wisconsin
Florida Minnesota Oregon [3] Wyoming

[1] Participated in the 1996 mathematics assessment program only.
[2] Participated in the 1992 and 1996 mathematics assessment programs but not in the 1990 program.
[3] Participated in the 1990 and 1996 mathematics assessment programs but not in the 1992 program.
[4] Grade 4 only.
[5] Grade 8 only.
[6] DoDEA is the Department of Defense Education Activity schools, DDESS is the Department of Defense Domestic Dependent Elementary and Secondary Schools, and DoDDS is the Department of Defense Dependents Schools.

The 1996 NAEP mathematics assessments were based on the same framework that was used to construct the 1990 and 1992 assessments. The mathematics framework and assessment specifications were developed for NAEP through a consensus project conducted by the Council of Chief State School Officers (CCSSO) under funding from the National Assessment Governing Board (NAGB). Subsequent to the 1992 assessment, assessment specifications were refined to bring the assessment more in line with the Curriculum and Evaluation Standards for School Mathematics, published by the National Council of Teachers of Mathematics (NCTM). Research conducted as part of the 1995 NAEP field test indicated that despite these specifications, the measurement constructs associated with the 1992 and 1996 instruments were sufficiently similar to justify the continuation of the current NAEP scale. Hence, for grade 8, 1996 provides an opportunity to report jurisdiction-level trend data for a NAEP mathematics instrument for those states and territories that participated in the 1990, 1992, and 1996 State Assessment programs. In addition, questionnaires completed by the students, their mathematics teachers, and principals or other school administrators provided an abundance of contextual data within which to interpret the mathematics results.

The purpose of this report is to provide technical information about the 1996 State Assessment in mathematics. It provides a description of the design for the State Assessment and gives an overview of the steps involved in the implementation of the program from the planning stages through to the analysis and reporting of the data. The report describes in detail the development of the cognitive and background questions, the field procedures, the creation of the database and data products for analysis, and the methods and procedures used for sampling, analysis, and reporting. It does not provide the results of the assessment--rather, it provides information on how those results were derived.

This report is one of several documents that provide technical information about the 1996 State Assessment. For those interested in performing their own analyses of the data, this report and the user guide for the secondary-use data should be used as primary sources of information about NAEP. Information for lay audiences is provided in the procedural appendices to the mathematics subject-area reports; theoretical information about the models and procedures used in NAEP can be found in the special NAEP-related issue of the Journal of Educational Statistics (Summer 1992/Volume 17, Number 2).

Under a cooperative agreement with the National Center for Education Statistics (NCES), Educational Testing Service (ETS) was responsible for the development, analysis, and reporting of the 1996 NAEP programs, including the State Assessment. ETS was responsible for overall management of aspects of the programs as well as for development of the overall design, the items and questionnaires, data analysis, and reporting. National Computer Systems (NCS) was a subcontractor to ETS on both the national and State NAEP programs. NCS was responsible for printing, distribution, and receipt of all assessment materials, and for data processing, scanning, and professional scoring. All aspects of sampling and field operations for both the national and State Assessments were the responsibility of Westat, Inc. NCES contracted directly with Westat for these services for the national and state assessments.

This technical report provides information about the technical bases for a series of reports that have been prepared for the 1996 State Assessment program in mathematics. They include:

  • A State Report for each participating jurisdiction that describes the mathematics scale scores of the fourth-and eighth-grade public- and nonpublic-school students in that jurisdiction and relates their scale scores to contextual information about mathematics policies and instruction.

  • The NAEP 1996 Mathematics Report Card for the Nation and the States, which provides both public- and nonpublic-school data for major NAEP reporting subgroups for all of the jurisdictions that participated in the State Assessment program, as well as selected results from the 1996 national mathematics assessment.

  • The Cross-State Data Compendium from the NAEP 1996 Mathematics Assessment, which includes jurisdiction-level results for all the demographic, instructional, and experiential background variables included in the Mathematics Report Card and State Report.

  • Two Data Almanacs for each jurisdiction, one for grade 4 and one for grade 8, distributed only in electronic form, that contain a detailed breakdown of the mathematics scale-score data according to the responses to the student, teacher, and school questionnaires for the public school, nonpublic school, and combined populations as a whole and for important subgroups of the public-school population. There are six sections to each almanac:

    • The Distribution Data Section provides information about the percentages of students at or above the three composite scale achievement levels (and below basic). For the composite scale and each mathematics content strand scale,[2] this almanac also provides selected percentiles for the public school, nonpublic school, and combined populations and for the standard demographic subgroups of the public-school population. Mathematics was previously assessed in 1990 and 1992 for grade 8 and in 1992 for grade 4 in the State Assessment program. For items that are common to 1990 and/or 1992, trend results are presented, as applicable.

    • The Student Questionnaire Section provides a breakdown of the composite scale score data according to the students’ responses to questions in the three student questionnaires included in the assessment booklets.

    • The Teacher Questionnaire Section provides a breakdown of the composite scale score data according to the teachers’ responses to questions in the mathematics teacher questionnaire.

    • The School Questionnaire Section provides a breakdown of the composite scale score data according to the principals’ (or other administrators’) responses to questions in the school characteristics and policies questionnaire.

    • The Scale Section provides a breakdown of selected items from the questionnaires according to each of the scales measuring mathematics content strands in the assessment.

    • The Mathematics Item Section provides the response data for each mathematics item in the assessment.

The state reports and the Mathematics Report Card will be available on the World Wide Web as they are publicly released; the almanacs will be placed on the web about a month after they are released on CD-ROM.


  1. Nancy L. Allen is the Director of Data Analysis and Scaling, NAEP Research, Educational Testing Service. John Mazzeo is the Director of NAEP Reporting, Educational Testing Service.

  2. Scales were created for five content strands: number sense, properties, and operations; measurement; geometry and spatial sense; data analysis, statistics, and probability; and algebra and functions.


PDF Download the complete report in a PDF file for viewing and printing. 2,227K

NCES 97-951 Ordering information

Last updated 23 March 2001 (RH)

Go to Top of Page