January 1998
Download the complete report in a PDF file for viewing and printing. 1,557K
In April 1988, Congress reauthorized the National Assessment of Educational Progress (NAEP) and added a new dimension to the program -- voluntary state-by-state assessments on a trial basis in 1990 and 1992, in addition to continuing the national assessments that NAEP had conducted since its inception. In 1994, Congress authorized a third Trial State Assessment for administration in 1994. It should be noted that the word trial in Trial State Assessment refers to the Congressionally mandated trial to determine whether such assessments can yield valid, reliable state representative data. Enough experience had been gained for Congress to authorize State Assessments, rather than Trial State Assessments, to be conducted in 1996. In this report, we will refer to the voluntary state-by-state assessment program as the State Assessment program. The State Assessment program, which is designed to provide representative data on achievement for participating jurisdictions, is distinct from the assessment designed to provide nationally representative data, referred to in this report as the national assessment. (This terminology is also used in all other reports of the 1996 assessment results.) All instruments and procedures used in the 1990, 1992, 1994, and 1996 state and national assessments were previously piloted in field tests conducted in the year prior to each assessment.
The 1990 Trial State Assessment program collected information on the mathematics knowledge, skills, understanding, and perceptions of a representative sample of eighth-grade students in public schools in 37 states, the District of Columbia, and two territories. The second phase of the Trial State Assessment program, conducted in 1992, collected information on the mathematics knowledge, skills, understanding, and perceptions of a representative sample of fourth- and eighth-grade students and the reading skills and understanding of a representative sample of fourth-grade students in public schools in 41 states, the District of Columbia, and two territories.
The 1994 Trial State Assessment program once again assessed the reading skills and understanding of representative samples of fourth-grade students, this time in 44 participating jurisdictions. The 1994 program broke new ground in two ways. The 1994 NAEP authorization called for the assessment of samples of both public- and nonpublic-school students. Thus, for the first time in NAEP, jurisdiction-level samples of students from Catholic schools, other religious schools and private schools, Department of Defense Domestic Dependent Elementary and Secondary Schools (DDESS), and Bureau of Indian Affairs (BIA) schools were added to the Trial State Assessment program. Second, samples of students from the Department of Defense Dependents Schools Office of Dependents Education (DoDDS) schools participated as a jurisdiction, along with the states and territories that have traditionally had the opportunity to participate in the Trial State Assessment program.
The 1996 State Assessment program, described in this report, collected information on the science knowledge, skills, understanding, and perceptions of a representative sample of eighth-grade students in the jurisdictions shown in Table 1-1; Department of Defense Education Activity (DoDEA) school students were assessed at both grades 4 and 8. The grade 4 assessment of DoDEA students was a special assessment supported by NCES. In addition, grade 4 and grade 8 students were assessed for a third time in mathematics (see the Technical Report of the NAEP 1996 State Assessment Program in Mathematics, Allen, Jenkins, Kulick, & Zelenak, 1996).
A special feature of the 1996 State Assessments was the introduction of new rules for student inclusion in NAEP assessments. Half of the schools selected for participation in the 1996 assessment used the old inclusion rules to determine whether students should be included in the assessment and the other half used the new inclusion rules. In addition to the two groups of schools using the old and new inclusion rules without offering students special testing accommodations, the 1996 national assessment included a third group of schools that used the new inclusion rules and offered students within those schools accommodations to the standard NAEP administration procedures.
The accommodations provided by NAEP in the national assessments were meant to match those specified in the students individualized education plan (IEP) or those ordinarily provided in the classroom for testing situations. The most common accommodation was extended time. In the State Assessment, no special accommodations were offered.
The new inclusion rules are applied only when a student has been categorized in his or her IEP as a student with disabilities (SD) or as a student with limited English proficiency (LEP); all other students are asked to participate in the assessment. For this reason, the sample of students that were selected for most analysis and reporting purposes for science consisted of students who were not categorized as SD or LEP students and students from the schools using new inclusion rules that were categorized as SD or LEP. The students who were not categorized as SD or LEP were from all schools no matter which set of inclusion rules were used. The advantage of this reporting sample is that it makes use of most of the data from the assessment and begins a science trend line for the State Assessment program with the new inclusion rules.
Special analyses that used the national science and mathematics assessment data to compare the old and new inclusion rules and examine the effect of offering testing accommodations, indicated little difference in proportions of students included in the assessment using the old and new inclusions. More students were included in the assessment when they were offered accommodations; however, a portion of students who would have participated in the assessment under standard conditions were assessed with accommodations when they were offered. A result of this is that fewer students were assessed under standard conditions when accommodations were offered.
Table 1-1 lists the jurisdictions that participated in the 1996 State Assessment program. Over 125,000 students participated in the 1996 State Assessment in science in the jurisdictions shown. Students were administered the same assessment booklets that were used in NAEPs 1996 national science assessment.
The 1996 NAEP science framework and assessment specifications were developed for NAEP through a consensus project conducted by the Council of Chief State School Officers (CCSSO) under funding from the National Assessment Governing Board (NAGB). During this development process, input and reactions were continually sought from a wide range of educators and professionals both within the field of science and external to it. Hence, for grade 8, the assessment provides the first opportunity to report jurisdiction-level data for a NAEP science instrument for those states and territories that participated in the 1996 State Assessment program. In addition, questionnaires completed by the students, their science teachers, and principals or other school administrators provided an abundance of contextual data within which to interpret the science results.
Alabama | Georgia | Mississippi | Rhode Island |
Alaska | Guam | Missouri | South Carolina |
Arizona | Hawaii | Montana | Tennessee |
Arkansas | Indiana | Nebraska | Texas |
California | Iowa | Nevada | Utah |
Colorado | Kentucky | New Hampshire | Vermont |
Connecticut | Louisiana | New Jersey | Virginia |
Delaware | Maine | New Mexico | Washington |
DoDEA/DDESS | Maryland | New York West | Virginia |
DoDEA/DoDDS | Massachusetts | North Carolina | Wisconsin |
District of Columbia | Michigan | North Dakota | Wyoming |
Florida | Minnesota | Oregon |
The purpose of this report is to provide technical information about the 1996 State Assessment in science. It provides a description of the design for the State Assessment and gives an overview of the steps involved in the implementation of the program from the planning stages through to the analysis and reporting of the data. As stated previously, the 1996 State Assessment in science was conducted at grade 8 only, although, as part of a special assessment, DoDEA students in grade 4 were also assessed. The report describes in detail the development of the cognitive and background questions, the field procedures, the creation of the database and data products for analysis, and the methods and procedures used for sampling, analysis, and reporting. It does not provide the results of the assessment -- rather, it provides information on how those results were derived.
This report is one of several documents that provide technical information about the 1996 State Assessment. For those interested in performing their own analyses of the data, this report and the user guide for the secondary-use data should be used as primary sources of information about NAEP (OReilly, Zelenak, Rogers, & Kline, 1997). Information for lay audiences is provided in the procedural appendices to the science subject-area reports; theoretical information about the models and procedures used in NAEP can be found in the special NAEP-related issue of the Journal of Educational Statistics (Summer 1992/Volume 17, Number 2) and in previous national technical reports. Further, the Science Framework for the 1996 National Assessment in Educational Progress includes a discussion of the processes and specifications by which the framework was developed (National Assessment Governing Board, 1993). For more information about the science assessment and the characteristics of the items in the assessment, see The NAEP 1996 Technical Report (Allen, Carlson, & Zelenak, 1998).
Under a cooperative agreement with the National Center for Education Statistics (NCES), Educational Testing Service (ETS) was responsible for the development, analysis, and reporting of the 1996 NAEP programs, including the State Assessment. ETS was responsible for overall management of aspects of the programs as well as for development of the overall design, the items and questionnaires, data analysis, and reporting. National Computer Systems (NCS) was a subcontractor to ETS on both the national and State NAEP programs. NCS was responsible for printing, distribution, and receipt of all assessment materials, and for data processing, scanning, and professional scoring. All aspects of sampling and field operations for both the national and State Assessments were the responsibility of Westat, Inc. The National Center for Education Statistics (NCES) contracted directly with Westat for these services for the national and state assessments.
This technical report provides information about the technical bases for a series of reports that have been prepared for the 1996 State Assessment program in science. They include:
The state reports and the Science Report Card will be available on the World Wide Web as they are publicly released; the almanacs will be placed on the web about a month after they are released on CD-ROM.
NCES 98-480 Ordering information
Suggested Citation
Allen, N.L., Swinton, S.S., Isham, S.P., & Zelenak, C.A. (1997).
Technical report of the NAEP 1996 state assessment program in science.
Washington, DC: National Center for Education Statistics.
Last updated 22 March 2001 (RH)