Skip Navigation
Office for Civil Rights Survey Redesign: A Feasibility Survey
NCES: 92130
September 1992

Survey Methodology and Data Reliability

Sample Selection

A stratified sample of 843 districts was drawn from the 1989-90 list of public school districts compiled by the National Center for Education Statistics (NCES). This file contains over 16,000 listings and is part of the NCES Common Core of Data (CCD) School Universe. Local school districts in outlying territories, as well as supervisory union administrative centers. regional service agencies, and state- or federally operated institutions providing services to special needs populations, were excluded from the frame prior to sampling. With these exclusions, the final sampling frame consisted of approximately 15,400 eligible districts. The districts were stratified by size of district (in terms of total enrollment), metropolitan status, and region. Districts were sampled at rates that depended on the size and metropolitan status of the district. These rates were obtained by initially allocating the sample to strata in proportion to the aggregate square root of enrollment of the districts in the stratum, and then adjusting the rates for the urban districts to increase the sample size of these.

Response Rates

In late September 1991, questionnaires (see Appendix B) were mailed to superintendents of the 843 districts in the sample. Superintendents were asked to have the questionnaire completed by the person most knowledgeable about reporting civil rights information. Two of the districts were found to be out of scope (because of closings), leaving 841 districts in the sample. Telephone followup of nonrespondents was initiated in late October data collection was completed by the end of November. For the eligible districts that received surveys, a response rate of 96 percent (809 responding districts divided by the 841 districts in the sample) was obtained (see Table A). Item nonresponse ranged from 0.0 percent to 2.0 percent.

Sampling and Nonsampling Errors

The response data were weighted to produce national estimates. The weights were designed to adjust for the variable probabilities of selection and differential nonresponse. A final post-stratification adjustment was made so that the weighted district counts equaled the corresponding CCD frame counts within cells defined by district size, metropolitan status, and region. The findings in this report are estimates based on the sample selected and, consequently, are subject to sampling variability.

The survey estimates are also subject to nonsampling errors that can arise because of nonobservation (nonresponse or noncoverage) errors, errors of reporting, and errors made in collection of the data. These errors can sometimes bias the data. Nonsampling errors may include such problems as the differences in the respondents" interpretation of the meaning of the questions; memory effects; misrecording of responses; incorrect editing, coding, and data entry; differences related to the particular time the survey was conducted; or errors in data preparation. While general sampling theory can be used in part to determine how to estimate the sampling variability of a statistic, nonsampling errors are not easy to measure and, for measurement purposes, usually require that an experiment be conducted as part of the data collection procedures or that data external to the study be used.

To minimize the potential for nonsampling errors, the questionnaire was pretested with administrators like those who completed the survey. During the design of the survey and the survey pretest, an effort was made to check for consistency of interpretation of questions and to eliminate ambiguous items. The questionnaire and instructions were extensively reviewed by the National Center for Education Statistics, and the Office for Civil Rights in the U.S. Department of Education. Manual and machine editing of the questionnaires were conducted to check the data for accuracy and consistency. Cases with missing or inconsistent items were recontacted by telephone. Imputations for item nonresponse were not implemented, as item nonresponse rates were less than 5 percent (for nearly all items, nonresponse rates were less than 1 percent). Data were keyed with 100 percent verification.

Variances

The standard error is a measure of the variability of estimates due to sampling. It indicates the variability of a sample estimate that would be obtained from all possible samples of a given design and size. Standard errors are used as a measure of the precision expected from a particular sample. If all possible samples were surveyed under similar conditions, intervals of 1.96 standard errors below to 1.96 standard errors above a particular statistic would include the true population parameter being estimated in about 95 percent of the samples. This is a 95 percent confidence interval. For example, the estimated percentage of districts that chose a paper questionnaire as one of their preferred methods for providing data reported on the OCR Elementary and Secondary School Civil Rights Survey is 66 percent, and the estimated standard error is 2.3 percent. The 95 percent confidence interval for the statistic extends from 65- (2.3 times 1.96) to 65 + (2.3 times 1.96), or from 61 to 70 percent.

Estimates of standard errors were computed using a technique known as jackknife replication. As with any replication method, jackknife replication involves constructing a number of subsamples (replicates) from the full sample and computing the statistic of interest for each replicate. The mean square error of the replicate estimates around the full sample estimate provides an estimate of the variance of the statistic (see Wolter, 1985, Chapter 4). To construct the replications, 30 stratified subsamples of the full sample were created and then dropped one at a time to define 30 jackknife replicates (see Wolter, 1985, page 183). A proprietary computer program (WESVAR), available at Westat, Inc., was used to calculate the estimates of standard errors. The software runs under IBM/OS and VAX/VMS systems.

Background Information

The survey was performed under contract with Westat, Inc., using the Fast Response Survey System (FRSS). FRSS was established in 1975 by NCES. It was designed to collect small amounts of issue-oriented data quickly and with minimum burden on respondents. Over 40 surveys have been conducted through FRSS. Recent FRSS reports (available through the Government Printing Office) include the following:

  • Public School District Survey on Safe, Disciplined, and Drug-Free Schools, E.D. TABS (NCES 92-008).
  • Public School Principal Survey on Safe, Disciplined, and Drug-Free Schools, E.D. TABS (NCES 92-007).
  • Teacher Survey on Safe, Disciplined, and Drug-Free Schools, E.D. TABS (NCES 91-091).
  • College-Level Remedial Education in the Fall of 1989 (NCES 91-191).
  • Services and Resources for Children in Public Libraries, 1988-89 (NCES 90-098).
  • Use of Educational Research and Development Resources by Public School Districts (NCES 90-084).

Westat's Project Director was Elizabeth Farris, and the Survey Manager was Wendy Mansfield. Judi Carpenter was the NCES Project Officer. The data requester was Sharon Tuchman, Office for Civil Rights.

The report was reviewed by David Hunt, Assistant Superintendent, Rochester City School District. New York; and Edward B. Penry, Director of Student Information Management, School District of Philadelphia. Pennsylvania. Within NCES, report reviewers were Susan Broyles, Postsecondary Education Statistics Division: John J. Mathews, Education Assessment Division: and Edie McArthur, Data Development Division.

For more information about the Fast Response Survey System or the Office for Civil Rights Feasibility Survey, contact Judi Carpenter, Elementary/Secondary Education Statistics Division, Special Surveys and Analysis Branch, Office of Educational Research and Improvement, National Center for Education Statistics. 555 New Jersey Avenue NW, Washington, DC 20208-5651, telephone (202)219-1333.

References

The WESVAR Procedures. 1989. Rockville, MD: Westat, Inc.

Wolter. K. 1985. Introduction to Variance Estimation. Springer-Verlag.

Definitions

Common Core of Data Public Education Agency Universe - A data tape containing 16,987 records, one for each public elementary and secondary education agency in the 50 states, District of Columbia, and 5 outlying areas, as reported to the National Center for Education Statistics by the state education agencies for 1989-90. Records on this file contain the state and federal identification numbers, name, address, and telephone number of the agency, county name and FIPS code, agency type code, student counts, graduates and other completers counts, and other codes for selected characteristics of the agency.

Disciplinary actions - Corporal punishment, in-school suspensions, out-of-school suspensions, and expulsions (definitions of these actions were not provided on the questionnaire; interpretation was left to the respondents who are familiar with these actions).

Metropolitan status

Urban - Primarily serves a central city of a Metropolitan Statistical Area (MSA).
Suburban - Serves an MSA. but not primarily its central city.
Rural - Does not serve an MSA.

Region

Northeast region - Connecticut, Delaware, District of Columbia, Maine, Maryland, Massachusetts, New Hampshire, New Jersey, New York, Pennsylvania, Rhode Island, and Vermont.  
Central region - Illinois, Indiana, Iowa, Kansas, Michigan, Minnesota, Missouri, Nebraska, North Dakota, Ohio, South Dakota, and Wisconsin. 
Southeast region - Alabama, Arkansas, Florida, Georgia, Kentucky, Louisiana, Mississippi, North Carolina, South Carolina, Tennessee, Virginia, and West Virginia. 
West region - Alaska, Arizona, California, Colorado, Hawaii, Idaho, Montana, Nevada, New Mexico, Oklahoma, Oregon, Texas, Utah, Washington, and Wyoming.

Special academic programs - Magnet, gifted and talented, advanced placement, and honors programs (definitions of these programs were not provided on the questionnaire; interpretation was left to the respondents who are familiar with these programs).

Top