Skip Navigation

School and Staffing Survey (SASS)

5. Data Quality and Comparability

Sampling Error

The estimators of sampling variances for SASS statistics take the SASS complex sample design into account. For an overview of the calculation of sampling errors, see the Quality Profile reports (Jabine 1994; Kalton et al. 2000).

Direct Variance Estimators. The balanced half-sample replication (BHR) method, also called balanced repeated replication (BRR), was used to estimate the sampling errors associated with estimates from the 1987–88 and 1990–91 SASS. Given the replicate weights, the statistic of interest (e.g., the number of 12th grade teachers from the School Survey) can be estimated from the full sample and from each replicate. The mean square error of the replicate estimates around the full sample estimate provides an estimate of the variance of the statistic.

A bootstrap variance estimator was used for the 1993–94, 1999–2000, 2003–04, 2007–08, and 2011–12 SASS. The bootstrap variance reflects the increase in precision due to large sampling rates because the bootstrap is done systematically without replacement, as was the original sampling. Bootstrap samples can be selected from the bootstrap frame, replicate weights computed, and variances estimated with standard BHR software. The bootstrap replicate basic weights (inverse of the probability of selection) were subsequently reweighted. More information on the bootstrap variance methodology and how it applies to SASS is contained in the following sources: “A Bootstrap Variance Estimator for Systematic PPS Sampling” (U.S. Department of Education 2000) which describes the methodology used in the 1999–2000 SASS; “A Bootstrap Variance Estimator for the Schools and Staffing Survey” (U.S. Department of Education 1994); “Balanced Half-Sample Replication With Aggregation Units” (U.S. Department of Education 1994); “Comparing Three Bootstrap Methods for Survey Data” (Sitter 1990); “Properties of the Schools and Staffing Survey Bootstrap Variance Estimator” (U.S. Department of Education 1996); and “The Jackknife, the Bootstrap and Other Resampling Plans” (Efron 1982).

SASS variances can be calculated using the replicates of the full sample that are available in the data files with software such as WesVarPC. For examples of other software that support BRR, see Introduction to Variance Estimation (Wolter 1985).

Average Design Effects. Design effects (Deffs) measure the impact of the complex sample design on the accuracy of a sample estimate, in comparison to the alternative simple random sample design. For the 1990–91 SASS, an average design effect was derived for groups of statistics and, within each group, for a set of subpopulations. Standard errors for 1990–91 and 1993–94 SASS statistics of various groups for various subpopulations can then be calculated approximately from the standard errors based on the simple random sample (using SAS or SPSS) in conjunction with the average design effects provided. For example, for the 1990–91 SASS, average design effects for selected variables in the School Survey are 1.60 (public sector) and 1.36 (private sector); in the Principal Survey, 4.40 (public sector) and 4.02 (private sector); and in the Teacher Survey, 3.75 (public sector) and 2.52 (private sector). Examples illustrating the use of SASS average design effect tables are provided in Design Effects and Generalized Variance Functions for the 1990–91 Schools and Staffing Survey (SASS), Volume I, User’s Manual (Salvucci and Weng 1995).

Generalized Variance Functions (GVFs). GVF tables were developed for use in the calculation of standard errors of totals, averages, and proportions of interest in the 1990– 91 SASS components. The 1990–91 GVFs can be used for the 1993–94 SASS because no major design changes were adopted between 1990–91 and 1993–94. Note that the GVF approach, unlike the design effect approach described above, involves no need to calculate the simple random sample variance estimates. Examples illustrating the use of the GVF tables are provided in Design Effects and Generalized Variance Functions for the 1990–91 Schools and Staffing Survey (SASS), Volume I, User’s Manual (Salvucci and Weng 1995).


Nonsampling Error

Coverage Error. SASS surveys are subject to any coverage error present in the CCD and PSS data files, which serve as their principal sampling frames. The report Coverage Evaluation of the 1994–95 Common Core of Data: Public Elementary/Secondary Education Agency Universe Survey (Owens 1997) found that overall coverage in the 1994–95 CCD Local Education Agency Universe Survey was 96.2 percent (in a comparison to state education directories). “Regular” agencies–those traditionally responsible for providing public education–had almost total coverage in the 1994–95 agency universe survey. Most coverage discrepancies were attributed to nontraditional agencies that provide special education, vocational education, and other services. However, there is potential for undercoverage bias associated with the absence of schools built between the time when the sampling frame is constructed and the time of the SASS survey administration. Further research on coverage can be found in Evaluating the Coverage of the U.S. National Center for Education Statistics ’ Public Elementary/Secondary School Frame (Hamann 2000) and Evaluating the Coverage of the U.S. National Center for Education Statistics’ Public and Private School Frames Using Data from the National Assessment of Educational Progress (Lee, Burke, and Rust 2000).

A capture-recapture methodology was used to estimate the number of private schools in the United States and to estimate the coverage of private schools in the 1999–2000 PSS; the study found that the PSS school coverage rate is equal to 97 percent. (See CCD and PSS chapters for a more thorough discussion.)

Nonresponse Error.
Unit nonresponse. The weighted unit response rates for public schools have been higher than the weighted unit response rates for private schools in all six rounds of SASS. (See table SASS-1 for response rates from selected years.) For more information on the analysis of nonresponse rates, refer to An Analysis of Total Nonesponse in the 1993–94 Schools and Staffing Survey (SASS) (Monaco et al. 1997) and An Exploratory Analysis of Response Rates in the 1990–91 Schools and Staffing Survey (SASS) (Scheuren et al. 1996).

Item Nonresponse. For the 2011–12 SASS, the percentage with weighted item response rates at or above 85 percent, by individual survey, are: 100 percent for public school districts; 96 percent for public schools; 94 percent for private schools; 99 percent for public school principals; 98 percent for private school principals; 94 percent for public school teachers; and 93 percent for private school teachers.  Only private school teachers surveys had items with response rates below 70 percent.

Measurement Error. Results reported in An Analysis of Total Response in the 1993–94 Schools and Staffing Survey (SASS) (Monaco et al. 1997) support the contention that, without follow-up to mail surveys, nonresponse error would be much greater than it is and that the validity and reliability of the data would be considerably reduced. However, because of the substantial amount of telephone follow-up, there is concern about possible bias due to differences in the mode of survey collection. Other possible sources of measurement error include long, complex instructions that respondents either do not read or do not understand, navigation problems related to the format of the questionnaires, and definitional and classification problems. See also Measurement Error Studies at the National Center for Education Statistics (Salvucci et al.1997).

Several NCES working papers also address measurement error. Reports on the 1993–94 SASS include Cognitive Research on the Teacher Listing Form for the Schools and Staffing Survey (Jenkins and Von Thurn 1996); Further Cognitive Research on the Schools and Staffing Survey (SASS) (Zukerberg and Lee 1997); Report of Cognitive Research on the Public and Private School Teacher.

Questionnaires for the Schools and Staffing Survey 1993– 94 School Year (Jenkins 1997); and Response Variance in the 1993–94 Schools and Staffing Survey: A Reinterview Report (Bushery, Schreiner, and Sebron 1998). Reports on the 1991–92 SASS include the 1991 Schools and Staffing Survey (SASS) Reinterview Response Variance Report (Royce 1994) and The Results of the 1991–92 Teache Follow-up Survey (TFS) Reinterview and Extensive Reconciliation (Jenkins and Wetzel 1995).

Summary of weighted unit response rates for selected SASS questionnaires: 1993–94 through 2011–12 
Questionnaire 1993–94 1999 –2000  2003–04 2007–08 2011–12
School District Survey 93.9 88.6 82.9 87.8 80.6
Public Principal Survey 96.6 90.0 82.2 79.4 72.7
Public School Survey 92.3 88.5 80.8 80.4 72.5
Public Teacher Survey1 88.2 92.2 89.2 86.2 79.6
Private Principal Survey 87.6 84.8 74.9 72.2 64.7
Private School Survey 83.2 79.8 75.9 75.9 65.7
Private Teacher Survey1 80.2 87.0 85.4 85.1 71.6
BIE Principal Survey 98.7 93.3 90.7 79.2
BIE School Survey 99.3 96.7 89.5 77.1
BIE Teacher Survey 86.5 97.8 93.8 87.3
—Not available. Data were not collected.
1The overall teacher response rates are the percentage of teachers responding in schools that provided teacher lists for sampling.
SOURCE: SASS methodology reports; available at