The SASS underwent a content redesign in preparation for the 1999–2000 round of SASS. The redesigned questionnaires continued to measure the same five major policy issues teacher shortage and demand; characteristics of elementary and secondary teachers; teacher workplace conditions; characteristics of school principals; and school programs and policies. New coverage included school and district performance reports, standards for home schooled students, charter schools, migrant students and availability and use of computers and the Internet. The 1999–2000 SASS discontinued the student records survey and library media specialist survey, but retained the library media center survey.
The sampling frame for the public school sample was the 1997–98 Common Core of Data (CCD) school file. CCD is a universe file that includes all elementary and secondary schools in the United States. Schools operated by the Department of Defense or those that offered only kindergarten or pre-kindergarten or adult education were excluded from the SASS sample.
The list frame used the 1997–98 Private School Universe Survey (PSS) list, updated with association lists. An area frame supplement was based on the canvassing of private schools within specific geographical areas.
A separate universe of schools funded by the Bureau of Indian Affairs (BIA) was drawn from the Program Education Directory maintained by the BIA. To avoid duplicates in the BIA files, BIA schools in the CCD school file were treated as public schools.
The sample design for the School Survey met the objectives for SASS and took into consideration the response burden for schools. The main design objective of the School Survey was to provide estimates of school characteristics by the following key analytical domains: the nation; elementary and secondary levels by public and private sectors; BIA schools and schools with a student population which is at least 25 percent American Indian or Alaska Native; school levels of public schools by state; and private schools by association group, region and school level.
Another objective was to balance the requirements of the samples in SASS. The 1999–2000 SASS sampled schools first and LEAs afterward. To obtain a suitable teacher sample, schools were selected with a probability proportionate to the square root of the number of teachers. Teachers within schools were then sampled at a rate that made the overall selection probability approximately constant within strata, subject to the constraints of sampling at least one and no more than 20 teachers per school. The SASS sample design also sought to control sample overlap between SASS and other NCES school surveys.
The U.S. Census Bureau performed the data collection and began by sending advance letters to the sampled LEAs and schools in August and September, respectively. School questionnaires were mailed in October and a reminder postcard was sent several weeks later. Follow-up for nonresponding teachers was conducted using Computer-Assisted Telephone Interviewing (CATI).
The U.S. Census Bureau performed the data processing. Each questionnaire was coded according to its status-for example, whether the questionnaire contains a completed interview, a respondent refused to complete it, a school district merged with another district, or a school closed. The next step was to make a preliminary determination of each case's interview status, i.e., whether it is an interview, a noninterview, or if the respondent was out-of-scope (for example, if a sampled school had closed). A computer pre-edit program generated a list of cases where problems occurred as defined by edit specifications, depending on each survey. After pre-edit corrections were made, each file was subjected to another computer edit. This operation consisted of a range check, a consistency edit, and a blanking edit.
After the completion of the range check, consistency edit, and blanking edit, the records were put through another edit. That edit made a final determination of whether the case was eligible for the survey and, if so, whether there were sufficient data for the case to be classified as an interview. A final interview status recode value was assigned to each case as a result of the edit.
SASS used four methods to impute values for questionnaire items that respondents did not answer. These include the following: (1) using data from other items on the questionnaire, (2) extracting data from a related component of SASS, (3) extracting data from the sample frame (PSS or CCD), and (4) extracting data from the record for a sample case with similar characteristics (commonly known as the "hot deck" method for imputing item response).
Weighting of the sample units was carried out to produce national and state estimates for public schools, LEAs, administrators and teachers. Private schools, administrators and teachers were weighted to produce national and affiliation estimates. The weighting procedures used in the School Survey have three purposes: to take account of the school's selection probabilities; to reduce biases that may result from unit nonresponse; and to make use of available information from external sources to improve the precision of sample estimates.
Weighted response rates are defined as the number of in-scope responding questionnaires divided by the number of in-scope sample cases, using the basic weight (inverse of the probability of selection) of the record. For all other components, only one sampling stage was involved; therefore, for these components, the weighted overall response rate and the weighted response rate are the same.
|Component||Sample size||Weighted response rate|
|Teacher Listing Form||—||92.2%|
|Library Media Center||9,893||94.7%|
|Teacher Listing Form||—||87.0%|
|Library Media Center||3,558||87.7%|
|Teacher Listing Form||—||97.8%|
|Library Media Center||124||95.4%|
|Public Charter Schools|
|Teacher Listing Form||—||91.4%|
SASS conducted a reinterview of about 10 percent of schools and principals in the sample. Questionnaires were sent three or four weeks after the original interview. CATI reinterviews took place one or two weeks later. Reinterview results were analyzed using one index of inconsistency for items with a dichotomous response that is accounted for by response variance. A separate index of inconsistency was used for response items with more than two response categories.
|NCES 2002313:|| Schools and Staffing Survey, 1999–2000: Overview of the Data for Public, Private, Public Charter, and Bureau of Indian Affairs Elementary and Secondary Schools (Appendix B contains technical note).
|NCES 9712:|| Measuring School Reform: Recommendations for Future SASS Data Collection
|NCES 9808:||The Redesign of the Schools and Staffing Survey for 1999–2000: A Position Paper
|NCES 200010:||A Research Agenda for the 1999–2000 Schools and Staffing Survey
|NCES 97596:||The Schools and Staffing Survey: Recommendations for the Future