Skip Navigation

School and Staffing Survey (SASS)



4. SURVEY DESIGN

TARGET POPULATION


LEAs that employ elementary- and/or secondary-level teachers (e.g., public school districts, state agencies that operate schools for special student populations, such as inmates of juvenile correctional facilities or students in Department of Defense schools); cooperative agencies that provide special services to more than one school district; public, private, BIE, and charter schools with students in any of grades 1–12; the principals of these schools; library media centers; and teachers in public, private, BIE, and charter schools who teach students in grades K–12 in a school with at least a 1st grade.

SAMPLE DESIGN

SASS uses a stratified probability sample design. Details of stratification variables, sample selection, and frame sources are provided below.

Public School Sample. In the public school sample, schools are selected first. The first level of stratification is by type of school: (a) BIE schools (all BIE schools are automatically in the sample); (b) schools with a high percentage of American Indian students (i.e., schools with 19.5 percent or more American Indian students); (c) schools in Delaware, Florida, Maryland, Nevada, and West Virginia (where it is necessary to implement a different sampling methodology to select at least one school from each LEA in the state); (d) charter schools; and (e) all other schools. Schools falling into more than one group are assigned to types A, B, D, C, and E in that order. The second level of stratification varies within school type. All BIE schools are automatically selected for the sample, so no stratification is needed. Schools with a high percentage of American Indian students are stratified by state (Arizona; California; Montana; New Mexico; Washington; the remaining western states; Minnesota; North Dakota; South Dakota; the remaining midwestern states; North Carolina; Oklahoma; and the remaining states except Alaska, since most Alaskan schools have a high Native American enrollment). Schools in Delaware, Florida, Maryland, Nevada, and West Virginia are stratified first by state and then by LEA. Charter schools and schools not placed in another category are stratified by state. Within each second level, there are three grade level strata (elementary, secondary, and combined schools).

Within each stratum, all non-BIE schools are systematically selected using a probability proportionate to size algorithm. The measure of size used for schools in the CCD is the square root of the number of teachers in the school as reported in the CCD file. Any school with a measure of size larger than the sampling interval is excluded from the probability sampling operation and included in the sample with certainty.

The CCD Public Elementary/Secondary School Universe Survey serves as the public school sampling frame. (See the CCD chapter for a more thorough discussion.) The frame includes regular public schools, Department of Defense-operated military base schools, and special purpose schools (such as special education, vocational, and alternative schools). Schools outside the United States and schools that teach only prekindergarten, kindergarten, or postsecondary students are deleted from the file. The following years of the CCD were used as the public school frame for the last five rounds of SASS:

  • 2009–10 CCD for the 2011–12 SASS;
  • 2005–06 CCD for the 2007–08 SASS;
  • 2001–02 CCD for the 2003–04 SASS;
  • 1997–98 CCD for the 1999–2000 SASS;
  • 1991–92 CCD for the 1993–94 SASS; and
  • 1988–89 CCD for the 1990–91 SASS.

In the 1987–88 SASS, the 1986 Quality Education Data (QED) survey was used as the sampling frame.

Private School Sample. For private schools, the sample is stratified within each of the two types of frames: (1) a list frame, which is the primary private school frame; and (2) an area frame, which is used to identify schools not included in the list frame and to compensate for the undercoverage of the list frame. Private schools in the list frame are stratified by affiliation, grade level, and region. Within each stratum, schools are sampled systematically using a probability proportionate to size algorithm. Any school with a measure of size larger than the sampling interval is excluded from the probability sampling process and included in the sample with certainty. All schools in the area frame within noncertainty PSUs and not already listed in the list frame are included in the sample with certainty.

The most recent PSS, updated with the most recent association lists, serves as the private school sampling frame. For example, the 2001–02 PSS—updated with 26 lists of private schools provided by a private school association (as well as 51 lists of private schools, from the 50 states and the District of Columbia)—was used as the private school frame for the 2003–04 SASS. SASS 2011–12 was based on the 2009–10 PSS.  For the 2007–08 SASS, the private school list frame was based on the 2005–06 PSS, updated with private school organizations and state lists collected by the U.S. Census Bureau in the summer of 2006. The 1991–92, 1989–90, and 1997–98 PSS were the basis for the private school frame for the 1993–94, 1990–91, and 1999–2000 SASS, respectively. The 1986 QED survey was used as the sampling frame for the 1987–88 SASS.

BIE school selection. Since the 1993–94 SASS, all BIE schools have been selected with certainty; in 1990–91, 80 percent of BIE schools were sampled. The BIE school frame for the 2003–04 SASS consisted of a list of schools that the BIE operated or funded during the 2001–02 school year. (The list was obtained from the U.S. Department of the Interior.) The BIE list was matched against the CCD, and the schools on the BIE list that did not match the CCD were added to the universe of schools.

For the 2007–08 SASS data collection, a separate universe of schools operated or funded by the BIE in the 2005–06 school year was drawn from the Program Education Directory maintained by the BIE. (The CCD now defines the BIE as its own "territory," similar to Puerto Rico and other non-state territories, and does not permit duplicates to be reported by the states.) All BIE schools meeting the SASS definition of a school were included in the sample.

After the 2007–08 SASS data collection however, BIE data collection was discontinued; as a result no BIE schools, principals, teachers or library media centers were sampled for the 2011–12 SASS.

Charter school selection. In the 1999–2000 SASS, a charter school sample was added. All charter schools were selected with certainty from the frame, which consisted of a list of charter schools developed for the U.S. Department of Education's Institute of Education Sciences. The list included only charter schools that were open (teaching students) during the 1998–99 year. This changed in the 2003–04 SASS, when a nationally representative sample of public charter schools was included as part of the public school sample. In the 2011–12 SASS, charter schools continued to be included as a part of the public school sample.

Each school sampled for SASS receives a school questionnaire, and the principal of each sampled school receives a principal questionnaire.

Top

Teacher selection. Within each sampled school, a sample of teachers is selected. First, the sampled schools are asked to provide a list of their teachers and selected characteristics. For example, in the 2007–08 SASS data collection, the Teacher Listing Form was collected as early as possible in the 2007–08 school year at all public (including public charter), private, and BIE-funded schools in the SASS sample to obtain a complete list of all the teachers employed at each school.

In the 2007–08 SASS, teachers were stratified into one of two teacher types: new and experienced. For new and experienced teachers in public schools, oversampling was not required, due to the large number of sampled schools with new teachers. Therefore, teachers were allocated to the new and experienced categories in proportion to their numbers in the school. However, in private schools, new teachers were oversampled. Before teachers were allocated to the new or experienced strata, schools were first allocated an overall number of teachers to be selected.

For the 2011–12 SASS, Teacher Listing Forms were collected for sample schools and districts, and compiled by the Census Bureau throughout the collection period.  Sampled schools provided information on teacher's teaching experience, with stratifications of beginning, early career, mid-year, and experienced.  Beginning and early career teachers were oversampled to improve survey estimates for this subpopulation; within each teacher stratum within each school, teachers were selected systematically with equal probability.

Teacher records within a school are sorted by the teacher stratum code, the teacher subject code, and the teacher line number code. The teacher line number code is a unique number assigned to identify the teacher within the list of teachers keyed by the field representative. Within each teacher stratum in each school, teachers are selected systematically with equal probability. The within-school probabilities of selection are computed so as to give all teachers within a school stratum the same overall probability of selection (self-weighted) within teacher and school strata, but not across strata. However, since the school sample size of teachers is altered due to the minimum constraint (i.e., at least one teacher per school) or maximum constraint (i.e., no more than either twice the average stratum allocation or 20 teachers per school), the goal of achieving self-weighting for teachers is lost in some schools. Each sampled teacher receives a teacher questionnaire.

Library media center selection. For the 2003–04 and 2007–08 SASS, all library media centers in public, public charter, and BIE-funded schools in the SASS sample were asked to complete the School Library Media Center Questionnaire. For 2011–12, all library media centers in public and public charter schools were sampled equating to roughly 10,250 public and 750 public charter library media centers being sampled.

School district selection. In most states, once public schools are selected, the districts associated with these schools are placed in the sample as well. However, in Delaware, Florida, Maryland, Nevada, and West Virginia, all districts are defined as school sampling strata, placing all districts in each of these states in the district sample. (In some SASS administrations, a sample of districts not associated with schools is taken, but not in the 2007–08 SASS.) For the 2011–12 SASS sampling frame, public charter schools were classified as dependent (governed by a school district) and independent (not associated with a school district). The district sample is selected using a probability proportionate to size algorithm. Each sampled school district receives a school district questionnaire. The approximate sample sizes for the 2011–12 SASS were 51,100 public school teachers; 7,100 private school teachers; 14,000 school principals; and 5,800 school districts.

Top

Data Collection and Processing

In the 2011–12 SASS, teachers were mailed an invitation to complete a web-based questionnaire, although they could also request to take the paper questionnaire. 67% of public school teachers and 59% of private school teachers chose to use the web-based questionnaire. In 2003–04 and 2007–08, the School Library Media Center Survey did not have an Internet reporting option, as it did in 1999–2000. All survey modes used in SASS are administered by the U.S. Bureau of the Census.

Reference Dates. Data for SASS components are collected during a single school year. Most data items refer to that school year. Questions on enrollment and staffing refer to October 1 of the school year. Questions for teachers about current teaching loads refer to the most recent full week that school was in session, and questions on professional development refer to the past 12 months.

Data Collection. The data collection procedures begin with advance mailings to school districts explaining the nature and purpose of SASS. Field staff attempt to establish a contact person for the School District Questionnaire and determine whether the district is willing and able to provide an electronic list of teachers for their selected school(s) in the fall. If the district agrees to provide an electronic list, field staff determine the appropriate contact person to receive the request. Field staff verify the selected schools' names, grade ranges, and operational statuses. Finally, field staff attempt to collect the names of the selected schools' principals and their e-mail addresses.

The school district questionnaires are mailed out first. Then, the school, principal, and library media center surveys are delivered to schools in person. The teacher questionnaires are delivered last. Follow-up efforts begin approximately 2 weeks after questionnaires are distributed. They consist of telephone calls and personal visits to schools to obtain completed questionnaires or to verify that they have been mailed back. Field staff record the status of each questionnaire and, if necessary, supply additional blank questionnaires.

Processing. During the check-in phase, each questionnaire is assigned an outcome code: completed interview, out-of-scope, or noninterview. A combination of manual data keying and imaging technology was used to enter the data. Then, interview records in the data files undergo a round of primary data review, where analysts examine the frequencies of each data item in order to identify any suspicious values. Census staff review the problem cases and make corrections whenever possible.

After the primary data review, all records (i.e., records from all survey components) classified as interviews are subject to a set of computer edits: a range check, a consistency edit, and a blanking edit. After the completion of these edits, the records are put through another edit to make a final determination of whether the case is eligible for the survey, and, if so, whether sufficient data have been collected for the case to be classified as an interview. A final interview status recode (ISR) value is assigned to each case as a result of the edit.

Top

Estimation Methods

Sample units are weighted to produce national and state estimates for public elementary and secondary school surveys (i.e., schools, teachers, administrators, school districts, and school library media centers); and national estimates for BIE, charter school, and public combined school surveys (i.e., schools, teachers, administrators, and school library media centers). The private sector is weighted to produce national and affiliation group estimates. These estimates are produced through the weighting and imputation procedures discussed below.

Weighting. Estimates from SASS sample data are produced by using weights. The weighting process for each component of SASS includes adjustments for nonresponse using respondents' data and adjustments of the sample totals to the frame totals to reduce sampling variability. The exact formula representing the construction of the weight for each component of SASS is provided in each administration's sample design report (e.g., 1993–94 Schools and Staffing Survey: Sample Design and Estimation [Abramson et al. 1996]). The construction of weights is also discussed in the Quality Profile reports (Jabine 1994; Kalton et al. 2000) and in the documentation for the 2003–04 administration (Tourkin et al. 2007). Since SASS and PSS data were collected at the same time in 1993–94 and 1999–2000, in both years the number of private schools reported in SASS was made to match the number of private schools reported in PSS.

Imputation. In all administrations of SASS, all items with missing values are imputed for records classified as interviews. SASS uses a two-stage imputation procedure. The first-stage imputation uses a logical or deductive method, such as:

  • Using data from other items in the same questionnaire;
  • Extracting data from a related SASS component (different questionnaire); or
  • Extracting information about the sample case from the PSS or CCD, the sampling frames for private and public schools, respectively.

In addition, some inconsistencies between items are corrected by ratio adjustment during the first-stage imputation.

The second-stage imputation process is applied to all items with missing values that were not imputed in the first stage. This imputation uses a hot-deck imputation method, extracting data from a respondent (i.e., a donor) with similar characteristics to the nonrespondent. If there is still no observed value after collapsing to a certain point, the missing values are imputed using a clerically imputed value or automated algorithm.

Top

Recent Changes

Several changes were made over time, largely due to budgetary reasons.

Design Changes from 1999–2000 to 2011–12: 

  • Since the 2007–08 SASS, BIE schools, principals, teachers, or library media centers are no longer sampled.
  • Rather than surveying all public charter schools, as was done in the 1999–2000 SASS, some 300 public charter schools were sampled for the 2003–04 SASS.
  • The separate questionnaire for public charter schools was discontinued. The reduction in the public charter school sample size from 1,100 in the 1999–2000 SASS to about 300 in the 2003–04 SASS meant it was no longer feasible to produce a separate questionnaire, since public charter school data could not be published with as much detail (for the 2003–04 SASS, only at the national and regional levels). Public charter school data are now included with traditional public school data.
  • Affiliation for private schools was redefined and stratified into 17 groups rather than the previous 20 groups in the 2003–04 SASS. Catholic schools were split into three groups based on typology. Other religious schools were divided into five groups corresponding to the four largest non-Catholic religious organizations (by number of schools) and a catch-all "other." Nonsectarian schools were divided into three groups by typology.
  • Grade-level stratification in public and private schools was defined purely on the basis of grade level of the school starting in 2003–04 SASS. Schools classified as a type other than "regular school" were no longer placed by default in the combined school category, which includes schools with some elementary and some secondary grades. Many nonregular schools (i.e., special education, alternative, and vocational schools) cover a specific grade range. To the extent this grade range is known, this was a more appropriate method of stratification than placing them all in the combined school strata. Nonregular schools with a grade range that is ungraded or unknown remain in the combined school strata.
  • Public schools from the CCD were collapsed into what was perceived to be a better fit with the SASS definition of a school prior to stratification beginning in the 2003–04 SASS. The sample allocation was revised to avoid undersampling schools now classified at the combined grade level. In other words, the revision of the sample allocation ensured that the newly combined schools were sampled at the same approximate rate as they would have been prior to the collapsing procedure. In general, the combined school sample size was increased to the point at which the combined school sampling rate equaled the overall state-level sampling rate. For example, if one in five schools were sampled in a particular state, then one in five of the combined schools were sampled rather than using the default sample size of 10 combined schools.
  • The sort order for the public and private school sampling was altered to sort on enrollment in a serpentine fashion (instead of always sorting in descending order) in the 2003–04 SASS. Serpentine sorting involves sorting in ascending order with respect to higher level sort variables one time, then sorting in descending order the next time, and so on. This reduces the variation in enrollment between adjacent sampled schools and thus reduces the overall sampling error.
  • Florida and Maryland were added to the list of states where at least one school is selected in each school district. This was done in the 2003–04 SASS to decrease the standard error of the state-level school district estimates.
  • Oversampling of bilingual/English as a Second Language (ESL) teachers was discontinued in the 2003–04 SASS, since a sufficient number of bilingual teachers to produce the desired reliability estimates could be done without oversampling.
  • Teacher sampling was automated to speed up the distribution of the teacher questionnaires. This, however, reduced the level of control over the sample sizes for the remaining oversampled teacher strata (Asian/Pacific Islander and American Indian/Alaska Native). The automation no longer allowed the sampling rate for these teachers to be periodically revised during the sampling process. Thus, if the number of these teachers listed differed from the expected number, the sample size goal would no longer be met.
  • The School Library Media Center Questionnaire was not administered to private schools for budget reasons as of the 2003–04 SASS.
  • The School Questionnaire (with district items) is a questionnaire that contains the public school questions and most of the school district questions in the 1999–2000 SASS. It was administered to public charter, state-operated (often schools for the blind or schools located in juvenile detention facilities), and BIE-funded schools, as well as public schools in one-school districts. This change was made to ease respondent burden in cases where the respondent for the school and school district questionnaires was expected to be the same.

FUTURE PLANS

SASS was re-designed into the National Teacher and Principal Survey (NTPS), which was administered for the first time in 2015–16.

Top