Skip Navigation
Public and Private School Principals in the United States: A Statistical Profile, 1987-88 to 1993-94 / Appendix C

Technical Notes

The Schools and Staffing Survey (SASS), an integrated survey of American schools, school districts, principals, teachers, and student records, is funded by the (NCES) of the U.S. Department of Education. First conducted during the 1987-88 school year, SASSis designed to provide periodic data on public and private schools in the United States. Since the 1990-91 school year, SASS has also included Indian schools supported by the Bureau of Indian Affairs, U.S. Department of the Interior. Major categories of data collected in SASS include the characteristics of schools and principals, school programs and policies, and the opinions and attitudes of principals about policies and working conditions.

The analytical power of the data is enhanced by the ability to link survey data for individual local education agencies (LEAs), schools, principals, teachers, and, since the 1993-94 school year, student-level records. The use of comparable questions in each round of SASS makes it possible to monitor changes in the nations educational system. The first SASS was administered during the 1987-88 school year, with a teacher followup survey conducted during 1988-89. The two subsequent SASS administrations were at three-year intervals (1990-91 and 1993-94). The next SASS round (1998-99) and subsequent administrations are planned for five-year intervals.

The 1993-94 SASS consisted of separate surveys administered simultaneously to linked samples of respondents./1 These instruments included:

The analyses for this report on public and private school principals use five SASS instruments: the Public School, Private School, Public School Principal, and Private School Principal Questionnaires, and the Teacher Demand and Shortage Questionnaire for Public School Districts. Data collected with the Student Records Questionnaire and the Teacher Followup Survey were excluded from analyses. Additionally, since less then 0.1 percent of American students attend Indian schools operated by the Bureau of Indian Affairs, these schools (170 schools in 1993-94), were excluded from this report./3

Overview of the Design of SASS

Schools and Staffing Survey continues to be the largest and most thorough national integrated survey of districts, schools, principals, and teachers ever undertaken in this country. The target populations for the SASS surveys include elementary and secondary schools, principals and classroom teachers in these schools, former teachers, and the LEAs that are responsible for administering the public schools. The 1993-94 sample consisted of approximately 9,000 public schools and 3,000 private schools.

Evolution of the SASS Design

The first administration of SASS in 1987-88 integrated three existing NCES survey programs: the Teacher Demand and Shortage Survey, the Public and Private School Surveys, and the Teacher Surveys. The 1987-88 SASS included a public school sample of 9,317 schools selected from the Quality Education Data (QED) file of public schools. The private school sample included 3,513 schools selected primarily from the QED file of private schools supplemented with private school association lists and targeted area samples from telephone directories.

Since that first administration, NCES has implemented a number of changes in the survey design and context to improve study estimates and to better reflect changes in the educational environment. Some of the most important changes that relate to this report are highlighted below:/4

Notably for public schools, the QED and CCD data sources apply slightly different definitions of the school unit. The QED file defined schools in terms of their physical location; the CCD file used for subsequent SASS surveys described schools as "administrative units with principals." Thus, in instances where multiple schools share a single campus, the estimated number of schools increases using the CCD definition.

Sample Selection

The initial sampling units for SASS were schools./6 The sampling structure was designed to provide separate data for public and private schools, with detail by state for the public sector and by private school association for the private sector. After schools were selected, each public and private school in the sample was sent a letter requesting that school personnel provide a list of all teachers in the school. The returned lists, supplemented by telephone followup, served as the sampling frame for the teacher survey. The same sample was used for the public and private school principal survey. Each LEA that administered one or more of the sample schools in the public sector became part of the sample for the Teacher Demand and Shortage Questionnaire.

Selection of schools. Since the 1990-91 SASS, the public school sampling frame has been the CCD file. The CCD is based on census data collected annually by NCES from state education agencies and is believed to be the most complete list of public schools available. The frame includes regular public schools, military base schools operated by the Department of Defense, and nonregular schools such as special education, vocational, and alternative schools. The public school sampling frame for the 1987-88 SASS was the school file developed by QED.

For private schools in the 1987-88 SASS, the QED private school frame was supplemented with lists obtained from several private school associations and by an area sample of 123 counties or groups of counties in which telephone directories, government offices, and other local sources were utilized to identify omitted private schools. The sampling frame for private schools in the 1990-91 SASS was the NCES 1989-90 Private School Universe Survey, augmented with state lists and private school association lists./7 The 1993-94 SASS private school frame consisted of the 1991-92 Private School Universe Survey, augmented with private school association lists and lists from an area frame./8

Selection of local education agencies. All LEAs that had at least one school selected for the schoolsample were included in the LEA sample for the Teacher Demand and Shortage Questionnaires.

Survey Operations Procedures

Survey operations for the 1987-88, 1990-91, and 1993-94 SASS, including sample selection, data collection, and data processing, were carried out under an interagency agreement by the U.S. Bureau of the Census, according to specifications provided by NCES. At the start of each school year, introductory letters containing a Teacher Listing Sheet were mailed to sample schools. These Teacher Listing Sheets, designed to enumerate the instructional staff at each school, served as the sampling frame for the teacher sample. Shortly after the listing sheets are distributed, Teacher Demand and Shortage Questionnaires were mailed to the local education agencies representing the sampled public schools and School Principal Questionnaires were sent to the principals ofthe selected public and private schools. School Teacher Questionnaires for teachers selected from lists provided by the sample public and private schools were also mailed at that time. Completed questionnaires were returned by mail to the Census Bureau. Telephone followup interviews of nonrespondents to the questionnaires were conducted by Census Bureau field representatives.


Weights of the sample units were developed to produce national and state estimates for teachers, administrators, schools, and local education agencies./9 The basic weights were the inverse of the probability of selection. The weights were also adjusted for nonresponse and to ensure that sample totals (based on responding, nonresponding, and out-of-scope cases) were comparable to the frame totals.

Standard Errors

The estimates presented in the text and tables of this report are based on samples and are subject to sampling variability. Standard errors were estimated using a balanced repeated replications procedure that incorporated the design features of this complex sample survey./10 The standard errors indicate the accuracy of each estimate. If all possible samples of the samesize were surveyed under the same conditions, an interval of 1.96 standard error units below to 1.96 standard error units above a particular statistic would include the true populationvalue in approximately 95 percent of the cases. Note, however, that the standarderrors do not take into account the effects of biases due to item nonresponse, measurement error, data processing error, or other possible systematic errors. Standard errors for the estimates presented in the text and tables of this report are included in appendix B.

Accuracy of Estimates

Some principals, schools, and districts did not return questionnaires, which resulted in missing data. These missing data, however, should have relatively little impact on the estimates of percentages, means, and counts that this report presents because of nonresponse adjustment strategies employed by SASS./11 Where analyses required examining data across questionnaires (e.g., when analyses included variables from both the Principal Questionnaire and the Teacher Demand and Shortage Questionnaire for Public School Districts), district nonresponse reduced the sample size of respondents because principals located in districts that did not return a questionnaire could not be included in the analyses. Thus, in these cases, the principal totals, and all other estimates presented, were based on a subset of the total number of principals.

The accuracy of any statistic is determined by the joint effects of sampling and nonsampling errors. Both types of error affect the estimates presented in this report./12

Nonsampling Error

Both universe and sample surveys are subject to nonsampling errors. Two types of nonsampling errors occurnonobservation error and measurement errorand both are extremely difficult to estimate.

Nonobservation error may be due to noncoverage, which occurs when members of the population of interest are excluded from the sampling frame and, therefore, are not included in the survey sample. Nonobservation error also occurs when sampled units (for example, teachers or administrators) refuse to answer some or all of the survey questions. These types oferror are referred to as questionnaire nonresponse (where the entire questionnaire is missing) and item nonresponse (where only some items of the questionnaire are missing). Sample weight adjustment techniques were used to compensate for questionnaire nonresponse; imputation procedures were used to compensate for item nonresponse in SASS./13

Measurement error occurs when mistakes are made when data are edited, coded, or entered into computers (processing errors), when the responses that subjects provide differ from the "true" responses (response errors), and when measurement instruments fail to measurethe characteristics they are intended to measure. Sources of response errors include differences in the ways that respondents interpret questions, faulty respondent memory, and mistakes respondents make when recording their answers. Because estimating the magnitude of these various types of nonsampling errors would require special experiments or access to independent data, information on the scope of these errors is seldom available.

Sampling Error

Sampling error occurs when members of a population are selected (sampled), and only sample members respond to survey questions. Estimates that are based on sample responses will differ somewhat from the data that would have been obtained if a complete census of the relevant population had been taken using the same survey instruments, instructions, and procedures. The estimated standard error of a statistic is a measure of the variation due to sampling and can be used to examine the precision obtained in a particular sample. All estimates and standard errors were calculated using a balanced repeated replications variance estimation program developed to calculate standard errors based upon complex survey designs.

Response Rates and Imputation

The final weighted questionnaire response rates are reported in table C1 for the various SASS years. Table C2 provides the item-response rates for the SASS instruments by year. Values were imputed for items with missing data by (1)using data from other items on the questionnaire or a related component of the SASS (e.g., a school record to impute district data); (2)extracting data from the sample frame such as the CCD; or (3)extracting data from a respondent with similar characteristics./14

The reader should note that all data on principals in this report are imputed. For earlier reports, imputed data on principals for 1987-88 were not available; unimputed data were used. Thus, differences may exist between the 1987-88 estimates reported here and those in previous reports.

Statistical Procedures

The comparisons in the text were tested for statistical significance to ensure that the differences are larger than might be expected from sampling variation. These statistical tests were based on Students t statistic. Generally, whether a difference is considered significant is determined by calculating a t value for the difference between a pair of means or percentages, and comparing this value to published tables of values at certain critical levels, called alpha levels. The alpha level is an a priori statement of the probability of inferring that a difference exists when, in fact, it does not (i.e., the observed difference results from sample variation rather than a true difference between two means).

Table C1.Weighted and unweighted percent response rates by SASS
          instrument: 1987-88, 1990-91, and 1993-94
                                   Unweighted                Weighted
Questionnaire                1987-88 1990-91 1994-94    1987   1990   1993
                                                         -88    -91    -94
Teacher demand and shortage
 for public school districts   89.4    93.7    93.1     90.8   93.5   93.9
Public school principal        94.2    96.9    96.6     94.4   96.7   96.6
Private school principal       81.2    91.1    90.3     79.3   90.0   87.6
Public school                  91.9    95.0    92.0     91.9   95.3   92.3
Private school                 79.6    85.1    84.1     78.6   83.9   83.2
Public school teacher*         86.5    91.5    88.9     86.4   90.3   88.2
Private school teacher*        77.0    83.1    80.6     79.1   84.3   80.2
* The response rates for public and private school teachers exclude the
schools that did not provide teacher lists. The overall or effective
response rates for public school teachers, including those that could not
be sampled from nonresponding schools, were 83 percent, 86 percent, and 85
percent, respectively, for the 1987-88 through 1993-94 SASS. Overall
response rates for private school teachers were 70 percent, 75 percent, and
73 percent for the SASS administration.

In order to make proper inferences and interpretations from the statistics, several pointsmust be kept in mind. First, comparisons resulting in large t statistics may appear to merit special note. However, this is not always the case because the size of the t statistic depends not only on the observed difference in means or percentages being compared, butalso on the standard error of the difference. Thus, a small difference between two groups with a much smaller standard error could result in a large t statistic, but this small difference isnot necessarily noteworthy. Second, when multiple statistical comparisons are made on the same data, it becomes increasingly likely that an indication of a population difference is erroneous. Even when there is no difference in the population, at an alpha level of .05, there isstill a 5 percent chance of concluding that an observed t value representing one comparison in the sample is large enough to be statistically significant. As the number of comparisons increases, so does the risk of making such an error in inference.

To guard against errors of inference based upon multiple comparisons, the Bonferroni procedure to correct significance tests for multiple contrasts was used. This method corrects the significance (or alpha) level for the total number of contrasts made with a particular classification variable. For each classification variable, there are (K*(K-1)/2) possible contrasts (or nonredundant pairwise comparisons), where K is the number of categories. For example, race-ethnicity has five categories (i.e., American Indian/Alaska Native, Asian/Pacific Islander, Hispanic, and White and Black non-Hispanic). With K=5, there are 5*(5-1)/2 or 10 possible comparisons among the race-ethnicity categories. The Bonferroni procedure divides the alpha level for a single t test by the number of possible pairwise comparisons in order provide a new alpha that is corrected for the fact that multiple contrasts are being made.

Table C2.-Unweighted item-response rates for SASS questionnaires, by year
                                                             Percent of items with response  Percent of items with a response
Questionnaire                Range of item-response rates         rate ³ 90 percent               rate < 75 percent
                           -------------------------------- -------------------------------- ---------------------------------
                             1987-88  1990-91  1993-94       1987-88  1990-91 1993-94        1987-88  1990-91 1993-94
Teacher demand and shortage  40-100%  85-100%   67-100%        74%      90%     91%             12%      0%      1%
Public school principal       70-100   90-100    65-100         86      100      92               2       0       4
Private school principal      72-100   80-100    55-100         89       98      90               2       0       6
Public school                 43-100   56-100    83-100         64       77      83              11       1       0
Private school                11-100   67-100    61-100         56       77      77               8       5       3
Public school teacher         64-100   76-100    71-100         90       84      91               1       0       0
Private school teacher        60-100   71-100    69-100         89       79      89               1       1       1
Table C3.-Decision rules for estimates suppresion
For Total Columns:
1. If n < 10, then suppress all totals and counts;
2. If 10 £ n < 30, then do
   A. If C.V. of the N < 20%, then report all totals and counts;
   B. If C.V. of the estimate ³ 20%, then suppress all totals and counts;
3. If n ³ 30, then report all totals and counts.
For Percentages and Proportions:
1. If nden ³ 30, then do
   A. If nnum £ 2, then suppress percentage;
   B. If nnum > 2, then report percentage;
2. If 10 < nden < 30, then do
   A. If nnum £ 30, then suppress percentage;
   B. If nnum > 2, then do
      1. If C.V. of denominator < 20%, then report percentage;
      2. If C.V. of the denominator ³ 20%, then suppress;
3. IF nden < 10, then suppress.
Where n = unweighted count for cell,
      N = weighted number,
      NU = universe total for that cell (or the weighted count if not available), and
      C.V. = coefficient of variation for the estimate (i.e., the ratio of
             the standard error to the value of the statistic).

The formulas used to compute the t statistic was as follows:


where P1 and P2 are the estimates to be compared and se1 and se2 are their corresponding standard errors. This formula is valid only for independent estimates. When the estimates were not independent (for example, when comparing the percentages of respondents in different age groups), a covariance term was added to the denominator of the t-test formula. Because the actual covariance terms were not known, it was assumed that the estimates were perfectly negatively correlated. Consequently, 2*(se1*se2) was added to the denominator of the t-test formula.

The standard errors were calculated using the WESVAR program, a user-written procedure for the Statistical Analysis System (SAS)./15 This analytic software uses a balanced repeated replications method to calculate standard errors based upon complex survey designs.

Decision Rules for Suppression of Estimates

Estimates based on very small sample sizes are highly sensitive to sampling and measurement error and tend to have large standard errors. Since many of the crosstabular presentations in this report include cells based on small numbers of respondents, we have suppressed estimates based on very small sample sizes and footnote each cell with the note, Too few cases for a reliable estimate. To protect respondent confidentiality, we also suppressed cells with fewer than three responses. The decision rules for estimate suppression are presented in table C3.

Variable Definitions

Public School District

A public school district (or LEA) was defined as a government agency administratively responsible for providing public elementary and/or secondary instruction and educational support services. The agency or administrative unit was required to operate under a public board of education. Districts that did not operate schools but hired teachers for other districts were included. A district was considered out of scope if it did not employ elementaryor secondary teachers of any kind.

Public School

A public school is defined as an institution that provides educational services for at least one of grades 1-12 (or comparable ungraded), has one or more teachers to give instruction, is located in one or more buildings, receives public funds as primary support, has an assigned administrator, and is operated by an education agency. Schools in juvenile detention centers and schools located on military bases and operated by the Department of Defense were included; schools funded by the Bureau of Indian Affairs were not included.

Private School

A private school was defined as a school not in the public system that provides instruction for any of grades 1 through 12 where the instruction was not given in a private home.

Community Type

Community type was derived from the seven-category urbanicity code (locale) developed by Johnson./16 The locale code was based on the schools mailing address matched toBureau of the Census data files containing population density data, SMSA codes, and a Census code defining urban and rural areas. This approach, first employed during the 1990-91 SASS, is believed to provide a more accurate description of the community than the respondents self-reported community type used during initial analyses of the 1987-88 SASS./17 For this report, community type for the 1987-88 instruments was updated to reflect the current locale definition. The locale codes were aggregated into three community types.

Central city. A large central city (the central city of an SMSA with population ³ 400,000 or a population density ³ 6,000 persons per square mile) or a mid-size city (a central city of an SMSA, but not designated as a large centralcity).

Urban fringe and large town. The community type is defined to include the urban fringe of a large or mid-size city or a large town (a place not within an SMSA, but with a population ³ 25,000 and defined as urban by the U.S. Census Bureau).

Rural area and small town. This category was defined to include both rural areas (population of< 2,500 and defined as rural by the U.S. Census Bureau) and small towns (a place notwithin an SMSA, with a population of < 25,000, but ³ 2,500).

School Level

Elementary. A school that had grade 6 or lower, or ungraded and had no grade higherthan the 8th.

Secondary. A school that had no grade lower than the 7th, or ungraded and had grade7 or higher.

Combined. A school that had grades higher than the 8th and lower than the 7th. Schoolsin which students are ungraded (i.e., nonclassified by standard grade levels) are also classified as combined.

School Size

Less than 150, 150 to 499, 500 to 749, and 750 or more. Size categories were based on the number of students (by head count) who were enrolled in grades K-12 in the school on the Public and Private School Questionnaires.

Private School Typology

Private schools were assigned to one of three major categories, and within each major category, to one of three subcategories. The categories and subcategories are:/18


Four geographic regions corresponding to areas defined by the U.S. Bureau of the Census were employed in the report. The areas and states are defined below.

Highest Degree Earned

Less than bachelors degree, bachelors degree, masters degree, education specialist/professional diploma, doctoral degree. The highest degree earned by administrators is a recoding of the various academic degrees reported in the Public and Private School Principal Questionnaires. The education specialist or professional diploma (Ed.S.) typically involves one additional year of study beyond the masters degree.

Years of Teaching Experience

Averages for years of teaching experience prior to becoming a principal include data for principals with no teaching experience. Based on weighted data, 1.1 percent, 1.3 percent, and 1.2 percent of public school principals reported zero years of teaching experience for the respective survey years 1987-88, 1990-91, and 1993-94. The corresponding percentages for private principals were 11.0, 13.0, and 12.2.

Average Salary

Average salary for public and private school principals is defined as the weighted mean annual salary for the positions, before taxes and deductions, as reported by the survey respondents. Unpaid positions (i.e., annual salary = $0) were excluded from the calculations; all other salaries were included. Based on weighted data, 0.1 percent of public school principals (69) reported receiving no salary for survey year 1987-88. No public school principals reported receiving no salary in years 1990-91 or 1993-94. The corresponding percentages for private principals were 8.5, 9.9, and 7.4. Respondent-reported salaries, whether for 12 months, 10 months, or other periods, were considered annual salaries in our calculations.

In-kind Benefits

In-kind benefits are a composite of the employment benefits public and private school principals report receiving from their schools. The benefits include free housing or housing contributions, meals including free and reduced price lunches, tuition for children and reimbursement for personal tuition and course fees, child care, and car and transportation expenses. General medical, dental, and life insurance and pension contributions are not included as in-kind benefits.

District Size

Less than 1,000, 1,000 to 4,999, 5,000 to 9,999, and 10,000 and more. This four-category measure of district size is based on the district head count estimates reported in the Teacher Demand and Shortage Questionnaire for Public School Districts.


Responses to two questions on the Public and Private School Principal Questionnaires determined race-ethnicity of principals. The first question asked respondents for their race: American Indian or Alaska Native, Asian or Pacific Islander, Black, or White. Respondents were then asked, Are you of Hispanic Origin? For this report, the five race-ethnicity categories resulted from combining responses to the two questions. Respondents who indicated they were of Hispanic origin were classified as Hispanic regardless of their race. Respondents who indicated they were not of Hispanic origin were partitioned into one of the four race categories.


Principal age was calculated by subtracting respondent year of birth from the base year of the respective survey (i.e., 1987 for the 1987-88 SASS, 1990 for the 1990-91 SASS, and 1993 for the 1993-94 SASS). However, all NCES CD-ROMs (public use and restricted use for all three survey years) calculated respondent age using the latter year (1988, 1991, and 1994). Therefore, use of the age variable on the CD-ROMs in constructing age categories will yield results different from those in this report. Average age will differ by one year.

Minority Enrollment

Less than 20 percent, 20 to 50 percent, 50 percent and more. Based on the student demographic information contained in the Public and Private School Questionnaires, the variable is the sum of all racial-ethnic groups other than white non-Hispanic calculated as a percentage of students of all race-ethnic groups.

New Administrator

"New" administrators were defined as having been administrators for three years or less.


[1] Since 1987, NCES has published several reports that include these instruments (e.g., NCES Report 94-674, SASS and PSS Questionnaires, 1993-94). Copies of the questionnaires maybe obtained by writing to the NCES Education Surveys Program at the address given at the end of Chapter1.

[2] Because private and Indian schools do not typically operate under a district-like administrative structure, these instruments also contained several items on personnel policies and administrative practices that were included in the Teacher Demand and Shortage Questionnaire for Public School Districts.

[3] Readers are referred to the following report on Indian schools for additional information: Pavel, D.M., & Curtin, T.R. (1996). Characteristics of American Indian and Alaska Native Education: Results from the 1990-91 and 1993-94 Schools and Staffing Survey [NCES 97-451]. Washington, DC: U.S. Department of Education, Office of Educational Research and Improvement.

[4] Additional information on changes in SASS design can be found in Abramson, R., Cole, C., Jackson, B., Parmer, R., & Kaufman, S. (1996). 1993-94 Schools and Staffing Survey: Sample Design and Estimation (Technical Report NCES 96-089). Washington, DC: U.S. Department of Education, Officeof Educational Research and Improvement, or Jabine, T.B. (1994). Quality Profile for SASS: Aspects oftheQuality of Data in the Schools and Staffing Surveys (SASS) (NCES 94-340). Washington, DC: U.S. Department ofEducation, Office of Educational Research and Improvement.

[5] District and school files for the 1987-88 SASS were imputed before they were released. Administrator and teacher files were imputed during 1994, and the imputed values were added to the 1987-88 SASS database.

[6] For a detailed description of the sample design for the 1993-94 SASS, see Abramson et al. (1996).

[7] Gerald, E., McMillen, M., & Kaufman, S. (1993). Private School Universe Survey, 1989-90 [NCES 93-122]. Washington, DC: U.S. Department of Education, Office of Educational Research and Improvement.

[8] Broughman, S., Gerald, E. Bynum L.T., & Stoner, K. (1993). Private School Universe Survey, 1991-92 [NCES 94-350]. Washington, DC: U.S. Department of Education, Office of Educational Research and Improvement.

[9] For a detailed description of the weighting process for 1993-94, see Abramson et al. (1996).

[10] See, e.g., Wolter, K.M. (1985). Introduction to Variance Estimation. New York: Springer-Verlag.

[11] Sampling weights are adjusted for instrument nonresponse.

[12] A summary of the data quality for SASS is presented by Jabine (1994).

[13] A discussion of these nonresponse adjustment procedures is presented in the following references: Gruber, K.J., Rohr, C.L., & Fondelier, S.E. (1994).1990-91 Schools and Staffing Survey: Data File Users Manual (Vol. 1: Survey Documentation) (NCES 93-144-I). Washington, DC: U.S. Department of Education, Office of Educational Research and Improvement. Gruber, K.J., Rohr, C.L., & Fondelier, S.E. (1996).1993-94 Schools and Staffing Survey: Data File Users Manual (Vol. 1: Survey Documentation), (NCES 96-142). Washington, DC: U.S. Department of Education, Office of Educational Research and Improvement.

[14] For a description of the imputation procedures, see Abramson et al. (1996) (pp 90-108) and Gruber et al. (1994) (pp. 71-78).

[15] WESVAR is a proprietary computer program available from Westat, Inc., 1650 Research Boulevard, Rockville, MD20850.

[16] Johnson, F. (1989). Assigning Type of Local Codes to the 1987-88 CCD Public Schools Universe (Data Series TechnicalReport SP-CCD-87188-7.4). Washington, DC: U.S. Department of Education, Office of Educational Research and Improvement.

[17] Johnson, F.H. (1993). Comparisons of School Locale Settings: Self-Reported Versus Assigned. American Statistical Association, Proceedings of the Section on Survey Research Methods, 2, 686-691.

[18] For additional information, see McMillen, M., & Benson, P. (1991). Diversity of Private Schools (NCES 92-082). Washington, DC: U.S. Department of Education, Office of Educational Research and Improvement.

SummaryPrev Contents PrevSchools and Staffing Survey (SASS) Data Products