Skip Navigation
Features of Occupational Programs at the Secondary and Postsecondary Education Levels
NCES: 2001018
June 2001

Appendix A. Methodology and Technical Notes

Surveys and Samples

The tabular statistics in this report present data collected from the 1999 "Survey onVocational Programs in Secondary Schools" and the 1999 "Survey on Occupational Programs inPostsecondary Education Institutions." The surveys were conducted through the National Centerfor Education Statistics (NCES) Fast Response Survey System (FRSS) and PostsecondaryEducation Quick Information System (PEQIS), respectively.

The lists of occupational program areas used in the surveys were developed throughan iterative process. First, NCES, in consultation with the Office for Vocational and AdultEducation (OVAE, the U.S. Department of Education office that sponsored the surveys), identifiedvocational occupations from the list of occupations in the Bureau of Labor Statistics" (BLS"s)1998 Occupational Outlook Quarterly (Bureau of Labor Statistics, 1998); vocational occupationswere defined as occupations that do not typically require a bachelor's or higher degree, and/or forwhich secondary schools typically provide vocational education. Second, at OVAE's request,NCES selected from the overall list of vocational occupations a short list of relatively large andfast-growing occupations. These were defined as occupations that the Occupational OutlookQuarterly listed as (1) relatively large (at least 100,000 jobs in 1996) and (2) fast-growing(projected to increase in size by 10 percent or 500,000 jobs from 1996 to 2006) or with a shortageof skilled workers in 1996. Subsequent instrument review and pilot testing, however, revealed thatboth researchers and practitioners preferred to have information on a broader range of occupationalareas for which vocational education trains students. Thus, vocational program areasrecommended by these individuals were added to the survey instruments, using the occupationslisted in the Occupational Outlook Quarterly that most closely corresponded to the recommendedprogram areas. There is one exception to this rule. The occupation of "agriscience technician" wasincluded on the surveys to indicate the preparation provided by agriculture programs, even thoughthis occupation was not listed in the 1998 Occupational Outlook Quarterly.

To select a nationally representative sample of public secondary schools for the FRSSsurvey, a stratified sample of 1,200 public secondary schools, including 600 vocational schools and600 comprehensive (regular) schools, was selected from the 1996-97 Quality Education Data(QED) National Education Database. The QED database is compiled from a variety of sources, including the NCES Common Core of Data (CCD) public school universe file. Almost 16,000comprehensive secondary schools and 1,300 vocational schools met the eligibility requirement forthis study; that is, they had 11th and 12th grades. Excluded from the sampling frame were privateschools, nonregular schools such as special education and alternative schools, and schools in theoutlying U.S. territories.

The coverage of comprehensive public schools in the QED database was equivalent tothat of the CCD universe file. However, the QED database appeared to have better coverage ofvocational schools than did the CCD file for 1996-97 For example, the counts of vocationalschools in the QED file by state were generally higher than the corresponding counts in the CCDfile. In particular, the CCD file did not contain any listings of vocational schools in three states(Oklahoma, California, and Kansas). In contrast, the QED file contained numerous vocationalschools in these states. It is for this reason that the QED database was used to develop thesampling frame for the FRSS survey.

For the PEQIS survey, the sample of postsecondary institutions was restricted to 2-year and less-than-2-year institutions that were eligible to participate in federal financial aidprograms under Title IV of the Higher Education Act of 1965 (as amended). A stratified randomsample of 1,289 institutions was selected, including 689 2-year institutions and 600 less-than-2-year institutions. The sample of 2-year postsecondary institutions was drawn from thePostsecondary Education Quick Information System (PEQIS) panel, which contains a stratifiedrandom sample of 2,000 4-year and 2-year postsecondary institutions. The PEQIS panel wasconstructed from NCES" 1995-96 Integrated Postsecondary Education Data System (IPEDS)Institutional Characteristics file. The PEQIS frame included 5,353 4-year, 2-year, and less-than-2-year institutions of higher education located in the 50 states and the District of Columbia. Only 2-year institutions that were eligible for Title IV financial aid participation were included forselection from the PEQIS panel.

The sampling frame for the supplementary sample of less-than-2-year institutions wasthe 1996-97 IPEDS Institutional Characteristics file. The institutions eligible for thesupplementary sample were all less-than-2-year institutions in the 50 states and the District ofColumbia (the same geographic area used for the PEQIS panel) that reported eligibility for Title IVparticipation. A total of 1,898 institutions met these requirements.

Respondents and Response Rates

For the FRSS survey, questionnaires with letters explaining the purpose of the studywere mailed to school principals in early April 1999. The questionnaires were to be completed bythe person who was most knowledgeable about vocational education at the school. Telephonefollowup of nonrespondents was conducted during May and June 1999. Of the 1,200 schoolsselected for the survey, 50 were found to be out of scope for the study (29 of these werepostsecondary institutions). A total of 1,078 eligible schools completed the survey for an overallunweighted response rate of 94 percent. The weighted response rate was 95 percent.

The postsecondary questionnaires were mailed in mid-April 1999 to PEQIScoordinators at 2-year institutions and chief executive officers at less-than-2-year institutions. Aswith the FRSS survey, the questionnaire was to be completed by the person most knowledgeableabout occupational programs at the institution. Telephone followup of nonrespondents started inlate May, and data collection ended in early July 1999. Of the 1,289 postsecondary institutionssampled for the study, 103 were out of scope for the study; 57 of these institutions were closed, and38 did not have Title IV eligibility. The survey was completed by 1,100 2-year and less-than-2-postsecondary institutions, yielding an overall unweighted response rate of 94 percent. Theweighted response rate was also 94 percent.

Sampling and Nonsampling Errors

Survey responses were weighted to produce national estimates. The weights weredesigned to adjust for the variable probabilities of selection and differential nonresponse. Thefindings in this report are based on the sample selected and, consequently, are subject to samplingvariability.

The survey estimates are also subject to nonsampling errors that can arise because ofnonobservation (nonresponse and noncoverage) errors, errors of reporting, and errors made in datacollection. These errors can sometimes bias the data. Nonsampling errors may include suchproblems as misrecording of responses; incorrect editing, coding, and data entry; differencesrelated to the particular time the survey was conducted; or errors in data preparation. Whilegeneral sampling theory can be used to determine how to estimate the sampling variability of astatistic, nonsampling errors are not easy to measure and, for measurement and adjustmentpurposes, usually require that an experiment be conducted as part of the data collection proceduresor that data external to the study be used. A number of actions were taken to minimize nonsampling error. The questionnairewas pretested with respondents like those who completed the survey. During the design of thesurvey and survey pretest, an effort was made to check for consistency of interpretation ofquestions and to eliminate ambiguous items. The questionnaire and instructions were extensivelyreviewed by NCES and the Office of Vocational and Adult Education, U.S. Department ofEducation. Manual and machine editing of the questionnaire responses were conducted to checkthe data for accuracy and consistency. Cases with missing or inconsistent items were recontactedby telephone. Data were keyed with 100 percent verification.

Standard Errors and Statistical Tests

The standard error is a measure of the variability of estimates due to sampling. Itindicates the variability of a sample estimate that would be obtained from all possible samples of agiven design and size. Standard errors are used as a measure of the precision expected from aparticular sample. If all possible samples were surveyed under similar conditions, intervals of 1.96standard errors below to 1.96 standard errors above a particular statistic would include the truepopulation parameter being estimated in about 95 percent of the samples. This is a 95 percentconfidence interval. Estimates of standard errors for this report were computed using the jacknifereplication method and are based on 95 percent confidence intervals. All statistical tests, exceptthose in the "Relationships Among Program Characteristics" section, were based on t-testsconducted at the 95-percent confidence level.

Bonferroni adjustments were made to control for multiple comparisons whereappropriate. For example, for an "experiment-wise" comparison involving g pairwisecomparisons, each difference was tested at the 0.05/g significance level to control for the fact that gdifferences were simultaneously tested. The Bonferroni adjustment results in a more conservativecritical value being used when judging statistical significance. This means that comparisons thatwould have been significant with a critical value of 1.96 may not be significant with the moreconservative critical value. For example, the critical value for comparisons between any two of thesix broad program areas is 2.64, rather than 1.96. This means that there must be a larger differencebetween the estimates being compared for there to be a statistically significant difference.

In the "Relationships Among Program Characteristics" section, the relationshipsbetween programs" quality-control structures were tested for statistical significance using theWilcoxon signed-ranks test on the t-values obtained for each of the 28 program areas (see Darlington, 1975 for a description of the Wilcoxon test). The Wilcoxon test is a relatively powerfulnonparametric test; nonetheless, it is less powerful than a parametric test and therefore less likelythan a parametric test to detect significant relationships between variables. Thus the findings inthis report based on the Wilcoxon test should be viewed as exploratory.

Terms and Variables

For the secondary school survey, a vocational program was defined as a sequence ofcourses designed to prepare students for an occupation (e.g., nurses" aide) or occupation area (e.g.,health care) that typically requires education below the baccalaureate level. Because the focus ofthe surveys is on preparation for jobs within specific occupations, the definition of vocationalprograms did not include career exploration or other introductory courses that prepare students foradult life or for work in general (e.g., consumer and homemaking, industrial arts). At thepostsecondary level, an occupational program was defined as a sequence of courses designed toprepare students for an occupation (e.g., nurses" aide) that typically requires education below thebaccalaureate level. To allow institutions to report noncredit courses, a noncredit occupationalprogram could have included only one course or more than one course. In theory, a skillcompetency is defined as a concept, skill, or attitude that is essential to an occupation; a skillstandard is the level of attainment or performance established for a skill competency. However,survey pretesting revealed that respondents typically use the term "skill competency" to refer toboth competencies and standards. Because these terms tend to be used interchangeably in practice,skill competency was defined in the survey to include both the concept, skill, or attitude that isessential to an occupation, and to the level of attainment or performance established for a skillcompetency.

The term comprehensive school has exactly the same meaning as regular school.Comprehensive, or regular, schools do not focus primarily on special, vocational, or alternativeeducation, although they may offer these programs in addition to the regular curriculum. Avocational school focuses primarily on vocational, technical or career education and provideseducation or training in at least one semiskilled or technical occupation.

The main classification variable was school type (vocational, comprehensive) for thesecondary school survey, and level of institution (2-year, less-than-2-year) for the postsecondarysurvey. For secondary schools, school type was determined based on self-reported responses onthe FRSS survey. At the postsecondary level, school type was determined based on IPEDSclassifications (which were also based on self-report on IPEDS).

For the tabular analyses, several variables were constructed to measure the number ofprograms offered, skill competencies used for the programs offered, and skill certificates orindustry-related credentials available for programs offered. The variables were constructed for allprograms offered and for programs offered within the six broad occupation areas examined in thestudy.

Background Information

Both surveys were conducted under contract with Westat. The secondary survey wasconducted using the Fast Response Survey System (FRSS), and the postsecondary survey wasconducted using the Postsecondary Education Quick Information System (PEQIS). Westat"sProject Director was Elizabeth Farris; Basmat Parsad was the Survey Manager; Ed Heaton, theSystems Analyst; Catherine Marshall, the Text and Graphics Processor; and Carol Litman, theEditor. Bernard Greene was the NCES Project Officer. The data were requested by the Office ofVocational and Adult Education, U.S. Department of Education.

The following individuals reviewed this report:

Outside NCES

  • David Miller, Education Statistics Services Institute
  • Irma Berry, Office of Vocational and Adult Education
  • Vickie Schray, Office of Vocational and Adult Education

Inside NCES

  • Steve Broughman
  • Shelley Burns
  • Frank Johnson
  • Kristin Perry
  • Bruce Taylor

For more information about the surveys, "Survey of Vocational Programs inSecondary Schools" and "Survey of Occupational Programs in Postsecondary EducationInstitutions,"