View Quarterly by:
This Issue | Volume and Issue | Topics
|
|||
| |||
This article was originally published as the Executive Summary of the Technical Report of the same name. The sample survey data are from the NCES National Household Education Survey (NHES) and the October Current Population Survey (CPS), conducted by the U.S. Census Bureau. | |||
Introduction
Home schooling in the United States has become a topic of interest to education policymakers, administrators, and the general public. Currently, published estimates of the number of children who are home schooled vary by hundreds of thousands of children and are of uncertain reliability. Informed discussions of home-schooling policy are compromised without accurate estimates of how many children are educated at home and whether the proportion of children who are so educated is changing. Estimates of the number and proportion of students who were home schooled derived from two sets of national survey data from the mid-1990s-the October 1994 Current Population Survey (CPS:Oct94) Education Supplement and the 1996 National Household Education Survey (NHES:1996) "Parent and Family Involvement/Civic Involvement" component (PFI/CI)-also vary. The point estimates of the number of children ages 6 to 17 who were home schooled ranged from 345,000 in CPS:Oct94 to 636,000 in NHES:1996 (figure A). Taking estimated sampling variance into account, the 95 percent confidence interval around the CPS:Oct94 point estimate ranges from 287,000 to 402,000, and the 95 percent confidence interval around the NHES:1996 point estimate ranges from 515,000 to 757,000. According to CPS:Oct94, 0.8 percent of children were home schooled, and according to NHES:1996, 1.4 percent of children were home schooled. Although the differences between these surveys' estimates may reflect growth in the number and proportion of students who are home schooled, it seems unlikely that the number of home-schooled children nearly doubled in less than 2 years (Lines 1998; Ray 1999). This report explores differences in survey design and execution that may have contributed to these two different estimates. The report is based on the premise that for any given year, there is some "true" number of home-schooled children in the population. Point estimates derived from CPS and NHES depart from this true value by some amount of error. Errors in surveys include errors of nonobservation, errors of observation, and data processing errors (Groves 1991). After describing the data sources, this report examines each type of error. Figure A.-CPS and NHES point estimates and their 95 percent confidence intervals of number of home-schooled 6- to 17-year-olds: 1994 and 1996
SOURCE: U.S. Department of Commerce, Bureau of the Census, Current Population Survey (CPS), October 1994. U.S. Department of Education, National Center for Education Statistics, National Household Education Survey (NHES), "Parent and Family Involvement/Civic Involvement" component (PFI/CI), 1996.
Data Sources
For decades the U.S. Census Bureau has conducted CPS each month on behalf of the Bureau of Labor Statistics in order to study labor force participation and unemployment. CPS includes a set of basic labor force and demographic questions that are repeated each month and a supplement, whose topic varies from month to month. Each October's supplement focuses on participation in education programs for civilians age 3 and older, and in 1994 the October supplement included questions related to home schooling. CPS samples households using addresses from the most recent Decennial Census and updates to it as the sampling frame. Each sampled household is part of CPS for 8 months. In its 1st- and 5th-month interviews, the household's interview is conducted in person: a Census Bureau interviewer visits the home and conducts the interview with a laptop computer. With the household's permission, the remaining six interviews are conducted by telephone. Interviewers attempt to speak with the most knowledgeable person in the household, although any household member 15 years old or older may serve as the respondent. Respondents answer questions regarding all household members. The National Center for Education Statistics (NCES) has conducted NHES five times since the first administration in 1991. NCES uses NHES to collect data on education issues on which households, rather than education institutions, are best able to provide data. Each time NHES is fielded, a Screener interview is used to determine whether the household includes members who are eligible for either of two extended topical interviews. In 1996, one of these interviews-the PFI/CI component-of NHES included questions on children's schooling, including home schooling. The PFI/CI component sampled children from age 3 through 12th grade, with a maximum age of 20. NHES is a random-digit-dial (RDD) telephone survey; that is, it samples households via telephone numbers. Interviewers in telephone centers use computer-assisted telephone interviewing (CATI) to conduct interviews from January through April of the administration year. Interviewers ask to speak with a household member at least 18 years old, who responds to the Screener questions. In 1996, once the interviewer determined through the Screener that a child in the household was eligible for the PFI/CI, the interviewer asked to speak with the parent or guardian who knew the most about the sampled child's care and education. Impact of Nonobservation Errors
Errors of nonobservation occur when members of the target population are excluded from the sampling frame or when sampled members of the population fail to participate in the survey or some part thereof. This report discusses both of these sources of nonobservation error: sample coverage and nonresponse.
Sample coverage
Both the CPS and NHES sampling frames undercover some groups within the U.S. noninstitutionalized population, although each undercovers different segments. The Census Bureau estimates that CPS undercovers between 7 and 13 percent of infants through 19-year-olds in the population. Among children, males, blacks, and older children are more likely than females, nonblacks, and younger children to be missed. Sampling weights adjust for undercoverage with respect to these demographic characteristics, but to the extent that undercovered groups home school at rates different from the general population, these weights may not eliminate error in estimates related to home schooling. However, because home schooling is a rare event and the rates of undercoverage are low, even if the relatively small undercovered groups were home schooled at rates considerably higher or lower than the general population, the error in the estimates would be small. NHES has two primary sources of undercoverage: the exclusion of nontelephone households and the exclusion of some residential telephone numbers due to the particular method of random digit dialing used to sample households. CPS:Oct94 data indicate that approximately 6 percent of households did not have telephones. Sampling weights adjust NHES estimates to population controls derived from the Census, and therefore adjust for the undercoverage of households without telephones. CPS:Oct94 data indicate that children in nontelephone households were home schooled at the same rate as children in telephone households, and therefore there is no evidence of error due to the exclusion of nontelephone households. To reduce costs, NHES uses the list-assisted method of random digit dialing, and studies of the list-assisted method indicate that 3 to 4 percent of residential telephone numbers are excluded from the sampling frame when this method is used. It is not possible to determine empirically whether children in these households are more or less likely to be home schooled than are children in included households. However, the rate of home schooling is generally low and the proportion of excluded households is small. Therefore, even if the rate of home schooling were considerably different among excluded households compared with included households, the potential error in the estimated number and percentage of home-schooled children would be small. Although there is some potential for error in the studies' sampling frames, neither of the studies' sample designs appears to be biased. Both studies sample randomly from households within their frames and oversample some minority groups to collect sufficient data for reliable estimates concerning those groups. NHES:1996 PFI/CI randomly sampled children within households, depending on the number of children who were eligible to participate within a household.
Response rates
Response rates were calculated at three levels-household, supplement or extended interview, and item-for each survey. The CPS:Oct94 household response rate (94 percent) was considerably higher than the NHES:1996 Screener response rate (70 percent). The low household response rate in NHES allows for the possibility that home-schooling families, who may not wish to be identified or involved in government-related research (Kaseman and Kaseman 1991), may have participated at a lower rate than other families. However, because families with children in grades K-12 make up approximately 30 percent of the population of households in the United States, approximately 9 percent, rather than the entire 30 percent, of nonresponding households might include children in the desired age/grade range who were home schooled. At the second level, supplement in CPS and PFI/CI interview in NHES, CPS again had a higher response rate than NHES (97 percent compared with 89 percent, respectively). In both surveys the item response rates were high for items used in these analyses. Among the items that identify home-schooled children in CPS:Oct94, all of the items had response rates of at least 92 percent, and nearly all relevant items in NHES:1996 had item completion rates approaching 100 percent. It appears, therefore, that families who participated in the surveys were not unwilling to discuss home schooling. However, because missing data for many items were not imputed in the CPS:Oct94 data set, some cases had to be excluded from the CPS analyses because it was not possible to determine whether they met the criteria that defined the sample or whether they were home schooled. The excluded cases represented about 2 million of the 46 million 6- to 17-year-olds in the United States. If the excluded children were home schooled at the same rate as children who were included, approximately 30,000 additional children would be home schooled. However, the characteristics of excluded children, especially age, suggest that excluded children may well be home schooled at a lower rate than included children. Thus, although missing data may bias the CPS:Oct94 estimate, they are not likely to affect it greatly. The effect of the lower NHES:1996 household response rate cannot be estimated. Impact of Observation Errors
Observation errors can be introduced by data collection procedures, survey instruments, and respondents.
Data collection
The surveys differ with respect to data collection procedures in at least three ways. First, although both surveys are conducted with computer-assisted interviewing (CAI), CPS interviewers use both personal interviewing (CAPI) and telephone interviewing (CATI), whereas NHES interviews are conducted entirely via telephone interviewing. Whether and how personal interviewing, compared with telephone interviewing, might produce different results with respect to home schooling is unknown. Second, CPS is a panel survey, whereas NHES is not. The effects of this, aside from potential differences in response rates (which were examined separately), cannot be assessed with available data. Third, the surveys also differ with respect to timing. In addition to the 15- to 18-month span between the surveys' administration, the two surveys differ in the time of year at which they were administered. NHES is administered from January through April, in contrast to the October administration of the CPS Education Supplement. To the degree that parents are more likely to home school their children at some times of the year than at others, the difference in survey timing may contribute to the difference between the estimates.
Instrument error
This report examines question wording, question sequencing, respondent fatigue, and the location of home-schooling items as potential sources of instrument error. Question wording and sequencing. The questions regarding home schooling were worded differently between the two studies and even among interviews in CPS:Oct94. CPS:Oct94 interviews varied depending on the age and enrollment status of the person about whom the interview was being conducted. Regarding enrolled children, the question as to whether the child was "schooled primarily at home" allows for the possibility that children who were schooled partly at home and partly at school were not identified as home schooled. NHES:1996 PFI/CI interviewers first asked whether children were enrolled and then asked, regardless of enrollment status, whether children were schooled at home. When respondents indicated that a child was home schooled, the interviewer clarified the response by asking whether the child was schooled at home "instead of at school." It is not clear how parents who schooled their children partly at home and partly at school might have responded to these items. In addition to the difference in wording discussed above, the number of items and the complexity of their sequence are considerably greater in CPS:Oct94 than in NHES, creating more opportunities for missing or inaccurate responses. Although the greater number of items and the complexity of sequencing in CPS:Oct94 do not appear to have affected response rates, which were consistently high, whether they affected the quality of responses cannot be determined with the available data. Respondent fatigue. When surveys become too long, respondents often begin to tire or lose interest, a phenomenon known as "respondent fatigue." As a consequence of this fatigue, questions near the end of a long survey often have higher rates of nonresponse and responses to these questions can be less accurate than responses to questions near the beginning of the survey. The issue of respondent fatigue is addressed because the CPS Education Supplement questions regarding children's schooling occur near the end of the interview, after the basic labor force and supplement items for adults are asked. In contrast, the NHES items regarding children's schooling occur at the very beginning of the PFI/CI interview. It appears unlikely that this difference has affected these data. As noted above, the response rates to the supplement items regarding home schooling are high, which indicates that fatigue did not affect response rates greatly. In addition, in CPS:Oct94, household interviews that included supplement interviews for children ages 6 to 17 years old averaged 15 minutes in length. Given this relatively short duration, fatigue is not likely to have been a problem. However, whether fatigue did occur and affected the quality of responses cannot be determined with these data.*
Respondent error
Respondents' knowledge of the survey topic affects their ability to answer questions accurately. Therefore, respondents' relationships to the children about whom the home-schooling questions were asked may affect the accuracy of their answers. In addition, the political/legal and cognitive contexts within which questions are asked and answered may affect respondents' answers. Respondents' relationships to children. The CPS:Oct94 respondents could be different from the respondents to the NHES PFI/CI interviews because the instructions given to interviewers for choosing respondents differed between the two surveys. In CPS:Oct94, any household member 15 years old or older was eligible to respond for all household members, although interviewers were instructed to interview the most knowledgeable adult in the household if possible. In the NHES:1996 PFI/CI, interviewers asked to speak to the parent or guardian who knew most about the sampled child's education. Respondents were required to be 18 years old or older. It is not possible to establish empirically whether and how the respondents for the two studies differed. Although data regarding the relationship of the respondent to the child are available for all children in NHES, these data are available only for 15- to 17-year-olds in CPS. The available data indicate that parents were the most frequent respondents in both surveys, and it seems quite likely that if parents were the most common respondents for 15- to 17-year-olds in CPS:Oct94, they would also be so for younger children. Political/legal and cognitive contexts. The political/legal and cognitive contexts within which surveys are conducted can affect respondents' answers to particular questions. Home-schooling researchers have suggested that home-schooling families may be more reticent than others to participate in government research, particularly research that might address the issue of home schooling, because of the often ambiguous legal status of home schooling (Kaseman and Kaseman 1991; Ray 1997). On the other hand, to the degree that in recent years parents have become more interested in home schooling and in working with schools and districts to facilitate home schooling, there may be less reason for concern in this regard. The household- and item-level response rates provide relevant but conflicting evidence in this regard. The household response rate for NHES:1996 (which respondents were told was sponsored by the U.S. Department of Education and concerned education issues) was lower than the corresponding rate for CPS:Oct94 (which was conducted by the Census Bureau and which respondents were told covered labor force participation issues). This is consistent with the hypothesis that home-schooling parents may be more reluctant to discuss education issues, although the impact of the lower household response rate is somewhat mitigated because 30 percent of households, not 100 percent, are likely to include school-aged children. The high item response rates in both surveys indicate that respondents in participating households were no less likely to discuss home schooling than other issues. Unfortunately, whether the political/legal context of home schooling affected the quality of response cannot be determined with the existing data. The cognitive context may also have been affected by the different sponsors and purposes of the two surveys. In general, participating respondents want to cooperate with interviewers, and in their attempts to do so, use all available information to determine what the interviewer wants to know so they can provide the best information. Therefore, respondents are likely to have considered the different sponsors when they responded to questions, although any particular effects of these considerations upon their responses cannot be predicted or measured. Impact of Data Processing Errors
Whereas the NHES:1996 PFI/CI interview included online edits and all NHES:1996 data were edited after data collection concluded, the CPS:Oct94 supplement did not include online edits and the home-schooling items were not edited after data collection. As noted above, without editing, some cases could not be included in the CPS:Oct94 analysis due to missing information. Furthermore, not correcting errors that could be identified through consistency and plausibility checks in the CPS data may have contributed additional error to the CPS:Oct94 estimates relative to the NHES:1996 estimates. The available data do not permit estimation of the direction or magnitude of this potential error in this instance. Conclusion
This report examines several differences between the methods used in CPS:Oct94 and NHES:1996 that may have contributed to the observed difference in the two surveys' estimates of the number and proportion of home-schooled children. The potential direction and magnitude of estimate differences could not be predicted for most of these methodological differences between the surveys, however. This report raises a number of research questions regarding survey research and home schooling. First, it would be useful for researchers to address whether and how the political context of home schooling or other factors affect respondents' willingness to participate in the respective surveys and the accuracy of their answers to questions about home schooling. Second, research should explore the variety of schooling arrangements-exclusively at home, exclusively at school, and various combinations thereof-that parents make for their children, the frequency of these arrangements, and the factors that affect the kind of arrangement parents choose. Third, the results of cognitive laboratory research into parents' understanding of the term "home schooling" would aid in interpretation of responses to survey questions. Future research-using NHES:1999 data or cognitive laboratory studies of alternative question wording, for example-may address some of the issues raised in this report.
Footnotes
*Although the NHES:1996 PFI/CI interviews were longer-19 minutes in addition to the 6-minute Screener interview-the home-schooling questions were asked at the beginning of the extended interview and are thus relatively safe from the effects of respondent fatigue.
References
Groves, R.M. (1991). Measurement Errors Across the Disciplines. In P.B. Biemer, R.M. Groves, L.E. Lyberg, N.A. Mathiowetz, and S. Sudman (Eds.), Measurement Errors in Surveys (pp. 1-25). New York: John Wiley and Sons. Kaseman, S., and Kaseman, L. (1991). Does Homeschooling Research Help Homeschooling? Home Education Magazine, 8(1): 26-27, 46-49. Lines, P.M. (1998). Homeschoolers: Estimating Numbers and Growth. U.S. Department of Education. Washington, DC: Office of Educational Research and Improvement. Ray, B.D. (1997). Strengths of Their Own. Salem, OR: National Home Educational Research Institute Publications. Ray, B.D. (1999). Facts on Home Schooling. Salem, OR: National Home Educational Research Institute Publications. Available: http://www.nheri.org/98/research/general.html
|