American Community Survey (ACS)
The Census Bureau introduced the American Community Survey (ACS) in 1996. Fully implemented in 2005, it provides a large monthly sample of demographic, socioeconomic, and housing data comparable in content to the Long Form of the Decennial Census. Aggregated over time, these data will serve as a replacement for the Long Form of the Decennial Census. The survey includes questions mandated by federal law, federal regulations, and court decisions.
Since 2005, the survey has been mailed to approximately 250,000 addresses in the United States and Puerto Rico each month, or about 2.5 percent of the population annually. A larger proportion of addresses in small governmental units (e.g., American Indian reservations, small counties, and towns) also receive the survey. The monthly sample size is designed to approximate the ratio used in the 2000 Census, requiring more intensive distribution in these areas. The ACS covers the U.S. resident population which includes all of the civilian, noninstitutionalized population, those incarcerated, those institutionalized, and the active duty military who are in the United States. In 2006, the ACS began interviewing residents in group quarter facilities. Institutionalized group quarters include adult and juvenile correctional facilities, nursing facilities, and other health care facilities. Noninstitutionalized group quarters include college and university housing, military barracks, and other noninstitutional facilities such as workers and religious group quarters and temporary shelters for the homeless.
National-level data from the ACS are available starting with the year 2000. Under the current timetable, annual results were or will be available for areas with populations of 65,000 or more beginning in the summer of 2006; for areas with populations of 20,000 or more in the summer of 2008; and for all areas—down to the census tract level—by the summer of 2010. This schedule is based on the time it will take to collect data from a sample size large enough to produce accurate results for different size geographic units.
Common Core of Data (CCD)
The NCES Common Core of Data (CCD), the Department of Education's primary database on public elementary and secondary education in the United States, is a comprehensive, annual, national statistical database of information concerning all public elementary and secondary schools (approximately 97,000) and school districts (approximately 18,000). The database contains data that are designed to be comparable across all states. The CCD consists of five surveys that state education departments complete annually from their administrative records. The database includes a general description of schools and school districts; data on students and staff, including demographics; and fiscal data, including revenues and current expenditures.
Early Childhood Longitudinal Study, Birth Cohort (ECLS-B)
The Early Childhood Longitudinal Study, Birth Cohort (ECLS-B) was designed to provide detailed information on children's development, health, and early learning experiences in the years leading up to and including entry into school. The ECLS-B is the first nationally representative study within the United States to directly assess children's early cognitive and physical development, the quality of their early care and education settings, and the contributions of their fathers, as well as their mothers, to their lives. The children participating in the ECLS-B were followed from birth through entry into kindergarten. Information was collected from children and their parents during multiple rounds of data collection: rounds were conducted when the children were about 9 months old (2001); about 2 years old (2003); about preschool age, or about 4 years old (2005); and in kindergarten (2006-2007). Data were collected on a nationally representative sample of 14,000 children born in 2001. Their experiences are representative of the experiences of the approximately 4 million children born in the United States in 2001.
In the data collections when the children were 9 months, 2 years, and of preschool age, parents were asked about themselves, their families, and their children; fathers were asked about themselves and their roles in their children's lives; children were observed, and they participated in assessment activities. Trained assessors visited children in their homes. With the parents' permission, children participated in activities designed to measure important developmental skills in the cognitive, language, social, emotional, and physical domains. Trained assessors also conducted a computer-assisted interview with the sampled child's primary caregiver, most frequently the mother. In addition, when the children were about 2 years old and in preschool (about 4 years old), early care and education providers were asked to provide information about their own experience and training as well as information about the setting's learning environment. Providers were interviewed with the permission of the child's parents. Individuals and organizations that provide regular care for the child were interviewed. Trained staff conducted a computer-assisted interview over the phone. For home-based care settings, the primary provider was interviewed about the care setting and the sampled child's experiences there. For center-based care programs, the center director was first interviewed for general information about the program; the sampled child's primary provider in the center was then interviewed about the group care environment and the child's experiences. Child care settings were subsampled, then observed and rated.
Each variable corresponds with the year of the estimate for that variable. Estimates for 9-month-olds reflect the percentages of children representative of the given characteristic at the time of the 9-month data collection, whereas estimates for 2-year-olds reflect the percentages of children representative of the given characteristic at the time of the 2-year data collection, and estimates for 4-year-olds reflect the percentages of children representative of the given characteristic at the time of the 4-year data collection. Estimates from the 9-month wave of collection use the cross-sectional weight W1R0. Estimates from the 2-year wave of collection use the cross-sectional weight W2R0. Estimates from the preschool (4-year) wave of collection use the cross-sectional weight W3R0.
For indicator 2, family type categories were collapsed as follows: two parents (includes biological mother and biological father or biological mother and other father [step-, adoptive, foster] or biological father and other mother [step-, adoptive, foster] or two adoptive parents); single parent (includes biological mother only or biological father only or single adoptive parent or adoptive parent and stepparent); and other parent type (includes related guardian(s) or unrelated guardian(s)). "Adoptive parent and stepparent" is included in the "single parent" category because, in the ECLS-B, "single adoptive parent" and "single adoptive plus step-parent" are collapsed into one category, and in almost all cases it is only a single adoptive parent.
For indicator 3, parents participating in the ECLS-B were asked whether they currently had regular early care and education arrangements for their child, and, if so, how many hours per week their child spent in that setting. Information collected included the type of nonparental care and education in which the child spent the most hours, which was identified by the ECLS-B as the primary care arrangement. If a child spent equal time in each of two or more types of arrangements, care was coded as "multiple care arrangements." Primary type of care arrangement is the type of nonparental care in which the child spent the most hours. Children with no regular nonparental care arrangements were coded into the "no child care" category. "Regular" refers to arrangements that occurred on a routine schedule (i.e., occurring at least weekly or on some other schedule), not including occasional babysitting or "back-up" arrangements. "Relative care" refers to care provided in the child's home or in another private home by a relative (excluding parents). "Nonrelative care" refers to care provided in the child's home or in another private home by a person unrelated to the child. "Head Start" refers to services received at a public or private school, religious center, or private home, as reported by the parent. "Center-based care" refers to care provided in places such as early learning centers, nursery schools, and preschools. Information about Head Start enrollment was not obtained until the 2- and 4-year-old follow ups. For 2-year-olds, "Head Start" is included with other types of center-based care because few children were in Head Start at the time of the 2-year follow up. Separate estimates are provided for 4-year-olds enrolled in either "Head Start" or "Other center-based care."
Children, their parents, their child care providers, their teachers, and their school administrators provided information on children's cognitive, social, emotional and physical development across multiple settings (e.g., home, child care, school). A child's age at the time of the assessment may be related to certain child and family characteristics (e.g., certain groups of children may be older when assessed in a given wave). Thus, it is appropriate to analyze the ECLS-B cognitive and motor data in view of a child's age at the time of the assessment. Therefore, indicator 3. The ECLS-B assessment provides information on the probability a child would have achieved proficiency in a selected set of skills. The probabilities of proficiency are expressed as percentages.
Cognitive skills assessed at the 9-month data collection included
Motor skills assessed at the 9-month data collection included
Cognitive skills assessed at the 2-year data collection included
Motor skills assessed at the 2-year data collection included
Skills assessed at the 4-year data collection are classified as literacy and language skills, mathematics knowledge and skills, color identification, and fine motor skills. With the exception of fine motor skills, each of those skills is considered to be a cognitive skill.
Literacy and language skills assessed at the 4-year data collection included
Mathematics skills assessed at the 4-year data collection included
Color identification assessed at the 4-year data collection included
Fine motor skills assessed at the 4-year data collection included
Further information about the ECLS-B can be found at http://nces.ed.gov/ecls/birth.asp.
Integrated Postsecondary Education Data System (IPEDS)
The Integrated Postsecondary Education Data System (IPEDS) is the core program that the National Center for Education Statistics (NCES) uses for collecting data on postsecondary education. IPEDS is a single, comprehensive system that encompasses all identified institutions whose primary purpose is to provide postsecondary education. Before IPEDS, some of the same information was collected through the Higher Education General Information Survey [HEGIS]. Indicators 43 use data from HEGIS.
IPEDS consists of institution-level data that can be used to describe trends in postsecondary education at the institution, state, and/or national levels. For example, researchers can use IPEDS to analyze information on (1) enrollments of undergraduates, first-time freshmen, and graduate and first-professional students by race/ethnicity and sex; (2) institutional revenue and expenditure patterns by source of income and type of expense; (3) completions (awards) by type of program, level of award, race/ethnicity, and sex; (4) characteristics of postsecondary institutions, including tuition, room and board charges, and calendar systems; (5) status of career and technical education programs; and (6) other issues of interest.
Participation in IPEDS was a requirement for the 6,800 institutions that participated in Title IV federal student financial aid programs, such as Pell Grants or Stafford Loans, during the 2007-08 academic year. Title IV institutions include traditional colleges and universities, 2-year institutions, and for-profit degree- and non-degree-granting institutions (such as schools of cosmetology), among others. Each of these three categories is further disaggregated by financial control (public, private not-for-profit, and private for-profit), resulting in nine institutional categories, or sectors. In addition, 84 administrative offices (central and system offices) listed in the IPEDS universe were expected to provide minimal data through a shortened version of the Institutional Characteristics component. Four of the U.S. service academies are included in the IPEDS universe as if they were Title IV institutions. Institutions that do not participate in Title IV programs may participate in the IPEDS data collection on a voluntary basis.
National Household Education Surveys Program (NHES)
The National Household Education Surveys Program (NHES), conducted in 1991, 1993, 1995, 1996, 1999, 2001, 2003, 2005, and 2007, collects data on educational issues that cannot be addressed by school-level data. Each survey collects data from households on at least two topics, such as adult education, early childhood program participation, parental involvement in education, and before- and after-school activities.
NHES surveys the civilian, noninstitutionalized U.S. population in the 50 states and the District of Columbia. Interviews are conducted using computer-assisted telephone interviewing. Data are collected from adults and occasionally from older children (grades 6-12). Whether older or younger children are sampled, data about them are collected from the parent or guardian who is most knowledgeable.
Although NHES is conducted primarily in English, provisions are made to interview persons who speak Spanish. Questionnaires are translated into Spanish, and bilingual interviewers, who are trained to complete the interview in either English or Spanish, are employed. NHES only conducts interviews in English and Spanish, so if no respondent in the household can speak at least one of these two languages, then the interview is not completed.
Indicator 18 exclude homeschooled students.
Over time, NHES has had different response options for race/ethnicity. In 1991 and 1995, the response options were limited to White, Black, Hispanic, Asian/Pacific Islander, American Indian/Alaska Native and Other. In 1999 and 2001, the response options were White, Black, Hispanic, Asian/Pacific Islander, American Indian/Alaska Native, Other, and More than one race. In addition to these categories, in 2005 and 2007, Asian and Native Hawaiian or other Pacific Islander were separated into two race options. These categories are presented in indicator 18. For more information on race/ethnicity, see supplemental note 1.
Indicators 6, 18, 30, and 32 use data from the NHES. Further information about the program is available at http://nces.ed.gov/nhes/.
Open Doors International Student Census
About the Annual Census of International Students
Since its founding in 1919, the Institute of International Education (IIE) has conducted an annual census of international students in the United States. For the purposes of the Census, an international student is defined as an individual who is enrolled for courses at a higher education institution in the United States on a temporary visa, and who is not an immigrant (permanent resident with an I-151 or "Green Card"), a citizen, an illegal alien (undocumented immigrant), or a refugee. The data collection process changed in 1974-75; thus, refugees were counted from 1975-76 to 1990-91. After 1990-91, refugees were no longer counted. Since Open Doors 2004, individuals participating in Optional Practical Training (OPT) have been counted separately, although they are still included in the totals since these individuals are considered students in the Department of Homeland Security's Student and Exchange Visitor Information System (SEVIS). For more information on OPT, see http://www.uscis.gov/.
The International Student Census is made available to respondents as a detailed survey downloadable on the Open Doors website (http://opendoors.iienetwork.org/), along with detailed instructions and institutional codes. For Open Doors 2008, the Census was administered in fall 2007 to 2,657 institutions, with follow-ups continuing through summer 2008. Closed institutions and long-term nonrespondents were excluded. Some 1,714 institutions responded to the survey for a 64.5 percent response rate. The response rate was obtained through four rounds of mailings, as well as several rounds of email and telephone follow-ups by IIE, with the assistance of NAFSA: Association of International Educators (formerly known as the National Association of Foreign Student Advisers) and the American Association of Community Colleges (AACC). Although response rates have declined somewhat in recent years, the response rates remain very high for a voluntary survey. These declines parallel the introduction of other campus-based data collection on international enrollments, in particular the phasing in of mandatory campus reporting to SEVIS. When compared with SEVIS totals that have been adjusted for differences in the data collection schedule and response categories, Open Doors figures are closely congruent.
Some 1,648 institutions (96 percent of responding institutions) reported enrolling international students in 2007-08. Of these, 1,614 institutions (98 percent) provided detailed information on student characteristics. Key variables including students' place of origin, field of study, academic level, sex, and enrollment status had response rates ranging from 89 to 93 percent.
Fields of Study
The fields of study used in this book are from A Classification of Instructional Programs (CIP), 2000, published by the National Center for Education Statistics (NCES) of the U.S. Department of Education. For detailed information about CIP codes, see http://www.nces.ed.gov/pubs2002/cip2000/. See also supplemental note 9. In addition to the CIP 2000 codes, IIE created a separate category for Intensive English Programs. Optional Practical Training (OPT) has also been listed as a separate IIE category since Open Doors 2005.
Imputation and Estimation
For Open Doors, total international student enrollments and the various percentages cited are calculated directly from campus-based survey responses. Other student counts are determined by IIE using imputation, since not all campuses are able to provide detailed breakdowns for all variables, such as place of origin and field of study. Estimates of the number of students for each of the variables collected by the various surveys are imputed from the total number of students reported. For each imputation, base or raw counts are multiplied by a correction factor that reflects the ratio of the difference between the sum of the categories being imputed and the total number of students reported by the institutions. For this reason, student totals may vary slightly within a given year. While most institutions report academic level breakdowns by place of origin, others are unable to do so. Open Doors does not adjust further for this discrepancy, and uses the overall academic level breakdowns, not the academic level by place of origin, as the basis for calculating changes from year to year and for analyses. In addition, due to rounding, percentages may not always add up to 100 percent (regardless of whether or not numbers are imputed). In addition, estimates from Open Doors may differ from those derived from the Integrated Postsecondary Education Data System (IPEDS) because of differences in data collection and categorization procedures. See the preceding section on IPEDS for more information.
The data collection methodology was designed to produce stable, national estimates of international education activity. Analysis for units that reflect relatively small numbers of students (such as certain places of origin, fields of study, sources of financial support), and especially those that are cut by other variables, may reflect greater error variation than variables with a larger response base. In addition, to account for potential instability in annual institution-level counts, estimates based on counts from the previous reporting year are sometimes used to account for non-reporting institutions who have a history of reporting to the Open Doors surveys and whose previous year's figures were not themselves estimated. While estimation refinements were made for the 2008 edition and will continue to be made for future editions, the general practice of estimating based on previous years' numbers is entirely consistent with past years' Open Doors analysis protocols.
In the past, the reporting of students on post-completion Optional Practical Training (OPT) in the International Student Census was optional and left to each reporting institution. While these students are no longer enrolled in classes, they are still under the visa sponsorship of their (former) school and are reported as such in SEVIS. In order to make the reporting of OPT more consistent and more closely matched with the data reported to SEVIS, all institutions are now asked to break out their students on OPT and report them as a separate group (although they are still reported as part of the totals). Modest adjustments were made to the OPT data to account for inconsistencies in reporting due to this change.
Indicator 39 features data from the Open Doors International Student Census.
Private School Universe Survey (PSS)
The Private School Universe Survey (PSS) was established in 1988 to ensure that private school data dating back to 1890 would be collected on a more regular basis. With the help of the Census Bureau, the PSS is conducted biennially to provide the total number of private schools, students, and teachers, and to build a universe of private schools in the 50 states and the District of Columbia to serve as a sampling frame of private schools for NCES sample surveys.
The PSS groups elementary and secondary schools according to one of seven types of program emphasis:
Regular: The PSS questionnaire does not provide a definition of this term. Regular schools do not specialize in special, vocational/technical, early childhood or alternative education, or in having a Montessori or special program emphasis, although they may offer these programs in addition to the regular curriculum.
Montessori: The PSS questionnaire does not provide a definition of this term. Montessori schools provide instruction using Montessori teaching methods.
Special program emphasis: A science/mathematics school, a performing arts high school, a foreign language immersion school, and a talented/gifted school are examples of schools that offer a special program emphasis.
Special education: Special education schools primarily serve students with disabilities.
Vocational: Vocational schools primarily serve students who are being trained for occupations. For indicator 5, vocational schools are included with special program emphasis schools.
Alternative: Alternative schools provide nontraditional education. They fall outside the categories of regular, Montessori, special education, early childhood and vocational education.
Early childhood: Early childhood program schools serve students in prekindergarten, kindergarten, transitional (or readiness) kindergarten, and/or transitional first (or prefirst) grade.
In the most recent PSS data collection, conducted in 2007-08, the survey was sent to 39,147 qualified private schools, with a response rate of 91.8 percent.
School Survey On Crime And Safety (SSOCS)
The School Survey on Crime and Safety (SSOCS) focuses on incidents of specific crimes and offenses and a variety of specific discipline issues in public schools. SSOCS was administered in the spring of the 1999-2000, 2003-04, and 2005-06 school years. The survey also covers characteristics of school policies, school violence prevention programs and policies, and school characteristics that have been associated with school crime. The survey was conducted with a nationally representative sample of regular public primary, middle, high, and combined schools in the 50 states and the District of Columbia.
In the 2005-06 school year, a total of 3,565 schools were selected for the study. In March 2006, questionnaires were mailed to school principals, who were asked to complete the survey or to have it completed by the person most knowledgeable about discipline issues at the school. "At school" was defined for respondents to include activities that happen in school buildings, on school grounds, on school buses, and at places that hold school-sponsored events or activities. Respondents were instructed to provide information on the total number of recorded incidents and the number of incidents reported to the police or other law enforcement. Respondents were instructed to provide information on the number of incidents, not the number of victims or offenders, regardless of whether any disciplinary action was taken or whether students or nonstudents were involved. In the questions pertaining to indicator http://nces.ed.gov/surveys/ssocs/.