The National Center for Education Statistics (NCES) administers the Integrated Postsecondary Education Data System (IPEDS), which is a large-scale survey that collects institution-level data from postsecondary institutions in the United States (50 states and the District of Columbia) and other U.S. jurisdictions. IPEDS defines a postsecondary institution as an organization that is open to the public and has the provision of postsecondary education or training beyond the high school level as one of its primary missions. This definition includes institutions that offer academic, vocational and continuing professional education programs and excludes institutions that offer only avocational (leisure) and adult basic education programs. Definitions for other terms used in this report may be found in the IPEDS online glossary.
NCES annually releases national-level statistics on postsecondary institutions based on the IPEDS data. National statistics include tuition and fees, number and types of degrees and certificates conferred, number of students applying and enrolled, number of employees, financial statistics, graduation rates, student outcomes, student financial aid, and academic libraries.
Currently, there are 12 survey components that comprise the annual IPEDS data collection cycle. The 12 survey components are separated into one of three seasonal reporting periods–fall, winter, or spring. The fall data collection period consists of the Institutional Characteristics (IC), Completions (C), and 12-Month Enrollment (E12) survey components. The winter data collection consists of the Admissions (ADM), Graduation Rates (GR), 200 Percent Graduation Rates (GR200), Outcome Measures (OM), and Student Financial Aid (SFA) survey components. Finally, the spring collection consists of the Academic Libraries (AL), Fall Enrollment (EF), Finance (F), and Human Resources (HR) survey components. During the current IPEDS collection, some data are collected from the prior year to allow institutions to report the most current and complete data.
The IPEDS survey is a web-based collection, called the IPEDS Data Collection System (DCS). When respondents enter data, the DCS automatically calculates totals, averages, and percentages as well as compares the responses with the prior year’s data submission for the same institution to ensure the data are consistent. Adding another level of data quality control, the DCS also compares data reported across and within survey components. If data are still missing following the quality assurance procedures, or if an institution (unit) does not respond to a survey component, NCES would conduct imputations and as a result, a complete database is available for analysis.
As required by the Higher Education Act (HEA) of 1965, as amended (20 USC 1094(a)(17)), the submission of data to IPEDS is mandatory for any institution that participates in or is an applicant for participation in any federal financial assistance program authorized by Title IV of HEA. A mandatory participation consequently results in nearly 100% response rate for each IPEDS survey component. The IPEDS database is used as the principal sampling frame for other NCES postsecondary surveys. In addition to the mandatory participants, the IPEDS database also includes institutions that do not participate in Title IV financial aid programs. These institutions may participate in the IPEDS data collection program, and if they voluntarily respond to the surveys, the institutions are included in the Department of Education’s college search tool called The College Navigator. The College Navigator, which is maintained by NCES, is designed to help college students, prospective students, and their parents learn about admission requirements, programs of study, degrees offered, costs, graduation rates, and other characteristics of institutions that they may find helpful in selecting among postsecondary institutions.
The following section, which is organized by the three seasonal reporting periods, offers a brief description of each survey component in the IPEDS data collection. For more information about each IPEDS survey component, visit IPEDS Survey Components. In addition, NCES recommends reading The History and Origins of Survey Items for the Integrated Postsecondary Education Data System (2016-17 Update) for further detail on each IPEDS survey item and the legislative origins and requirements.
In addition to the number of degrees and other recognized postsecondary credentials, this component also collects the number of students receiving degrees or other postsecondary credentials by gender, race/ethnicity, age, and award level. The student count data from this component reflect awards received from July 1 to June 30.
GR helps institutions to comply with requirements of the Student Right-to-Know and Campus Security Act of 1990 (P.L. 101-542) legislation. Institutions operating on standard academic terms (semester, trimester, quarter, or 4-1-4) report on a fall cohort; all other institutions report on a full 12-month cohort (September 1 through August 31). Furthermore, for 4-year institutions, the cohort consists of those students who entered six years ago. For 2-year and less-than 2-year institutions, the cohort is made up of those students who entered three years ago.
For 4-year institutions, the cohort consists of those students who start eight years ago, and for less-than-4-year institutions (2-year and less-than-2-year institutions), the cohort is made up of those students starting four years ago. For 4-year institutions, the information collected is limited to bachelor’s-degree-seeking students, while less-than-4-year institutions report on the entire cohort (i.e., all degree/certificate-seeking students). Institutions operating on standard academic terms (semester, trimester, quarter, or 4-1-4) report on a fall cohort; all other institutions report on a full 12-month cohort (September 1 through August 31).
In addition to the total students in each of the four main cohorts, OM also collects subcohorts by Pell Grant recipient status (Pell Grant recipients and non-Pell Grant recipients), for a total of eight undergraduate subcohorts. The cohorts consist of all entering students who began their studies between July 1 and June 30. Student completion status is collected as of August 31st at 4, 6, and 8 years after students entered the institution. At each status point, institutions report the highest level of award students earned as of that status point.
In addition to completion status, the OM component collects enrollment status as of 8 years after students entered the reporting institution (August 31). For students who do not complete an award, institutions report the number of students, who remain enrolled at the reporting institution, leave the reporting institution and enroll at another institution, or are excluded from the cohort.
The AL component consists of two sections:
Part A collects the number, race/ethnicity, gender, and attendance status (full- or part-time) of students enrolled in the fall, including the number who are first-time degree/certificate-seeking undergraduate students; the number who are degree/certificate-seeking undergraduates; total undergraduates; and total graduate students. In addition, Part A collects data on the number of students enrolled exclusively in distance education courses, at least one but not all distance education courses, or no distance education courses. These data are reported by student level, undergraduate degree/certificate-seeking status, and student residence location (i.e., in same state or jurisdiction as the institution; in a different state or jurisdiction as the institution; outside the United States; or unknown).
Part B is required when data correspond to the fall of an odd-number year, but optional in an even-number year. This part collects the summary data on student age category by gender for undergraduate and graduate students by attendance status enrolled in the fall.
Part C is required when data correspond to the fall of an even-numbered year, but optional in an odd-number year. This part collects summary data on the residence of first-time degree/certificate-seeking undergraduate students and the number of those students who enroll in the fall that completed high school in the last 12 months by state or other United States jurisdiction of residence.
Part D collects data on the total number of undergraduate students who enter the institution for the first time in the fall term. This includes both full-time and part-time undergraduate students new to the institution, whether degree/certificate-seeking or not, and any students who transfer into the institution.
Part E collects data on retention rates, which quantify the proportion of the first-time student population enrolled during the previous fall term who returned to the same institution in the following fall term. Four-year institutions report their retention data for first-time, bachelor’s degree-seeking undergraduate students attending at the full-time and part-time levels. Less-than-4-year institutions report their retention data for first-time undergraduate students attending at the full-time and part-time levels.
Part F gathers an estimated undergraduate program student-to-faculty ratio. The survey instrument includes a worksheet to assist the institution in calculating the ratio requested.
This component is designed to follow the format of institutional financial statements suggested by the Financial Accounting Standards Board (FASB) and Governmental Accounting Standards Board (GASB). Different versions of the Finance component are available based mainly on control of the institution: public, private nonprofit, and private for-profit. Public institutions choose between two versions of the component depending on which standards they used for their internal accounting: (1) GASB Statements 34 and 35 reporting standards or (2) FASB reporting standards.
Public institutions that use GASB reporting standards to prepare their financial statements report data on their statement of financial position (Part A), revenues and other additions (Part B), expenses and other deductions (Part C), summary of changes in net position (Part D), scholarships and fellowships (Part E), endowment assets (Part H), and pension information (Part M). Additionally, they report certain data for the U.S. Census Bureau, including revenue data (Part J), expenditure data (Part K), and debts and assets (Part L).
Private nonprofit institutions and public institutions that use FASB reporting standards to prepare their financial statements report data on their statement of financial position (Part A), summary of changes in net assets (Part B), scholarships and fellowships (Part C), revenues and investment return (Part D), expenses by functional and natural classification (Part E), and endowment assets (Part H).
Private for-profit institutions use a form that is similar to the private nonprofit form, but adjust to account for differences between private nonprofit and private for-profit institutions (e.g., restricted/unrestricted status of revenues was not collected from private for-profit institutions). Private for-profit institutions report data on balance sheet information (Part A), summary of changes in equity (Part B), student scholarships and fellowships (Part C), revenues and investment return (Part D), expenses by function (Part E), and income tax expenses (Part F).
The following parts constitute the Human Resources component:
NCES releases the annual data collection in three levels: Preliminary, provisional, and final.
NCES provides IPEDS keyholders with access to IPEDS preliminary data through the IPEDS DCS soon after the close of data collection period. Preliminary data undergo an initial review and validation process, including following up with institutions. However, the data have not been extensively reviewed or edited, such that blank responses are left blank (i.e., data are unimputed).
NCES completes an extensive quality control process when the provisional data is released to the public. Along with additional follow-up with institutions, blank data are imputed, using the Nearest Neighbor method to estimate missing data (see Imputation Procedures section). The review process takes approximately 3 additional months after the preliminary data are released.
In the following collection year, institutions may revise their data should an error be detected by the institution. Institutions update their data through the IPEDS Prior Year Revision System and the incorrect provisional data is updated with the revisions. Final data for each component are released approximately 12 months after the provisional release of the data.
The IPEDS survey is a web-based data collection since 2000. Each institution’s Chief Executive Officer (CEO) appoints a keyholder who is responsible for ensuring that the institution submitted survey data are correct and complete. To access the web-based collection, known as the IPEDS DCS, the keyholder can create UserIDs and password for up to 16 additional survey respondents who could also enter or review data. For many institutions, keyholders also edit and “lock” the data; locking indicates to NCES that the data submitted the accurate, true, and complete.
Many states or systems may have one or more IPEDS coordinators who are responsible for a specified group of institutions to ensure that all data were entered correctly. Some coordinators are responsible for a system of institutions (e.g., SUNY—the State University of New York); others coordinate all or some institutions in a state or jurisdiction. Coordinators may elect to provide different levels of review. For example, some may have review-only data provided by their institutions, while others may have upload data from state or jurisdiction databases, review the data, and/or lock data for their institutions.
In each August, NCES sends letters to CEOs at institutions without preexisting keyholders, requesting that they appoint a keyholder for the current collection year. The package includes a letter for the keyholder and a registration certificate with the institution’s UserID for the entire current collection year, along with the temporary password enabling the keyholder to register and establish a permanent password. In addition, NCES sends e-mail messages to keyholders and coordinators who are continuing in their respective roles, providing them with their UserID and a temporary password and requesting that they update or confirm their registration information.
Typically, IPEDS data collection cycles require some follow-up for nonresponse. These activities begin at the end of August of the current collection in an effort to prompt remaining keyholders to register.
NCES sends a follow-up letter to CEOs of institutions whose keyholder has not registered, and also calls institutions to prompt registration. The result of these efforts is the eventual registration of a keyholder or locking coordinator at all institutions. Additional follow-ups with CEOs, coordinators, and keyholders for survey nonresponse are conducted via mail, e-mail, and telephone throughout the collection period. At the beginning of the winter and spring collections (in early December), NCES sends registered keyholders and coordinators e-mail messages alerting them to the collection opening and requesting that they update or confirm their registration contact information, if needed.
The web-based survey component forms offer many features to ensure the quality and timeliness of the data. As indicated above, the DCS requires survey respondents to register before entering data to provide a point of contact between NCES/IPEDS and the institution.
Online data entry forms are tailored to each institution based on characteristics such as institutional control (public, private nonprofit, private for-profit), level of institution (4-year, 2-year, and less-than-2-year), type of awards offered (degree-granting or non-degree-granting), and calendar system (standard academic terms or enrollment by program).
When available, a customized survey form contains preloaded data from previous years for easy reference and comparison purposes. Once the data are entered, either manually or through file upload, the keyholders run edit checks to resolve all errors before locking their data. Once locked, the data are considered submitted, regardless if a coordinator has reviewed the submission.
After the locking process, the IPEDS Help Desk staff conducts its final review. The staff contacts the institution’s keyholder or its coordinator to resolve any remaining questions if any additional problems still exists. When all problems are resolved, the staff migrates the final data to the IPEDS Data Center, where the preliminary data becomes available to other responding institutions for comparison purposes.
The web-based survey instrument contains edit checks to detect major reporting errors. The system automatically generates percentages and totals for each collection component and compares current responses to data reported the previous year. As edit checks are conducted through the DCS, key holders are alerted to correct data reporting errors. If accurate data fail the edit checks, the survey respondents either confirms the response or explains why the data are out of the expected data range. All edit checks have to be resolved (confirmed or explained) before each survey is permitted to be locked. In some cases, the respondents could not confirm or explain the edit failures, in which case they would contact the IPEDS Help Desk for edit overrides. Many IPEDS survey component instruments also contain one or more context boxes that respondents could, at their discretion, use to explain any special circumstances that might not be evident in their reported data. In addition, IPEDS Help Desk staff manually reviews the data for additional errors. When necessary, the staff contacts keyholders to verify the accuracy of the data.
The following are a few of the tailored edits for each IPEDS survey component.
Institutional Characteristics. IC edits examines the types of educational offerings (occupational, academic, continuing professional, avocational, adult basic, or secondary) and whether the institution qualifies as offering postsecondary programs and thus should be considered in scope for IPEDS. For all levels of offering and levels of award, edits compared application fees, tuition and fees, and room and board charges with the prior year’s data for consistency. The system flags large changes in the student charges section for follow-up. For example, the percentage increase or decrease of current year versus prior year data is not expected to exceed 50 percent for application fees, 30 percent for tuition and fees, and 40 percent for room and board charges.
Completions. The DCS preloads in the current Completions component previously reported CIP codes. IPEDS requires institutions to report Completions data using the most recent CIP taxonomy. The system checks the award levels reported for each CIP code against a predetermined list (of valid award levels for each 6-digit CIP code) developed by subject matter experts.
Edits also check the award levels against those indicated on the prior year’s Institutional Characteristic component and the prior year’s Completions component. For each award level, an edit compares the gender totals for each two-digit CIP with the information from the prior year. For large current year and prior year values, current year values are expected to be within 50 percent of prior year values. Small values, numbers less than 20 for both years, are not compared. Within each award level, an edit compares the number of awards for each race/ethnicity and gender combination with the corresponding value from the prior year. Finally, the total number of completers (students) earning an award is expected to be less than or equal to the total number of completions (awards) reported.
12-month Enrollment. E12 survey component also has several automated edit checks. The edits compares student counts, by level, with prior year counts to ensure consistency. They also check instructional activity hours to ensure that hours are reported if the institution reported students at the same level. Total instructional activity is also compared with the unduplicated headcount, for each student level, to ensure that the reported activity is appropriate for the number of students reported. That is, the contact and credit hours reported are expected to fall within a specific range defined by the institution’s calendar system and unduplicated headcount enrollment. Keyholders must explain any discrepancies or data reported outside the expected ranges.
Admissions. Edit checks for the ADM component of the survey are performed to ensure that there is a response to each item on the Admissions Consideration page and that “Required” was selected for at least one of the considerations. On the Applicants/Admissions/Enrollment page, edit checks are performed to ensure that the total for each field was greater than zero and also greater than or equal to the sum of the values separately reported for men and women. The total number of admissions is expected to be less than a percentage of the number of applicants; the percentage used in this edit varied by institutional sector. In addition, the number of admissions is required to be greater than or equal to the total number of students who enrolled. On the Test Score page (which is applicable only when SAT or ACT scores are required for admission), the edit checks ensure that the total number of test scores (both SAT and ACT scores) submitted by enrolled students is greater than or equal to the total number of enrolled students. In addition, the edit checks ensure that data are entered for each of the fields on the page. Edit checks ensure that test scores are within the range of valid scores for each test and test component. Additionally, if 25th percentile scores are reported, a 75th percentile score is required to be reported for that test score component, and vice versa. Edit checks also ensure that the 75th percentile scores are reported as being greater than the corresponding 25th percentile scores.
Graduation Rates. The GR component uses preloaded data from the Fall Enrollment component from the applicable year for the initial cohort of full-time, first-time degree/certificate-seeking students to ensure consistent reporting. Revisions to the initial cohort are permitted if better data had become available. To ensure that the sum of individual cells do not exceed the revised cohort for any race/ethnicity or gender classification, the system summed the individual cells and compared the result to the appropriate revised cohort values. The edits require institutions reporting very high or very low numbers of completers (as a percentage of the total cohort) to explain this anomaly. Finally, if any cohort members (i.e., the bachelor’s or equivalent degree-seeking cohort or the other-than-bachelor’s or equivalent degree-seeking cohort), are reported for either section of the Graduation Rates component, then data are expected to be reported in each applicable cohort section.
200 Percent Graduation Rates. The DCS preloads data to GR200 component on the cohort of full-time, first-time degree/certificate-seeking students; exclusions from the cohort; and completers within 150 percent of normal program completion time from the GR component covering the appropriate cohort year. Edit checks compare the sums of individual cells with the revised cohort. Additionally, the edit rules require institutions that report very high or very low numbers of completers within 151 to 200 percent of normal program completion time, or report high numbers of additional cohort exclusions (as a percentage of the cohort), to explain extreme anomalies and make necessary corrections.
Outcome Measures. The OM cohorts of full-time, first-time and part-time, first-time degree/certificate-seeking students are required to be greater than or equal to the corresponding cohort(s) reported in the EF component for the appropriate cohort year in order to ensure consistent reporting. To ensure that the sum of individual cells did not exceed the revised cohort for any group, the DCS sums the individual cells and compares the result to the appropriate revised cohort values. Additionally, cross-component comparisons with the appropriate GR, GR200, and SFA components are conducted to ensure consistency between OM reported data and prior reported data on full-time, first-time students from the applicable cohort year.
Student Financial Aid. The number of full-time, first-time students in SFA component must be less than or equal to the total number of undergraduate students enrolled. The number of full-time, first-time students who received any financial aid during the full academic year had to be less than or equal to the number of full-time, first-time undergraduate students, and the total aid received by the full-time, first-time students have to be less than the total aid state full-time, first-time undergraduate students could not exceed the number of full-time, first-time undergraduate students as reported in Part B. The number of full-time, first-time undergraduate students receiving federal grants cannot exceed the number of full-time, first-time undergraduate students who received any financial aid during the full academic year. The same criteria apply to state/local grants, institutional grants, and loans to students. In Part D, the average amount of aid received by full-time, first-time students is compared with the average amount of aid from the previous year, and the keyholder had to justify large discrepancies (typically 15 percent or greater) in the edit explanations. In Part E, average aid received in each income category is compared with that for the next lower income category, and the keyholder had to justify (via edit explanations) instances where higher average aid is received by students with higher incomes.
Academic Libraries. Edit checks in the AL component ensured that a value is entered for all fields in Section I: Library Collections/Circulation and Interlibrary Loan Services. In Section II, Expenses, edit checks ensure that a value was entered for all applicable fields. If the institution indicates that fringe benefits are paid out of the library budget, a value greater than zero is required to be entered for Total Fringe Benefits. In addition, if the institution indicates that fringe benefits are not paid out of the library’s budget, a value of zero is required to be entered for Total Fringe Benefits.
Fall Enrollment. The EF component has several automated edit checks designed to ensure internal consistency. Among them, the number of full-time, first-time degree/certificate-seeking undergraduate students has to be less than or equal to the total number of students. The checks compare student counts, by level, with activity hours reported in other components to ensure that the numbers of undergraduate and graduate students are consistent with previously reported data. Total students from Part B must equal the number reported in Part A. For this collection cycle, Part C data (reported by state or jurisdiction of residence) are optional. However, if reported, total first-time degree/certificate-seeking students in Part A (reported by race/ethnicity) has to equal total first-time degree/certificate-seeking students in Part C. If the DCS detects discrepancies in the numbers reported in parts A, B, and C, it generates balance amounts and enters data into “unknown” fields. For all sections, where large discrepancies (typically 25 percent or greater) exist between current year responses and data from previous years, the keyholder must justify the discrepancy via edit explanations.
Finance. If the DCS detects large changes in the reported Finance data when comparing current year data with the previous year’s data, the keyholder explains the reasons for the differences. In the version of the Finance component for private nonprofit institutions, total net assets has to equal total unrestricted net assets plus total restricted net assets. Total net assets also have to equal total assets minus total liabilities. For all versions of the Finance component, the DCS generates selected fields using predetermined formulas—such as other sources of revenue and other expenses. Institutions are instructed to review the generated totals and resolve any data entry errors.
Human Resources. The HR component has edit checks that compares the current year data for the full-time and part-time staff sections with the previous year’s data. If the edit fails, the keyholder has to explain any large discrepancies. Within the full-time staff section, Part A, the total number of full-time instructional staff has to be greater than or equal to the number of newly hired full-time permanent instructional staff (by gender and race/ethnicity). In addition, the total number of other full-time staff has to be greater than or equal to the number of newly hired full-time staff in the corresponding occupational category (by gender and race/ethnicity). Within Part G, the sum of the full-time instructional staff reported across the contract lengths has to be less than or equal to the corresponding total number of full-time instructional staff reported in Part A for each of the academic ranks, by gender. For each occupational category, monthly weighted average salaries are calculated, and the system performed checks to detect unusually high or unusually low averages. Total part-time staff data reported in Part D are checked for consistency with the total part-time staff data reported in Part E, by occupational category.
All components of the annual IPEDS collection are subject to imputation for nonresponse—both institutional (unit) nonresponse and item nonresponse—should any exist within the component. With the exception of the IC survey component, all items in each component are eligible for imputation. Within the IC component, only cost of attendance and other institutional charges data are eligible for imputation.
Only institutions with the following characteristics are candidates for imputation or to serve as donors:
In addition to these general criteria, the following conditions also need to be satisfied by institutions in the indicated component for the institution to be considered as an imputee or donor. Note that three survey components (IC, HR, and Finance) do not require that any additional criteria be satisfied.
IPEDS may apply one of three imputation methods for both unit and item nonresponse, depending on the data available. These methods are described briefly below.
Carry Forward: This method is used for institutions that have reported component data in prior years. In this procedure, a prior year’s data are used as a substitute for current data. Adjustments for year-to-year changes are applied, based on the values of respondent institutions within the imputee institution’s imputation group. Imputees that have complete data in either of the previous two years receive Carry Forward imputation.
Nearest Neighbor: The Nearest Neighbor procedure identifies data related to the key statistics of interest for each component (the distance measure), then uses those data to identify a responding institution similar to the nonresponding institution and uses the respondent’s data as a substitute for the nonrespondent’s missing items. Depending upon the component and the relationships between the distance measure and the key statistics of interest, an adjustment to account for dissimilarity between the imputee and donor may be applied.
Group Median: A median institution is identified within each imputation group, and all imputee institutions that cannot be imputed via the Carry Forward or Nearest Neighbor methods receive the median institution’s reported data as their imputed values. No adjustments are made to the donor institution’s data prior to assigning it to the imputee.
Information on response rates and any imputations conducted for each component are included in NCES documentation accompanying the provisional data release of each collection period.