Skip Navigation

Postsecondary
More Tools
Learning
RETRIEVE TABLE:

LAUNCH BY DATASET

QuickStats QuickStats
Select a Dataset to Launch in QuickStats
PowerStats PowerStats
Select a Dataset to Launch in PowerStats
TrendStats TrendStats
Select a Dataset to Launch in TrendStats

VIEW ALL DATASETS

Info Questions? Contact NCES
nces.info@rti.org

All Topics

Topics

  • Attendance and Enrollment
    • Enrollment Intensity
    • Patterns
  • Education History
    • Academic Experiences
    • Academic Performance
    • Admissions
    • Assessments
    • Field of Study
    • Outcomes
    • Persistence and Attainment
    • Programs and Courses
    • STEM
    • Transcripts
    • Transfer
  • Educational Transitions
    • High school to college
    • Preschool to elementary school
  • Employment
    • Employment characteristics
    • History
    • Status
    • While enrolled
  • Faculty and Staff
    • Compensation and Benefits
    • Education and Training
    • Experiences and Attitudes
    • Faculty Characteristics
    • Institutional Characteristics
    • Tenure
  • Finances
    • Application
    • Borrowing and Debt
    • Cost and Net Price
    • Debt
    • Expenses
    • Federal Aid
    • Grants
    • Income
    • Loans
    • Non-Federal Aid
    • Support
    • Work study
  • Parents and Family
    • Dependency and Marital Status
    • Parent Expectations, Attitudes, and Beliefs
    • Parental Involvement
    • Student's Parents
    • Student's Spouse and Dependents
  • Pre-K and K-12 Staff
    • K-12 Staff
    • Pre-K Staff
    • School Principals
    • Security Staff
  • School and Institutional Characteristics
    • Admissions and Tuition
    • Attendance and Enrollment
    • Classroom Settings, Sizes, and Organization
    • Crime and Safety
    • Demographics
    • Facilities
    • Institution/School Type, Level, and Sector
    • School Practices and Programs
    • Technology Use
  • School Districts
    • District Characteristics
    • Hiring and Compensation
  • Special Education
    • Programs and Services
    • Teachers and Staffing
  • Staffing
    • Number of Teachers and Staff
    • Vacancies
  • Student Characteristics
    • Demographics
    • Disabilities
    • Military or Public Service
    • Residence and Migration
  • Teachers and Teaching
    • Compensation and Benefits
    • Credentials
    • Demographics
    • Education and Training
    • Experiences, Performance, and Attitudes
    • Professional Development
9
68,69
10
72
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
1
48
9
68,69
10
72
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
9
68,69
10
72
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
1
48
9
68,69
10
72
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
5
59,60,61
10
72
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
1
48
9
68,69
10
72
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
10
72
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
9
68,69
10
72
11
54,20,31
12
56
13
71,53,1,32
9
68,69
10
72
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
1
48
9
68,69
10
72
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
9
68,69
10
72
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
9
68,69
10
72
11
54,20,31
12
56
13
71,53,1,32
9
68,69
10
72
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35
9
68,69
10
72
13
71,53,1,32
14
82,51,24,35,36
1
48
9
68,69
10
72
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
11
54,20,31
12
56
13
32
15
83,52,12,37,38
9
68,69
10
72
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
16
28
17
29
16
28
16
28
16
28
17
29
16
28
17
29
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
11
54,20,31
12
56
13
71,53,1,32
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
9
68,69
10
72
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
9
68,69
10
72
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
9
68,69
10
72
1
48
3
62,63,64
4
65,66,67
8
70,74,73
1
48
9
68,69
10
72
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
9
68,69
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
5
59,60,61
6
58
7
57
1
48
4
65,66,67
8
70,74,73
1
48
5
59,60,61
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
5
59,60,61
8
70,74,73
1
48
4
65,66,67
5
59,60,61
6
58
3
62,63,64
4
65,66,67
8
70,74,73
5
59,60,61
8
70,74,73
9
68,69
10
72
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
7
57
8
70,74,73
17
29
9
68,69
10
72
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
16
28
17
29
1
48
4
65,66,67
5
59,60,61
6
58
7
57
1
48
6
58
6
58
1
48
5
59,60,61
8
70,74,73
1
48
5
59,60
6
58
11
54,20,31
12
56
5
59,60,61
6
58
7
57
5
59,60,61
6
58
7
57
1
48
9
68,69
10
72
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
1
48
10
72
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
9
68,69
10
72
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
1
48
9
68,69
11
54,20,31
12
56
13
71,53,1,32
14
82,51,24,35,36
15
83,52,12,37,38
3
62,63,64
4
65,66,67
8
70,74,73
11
54,20,31
12
56
1
48
11
54,20,31
12
56
3
62,63,64
4
65,66,67
8
70,74,73
11
54,31
12
56
3
62,63,64
4
65,66,67
6
58
11
54,20,31
12
56
1
48
3
62,63,64
4
65,66,67
10
72
11
54,20,31
12
56
1
48
3
62,63,64
4
65,66,67
6
58
11
54,20,31
12
56
Matching Datasets

  • Dataset Available In:
  • Launch QuickStats
  • QuickStats
  • Launch PowerStats
  • PowerStats
  • Launch TrendStats
  • TrendStats



All Datasets

  • Dataset Available In:
  • Launch QuickStats
  • QuickStats
  • Launch PowerStats
  • PowerStats
  • Launch TrendStats
  • TrendStats



DATASET

Down

YEAR

Up

GROUP

Up

LAUNCH TOOLS

Title
Enter all the data here.

  • Dataset Available In:
  • Launch QuickStats
  • QuickStats
  • Launch PowerStats
  • PowerStats
  • Launch TrendStats
  • TrendStats



                    
  • National Postsecondary Student Aid Study, Undergraduate
  • National Postsecondary Student Aid Study, Graduate
Pre-Elementary Education Longitudinal Study
PEELS
Pre-elementary students who received preschool special education services, as they progressed through the early elementary years
Preschool special education, Programs and services received, Transitions between preschool and elementary school, Function and performance in preschool, kindergarten, and elementary school
https://ies.ed.gov/ncser/projects/peels
482003/2008qsOnpsOntsOff3,000

Imputation

Imputation was conducted for selected items on the teacher questionnaire and parent interview data. In general, the item missing rate was low. The risk of imputation-related bias was judged to be minimal. The variance inflation due to imputation was also low due to the low imputation rate of 10 percent. Imputation for the supplemental sample increased the amount of data usable for analysis, offsetting the potential risk of bias.

The methods of imputation included: hot-deck imputation, regression, external data source, and a derivation method, based on the internal consistency of inter-related variables.

View methodology reportpeels_subject.pdf6.71 MBpeels_varname.pdf6.63 MB00127
Schools and Staffing Survey, Teachers
SASS
Public and private school teachers
Class Organization, Education and Training, Certification, Professional Development, Working Conditions, School Climate and Teacher Attitudes, Employment and Background Information
https://nces.ed.gov/surveys/sass
622011-2012qsOffpsOntsOff42,000

Perturbation

To protect the confidentiality of NCES data that contain information about specific individuals, data were subject to perturbation procedures to minimize disclosure risk. Perturbation procedures, which have been approved by the NCES Disclosure Review Board, preserve the central tendency estimates but may result in slight increases in non-sampling errors. Data swapping and other forms of perturbation can lead to inconsistencies.


Imputation

Three types of edits were performed on the SASS data: blanking, consistency, and logic edits. Blanking edits delete extraneous entries that result from respondents failing to follow skip patterns correctly and assign “missing” codes to items that respondents should have answered and didn’t. Consistency edits ensured that responses to related items were consistent and did not contradict other survey data. Finally, logic edits were performed, using information collected from the same questionnaire, associated questionnaires in the same school or district, or information from the sampling frame to fill missing items, where possible.
After blanking, consistency, and logic edits were completed, any missing items that remained were filled using imputation. Data were imputed from items found on questionnaires of the same type that had certain characteristics in common or from the aggregated answers of similar questionnaires. These records are called “donor records1,” and the method of imputation that involves imputing data from donor records is called “hot-deck2” imputation.


Skips and Missing Values

Following data collection, the data are subjected to various consistency and quality control checks before release for use by analysts. One important check is examining all variables with missing data and substituting specific values to indicate the reason for the missing data. For example, an item may not have been applicable to some groups of respondents, a respondent may not have known the answer to a question, or a respondent may have skipped the item entirely. Please consult the survey methodology for more information.

1Donors were selected based on the type of data the donor would supply to the record undergoing imputation. Matching variables were selected based on their close relationship to the item requiring imputation, and a pool of donors was selected based on their answers to these matching variables.

2Goldring, R., Taie, S., Rizzo, L., Colby, D., and Fraser, A. (2013). User’s Manual for the 2011–12 Schools and Staffing Survey, Volume 1: Overview (NCES 2013-330). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

sass12teachpub_subject.pdf1.15 MBsass12teachpub_varname.pdf1.18 MB1332
632011-2012qsOffpsOntsOff42,000

Perturbation

To protect the confidentiality of NCES data that contain information about specific individuals, data were subject to perturbation procedures to minimize disclosure risk. Perturbation procedures, which have been approved by the NCES Disclosure Review Board, preserve the central tendency estimates but may result in slight increases in non-sampling errors. Data swapping and other forms of perturbation can lead to inconsistencies.


Imputation

Three types of edits were performed on the SASS data: blanking, consistency, and logic edits. Blanking edits delete extraneous entries that result from respondents failing to follow skip patterns correctly and assign “missing” codes to items that respondents should have answered and didn’t. Consistency edits ensured that responses to related items were consistent and did not contradict other survey data. Finally, logic edits were performed, using information collected from the same questionnaire, associated questionnaires in the same school or district, or information from the sampling frame to fill missing items, where possible.
After blanking, consistency, and logic edits were completed, any missing items that remained were filled using imputation. Data were imputed from items found on questionnaires of the same type that had certain characteristics in common or from the aggregated answers of similar questionnaires. These records are called “donor records1,” and the method of imputation that involves imputing data from donor records is called “hot-deck2” imputation.


Skips and Missing Values

Following data collection, the data are subjected to various consistency and quality control checks before release for use by analysts. One important check is examining all variables with missing data and substituting specific values to indicate the reason for the missing data. For example, an item may not have been applicable to some groups of respondents, a respondent may not have known the answer to a question, or a respondent may have skipped the item entirely. Please consult the survey methodology for more information.

1Donors were selected based on the type of data the donor would supply to the record undergoing imputation. Matching variables were selected based on their close relationship to the item requiring imputation, and a pool of donors was selected based on their answers to these matching variables.

2Goldring, R., Taie, S., Rizzo, L., Colby, D., and Fraser, A. (2013). User’s Manual for the 2011–12 Schools and Staffing Survey, Volume 1: Overview (NCES 2013-330). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

sass12teachpriv_subject.pdf977 KBsass12teachpriv_varname.pdf1.05 MB2332
642011-2012qsOffpsOntsOff42,000

Perturbation

To protect the confidentiality of NCES data that contain information about specific individuals, data were subject to perturbation procedures to minimize disclosure risk. Perturbation procedures, which have been approved by the NCES Disclosure Review Board, preserve the central tendency estimates but may result in slight increases in non-sampling errors. Data swapping and other forms of perturbation can lead to inconsistencies.


Imputation

Three types of edits were performed on the SASS data: blanking, consistency, and logic edits. Blanking edits delete extraneous entries that result from respondents failing to follow skip patterns correctly and assign “missing” codes to items that respondents should have answered and didn’t. Consistency edits ensured that responses to related items were consistent and did not contradict other survey data. Finally, logic edits were performed, using information collected from the same questionnaire, associated questionnaires in the same school or district, or information from the sampling frame to fill missing items, where possible.
After blanking, consistency, and logic edits were completed, any missing items that remained were filled using imputation. Data were imputed from items found on questionnaires of the same type that had certain characteristics in common or from the aggregated answers of similar questionnaires. These records are called “donor records1,” and the method of imputation that involves imputing data from donor records is called “hot-deck2” imputation.


Skips and Missing Values

Following data collection, the data are subjected to various consistency and quality control checks before release for use by analysts. One important check is examining all variables with missing data and substituting specific values to indicate the reason for the missing data. For example, an item may not have been applicable to some groups of respondents, a respondent may not have known the answer to a question, or a respondent may have skipped the item entirely. Please consult the survey methodology for more information.

1Donors were selected based on the type of data the donor would supply to the record undergoing imputation. Matching variables were selected based on their close relationship to the item requiring imputation, and a pool of donors was selected based on their answers to these matching variables.

2Goldring, R., Taie, S., Rizzo, L., Colby, D., and Fraser, A. (2013). User’s Manual for the 2011–12 Schools and Staffing Survey, Volume 1: Overview (NCES 2013-330). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

sass12teachcombined_subject.pdf1.15 MBsass12teachcombined_varname.pdf1.15 MB3332
Schools and Staffing Survey, Principals
SASS
Public and private school principals
Experience, Training, Education, and Professional Development, Goals and Decision Making, Teacher and Aide Professional Development, School Climate and Safety, Instructional Time, Working Conditions and Principal Perceptions, Teacher and School Performance
https://nces.ed.gov/surveys/sass
652011-2012qsOffpsOntsOff9,000

Perturbation

To protect the confidentiality of NCES data that contain information about specific individuals, data were subject to perturbation procedures to minimize disclosure risk. Perturbation procedures, which have been approved by the NCES Disclosure Review Board, preserve the central tendency estimates but may result in slight increases in non-sampling errors. Data swapping and other forms of perturbation can lead to inconsistencies.


Imputation

Three types of edits were performed on the SASS data: blanking, consistency, and logic edits. Blanking edits delete extraneous entries that result from respondents failing to follow skip patterns correctly and assign “missing” codes to items that respondents should have answered and didn’t. Consistency edits ensured that responses to related items were consistent and did not contradict other survey data. Finally, logic edits were performed, using information collected from the same questionnaire, associated questionnaires in the same school or district, or information from the sampling frame to fill missing items, where possible.
After blanking, consistency, and logic edits were completed, any missing items that remained were filled using imputation. Data were imputed from items found on questionnaires of the same type that had certain characteristics in common or from the aggregated answers of similar questionnaires. These records are called “donor records1,” and the method of imputation that involves imputing data from donor records is called “hot-deck2” imputation.


Skips and Missing Values

Following data collection, the data are subjected to various consistency and quality control checks before release for use by analysts. One important check is examining all variables with missing data and substituting specific values to indicate the reason for the missing data. For example, an item may not have been applicable to some groups of respondents, a respondent may not have known the answer to a question, or a respondent may have skipped the item entirely. Please consult the survey methodology for more information.

1Donors were selected based on the type of data the donor would supply to the record undergoing imputation. Matching variables were selected based on their close relationship to the item requiring imputation, and a pool of donors was selected based on their answers to these matching variables.

2Goldring, R., Taie, S., Rizzo, L., Colby, D., and Fraser, A. (2013). User’s Manual for the 2011–12 Schools and Staffing Survey, Volume 1: Overview (NCES 2013-330). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

sass12prinpub_subject.pdf1.99 MBsass12prinpub_varname.pdf1.97 MB1343
662011-2012qsOffpsOntsOff9,000

Perturbation

To protect the confidentiality of NCES data that contain information about specific individuals, data were subject to perturbation procedures to minimize disclosure risk. Perturbation procedures, which have been approved by the NCES Disclosure Review Board, preserve the central tendency estimates but may result in slight increases in non-sampling errors. Data swapping and other forms of perturbation can lead to inconsistencies.


Imputation

Three types of edits were performed on the SASS data: blanking, consistency, and logic edits. Blanking edits delete extraneous entries that result from respondents failing to follow skip patterns correctly and assign “missing” codes to items that respondents should have answered and didn’t. Consistency edits ensured that responses to related items were consistent and did not contradict other survey data. Finally, logic edits were performed, using information collected from the same questionnaire, associated questionnaires in the same school or district, or information from the sampling frame to fill missing items, where possible.
After blanking, consistency, and logic edits were completed, any missing items that remained were filled using imputation. Data were imputed from items found on questionnaires of the same type that had certain characteristics in common or from the aggregated answers of similar questionnaires. These records are called “donor records1,” and the method of imputation that involves imputing data from donor records is called “hot-deck2” imputation.


Skips and Missing Values

Following data collection, the data are subjected to various consistency and quality control checks before release for use by analysts. One important check is examining all variables with missing data and substituting specific values to indicate the reason for the missing data. For example, an item may not have been applicable to some groups of respondents, a respondent may not have known the answer to a question, or a respondent may have skipped the item entirely. Please consult the survey methodology for more information.

1Donors were selected based on the type of data the donor would supply to the record undergoing imputation. Matching variables were selected based on their close relationship to the item requiring imputation, and a pool of donors was selected based on their answers to these matching variables.

2Goldring, R., Taie, S., Rizzo, L., Colby, D., and Fraser, A. (2013). User’s Manual for the 2011–12 Schools and Staffing Survey, Volume 1: Overview (NCES 2013-330). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

sass12prinpriv_subject.pdf1.98 MBsass12prinpriv_varname.pdf1.90 MB2343
672011-2012qsOffpsOntsOff9,000

Perturbation

To protect the confidentiality of NCES data that contain information about specific individuals, data were subject to perturbation procedures to minimize disclosure risk. Perturbation procedures, which have been approved by the NCES Disclosure Review Board, preserve the central tendency estimates but may result in slight increases in non-sampling errors. Data swapping and other forms of perturbation can lead to inconsistencies.


Imputation

Three types of edits were performed on the SASS data: blanking, consistency, and logic edits. Blanking edits delete extraneous entries that result from respondents failing to follow skip patterns correctly and assign “missing” codes to items that respondents should have answered and didn’t. Consistency edits ensured that responses to related items were consistent and did not contradict other survey data. Finally, logic edits were performed, using information collected from the same questionnaire, associated questionnaires in the same school or district, or information from the sampling frame to fill missing items, where possible.
After blanking, consistency, and logic edits were completed, any missing items that remained were filled using imputation. Data were imputed from items found on questionnaires of the same type that had certain characteristics in common or from the aggregated answers of similar questionnaires. These records are called “donor records1,” and the method of imputation that involves imputing data from donor records is called “hot-deck2” imputation.


Skips and Missing Values

Following data collection, the data are subjected to various consistency and quality control checks before release for use by analysts. One important check is examining all variables with missing data and substituting specific values to indicate the reason for the missing data. For example, an item may not have been applicable to some groups of respondents, a respondent may not have known the answer to a question, or a respondent may have skipped the item entirely. Please consult the survey methodology for more information.

1Donors were selected based on the type of data the donor would supply to the record undergoing imputation. Matching variables were selected based on their close relationship to the item requiring imputation, and a pool of donors was selected based on their answers to these matching variables.

2Goldring, R., Taie, S., Rizzo, L., Colby, D., and Fraser, A. (2013). User’s Manual for the 2011–12 Schools and Staffing Survey, Volume 1: Overview (NCES 2013-330). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

sass12princombined_subject.pdf1.92 MBsass12princombined_varname.pdf2.05 MB3343
Schools and Staffing Survey, Schools
SASS
Public and private schools
Teacher demand, teacher and principal characteristics, general conditions in schools, principals' and teachers' perceptions of school climate and problems in their schools, teacher compensation, district hiring and retention practices, basic characteristics of the student population
https://nces.ed.gov/surveys/sass
592011-2012qsOffpsOntsOff9,000

Perturbation

To protect the confidentiality of NCES data that contain information about specific individuals, data were subject to perturbation procedures to minimize disclosure risk. Perturbation procedures, which have been approved by the NCES Disclosure Review Board, preserve the central tendency estimates but may result in slight increases in non-sampling errors. Data swapping and other forms of perturbation can lead to inconsistencies.


Imputation

Three types of edits were performed on the SASS data: blanking, consistency, and logic edits. Blanking edits delete extraneous entries that result from respondents failing to follow skip patterns correctly and assign “missing” codes to items that respondents should have answered and didn’t. Consistency edits ensured that responses to related items were consistent and did not contradict other survey data. Finally, logic edits were performed, using information collected from the same questionnaire, associated questionnaires in the same school or district, or information from the sampling frame to fill missing items, where possible.
After blanking, consistency, and logic edits were completed, any missing items that remained were filled using imputation. Data were imputed from items found on questionnaires of the same type that had certain characteristics in common or from the aggregated answers of similar questionnaires. These records are called “donor records1,” and the method of imputation that involves imputing data from donor records is called “hot-deck2” imputation.


Skips and Missing Values

Following data collection, the data are subjected to various consistency and quality control checks before release for use by analysts. One important check is examining all variables with missing data and substituting specific values to indicate the reason for the missing data. For example, an item may not have been applicable to some groups of respondents, a respondent may not have known the answer to a question, or a respondent may have skipped the item entirely. Please consult the survey methodology for more information.

1Donors were selected based on the type of data the donor would supply to the record undergoing imputation. Matching variables were selected based on their close relationship to the item requiring imputation, and a pool of donors was selected based on their answers to these matching variables.

2Goldring, R., Taie, S., Rizzo, L., Colby, D., and Fraser, A. (2013). User’s Manual for the 2011–12 Schools and Staffing Survey, Volume 1: Overview (NCES 2013-330). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

sass12schoolpub_subject.pdf520 KBsass12schoolpub_varname.pdf530 KB1351
602011-2012qsOffpsOntsOff9,000

Perturbation

To protect the confidentiality of NCES data that contain information about specific individuals, data were subject to perturbation procedures to minimize disclosure risk. Perturbation procedures, which have been approved by the NCES Disclosure Review Board, preserve the central tendency estimates but may result in slight increases in non-sampling errors. Data swapping and other forms of perturbation can lead to inconsistencies.


Imputation

Three types of edits were performed on the SASS data: blanking, consistency, and logic edits. Blanking edits delete extraneous entries that result from respondents failing to follow skip patterns correctly and assign “missing” codes to items that respondents should have answered and didn’t. Consistency edits ensured that responses to related items were consistent and did not contradict other survey data. Finally, logic edits were performed, using information collected from the same questionnaire, associated questionnaires in the same school or district, or information from the sampling frame to fill missing items, where possible.
After blanking, consistency, and logic edits were completed, any missing items that remained were filled using imputation. Data were imputed from items found on questionnaires of the same type that had certain characteristics in common or from the aggregated answers of similar questionnaires. These records are called “donor records1,” and the method of imputation that involves imputing data from donor records is called “hot-deck2” imputation.


Skips and Missing Values

Following data collection, the data are subjected to various consistency and quality control checks before release for use by analysts. One important check is examining all variables with missing data and substituting specific values to indicate the reason for the missing data. For example, an item may not have been applicable to some groups of respondents, a respondent may not have known the answer to a question, or a respondent may have skipped the item entirely. Please consult the survey methodology for more information.

1Donors were selected based on the type of data the donor would supply to the record undergoing imputation. Matching variables were selected based on their close relationship to the item requiring imputation, and a pool of donors was selected based on their answers to these matching variables.

2Goldring, R., Taie, S., Rizzo, L., Colby, D., and Fraser, A. (2013). User’s Manual for the 2011–12 Schools and Staffing Survey, Volume 1: Overview (NCES 2013-330). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

sass12schoolpriv_subject.pdf720 KBsass12schoolpriv_varname.pdf675 KB2351
612011-2012qsOffpsOntsOff9,000

Perturbation

To protect the confidentiality of NCES data that contain information about specific individuals, data were subject to perturbation procedures to minimize disclosure risk. Perturbation procedures, which have been approved by the NCES Disclosure Review Board, preserve the central tendency estimates but may result in slight increases in non-sampling errors. Data swapping and other forms of perturbation can lead to inconsistencies.


Imputation

Three types of edits were performed on the SASS data: blanking, consistency, and logic edits. Blanking edits delete extraneous entries that result from respondents failing to follow skip patterns correctly and assign “missing” codes to items that respondents should have answered and didn’t. Consistency edits ensured that responses to related items were consistent and did not contradict other survey data. Finally, logic edits were performed, using information collected from the same questionnaire, associated questionnaires in the same school or district, or information from the sampling frame to fill missing items, where possible.
After blanking, consistency, and logic edits were completed, any missing items that remained were filled using imputation. Data were imputed from items found on questionnaires of the same type that had certain characteristics in common or from the aggregated answers of similar questionnaires. These records are called “donor records1,” and the method of imputation that involves imputing data from donor records is called “hot-deck2” imputation.


Skips and Missing Values

Following data collection, the data are subjected to various consistency and quality control checks before release for use by analysts. One important check is examining all variables with missing data and substituting specific values to indicate the reason for the missing data. For example, an item may not have been applicable to some groups of respondents, a respondent may not have known the answer to a question, or a respondent may have skipped the item entirely. Please consult the survey methodology for more information.

1Donors were selected based on the type of data the donor would supply to the record undergoing imputation. Matching variables were selected based on their close relationship to the item requiring imputation, and a pool of donors was selected based on their answers to these matching variables.

2Goldring, R., Taie, S., Rizzo, L., Colby, D., and Fraser, A. (2013). User’s Manual for the 2011–12 Schools and Staffing Survey, Volume 1: Overview (NCES 2013-330). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

sass12schoolcombined_subject.pdf1.60 MBsass12schoolcombined_varname.pdf1.55 MB3351
Schools and Staffing Survey, Districts
SASS
Public school districts
Recruitment and Hiring of Staff, Principal and Teacher Compensation, Student Assignment, Graduation Requirements, Migrant Education, District Performance
https://nces.ed.gov/surveys/sass
582011-2012qsOffpsOntsOff4,500

Perturbation

To protect the confidentiality of NCES data that contain information about specific individuals, data were subject to perturbation procedures to minimize disclosure risk. Perturbation procedures, which have been approved by the NCES Disclosure Review Board, preserve the central tendency estimates but may result in slight increases in non-sampling errors. Data swapping and other forms of perturbation can lead to inconsistencies.


Imputation

Three types of edits were performed on the SASS data: blanking, consistency, and logic edits. Blanking edits delete extraneous entries that result from respondents failing to follow skip patterns correctly and assign “missing” codes to items that respondents should have answered and didn’t. Consistency edits ensured that responses to related items were consistent and did not contradict other survey data. Finally, logic edits were performed, using information collected from the same questionnaire, associated questionnaires in the same school or district, or information from the sampling frame to fill missing items, where possible.
After blanking, consistency, and logic edits were completed, any missing items that remained were filled using imputation. Data were imputed from items found on questionnaires of the same type that had certain characteristics in common or from the aggregated answers of similar questionnaires. These records are called “donor records1,” and the method of imputation that involves imputing data from donor records is called “hot-deck2” imputation.


Skips and Missing Values

Following data collection, the data are subjected to various consistency and quality control checks before release for use by analysts. One important check is examining all variables with missing data and substituting specific values to indicate the reason for the missing data. For example, an item may not have been applicable to some groups of respondents, a respondent may not have known the answer to a question, or a respondent may have skipped the item entirely. Please consult the survey methodology for more information.

1Donors were selected based on the type of data the donor would supply to the record undergoing imputation. Matching variables were selected based on their close relationship to the item requiring imputation, and a pool of donors was selected based on their answers to these matching variables.

2Goldring, R., Taie, S., Rizzo, L., Colby, D., and Fraser, A. (2013). User’s Manual for the 2011–12 Schools and Staffing Survey, Volume 1: Overview (NCES 2013-330). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

sass12district_subject.pdf1.15 MBsass12district_varname.pdf1.10 MB3164
Schools and Staffing Survey, Library Media Centers
SASS
Library media centers
School information, Facilities, services, and policies, Staffing information, Technology and information literacy, Collections and expenditures
https://nces.ed.gov/surveys/sass
572011-2012qsOffpsOntsOff7,000

Perturbation

To protect the confidentiality of NCES data that contain information about specific individuals, data were subject to perturbation procedures to minimize disclosure risk. Perturbation procedures, which have been approved by the NCES Disclosure Review Board, preserve the central tendency estimates but may result in slight increases in non-sampling errors. Data swapping and other forms of perturbation can lead to inconsistencies.


Imputation

Three types of edits were performed on the SASS data: blanking, consistency, and logic edits. Blanking edits delete extraneous entries that result from respondents failing to follow skip patterns correctly and assign “missing” codes to items that respondents should have answered and didn’t. Consistency edits ensured that responses to related items were consistent and did not contradict other survey data. Finally, logic edits were performed, using information collected from the same questionnaire, associated questionnaires in the same school or district, or information from the sampling frame to fill missing items, where possible.
After blanking, consistency, and logic edits were completed, any missing items that remained were filled using imputation. Data were imputed from items found on questionnaires of the same type that had certain characteristics in common or from the aggregated answers of similar questionnaires. These records are called “donor records1,” and the method of imputation that involves imputing data from donor records is called “hot-deck2” imputation.


Skips and Missing Values

Following data collection, the data are subjected to various consistency and quality control checks before release for use by analysts. One important check is examining all variables with missing data and substituting specific values to indicate the reason for the missing data. For example, an item may not have been applicable to some groups of respondents, a respondent may not have known the answer to a question, or a respondent may have skipped the item entirely. Please consult the survey methodology for more information.

1Donors were selected based on the type of data the donor would supply to the record undergoing imputation. Matching variables were selected based on their close relationship to the item requiring imputation, and a pool of donors was selected based on their answers to these matching variables.

2Goldring, R., Taie, S., Rizzo, L., Colby, D., and Fraser, A. (2013). User’s Manual for the 2011–12 Schools and Staffing Survey, Volume 1: Overview (NCES 2013-330). U.S. Department of Education. Washington, DC: National Center for Education Statistics.

sass12LMC_subject.pdf675 KBsass12LMC_varname.pdf695 KB3175
School Survey on Crime and Safety
SSOCS
Elementary and secondary schools
School Practices and Programs, Parent and Community Involvement at School, School Security, Staff Training, Limitations on Crime Prevention, Frequency of Crime and Violence, Frequency of hate and gang-related crimes, Disciplinary problems and actions
https://nces.ed.gov/surveys/ssocs
702009-2010qsOnpsOntsOff2,600

Imputation

Completed SSOCS surveys contain some level of item nonresponse after the conclusion of the data collection phase. Imputation procedures were used to impute missing values of key items in SSOCS:2000 and missing values of all items in each subsequent SSOCS. All imputed values are flagged as such.
SSOCS:2004 and Beyond: In subsequent collections, imputation procedures were used to create values for all questionnaire items with missing data. This procedural change from SSOCS:2000 was implemented because the analysis of incomplete datasets may cause different users to arrive at different conclusions, depending on how the missing data are treated. The imputation methods used in SSOCS:2004 and later surveys were tailored to the nature of each survey item. Four methods were used: aggregate proportions, logical, best match, and clerical.


Weighting

Data are weighted to compensate for differential probabilities of selection and to adjust for the effects of nonresponse.
Sample weights allow inferences to be made about the population from which the sample units are drawn. Because of the complex nature of the SSOCS sample design, these weights are necessary to obtain population-based estimates, to minimize bias arising from differences between responding and nonresponding schools, and to calibrate the data to known population characteristics in a way that reduces sampling error.

An initial (base) weight was first determined within each stratum by calculating the ratio of the number of schools available in the sampling frame to the number of schools selected. Due to nonresponse, the responding schools did not necessarily constitute a random sample from the schools in the stratum. In order to reduce the potential of bias due to nonresponse, weighting classes were determined by using a statistical algorithm similar to CHAID (chi-square automatic interaction detector) to partition the sample such that schools within a weighting class were homogenous with respect to their probability of responding. The same predictor variables from the SSOCS:2004 CHAID analysis were used for SSOCS:2006: instructional level, region, enrollment size, percent minority, student-to-FTE teaching staff ratio, percentage of students eligible for free or reduced-price lunch, and number of full-time equivalent (FTE) teachers. When the number of responding schools in a class was sufficiently small, the weighting class was combined with another to avoid the possibility of large weights. After combining the necessary classes, the base weights were adjusted so that the weighted distribution of the responding schools resembled the initial distribution of the total sample.

The nonresponse-adjusted weights were then poststratified to calibrate the sample to known population totals. Two dimension margins were set up for the poststratification—(1) instructional level and school enrollment size; and (2) instructional level and locale—and an iterative process known as the raking ratio adjustment brought the weights into agreement with known control totals. Poststratification works well when the population not covered by the survey is similar to the covered population within each poststratum. Thus, to be effective, the variables that define the poststrata must be correlated with the variables of interest, they must be well measured in the survey, and control totals must be available for the population as a whole. All three requirements were satisfied by the aforementioned poststratification margins. Instructional level, school enrollment, and locale have been shown to be correlated with crime (Miller 2004).

Miller, A.K. (2004). Violence in U.S. Public Schools: 2000 School Survey on Crime and Safety (NCES 2004-314R). National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education. Washington, DC.


ssocs2010_subject.pdf565 KBssocs2010_varname.pdf365 KB00831
742007-2008qsOnpsOntsOff2,560

Imputation

Completed SSOCS surveys contain some level of item nonresponse after the conclusion of the data collection phase. Imputation procedures were used to impute missing values of key items in SSOCS:2000 and missing values of all items in each subsequent SSOCS. All imputed values are flagged as such.
SSOCS:2004 and Beyond: In subsequent collections, imputation procedures were used to create values for all questionnaire items with missing data. This procedural change from SSOCS:2000 was implemented because the analysis of incomplete datasets may cause different users to arrive at different conclusions, depending on how the missing data are treated. The imputation methods used in SSOCS:2004 and later surveys were tailored to the nature of each survey item. Four methods were used: aggregate proportions, logical, best match, and clerical.


Weighting

Data are weighted to compensate for differential probabilities of selection and to adjust for the effects of nonresponse.
Sample weights allow inferences to be made about the population from which the sample units are drawn. Because of the complex nature of the SSOCS sample design, these weights are necessary to obtain population-based estimates, to minimize bias arising from differences between responding and nonresponding schools, and to calibrate the data to known population characteristics in a way that reduces sampling error.

An initial (base) weight was first determined within each stratum by calculating the ratio of the number of schools available in the sampling frame to the number of schools selected. Due to nonresponse, the responding schools did not necessarily constitute a random sample from the schools in the stratum. In order to reduce the potential of bias due to nonresponse, weighting classes were determined by using a statistical algorithm similar to CHAID (chi-square automatic interaction detector) to partition the sample such that schools within a weighting class were homogenous with respect to their probability of responding. The same predictor variables from the SSOCS:2004 CHAID analysis were used for SSOCS:2006: instructional level, region, enrollment size, percent minority, student-to-FTE teaching staff ratio, percentage of students eligible for free or reduced-price lunch, and number of full-time equivalent (FTE) teachers. When the number of responding schools in a class was sufficiently small, the weighting class was combined with another to avoid the possibility of large weights. After combining the necessary classes, the base weights were adjusted so that the weighted distribution of the responding schools resembled the initial distribution of the total sample.

The nonresponse-adjusted weights were then poststratified to calibrate the sample to known population totals. Two dimension margins were set up for the poststratification—(1) instructional level and school enrollment size; and (2) instructional level and locale—and an iterative process known as the raking ratio adjustment brought the weights into agreement with known control totals. Poststratification works well when the population not covered by the survey is similar to the covered population within each poststratum. Thus, to be effective, the variables that define the poststrata must be correlated with the variables of interest, they must be well measured in the survey, and control totals must be available for the population as a whole. All three requirements were satisfied by the aforementioned poststratification margins. Instructional level, school enrollment, and locale have been shown to be correlated with crime (Miller 2004).

Miller, A.K. (2004). Violence in U.S. Public Schools: 2000 School Survey on Crime and Safety (NCES 2004-314R). National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education. Washington, DC.


ssocs2008_subject.pdf1.96 MBssocs2008_varname.pdf912 KB00835
732005-2006qsOnpsOntsOff2,720

Imputation

Completed SSOCS surveys contain some level of item nonresponse after the conclusion of the data collection phase. Imputation procedures were used to impute missing values of key items in SSOCS:2000 and missing values of all items in each subsequent SSOCS. All imputed values are flagged as such.
SSOCS:2004 and Beyond: In subsequent collections, imputation procedures were used to create values for all questionnaire items with missing data. This procedural change from SSOCS:2000 was implemented because the analysis of incomplete datasets may cause different users to arrive at different conclusions, depending on how the missing data are treated. The imputation methods used in SSOCS:2004 and later surveys were tailored to the nature of each survey item. Four methods were used: aggregate proportions, logical, best match, and clerical.


Weighting

Data are weighted to compensate for differential probabilities of selection and to adjust for the effects of nonresponse.
Sample weights allow inferences to be made about the population from which the sample units are drawn. Because of the complex nature of the SSOCS sample design, these weights are necessary to obtain population-based estimates, to minimize bias arising from differences between responding and nonresponding schools, and to calibrate the data to known population characteristics in a way that reduces sampling error.

An initial (base) weight was first determined within each stratum by calculating the ratio of the number of schools available in the sampling frame to the number of schools selected. Due to nonresponse, the responding schools did not necessarily constitute a random sample from the schools in the stratum. In order to reduce the potential of bias due to nonresponse, weighting classes were determined by using a statistical algorithm similar to CHAID (chi-square automatic interaction detector) to partition the sample such that schools within a weighting class were homogenous with respect to their probability of responding. The same predictor variables from the SSOCS:2004 CHAID analysis were used for SSOCS:2006: instructional level, region, enrollment size, percent minority, student-to-FTE teaching staff ratio, percentage of students eligible for free or reduced-price lunch, and number of full-time equivalent (FTE) teachers. When the number of responding schools in a class was sufficiently small, the weighting class was combined with another to avoid the possibility of large weights. After combining the necessary classes, the base weights were adjusted so that the weighted distribution of the responding schools resembled the initial distribution of the total sample.

The nonresponse-adjusted weights were then poststratified to calibrate the sample to known population totals. Two dimension margins were set up for the poststratification—(1) instructional level and school enrollment size; and (2) instructional level and locale—and an iterative process known as the raking ratio adjustment brought the weights into agreement with known control totals. Poststratification works well when the population not covered by the survey is similar to the covered population within each poststratum. Thus, to be effective, the variables that define the poststrata must be correlated with the variables of interest, they must be well measured in the survey, and control totals must be available for the population as a whole. All three requirements were satisfied by the aforementioned poststratification margins. Instructional level, school enrollment, and locale have been shown to be correlated with crime (Miller 2004).

Miller, A.K. (2004). Violence in U.S. Public Schools: 2000 School Survey on Crime and Safety (NCES 2004-314R). National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education. Washington, DC.


ssocs2006_subject.pdf8.82 MBssocs2006_varname.pdf3.58 MB00834
Education Longitudinal Study
ELS
Students who were high school sophomores in 2001-02 or high school seniors in 2003-04
Student and Family Background, School and Classroom Characteristics, High School Completion and Dropout Status, Postsecondary Education Choice and Enrollment, Postsecondary Attainment, Employment, Transition to Adult Roles
https://nces.ed.gov/surveys/els2002
682002qsOnpsOntsOff14,000 to 16,000

Imputation

Stochastic methods were used to impute the missing values for the ELS:2002 third follow-up data. Specifically, a weighted sequential hot-deck (WSHD) statistical imputation procedure (Cox 1980; Iannacchione 1982) using the final analysis weight (F3QWT) was applied to the missing values for the variables in table 12 in the order in which they are listed. The WSHD procedure replaces missing data with valid data from a donor record within an imputation class. In general, variables with lower item nonresponse rates were imputed earlier in the process.


View methodology reportels2002sophomores_subject.pdf7.58 MBels2002sophomores_varname.pdf7.49 MB32929
692002qsOffpsOntsOff14,000 to 16,000

Imputation

Stochastic methods were used to impute the missing values for the ELS:2002 third follow-up data. Specifically, a weighted sequential hot-deck (WSHD) statistical imputation procedure (Cox 1980; Iannacchione 1982) using the final analysis weight (F3QWT) was applied to the missing values for the variables in table 12 in the order in which they are listed. The WSHD procedure replaces missing data with valid data from a donor record within an imputation class. In general, variables with lower item nonresponse rates were imputed earlier in the process.


View methodology reportels2002seniors_subject.pdf6.22 MBels2002seniors_varname.pdf6.16 MB42929
High School Longitudinal Study
HSLS
Students who were high school freshmen in the fall of 2009
Student Background, Math and Science Education, Classroom Characteristics, The Changing Environment of High School, Postsecondary Education Choice and Enrollment, Transition to Adult Roles
https://nces.ed.gov/surveys/hsls09
722009qsOnpsOntsOff23,000

Perturbation

To protect the confidentiality of NCES data that contain information about specific individuals, HSLS:09 data were subject to perturbation procedures to minimize disclosure risk. Perturbation procedures, which have been approved by the NCES Disclosure Review Board, preserve the central tendency estimates but may result in slight increases in nonsampling errors. Data swapping and other forms of perturbation, implemented to protect respondent confidentiality, can also lead to inconsistencies.


Imputation

Stochastic methods were used to impute the missing values. Specifically, a weighted sequential hot-deck (WSHD; statistical) imputation procedure (Cox 1980; Iannacchione 1982) using the final student analysis weight (W2STUDENT) was applied to the missing values for variables. The WSHD procedure replaces missing data with valid data from a donor record (i.e., first follow-up student [item] respondent) within an imputation class. In general, variables with lower item nonresponse rates were imputed earlier in the process.


Skips and Missing Values

The HSLS:09 data were edited using procedures developed and implemented for previous studies sponsored by NCES Following data collection, the information collected in the student instrument was subjected to various quality control checks and examinations. These checks were to confirm that the collected data reflected appropriate skip patterns. Another evaluation examined all variables with missing data and substituted specific values to indicate the reason for the missing data. A variety of explanations are possible for missing data.


The table below shows codes for missing values used in HSLS:09. Please consult the methodology report (coming soon) for more information.


Description of missing data codes

Missing data code Description
-1 Don't know
-4 Item not administered: abbreviated interview
-5 Suppressed
-6 Component not applicable
-7 Item legitimate skip/NA
-8 Unit nonresponse
-9 Missing
hsls2009_subject.pdf5.34 MBhsls2009_varname.pdf8.91 MB001033
Baccalaureate and Beyond
B&B
Bachelor degree recipients who were surveyed at the time of graduation, one year after graduation, four years after graduation, and ten years after graduation
Outcomes for bachelor's degree recipients, Graduate and professional program access, Labor market experiences, Rates of return on investment in education
https://nces.ed.gov/surveys/b&b
542008/2012qsOnpsOntsOff15,500

Perturbation

To protect the confidentiality of NCES data that contain information about specific individuals, B&B:08/12 data were subject to perturbation procedures to minimize disclosure risk. Perturbation procedures, which have been approved by the NCES Disclosure Review Board, preserve the central tendency estimates but may result in slight increases in nonsampling errors. B&B:08/12 has multiple sources of data for some variables (CPS, NLSDS, student interview, etc.), and reporting differences can occur in each. Data swapping and other forms of perturbation, implemented to protect respondent confidentiality, can also lead to inconsistencies.


Imputation

Variables with missing data were imputed for graduates who were respondents in a study wave . The imputation procedures employed a two-step process. The first step is a logical imputation . If the imputed value could be deduced from the logical relationships with other variables, then that information was used to impute the value for the recipient. The second step is weighted hot-deck imputation. This imputation procedure involves identifying a relatively homogenous group of observations, and, from within the group, selecting a random donor’s value to impute a value for the recipient.


Skips and Missing Values

The B&B: 08/12 data were edited using procedures developed and implemented for previous studies sponsored by NCES, including the base-year study, NPSAS:08. Following data collection, the information collected in the student instrument was subjected to various quality control checks and examinations. These checks were to confirm that the collected data reflected appropriate skip patterns. Another evaluation examined all variables with missing data and substituted specific values to indicate the reason for the missing data. A variety of explanations are possible for missing data.


The table below shows codes for missing values used in B&B:08/12. Please consult the First Look for more information.


Description of missing value codes

Missing data codeDescription
-1Don’t know
-2Independent student
-3Skipped
-9Missing

1In other words, if a graduate was a respondent in B&B:09, he or she will have no missing data for variables created as part of the B&B:09 wave. Similarly, if a graduate was a respondent in B&B:12, he or she will have no missing data for variables created as part of the B&B:12 wave, but may have missing data for variables created as part of the B&B:09 wave if he or she was not a respondent in B&B:09.

2Logical imputation is a process that aims to infer or deduce the missing values from answers to other questions.

3Sequential hot deck imputation involves defining imputation classes, which generally consist of a cross-classification of covariates, and then replacing missing values sequentially from a single pass through the survey data within the imputation classes. When this form of imputation is performed using the sampling weights, the procedure is called weighted sequential hot-deck imputation. This procedure takes into account the unequal probabilities of selection in the original sample to specify the expected number of times a particular respondent’s answer will be used as a donor. These expected selection frequencies are specified so that, over repeated applications of the algorithm, the weighted distribution of all values for that variable—imputed and observed—will resemble that of the target universe. While each respondent record may be selected for use as a hot-deck donor, the number of times a respondent record is used for imputation is controlled. To implement the weighted sequential hot-deck procedure, imputation classes and sorting variables that are relevant (strong predictors) for each item being imputed are defined. Imputation classes are developed by using a Chi-squared Automatic Interaction.

View methodology reportbb12_subject.pdf5.38 MBbb12_varname.pdf4.14 MB00116
202000/2001qsOffpsOntsOff10,000

Perturbation

To protect the confidentiality of NCES data that contain information about specific individuals, B&B:01 data were subject to perturbation procedures to minimize disclosure risk. Perturbation procedures, which have been approved by the NCES Disclosure Review Board, preserve the central tendency estimates but may result in slight increases in nonsampling errors.


Imputation

All variables with missing data were imputed. The imputation procedures employed a two-step process. The first step is a logical imputation1. If the imputed value could be deduced from the logical relationships with other variables, then that information was used to impute the value for the recipient. The second step is weighted hot-deck imputation.2 This imputation procedure involves identifying a relatively homogenous group of observations, and, from within the group, selecting a random donor’s value to impute a value for the recipient.


Skips and Missing Values

Both during and upon completion of data collection, edit checks were performed on the B&B:00/01 data file to confirm that the intended skip patterns were implemented during the interview. Following data collection, the information collected in CATI was subjected to various checks and examinations. These checks were intended to confirm that the database reflected appropriate skip-pattern relationships and different types of missing data by inserting special codes.


The Table below lists each missing value code and its associated meaning in the B&B:00/01 interview. For more information, see the Baccalaureate and Beyond Longitudinal Study (B&B:00/01) methodology report .


Description of missing data codes

Missing data code Description
-1 Don’t know (CATI variables), Data not available (CADE variables)
-2 Refused (CATI variables only)
-3 Not applicable (CADE and CATI variables only)
-4 B&B:97 nonrespondent not sampled
-6 Bad data, out of range
-7 Item was not reached (abbreviated and partial CATI interviews)
-8 Item was not reached due to a CATI error
-9 Data missing, reason unknown (CATI variables)
View methodology reportbb01_subject.pdf3.44 MBbb01_varname.pdf3.38 MB00119
311993/2003qsOnpsOntsOff11,200

Imputation

Variables used in cross-sectional estimates in the Baccalaureate and Beyond descriptive reports were imputed. The variables identified for imputation were used in the two B&B:93/03 descriptive reports (Bradburn, Nevill, and Forrest Cataldi 2006; Alt and Henke 2007). The imputations were performed in three steps. First, the interview variables were imputed using the sequential hot deck imputation method.1 This imputation procedure involves identifying a relatively homogenous group of observations, and within the group selecting a random donor’s value to impute a value for the recipient. Second, using the interview variables, including the newly imputed variable values, derived variables were constructed.


Skips and Missing Values

Both during and upon completion of data collection, edit checks were performed on the B&B:93/03 data file to confirm that the intended skip patterns were implemented during the interview. At the conclusion of data collection, special codes were added as needed to indicate the reason for missing data. Missing data within individual data elements can occur for a variety of reasons.


The Table below lists each missing value code and its associated meaning in the B&B:93/03 interview. For more information, see the Baccalaureate and Beyond Longitudinal Study (B&B:93/03) methodology report.


Description of missing data codes

Missing data code Description
-1 Missing
-2 Not applicable
-3 Skipped
-4 B&B:97 nonrespondent not sampled
-6 Uncodeable, out of range
-7 Not reached
-8 Item was not reached due to an error
-9 Missing, blank

1Sequential hot-deck imputation involves defining imputation classes, which generally consist of a cross-classification of covariates, and then replacing missing values sequentially from a single pass through the survey data within the imputation classes. When this form of imputation is performed using the sampling weights, the procedure is called weighted sequential hot-deck imputation. This procedure takes into account the unequal probabilities of selection in the original sample to specify the expected number of times a particular respondent’s answer will be used as a donor. These expected selection frequencies are specified so that, over repeated applications of the algorithm, the weighted distribution of all values for that variable—imputed and observed—will resemble that of the target universe. Under this methodology, while each respondent record has a chance to be selected for use as a hot-deck donor, the number of times a respondent record can be used for imputation will be controlled. To implement the weighted sequential hot-deck procedure, imputation classes and sorting variables that are relevant (strong predictor) for each item being imputed were defined. Imputation classes were developed by using a Chi-squared Automatic Interaction.

View methodology reportbb03_subject.pdf4.56 MBbb03_varname.pdf3.98 MB00117
Baccalaureate and Beyond, Graduate Students
B&B:GR
Bachelor degree recipients who were surveyed at the time of graduation, one year after graduation, four years after graduation, and ten years after graduation
Outcomes for bachelor's degree recipients, Graduate and professional program access, Labor market experiences, Rates of return on investment in education
https://nces.ed.gov/surveys/b&b
561993/2003qsOffpsOntsOff4,000

Imputation

Variables used in cross-sectional estimates in the Baccalaureate and Beyond descriptive reports were imputed. The variables identified for imputation were used in the two B&B:93/03 descriptive reports (Bradburn, Nevill, and Forrest Cataldi 2006; Alt and Henke 2007). The imputations were performed in three steps. First, the interview variables were imputed using the sequential hot deck imputation method.1 This imputation procedure involves identifying a relatively homogenous group of observations, and within the group selecting a random donor’s value to impute a value for the recipient. Second, using the interview variables, including the newly imputed variable values, derived variables were constructed.


Skips and Missing Values

Both during and upon completion of data collection, edit checks were performed on the B&B:93/03 data file to confirm that the intended skip patterns were implemented during the interview. At the conclusion of data collection, special codes were added as needed to indicate the reason for missing data. Missing data within individual data elements can occur for a variety of reasons.


The Table below lists each missing value code and its associated meaning in the B&B:93/03 interview. For more information, see the Baccalaureate and Beyond Longitudinal Study (B&B:93/03) methodology report.


Description of missing data codes

Missing data code Description
-1 Missing
-2 Not applicable
-3 Skipped
-4 B&B:97 nonrespondent not sampled
-6 Uncodeable, out of range
-7 Not reached
-8 Item was not reached due to an error
-9 Missing, blank

1Sequential hot-deck imputation involves defining imputation classes, which generally consist of a cross-classification of covariates, and then replacing missing values sequentially from a single pass through the survey data within the imputation classes. When this form of imputation is performed using the sampling weights, the procedure is called weighted sequential hot-deck imputation. This procedure takes into account the unequal probabilities of selection in the original sample to specify the expected number of times a particular respondent’s answer will be used as a donor. These expected selection frequencies are specified so that, over repeated applications of the algorithm, the weighted distribution of all values for that variable—imputed and observed—will resemble that of the target universe. Under this methodology, while each respondent record has a chance to be selected for use as a hot-deck donor, the number of times a respondent record can be used for imputation will be controlled. To implement the weighted sequential hot-deck procedure, imputation classes and sorting variables that are relevant (strong predictor) for each item being imputed were defined. Imputation classes were developed by using a Chi-squared Automatic Interaction.

View methodology reportbb03_subject_students.pdf9.75 MBbb03_varname_students.pdf8.81 MB00128
Beginning Postsecondary Students
BPS
Beginning students who were surveyed at the end of their first year, and then three and six years after first starting in postsecondary education.
Students’ persistence, progress and attainment of a degree, Labor force experiences
https://nces.ed.gov/surveys/bps/
712012/2014qsOnpsOntsOff25,000

Perturbation

To protect the confidentiality of NCES data that contain information about specific individuals, BPS:12/14 data were subject to perturbation procedures to minimize disclosure risk. Perturbation procedures, which have been approved by the NCES Disclosure Review Board, preserve the central tendency estimates but may result in slight increases in nonsampling errors. BPS:12/14 has multiple sources of data for some variables (CPS, NLSDS, student interview, etc.), and reporting differences can occur in each. Data swapping and other forms of perturbation, implemented to protect respondent confidentiality, can also lead to inconsistencies.


Imputation

All variables with missing data were imputed. The imputation procedures employed a two-step process. The first step is a logical imputation1. If the imputed value could be deduced from the logical relationships with other variables, then that information was used to impute the value for the recipient. The second step is weighted hot-deck imputation.2 This imputation procedure involves identifying a relatively homogenous group of observations, and, from within the group, selecting a random donor’s value to impute a value for the recipient.


Skips and Missing Values

The BPS:12/14 data were edited using procedures developed and implemented for previous studies sponsored by NCES, including the base-year study, NPSAS:04. Following data collection, the information collected in the student instrument was subjected to various quality control checks and examinations. These checks were to confirm that the collected data reflected appropriate skip patterns. Another evaluation examined all variables with missing data and substituted specific values to indicate the reason for the missing data. A variety of explanations are possible for missing data.


The table below shows codes for missing values used in BPS:12/14. Please consult the methodology report (coming soon) for more information.


Description of missing data codes

Missing data code Description
-1 Not classified
-2 Not applicable
-3 Skipped
-9 Data missing

1Logical imputation is a process that aims to infer or deduce the missing values from answers to other questions.

2Sequential hot-deck imputation involves defining imputation classes, which generally consist of a cross-classification of covariates, and then replacing missing values sequentially from a single pass through the survey data within the imputation classes. When this form of imputation is performed using the sampling weights, the procedure is called weighted sequential hot-deck imputation. This procedure takes into account the unequal probabilities of selection in the original sample to specify the expected number of times a particular respondent’s answer will be used as a donor. These expected selection frequencies are specified so that, over repeated applications of the algorithm, the weighted distribution of all values for that variable—imputed and observed—will resemble that of the target universe. While each respondent record may be selected for use as a hot-deck donor, the number of times a respondent record is used for imputation is controlled. To implement the weighted sequential hot-deck procedure, imputation classes and sorting variables that are relevant (strong predictors) for each item being imputed are defined. Imputation classes are developed by using a Chi-squared Automatic Interaction.

bps2014_subject.pdf2.10 MBbps2014_varname.pdf2.63 MB001332
532004/2009qsOnpsOntsOff16,500

Perturbation

To protect the confidentiality of NCES data that contain information about specific individuals, BPS:04/09 data were subject to perturbation procedures to minimize disclosure risk. Perturbation procedures, which have been approved by the NCES Disclosure Review Board, preserve the central tendency estimates but may result in slight increases in nonsampling errors. BPS:04/09 has multiple sources of data for some variables (CPS, NLSDS, student interview, etc.), and reporting differences can occur in each. Data swapping and other forms of perturbation, implemented to protect respondent confidentiality, can also lead to inconsistencies.


Imputation

All variables with missing data were imputed. The imputation procedures employed a two-step process. The first step is a logical imputation1. If the imputed value could be deduced from the logical relationships with other variables, then that information was used to impute the value for the recipient. The second step is weighted hot-deck imputation.2 This imputation procedure involves identifying a relatively homogenous group of observations, and, from within the group, selecting a random donor’s value to impute a value for the recipient.


Skips and Missing Values

The BPS:04/09 data were edited using procedures developed and implemented for previous studies sponsored by NCES, including the base-year study, NPSAS:04. Following data collection, the information collected in the student instrument was subjected to various quality control checks and examinations. These checks were to confirm that the collected data reflected appropriate skip patterns. Another evaluation examined all variables with missing data and substituted specific values to indicate the reason for the missing data. A variety of explanations are possible for missing data.


The table below shows codes for missing values used in BPS:04/09. Please consult the methodology report (coming soon) for more information.


Description of missing data codes

Missing data code Description
-2 Independent student
-3 Skipped
-9 Data missing

1Logical imputation is a process that aims to infer or deduce the missing values from answers to other questions.

2Sequential hot-deck imputation involves defining imputation classes, which generally consist of a cross-classification of covariates, and then replacing missing values sequentially from a single pass through the survey data within the imputation classes. When this form of imputation is performed using the sampling weights, the procedure is called weighted sequential hot-deck imputation. This procedure takes into account the unequal probabilities of selection in the original sample to specify the expected number of times a particular respondent’s answer will be used as a donor. These expected selection frequencies are specified so that, over repeated applications of the algorithm, the weighted distribution of all values for that variable—imputed and observed—will resemble that of the target universe. While each respondent record may be selected for use as a hot-deck donor, the number of times a respondent record is used for imputation is controlled. To implement the weighted sequential hot-deck procedure, imputation classes and sorting variables that are relevant (strong predictors) for each item being imputed are defined. Imputation classes are developed by using a Chi-squared Automatic Interaction.

View methodology reportbps2009_subject.pdf6.83 MBbps2009_varname.pdf5.67 MB001310
11996/2001qsOnpsOntsOff12,000

Imputation

Logical imputations were performed where items were missing but their values could be implicitly determined.


Skips and Missing Values

During and following data collection, the CATI/CAPI data were reviewed to confirm that the data collected reflected the intended skip-pattern relationships. At the conclusion of data collection, special codes were inserted in the database to reflect the different types of missing data. There are a variety of explanations for missing data within individual data elements.


The table below shows codes for missing values used in BPS:01. Please consult the methodology report for more information.


Description of missing data codes

Missing data code Description
-1 Don’t know
-2 Refused
-3 Legitimate skip (item was intentionally not collected because variable was not applicable to this student)
-6 Bad data, out of range, uncodeable userexit string
-7 Not reached
-8 Missing, CATI error
-9 Missing

View methodology reportbps2001_subject.pdf9.05 MBbps2001_varname.pdf6.95 MB001311
321990/1994qsOffpsOntsOff6,600

Perturbation

To protect the confidentiality of NCES data that contain information about specific individuals, BPS:94 data were subject to perturbation procedures to minimize disclosure risk. Perturbation procedures, which have been approved by the NCES Disclosure Review Board, preserve the central tendency estimates but may result in slight increases in nonsampling errors. BPS:94 has multiple sources of data for some variables (CPS, NLSDS, student interview, etc.), and reporting differences can occur in each. Data swapping and other forms of perturbation, implemented to protect respondent confidentiality, can also lead to inconsistencies.


Imputation

All variables with missing data were imputed. The imputation procedures employed a two-step process. The first step is a logical imputation1. If the imputed value could be deduced from the logical relationships with other variables, then that information was used to impute the value for the recipient. The second step is weighted hot-deck imputation.2 This imputation procedure involves identifying a relatively homogenous group of observations, and, from within the group, selecting a random donor’s value to impute a value for the recipient.


Skips and Missing Values

The BPS:94 data were edited using procedures developed and implemented for previous studies sponsored by NCES. Following data collection, the information collected in the student instrument was subjected to various quality control checks and examinations. These checks were to confirm that the collected data reflected appropriate skip patterns. Another evaluation examined all variables with missing data and substituted specific values to indicate the reason for the missing data.

A variety of explanations are possible for missing data.



The table below shows codes for missing values used in BPS:94. Please consult the methodology report  for more information.


Description of missing data codes

         
Missing data code Description
-2 Independent student
-3 Skipped
-9 Data missing
View methodology reportbps1994_subject.pdf4.34 MBbps1994_varname.pdf4.17 MB001312
National Postsecondary Student Aid Study, Undergraduate
NPSAS:UG
Students who were undergraduates at the time of interview
General demographics, Types of aid and amounts received, Cost of attending college, Combinations of work, study, and borrowing, Enrollment patterns
https://nces.ed.gov/surveys/npsas
8212012qsOnpsOntsOn95,000

Perturbation

To protect the confidentiality of NCES data that contain information about specific individuals, data were subject to perturbation procedures to minimize disclosure risk. Perturbation procedures, which have been approved by the NCES Disclosure Review Board, preserve the central tendency estimates but may result in slight increases in nonsampling errors. Data swapping and other forms of perturbation can lead to inconsistencies.


Missing Values and Imputation

Following data collection, the data are subjected to various consistency and quality control checks before release. One important check is examining all variables with missing data and substituting specific values to indicate the reason for the missing data. For example, an item may not have been applicable to some respondents, a respondent may not have known the answer to a question, or a respondent may have skipped the item entirely.


Except for data that were missing for cases to which they did not apply (e.g., whether a spouse is enrolled in college for unmarried students) and in a small number of items describing institutional characteristics, missing data were imputed using a two-step process. The first step is a logical imputation.1 If a value could be calculated from the logical relationships with other variables, then that information was used to impute the value for the observation with a missing value. The second step is weighted hot deck imputation.2 This procedure involves identifying a relatively homogenous group of observations, and, from within the group, selecting a random donor's value to impute a value for the observation with a missing value.


The table below shows the set of missing value codes for missing values that were not imputed in NPSAS:12. More information is available from the NPSAS:12 Data File Documentation (http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2014182).


Description of missing value codes

Missing data codeDescription
-1Not classified
-2Not applicable
-3Skipped
-9Missing

1Logical imputation is a process that aims to infer or deduce the missing values from values for other items.

2Sequential hot deck imputation involves defining imputation classes, which generally consist of a cross-classification of covariates, and then replacing missing values sequentially from a single pass through the survey data within the imputation classes. When this form of imputation is performed using the sampling weights, the procedure is called weighted sequential hot deck imputation. This procedure takes into account the unequal probabilities of selection in the original sample to specify the expected number of times a particular respondent's answer will be used as a donor. These expected selection frequencies are specified so that, over repeated applications of the algorithm, the weighted distribution of all values for that variable—imputed and observed—will resemble that of the target universe. While each respondent record may be selected for use as a hot deck donor, the number of times a respondent record is used for imputation is controlled. To implement the weighted sequential hot deck procedure, imputation classes and sorting variables that are relevant (strong predictors) for each item being imputed are defined. Imputation classes are developed by using the chi-square automatic interaction detection algorithm.

View methodology reportnpsas2012ug_subject.pdf6.90 MBnpsas2012ug_varname.pdf5.45 MB001413
5112008qsOnpsOntsOn113,500

Perturbation

To protect the confidentiality of NCES data that contain information about specific individuals, data were subject to perturbation procedures to minimize disclosure risk. Perturbation procedures, which have been approved by the NCES Disclosure Review Board, preserve the central tendency estimates but may result in slight increases in non-sampling errors. Data swapping and other forms of perturbation can lead to inconsistencies.


Imputation

All variables with missing data were imputed. The imputation procedures employed a two-step process. The first step is a logical imputation1. If the imputed value could be deduced from the logical relationships with other variables, then that information was used to impute the value for the recipient. The second step is weighted hot-deck imputation.2 This imputation procedure involves identifying a relatively homogenous group of observations, and, from within the group, selecting a random donor’s value to impute a value for the recipient.


Skips and Missing Values

Following data collection, the data are subjected to various consistency and quality control checks before release for use by analysts. One important check is examining all variables with missing data and substituting specific values to indicate the reason for the missing data. For example, an item may not have been applicable to some groups of respondents, a respondent may not have known the answer to a question, or a respondent may have skipped the item entirely.


The table below shows the set of reserve codes for missing values used in NPSAS 2008. Please consult the methodology report for more information.


Description of missing data codes

Missing data code Description
-1 Not classified
-2 Not applicable
-6 Out of range
-8 Item was not reached due to an error
-9 Missing

1Logical imputation is a process that aims to infer or deduce the missing values from answers to other questions.

2Sequential hot-deck imputation involves defining imputation classes, which generally consist of a cross-classification of covariates, and then replacing missing values sequentially from a single pass through the survey data within the imputation classes. When this form of imputation is performed using the sampling weights, the procedure is called weighted sequential hot-deck imputation. This procedure takes into account the unequal probabilities of selection in the original sample to specify the expected number of times a particular respondent’s answer will be used as a donor. These expected selection frequencies are specified so that, over repeated applications of the algorithm, the weighted distribution of all values for that variable—imputed and observed—will resemble that of the target universe. While each respondent record may be selected for use as a hot-deck donor, the number of times a respondent record is used for imputation is controlled. To implement the weighted sequential hot-deck procedure, imputation classes and sorting variables that are relevant (strong predictors) for each item being imputed are defined. Imputation classes are developed by using a Chi-squared Automatic Interaction.

View methodology reportnpsas2008ug_subject.pdf8.10 MBnpsas2008ug_varname.pdf6.40 MB001414
2412004qsOnpsOntsOn79,900
 

Perturbation

To protect the confidentiality of NCES data that contain information about specific individuals, data were subject to perturbation procedures to minimize disclosure risk. Perturbation procedures, which have been approved by the NCES Disclosure Review Board, preserve the central tendency estimates but may result in slight increases in non-sampling errors. Data swapping and other forms of perturbation can lead to inconsistencies.


Imputation

The imputation procedures employed a two-step process. In the first step, the matching criteria and imputation classes that were used to stratify the dataset were identified such that all imputation was processed independently within each class. In the second step, the weighted sequential hot deck process1 was implemented, whereby missing data were replaced with valid data from donor records that match the recipients with respect to the matching criteria. Variables requiring imputation were not imputed simultaneously. However, some variables that were related substantively were grouped together into blocks, and the variables within a block were imputed simultaneously. Basic demographic variables were imputed first using variables with full information to determine the matching criteria. The order in which variables were imputed was also determined to some extent by the substantive nature of the variables. For example, basic demographics (such as age) were imputed first and these were used to process education variables (such as student level and enrollment intensity) which in turn were used to impute the financial aid variables (such as aid receipt and loan amounts).


Skips and Missing Values

Edit checks were performed on the NPSAS:04 student interview data and CADE data, both during and upon completion of data collection, to confirm that the intended skip patterns were implemented in both instruments. At the conclusion of data collection, special codes were added as needed to indicate the reason for missing data. Missing data within individual data elements can occur for a variety of reasons.


The table below shows the set of reserve codes for missing values used in NPSAS 2004. Please consult the methodology report for more information.


Description of missing data codes

Missing data code Description
-1 Not classified
-3 Legitimate skip
-9 Missing

1Sequential hot-deck imputation involves defining imputation classes, which generally consist of a cross-classification of covariates, and then replacing missing values sequentially from a single pass through the survey data within the imputation classes. When this form of imputation is performed using the sampling weights, the procedure is called weighted sequential hot-deck imputation. This procedure takes into account the unequal probabilities of selection in the original sample to specify the expected number of times a particular respondent’s answer will be used as a donor. These expected selection frequencies are specified so that, over repeated applications of the algorithm, the weighted distribution of all values for that variable—imputed and observed—will resemble that of the target universe. While each respondent record may be selected for use as a hot-deck donor, the number of times a respondent record is used for imputation is controlled. To implement the weighted sequential hot-deck procedure, imputation classes and sorting variables that are relevant (strong predictors) for each item being imputed are defined. Imputation classes are developed by using a Chi-squared Automatic Interaction.

View methodology reportnpsas2004ug_subject.pdf7.75 MBnpsas2004ug_varname.pdf6.00 MB001416
3512000qsOffpsOntsOn50,000

Perturbation

To protect the confidentiality of NCES data that contain information about specific individuals, NPSAS:00 data were subject to perturbation procedures to minimize disclosure risk. Perturbation procedures, which have been approved by the NCES Disclosure Review Board, preserve the central tendency estimates but may result in slight increases in nonsampling errors.


Imputation

All variables with missing data were imputed. The imputation procedures employed a two-step process. The first step is a logical imputation1. If the imputed value could be deduced from the logical relationships with other variables, then that information was used to impute the value for the recipient. The second step is weighted hot-deck imputation.2 This imputation procedure involves identifying a relatively homogenous group of observations, and, from within the group, selecting a random donor’s value to impute a value for the recipient.


Skips and Missing Values

The NPSAS:00 data were edited using procedures developed and implemented for previous studies sponsored by NCES. Following data collection, the information collected in the student instrument was subjected to various quality control checks and examinations. These checks were to confirm that the collected data reflected appropriate skip patterns. Another evaluation examined all variables with missing data and substituted specific values to indicate the reason for the missing data. A variety of explanations are possible for missing data.


 

The table below shows codes for missing values used in NPSAS:00 Please consult the methodology report  for more information.


Description of missing data codes

         
Missing data code Description
-2 Independent student
-3 Skipped
-9 Data missing
View methodology reportnpsas2000ug_subject.pdf8.68 MBnpsas2000ug_varname.pdf7.25 MB001417
3611996qsOffpsOntsOn41,500

Imputation

Values for 22 analysis variables were imputed. The variables were imputed using a weighted hot deck procedure, with the exception of estimated family contribution (EFC), which was imputed through a multiple regression approach.The weighed hot deck imputation procedure involves identifying a relatively homogenous group of observations, and, from within the group, selecting a random donor’s value to impute a value for the recipient.


Skips and Missing Values

The NPSAS:96 data were edited using procedures developed and implemented for previous studies sponsored by NCES. Following data collection, the information collected in the student instrument was subjected to various quality control checks and examinations. These checks were to confirm that the collected data reflected appropriate skip patterns. Another evaluation examined all variables with missing data and substituted specific values to indicate the reason for the missing data. A variety of explanations are possible for missing data.


 

The table below shows codes for missing values used in NPSAS:96 Please consult the methodology report  for more information.


Description of missing data codes

             
Missing data code Description
-1 Don't know
-2 Refused
-3 Skipped
-8 Data source not available
-9 Data missing
View methodology reportnpsas1996ug_subject.pdf3.47 MBnpsas1996ug_varname.pdf3.09 MB001418
National Postsecondary Student Aid Study, Graduate
NPSAS:GR
Students who were graduate and first-professional students at the time of interview
General demographics, Types of aid and amounts received, Cost of attending college, Combinations of work, study, and borrowing, Enrollment patterns
https://nces.ed.gov/surveys/npsas
8322012qsOnpsOntsOn16,000

Perturbation

To protect the confidentiality of NCES data that contain information about specific individuals, data were subject to perturbation procedures to minimize disclosure risk. Perturbation procedures, which have been approved by the NCES Disclosure Review Board, preserve the central tendency estimates but may result in slight increases in nonsampling errors. Data swapping and other forms of perturbation can lead to inconsistencies.


Missing Values and Imputation

Following data collection, the data are subjected to various consistency and quality control checks before release. One important check is examining all variables with missing data and substituting specific values to indicate the reason for the missing data. For example, an item may not have been applicable to some respondents, a respondent may not have known the answer to a question, or a respondent may have skipped the item entirely.


Except for data that were missing for cases to which they did not apply (e.g., whether a spouse is enrolled in college for unmarried students) and in a small number of items describing institutional characteristics, missing data were imputed using a two-step process. The first step is a logical imputation.1 If a value could be calculated from the logical relationships with other variables, then that information was used to impute the value for the observation with a missing value. The second step is weighted hot deck imputation.2 This procedure involves identifying a relatively homogenous group of observations, and, from within the group, selecting a random donor's value to impute a value for the observation with a missing value.


The table below shows the set of missing value codes for missing values that were not imputed in NPSAS:12. More information is available from the NPSAS:12 Data File Documentation (http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2014182).


Description of missing value codes

Missing data codeDescription
-1Not classified
-2Not applicable
-3Skipped
-9Missing

1Logical imputation is a process that aims to infer or deduce the missing values from values for other items.

2Sequential hot deck imputation involves defining imputation classes, which generally consist of a cross-classification of covariates, and then replacing missing values sequentially from a single pass through the survey data within the imputation classes. When this form of imputation is performed using the sampling weights, the procedure is called weighted sequential hot deck imputation. This procedure takes into account the unequal probabilities of selection in the original sample to specify the expected number of times a particular respondent's answer will be used as a donor. These expected selection frequencies are specified so that, over repeated applications of the algorithm, the weighted distribution of all values for that variable—imputed and observed—will resemble that of the target universe. While each respondent record may be selected for use as a hot deck donor, the number of times a respondent record is used for imputation is controlled. To implement the weighted sequential hot deck procedure, imputation classes and sorting variables that are relevant (strong predictors) for each item being imputed are defined. Imputation classes are developed by using the chi-square automatic interaction detection algorithm.

View methodology reportnpsas2012gr_subject.pdf1.47 MBnpsas2012gr_varname.pdf4.20 MB001519
5222008qsOnpsOntsOn14,200

Perturbation

To protect the confidentiality of NCES data that contain information about specific individuals, data were subject to perturbation procedures to minimize disclosure risk. Perturbation procedures, which have been approved by the NCES Disclosure Review Board, preserve the central tendency estimates but may result in slight increases in non-sampling errors. Data swapping and other forms of perturbation can lead to inconsistencies.


Imputation

All variables with missing data were imputed. The imputation procedures employed a two-step process. The first step is a logical imputation.1 If the imputed value could be deduced from the logical relationships with other variables, then that information was used to impute the value for the recipient. The second step is weighted hot-deck imputation.2 This imputation procedure involves identifying a relatively homogenous group of observations, and, from within the group, selecting a random donor’s value to impute a value for the recipient.


Skips and Missing Values

Following data collection, the data are subjected to various consistency and quality control checks before release for use by analysts. One important check is examining all variables with missing data and substituting specific values to indicate the reason for the missing data. For example, an item may not have been applicable to some groups of respondents, a respondent may not have known the answer to a question, or a respondent may have skipped the item entirely.


The table below shows the set of reserve codes for missing values used in NPSAS 2008. Please consult the methodology report for more information.


Description of missing data codes

Missing data code Description
-1 Not classified
-3 Not applicable
-6 Out of range
-8 Item was not reached due to an error
-9 Missing

1Logical imputation is a process that aims to infer or deduce the missing values from answers to other questions.

2Sequential hot-deck imputation involves defining imputation classes, which generally consist of a cross-classification of covariates, and then replacing missing values sequentially from a single pass through the survey data within the imputation classes. When this form of imputation is performed using the sampling weights, the procedure is called weighted sequential hot-deck imputation. This procedure takes into account the unequal probabilities of selection in the original sample to specify the expected number of times a particular respondent’s answer will be used as a donor. These expected selection frequencies are specified so that, over repeated applications of the algorithm, the weighted distribution of all values for that variable—imputed and observed—will resemble that of the target universe. While each respondent record may be selected for use as a hot-deck donor, the number of times a respondent record is used for imputation is controlled. To implement the weighted sequential hot-deck procedure, imputation classes and sorting variables that are relevant (strong predictors) for each item being imputed are defined. Imputation classes are developed by using a Chi-squared Automatic Interaction.

View methodology reportnpsas2008gr_subject.pdf1.02 MBnpsas2008gr_varname.pdf748 KB001520
1222004qsOnpsOntsOn10,900
 

Perturbation

To protect the confidentiality of NCES data that contain information about specific individuals, data were subject to perturbation procedures to minimize disclosure risk. Perturbation procedures, which have been approved by the NCES Disclosure Review Board, preserve the central tendency estimates but may result in slight increases in non-sampling errors. Data swapping and other forms of perturbation can lead to inconsistencies.


Imputation

The imputation procedures employed a two-step process. In the first step, the matching criteria and imputation classes that were used to stratify the dataset were identified such that all imputation was processed independently within each class. In the second step, the weighted sequential hot deck process1 was implemented, whereby missing data were replaced with valid data from donor records that match the recipients with respect to the matching criteria. Variables requiring imputation were not imputed simultaneously. However, some variables that were related substantively were grouped together into blocks, and the variables within a block were imputed simultaneously. Basic demographic variables were imputed first using variables with full information to determine the matching criteria. The order in which variables were imputed was also determined to some extent by the substantive nature of the variables. For example, basic demographics (such as age) were imputed first and these were used to process education variables (such as student level and enrollment intensity) which in turn were used to impute the financial aid variables (such as aid receipt and loan amounts).


Skips and Missing Values

Edit checks were performed on the NPSAS:04 student interview data and CADE data, both during and upon completion of data collection, to confirm that the intended skip patterns were implemented in both instruments. At the conclusion of data collection, special codes were added as needed to indicate the reason for missing data. Missing data within individual data elements can occur for a variety of reasons.


The table below shows the set of reserve codes for missing values used in NPSAS 2004. Please consult the methodology report for more information.


Description of missing data codes

Missing data code Description
-1 Not classified
-3 Legitimate skip
-9 Missing

1Sequential hot-deck imputation involves defining imputation classes, which generally consist of a cross-classification of covariates, and then replacing missing values sequentially from a single pass through the survey data within the imputation classes. When this form of imputation is performed using the sampling weights, the procedure is called weighted sequential hot-deck imputation. This procedure takes into account the unequal probabilities of selection in the original sample to specify the expected number of times a particular respondent’s answer will be used as a donor. These expected selection frequencies are specified so that, over repeated applications of the algorithm, the weighted distribution of all values for that variable—imputed and observed—will resemble that of the target universe. While each respondent record may be selected for use as a hot-deck donor, the number of times a respondent record is used for imputation is controlled. To implement the weighted sequential hot-deck procedure, imputation classes and sorting variables that are relevant (strong predictors) for each item being imputed are defined. Imputation classes are developed by using a Chi-squared Automatic Interaction.

View methodology reportnpsas2004gr_subject.pdf1.06 MBnpsas2004gr_varname.pdf787 KB001522
3722000qsOffpsOntsOn12,000

Perturbation

To protect the confidentiality of NCES data that contain information about specific individuals, NPSAS:00 data were subject to perturbation procedures to minimize disclosure risk. Perturbation procedures, which have been approved by the NCES Disclosure Review Board, preserve the central tendency estimates but may result in slight increases in nonsampling errors.


Imputation

All variables with missing data were imputed. The imputation procedures employed a two-step process. The first step is a logical imputation1. If the imputed value could be deduced from the logical relationships with other variables, then that information was used to impute the value for the recipient. The second step is weighted hot-deck imputation.2 This imputation procedure involves identifying a relatively homogenous group of observations, and, from within the group, selecting a random donor’s value to impute a value for the recipient.


Skips and Missing Values

The NPSAS:00 data were edited using procedures developed and implemented for previous studies sponsored by NCES. Following data collection, the information collected in the student instrument was subjected to various quality control checks and examinations. These checks were to confirm that the collected data reflected appropriate skip patterns. Another evaluation examined all variables with missing data and substituted specific values to indicate the reason for the missing data. A variety of explanations are possible for missing data.


 

The table below shows codes for missing values used in NPSAS:00 Please consult the methodology report  for more information.


Description of missing data codes

         
Missing data code Description
-2 Independent student
-3 Skipped
-9 Data missing
View methodology reportnpsas2000gr_subject.pdf1.71 MBnpsas2000gr_varname.pdf1.43 MB001523
3821996qsOffpsOntsOn7,000

Imputation

Values for 22 analysis variables were imputed. The variables were imputed using a weighted hot deck procedure, with the exception of estimated family contribution (EFC), which was imputed through a multiple regression approach.The weighed hot deck imputation procedure involves identifying a relatively homogenous group of observations, and, from within the group, selecting a random donor’s value to impute a value for the recipient.


Skips and Missing Values

The NPSAS:96 data were edited using procedures developed and implemented for previous studies sponsored by NCES. Following data collection, the information collected in the student instrument was subjected to various quality control checks and examinations. These checks were to confirm that the collected data reflected appropriate skip patterns. Another evaluation examined all variables with missing data and substituted specific values to indicate the reason for the missing data. A variety of explanations are possible for missing data.


 

The table below shows codes for missing values used in NPSAS:96 Please consult the methodology report  for more information.


Description of missing data codes

             
Missing data code Description
-1 Don''t know
-2 Refused
-3 Skipped
-8 Data source not available
-9 Data missing
View methodology reportnpsas1996gr_subject.pdf2.53 MBnpsas1996gr_varname.pdf2.13 MB001524
National Study of Postsecondary Faculty
NSOPF
Postsecondary faculty
Workload, Equity issues, Involvement in undergraduate teaching, Relationship between teaching and research
https://nces.ed.gov/surveys/nsopf
282004qsOnpsOntsOff26,100

Perturbation

A restricted faculty-level data file was created for release to individuals who apply for and meet standards for such data releases. While this file does not include personally identifying information (i.e., name and Social Security number), other data (i.e., institution, Integrated Postsecondary Education Data System [IPEDS] ID, demographic information, and salary data) may be manipulated in such a way to seem to identify data records corresponding to a particular faculty member. To protect further against such situations, some of the variable values were swapped between faculty respondents. This procedure perturbed and added additional uncertainty to the data. Thus, associations made among variable values to identify a faculty respondent may be based on the original or edited, imputed and/or swapped data. For the same reasons, the data from the institution questionnaire were also swapped to avoid data disclosure.


Imputation

Item imputation for the faculty questionnaire was performed in several steps. In the first step, the missing values of gender, race, and ethnicity were filled—using cold-deck imputation1— based on the sampling frame information or institution record data. These three key demographic variables were imputed prior to any other variables since they were used as key predictors for all other variables on the data file. After all logical2 and cold-deck imputation procedures were performed, the remaining variables were imputed using the weighted sequential hot-deck method.3 Initially, variables were separated into two groups: unconditional and conditional variables. The first group (unconditional) consisted of variables that applied to all respondents, while the second group (conditional) consisted of variables that applied to only a subset of the respondents. That is, conditional variables were subject to “gate” questions. After this initial grouping, these groups were divided into finer subgroups. After all variables were imputed, consistency checks were applied to the entire faculty data file to ensure that the imputed values did not conflict with other questionnaire items, observed or imputed. This process involved reviewing all of the logical imputation and editing rules as well.


Skips and Missing Values

During and following data collection, the data were reviewed to confirm that the data collected reflected the intended skip-pattern relationships. At the conclusion of data collection, special codes were inserted in the database to reflect the different types of missing data. There are a number of explanations for missing data; for example, the item may not have been applicable to certain respondents or a respondent may not have known the answer to the question. With the exception of the not applicable codes, missing data were stochastically imputed. Moreover, for hierarchical analyses and developing survey estimates for faculty members corresponding to sample institutions that provided faculty lists and responded to the institution survey, contextual weights were produced for such subsets of the responding faculty members.


The table below shows codes for missing values used in NSOPF:04. Please consult the methodology report for more information.


Description of missing data codes

Missing data code Description
-3 Legitimate skip
-7 Not reached
-9 Missing

1Cold-deck imputation involves replacing the missing values with data from sources such as data used for sampling frame construction. While resource intensive, these methods often obtain the actual value that is missing. Stochastic imputation methods, such as sequential hot-deck imputation, rely on the observed data to provide replacing values (donors) for records with missing values.

2Logical imputation is a process that aims to infer or deduce the missing values from answers to other questions.

3Sequential hot-deck imputation involves defining imputation classes, which generally consist of a cross-classification of covariates, and then replacing missing values sequentially from a single pass through the survey data within the imputation classes. When this form of imputation is performed using the sampling weights, the procedure is called weighted sequential hot-deck imputation. This procedure takes into account the unequal probabilities of selection in the original sample to specify the expected number of times a particular respondent’s answer will be used as a donor. These expected selection frequencies are specified so that, over repeated applications of the algorithm, the weighted distribution of all values for that variable—imputed and observed—will resemble that of the target universe. Under this methodology, while each respondent record has a chance to be selected for use as a hot-deck donor, the number of times a respondent record can be used for imputation will be controlled. To implement the weighted sequential hot-deck procedure, imputation classes and sorting variables that are relevant (strong predictor) for each item being imputed were defined. Imputation classes were developed by using a Chi-squared Automatic Interaction.

View methodology reportnsopf04_subject.pdf1.16 MBnsopf04_varname.pdf926 KB001625
National Study of Postsecondary Faculty, Institutions
NSOPF
Postsecondary institutions
Faculty tenure policies, Union representation, and Faculty attrition
https://nces.ed.gov/surveys/nsopf
292004qsOnpsOntsOff900
 

Imputation

The imputation process for the missing data from the institution questionnaire involved similar steps to those used for imputation of the faculty data. The missing data for variables were imputed using the weighted sequential hot-deck method.1 Analogous to the imputation process for the faculty data, the variables were partitioned into conditional and unconditional groups. The unconditional variables were sorted by percent missing and then imputed in the order from the lowest percent missing to the highest. The conditional group was partitioned into three subgroups based on the level of conditionality for each variable, and then imputed in that order. The imputation class for both unconditional and conditional variables consisted of the institution sampling stratum, and the sorting variables included the number of full-time and part-time faculty members.


Skips and Missing Values

During and following data collection, the data were reviewed to confirm that the data collected reflected the intended skip-pattern relationships. At the conclusion of data collection, special codes were inserted in the database to reflect the different types of missing data. There are a number of explanations for missing data; for example, the item may not have been applicable to certain respondents or a respondent may not have known the answer to the question. With the exception of the not applicable codes, missing data were stochastically imputed. Moreover, for hierarchical analyses and developing survey estimates for faculty members corresponding to sample institutions that provided faculty lists and responded to the institution survey, contextual weights were produced for such subsets of the responding faculty members.


The table below shows codes for missing values used in NSOPF:04. Please consult the methodology report for more information.


Description of missing data codes

Missing data code Description
-3 Legitimate skip
-7 Not reached
-9 Missing

1Sequential hot-deck imputation involves defining imputation classes, which generally consist of a cross-classification of covariates, and then replacing missing values sequentially from a single pass through the survey data within the imputation classes. When this form of imputation is performed using the sampling weights, the procedure is called weighted sequential hot-deck imputation. This procedure takes into account the unequal probabilities of selection in the original sample to specify the expected number of times a particular respondent’s answer will be used as a donor. These expected selection frequencies are specified so that, over repeated applications of the algorithm, the weighted distribution of all values for that variable—imputed and observed—will resemble that of the target universe. Under this methodology, while each respondent record has a chance to be selected for use as a hot-deck donor, the number of times a respondent record can be used for imputation will be controlled. To implement the weighted sequential hot-deck procedure, imputation classes and sorting variables that are relevant (strong predictor) for each item being imputed were defined. Imputation classes were developed by using a Chi-squared Automatic Interaction.

View methodology reportnsopf04inst_subject.pdf543 KBnsopf04inst_varname.pdf471 KB001726
Baccalaureate and Beyond: 1993/20031993/2003PostsecondaryqsOnpsOntsOff
Baccalaureate and Beyond: 1993/2003 Graduate students1993/2003PostsecondaryqsOffpsOntsOff
Baccalaureate and Beyond: 2000/20012000/2001PostsecondaryqsOffpsOntsOff
Baccalaureate and Beyond: 2008/20122008/2012PostsecondaryqsOnpsOntsOff
Beginning Postsecondary Students: 1990/19941990/1994PostsecondaryqsOffpsOntsOff
Beginning Postsecondary Students: 1996/20011996/2001PostsecondaryqsOnpsOntsOff
Beginning Postsecondary Students: 2004/20092004/2009PostsecondaryqsOnpsOntsOff
Beginning Postsecondary Students: 2012/20142012/2014PostsecondaryqsOnpsOntsOff
Education Longitudinal Study of 20022002PostsecondaryqsOnpsOntsOff
Education Longitudinal Study of 20022002K-12qsOnpsOntsOff
High School Longitudinal Study of 20092009PostsecondaryqsOnpsOntsOff
High School Longitudinal Study of 20092009K-12qsOnpsOntsOff
National Postsecondary Student Aid Study: 1996 Graduate Students1996PostsecondaryqsOffpsOntsOn2
National Postsecondary Student Aid Study: 1996 Undergraduates1996PostsecondaryqsOffpsOntsOn1
National Postsecondary Student Aid Study: 2000 Graduate Students2000PostsecondaryqsOffpsOntsOn2
National Postsecondary Student Aid Study: 2000 Undergraduates2000PostsecondaryqsOffpsOntsOn1
National Postsecondary Student Aid Study: 2004 Graduate Students2004PostsecondaryqsOnpsOntsOn2
National Postsecondary Student Aid Study: 2004 Undergraduates2004PostsecondaryqsOnpsOntsOn1
National Postsecondary Student Aid Study: 2008 Graduate Students2008PostsecondaryqsOnpsOntsOn2
National Postsecondary Student Aid Study: 2008 Undergraduates2008PostsecondaryqsOnpsOntsOn1
National Postsecondary Student Aid Study: 2012 Graduate Students2012PostsecondaryqsOnpsOntsOn2
National Postsecondary Student Aid Study: 2012 Undergraduates2012PostsecondaryqsOnpsOntsOn1
National Study of Postsecondary Faculty: 2004 Faculty2004PostsecondaryqsOnpsOntsOff
National Study of Postsecondary Faculty: 2004 Institution2004PostsecondaryqsOnpsOntsOff
Pre-Elementary Education Longitudinal Study, Waves 1-52003/2008Pre-KqsOnpsOntsOff
School Survey on Crime and Safety: 2005-062005-2006K-12qsOnpsOntsOff
School Survey on Crime and Safety: 2007-082007-2008K-12qsOnpsOntsOff
School Survey on Crime and Safety: 2009-102009-2010K-12qsOnpsOntsOff
Schools and Staffing Survey, Districts: 2011-122011-2012K-12qsOffpsOntsOff
Schools and Staffing Survey, Library Media Centers: 2011-122011-2012K-12qsOffpsOntsOff
Schools and Staffing Survey, Public and Private School Principals: 2011-122011-2012K-12qsOffpsOntsOff
Schools and Staffing Survey, Public and Private Schools: 2011-122011-2012K-12qsOffpsOntsOff
Schools and Staffing Survey, Public and Private Teachers: 2011-122011-2012K-12qsOffpsOntsOff
Schools and Staffing Survey, Public and Private Teachers: 2011-122011-2012K-12qsOffpsOntsOff
Schools and Staffing Survey, Public and Private Schools: 2011-122011-2012K-12qsOffpsOntsOff
Schools and Staffing Survey, Public and Private School Principals: 2011-122011-2012K-12qsOffpsOntsOff
Schools and Staffing Survey, Library Media Centers: 2011-122011-2012K-12qsOffpsOntsOff
Schools and Staffing Survey, Districts: 2011-122011-2012K-12qsOffpsOntsOff
School Survey on Crime and Safety: 2009-102009-2010K-12qsOnpsOntsOff
School Survey on Crime and Safety: 2007-082007-2008K-12qsOnpsOntsOff
School Survey on Crime and Safety: 2005-062005-2006K-12qsOnpsOntsOff
Pre-Elementary Education Longitudinal Study, Waves 1-52003/2008Pre-KqsOnpsOntsOff
National Study of Postsecondary Faculty: 2004 Institution2004PostsecondaryqsOnpsOntsOff
National Study of Postsecondary Faculty: 2004 Faculty2004PostsecondaryqsOnpsOntsOff
National Postsecondary Student Aid Study: 2012 Undergraduates2012PostsecondaryqsOnpsOntsOn1
National Postsecondary Student Aid Study: 2012 Graduate Students2012PostsecondaryqsOnpsOntsOn2
National Postsecondary Student Aid Study: 2008 Undergraduates2008PostsecondaryqsOnpsOntsOn1
National Postsecondary Student Aid Study: 2008 Graduate Students2008PostsecondaryqsOnpsOntsOn2
National Postsecondary Student Aid Study: 2004 Undergraduates2004PostsecondaryqsOnpsOntsOn1
National Postsecondary Student Aid Study: 2004 Graduate Students2004PostsecondaryqsOnpsOntsOn2
National Postsecondary Student Aid Study: 2000 Undergraduates2000PostsecondaryqsOffpsOntsOn1
National Postsecondary Student Aid Study: 2000 Graduate Students2000PostsecondaryqsOffpsOntsOn2
National Postsecondary Student Aid Study: 1996 Undergraduates1996PostsecondaryqsOffpsOntsOn1
National Postsecondary Student Aid Study: 1996 Graduate Students1996PostsecondaryqsOffpsOntsOn2
High School Longitudinal Study of 20092009PostsecondaryqsOnpsOntsOff
High School Longitudinal Study of 20092009K-12qsOnpsOntsOff
Education Longitudinal Study of 20022002PostsecondaryqsOnpsOntsOff
Education Longitudinal Study of 20022002K-12qsOnpsOntsOff
Beginning Postsecondary Students: 2012/20142012/2014PostsecondaryqsOnpsOntsOff
Beginning Postsecondary Students: 2004/20092004/2009PostsecondaryqsOnpsOntsOff
Beginning Postsecondary Students: 1996/20011996/2001PostsecondaryqsOnpsOntsOff
Beginning Postsecondary Students: 1990/19941990/1994PostsecondaryqsOffpsOntsOff
Baccalaureate and Beyond: 2008/20122008/2012PostsecondaryqsOnpsOntsOff
Baccalaureate and Beyond: 2000/20012000/2001PostsecondaryqsOffpsOntsOff
Baccalaureate and Beyond: 1993/2003 Graduate students1993/2003PostsecondaryqsOffpsOntsOff
Baccalaureate and Beyond: 1993/20031993/2003PostsecondaryqsOnpsOntsOff
Beginning Postsecondary Students: 1990/19941990/1994PostsecondaryqsOffpsOntsOff
Baccalaureate and Beyond: 1993/2003 Graduate students1993/2003PostsecondaryqsOffpsOntsOff
Baccalaureate and Beyond: 1993/20031993/2003PostsecondaryqsOnpsOntsOff
National Postsecondary Student Aid Study: 1996 Undergraduates1996PostsecondaryqsOffpsOntsOn1
National Postsecondary Student Aid Study: 1996 Graduate Students1996PostsecondaryqsOffpsOntsOn2
Beginning Postsecondary Students: 1996/20011996/2001PostsecondaryqsOnpsOntsOff
National Postsecondary Student Aid Study: 2000 Undergraduates2000PostsecondaryqsOffpsOntsOn1
National Postsecondary Student Aid Study: 2000 Graduate Students2000PostsecondaryqsOffpsOntsOn2
Baccalaureate and Beyond: 2000/20012000/2001PostsecondaryqsOffpsOntsOff
Education Longitudinal Study of 20022002PostsecondaryqsOnpsOntsOff
Education Longitudinal Study of 20022002K-12qsOnpsOntsOff
Pre-Elementary Education Longitudinal Study, Waves 1-52003/2008Pre-KqsOnpsOntsOff
National Study of Postsecondary Faculty: 2004 Institution2004PostsecondaryqsOnpsOntsOff
National Study of Postsecondary Faculty: 2004 Faculty2004PostsecondaryqsOnpsOntsOff
National Postsecondary Student Aid Study: 2004 Undergraduates2004PostsecondaryqsOnpsOntsOn1
National Postsecondary Student Aid Study: 2004 Graduate Students2004PostsecondaryqsOnpsOntsOn2
Beginning Postsecondary Students: 2004/20092004/2009PostsecondaryqsOnpsOntsOff
School Survey on Crime and Safety: 2005-062005-2006K-12qsOnpsOntsOff
School Survey on Crime and Safety: 2007-082007-2008K-12qsOnpsOntsOff
National Postsecondary Student Aid Study: 2008 Undergraduates2008PostsecondaryqsOnpsOntsOn1
National Postsecondary Student Aid Study: 2008 Graduate Students2008PostsecondaryqsOnpsOntsOn2
Baccalaureate and Beyond: 2008/20122008/2012PostsecondaryqsOnpsOntsOff
High School Longitudinal Study of 20092009PostsecondaryqsOnpsOntsOff
High School Longitudinal Study of 20092009K-12qsOnpsOntsOff
School Survey on Crime and Safety: 2009-102009-2010K-12qsOnpsOntsOff
Schools and Staffing Survey, Public and Private Teachers: 2011-122011-2012K-12qsOffpsOntsOff
Schools and Staffing Survey, Public and Private Schools: 2011-122011-2012K-12qsOffpsOntsOff
Schools and Staffing Survey, Public and Private School Principals: 2011-122011-2012K-12qsOffpsOntsOff
Schools and Staffing Survey, Library Media Centers: 2011-122011-2012K-12qsOffpsOntsOff
Schools and Staffing Survey, Districts: 2011-122011-2012K-12qsOffpsOntsOff
National Postsecondary Student Aid Study: 2012 Undergraduates2012PostsecondaryqsOnpsOntsOn1
National Postsecondary Student Aid Study: 2012 Graduate Students2012PostsecondaryqsOnpsOntsOn2
Beginning Postsecondary Students: 2012/20142012/2014PostsecondaryqsOnpsOntsOff
Beginning Postsecondary Students: 2012/20142012/2014PostsecondaryqsOnpsOntsOff
National Postsecondary Student Aid Study: 2012 Undergraduates2012PostsecondaryqsOnpsOntsOn1
National Postsecondary Student Aid Study: 2012 Graduate Students2012PostsecondaryqsOnpsOntsOn2
Schools and Staffing Survey, Public and Private Teachers: 2011-122011-2012K-12qsOffpsOntsOff
Schools and Staffing Survey, Public and Private Schools: 2011-122011-2012K-12qsOffpsOntsOff
Schools and Staffing Survey, Public and Private School Principals: 2011-122011-2012K-12qsOffpsOntsOff
Schools and Staffing Survey, Library Media Centers: 2011-122011-2012K-12qsOffpsOntsOff
Schools and Staffing Survey, Districts: 2011-122011-2012K-12qsOffpsOntsOff
School Survey on Crime and Safety: 2009-102009-2010K-12qsOnpsOntsOff
High School Longitudinal Study of 20092009PostsecondaryqsOnpsOntsOff
High School Longitudinal Study of 20092009K-12qsOnpsOntsOff
Baccalaureate and Beyond: 2008/20122008/2012PostsecondaryqsOnpsOntsOff
National Postsecondary Student Aid Study: 2008 Undergraduates2008PostsecondaryqsOnpsOntsOn1
National Postsecondary Student Aid Study: 2008 Graduate Students2008PostsecondaryqsOnpsOntsOn2
School Survey on Crime and Safety: 2007-082007-2008K-12qsOnpsOntsOff
School Survey on Crime and Safety: 2005-062005-2006K-12qsOnpsOntsOff
Beginning Postsecondary Students: 2004/20092004/2009PostsecondaryqsOnpsOntsOff
National Study of Postsecondary Faculty: 2004 Institution2004PostsecondaryqsOnpsOntsOff
National Study of Postsecondary Faculty: 2004 Faculty2004PostsecondaryqsOnpsOntsOff
National Postsecondary Student Aid Study: 2004 Undergraduates2004PostsecondaryqsOnpsOntsOn1
National Postsecondary Student Aid Study: 2004 Graduate Students2004PostsecondaryqsOnpsOntsOn2
Pre-Elementary Education Longitudinal Study, Waves 1-52003/2008Pre-KqsOnpsOntsOff
Education Longitudinal Study of 20022002PostsecondaryqsOnpsOntsOff
Education Longitudinal Study of 20022002K-12qsOnpsOntsOff
Baccalaureate and Beyond: 2000/20012000/2001PostsecondaryqsOffpsOntsOff
National Postsecondary Student Aid Study: 2000 Undergraduates2000PostsecondaryqsOffpsOntsOn1
National Postsecondary Student Aid Study: 2000 Graduate Students2000PostsecondaryqsOffpsOntsOn2
Beginning Postsecondary Students: 1996/20011996/2001PostsecondaryqsOnpsOntsOff
National Postsecondary Student Aid Study: 1996 Undergraduates1996PostsecondaryqsOffpsOntsOn1
National Postsecondary Student Aid Study: 1996 Graduate Students1996PostsecondaryqsOffpsOntsOn2
Baccalaureate and Beyond: 1993/2003 Graduate students1993/2003PostsecondaryqsOffpsOntsOff
Baccalaureate and Beyond: 1993/20031993/2003PostsecondaryqsOnpsOntsOff
Beginning Postsecondary Students: 1990/19941990/1994PostsecondaryqsOffpsOntsOff
Schools and Staffing Survey, Public and Private Teachers: 2011-122011-2012K-12qsOffpsOntsOff
Schools and Staffing Survey, Public and Private Schools: 2011-122011-2012K-12qsOffpsOntsOff
Schools and Staffing Survey, Public and Private School Principals: 2011-122011-2012K-12qsOffpsOntsOff
Schools and Staffing Survey, Library Media Centers: 2011-122011-2012K-12qsOffpsOntsOff
Schools and Staffing Survey, Districts: 2011-122011-2012K-12qsOffpsOntsOff
School Survey on Crime and Safety: 2009-102009-2010K-12qsOnpsOntsOff
School Survey on Crime and Safety: 2007-082007-2008K-12qsOnpsOntsOff
School Survey on Crime and Safety: 2005-062005-2006K-12qsOnpsOntsOff
High School Longitudinal Study of 20092009K-12qsOnpsOntsOff
Education Longitudinal Study of 20022002K-12qsOnpsOntsOff
National Study of Postsecondary Faculty: 2004 Institution2004PostsecondaryqsOnpsOntsOff
National Study of Postsecondary Faculty: 2004 Faculty2004PostsecondaryqsOnpsOntsOff
National Postsecondary Student Aid Study: 2012 Undergraduates2012PostsecondaryqsOnpsOntsOn1
National Postsecondary Student Aid Study: 2012 Graduate Students2012PostsecondaryqsOnpsOntsOn2
National Postsecondary Student Aid Study: 2008 Undergraduates2008PostsecondaryqsOnpsOntsOn1
National Postsecondary Student Aid Study: 2008 Graduate Students2008PostsecondaryqsOnpsOntsOn2
National Postsecondary Student Aid Study: 2004 Undergraduates2004PostsecondaryqsOnpsOntsOn1
National Postsecondary Student Aid Study: 2004 Graduate Students2004PostsecondaryqsOnpsOntsOn2
National Postsecondary Student Aid Study: 2000 Undergraduates2000PostsecondaryqsOffpsOntsOn1
National Postsecondary Student Aid Study: 2000 Graduate Students2000PostsecondaryqsOffpsOntsOn2
National Postsecondary Student Aid Study: 1996 Undergraduates1996PostsecondaryqsOffpsOntsOn1
National Postsecondary Student Aid Study: 1996 Graduate Students1996PostsecondaryqsOffpsOntsOn2
High School Longitudinal Study of 20092009PostsecondaryqsOnpsOntsOff
Education Longitudinal Study of 20022002PostsecondaryqsOnpsOntsOff
Beginning Postsecondary Students: 2012/20142012/2014PostsecondaryqsOnpsOntsOff
Beginning Postsecondary Students: 2004/20092004/2009PostsecondaryqsOnpsOntsOff
Beginning Postsecondary Students: 1996/20011996/2001PostsecondaryqsOnpsOntsOff
Beginning Postsecondary Students: 1990/19941990/1994PostsecondaryqsOffpsOntsOff
Baccalaureate and Beyond: 2008/20122008/2012PostsecondaryqsOnpsOntsOff
Baccalaureate and Beyond: 2000/20012000/2001PostsecondaryqsOffpsOntsOff
Baccalaureate and Beyond: 1993/2003 Graduate students1993/2003PostsecondaryqsOffpsOntsOff
Baccalaureate and Beyond: 1993/20031993/2003PostsecondaryqsOnpsOntsOff
Pre-Elementary Education Longitudinal Study, Waves 1-52003/2008Pre-KqsOnpsOntsOff
Pre-Elementary Education Longitudinal Study, Waves 1-52003/2008Pre-KqsOnpsOntsOff
National Study of Postsecondary Faculty: 2004 Institution2004PostsecondaryqsOnpsOntsOff
National Study of Postsecondary Faculty: 2004 Faculty2004PostsecondaryqsOnpsOntsOff
National Postsecondary Student Aid Study: 2012 Undergraduates2012PostsecondaryqsOnpsOntsOn1
National Postsecondary Student Aid Study: 2012 Graduate Students2012PostsecondaryqsOnpsOntsOn2
National Postsecondary Student Aid Study: 2008 Undergraduates2008PostsecondaryqsOnpsOntsOn1
National Postsecondary Student Aid Study: 2008 Graduate Students2008PostsecondaryqsOnpsOntsOn2
National Postsecondary Student Aid Study: 2004 Undergraduates2004PostsecondaryqsOnpsOntsOn1
National Postsecondary Student Aid Study: 2004 Graduate Students2004PostsecondaryqsOnpsOntsOn2
National Postsecondary Student Aid Study: 2000 Undergraduates2000PostsecondaryqsOffpsOntsOn1
National Postsecondary Student Aid Study: 2000 Graduate Students2000PostsecondaryqsOffpsOntsOn2
National Postsecondary Student Aid Study: 1996 Undergraduates1996PostsecondaryqsOffpsOntsOn1
National Postsecondary Student Aid Study: 1996 Graduate Students1996PostsecondaryqsOffpsOntsOn2
High School Longitudinal Study of 20092009PostsecondaryqsOnpsOntsOff
Education Longitudinal Study of 20022002PostsecondaryqsOnpsOntsOff
Beginning Postsecondary Students: 2012/20142012/2014PostsecondaryqsOnpsOntsOff
Beginning Postsecondary Students: 2004/20092004/2009PostsecondaryqsOnpsOntsOff
Beginning Postsecondary Students: 1996/20011996/2001PostsecondaryqsOnpsOntsOff
Beginning Postsecondary Students: 1990/19941990/1994PostsecondaryqsOffpsOntsOff
Baccalaureate and Beyond: 2008/20122008/2012PostsecondaryqsOnpsOntsOff
Baccalaureate and Beyond: 2000/20012000/2001PostsecondaryqsOffpsOntsOff
Baccalaureate and Beyond: 1993/2003 Graduate students1993/2003PostsecondaryqsOffpsOntsOff
Baccalaureate and Beyond: 1993/20031993/2003PostsecondaryqsOnpsOntsOff
Schools and Staffing Survey, Public and Private Teachers: 2011-122011-2012K-12qsOffpsOntsOff
Schools and Staffing Survey, Public and Private Schools: 2011-122011-2012K-12qsOffpsOntsOff
Schools and Staffing Survey, Public and Private School Principals: 2011-122011-2012K-12qsOffpsOntsOff
Schools and Staffing Survey, Library Media Centers: 2011-122011-2012K-12qsOffpsOntsOff
Schools and Staffing Survey, Districts: 2011-122011-2012K-12qsOffpsOntsOff
School Survey on Crime and Safety: 2009-102009-2010K-12qsOnpsOntsOff
School Survey on Crime and Safety: 2007-082007-2008K-12qsOnpsOntsOff
School Survey on Crime and Safety: 2005-062005-2006K-12qsOnpsOntsOff
High School Longitudinal Study of 20092009K-12qsOnpsOntsOff
Education Longitudinal Study of 20022002K-12qsOnpsOntsOff
1
Percentage distribution of 1995–96 beginning postsecondary students' highest degree attained by 2001, by work status
Highest degree completed as of June 2001 Certificate
(%)
Associate
(%)
Bachelor
(%)
Never attained
(%)
Total
Estimates
Total 11.7 9.8 29.8 48.6 100%
Job 1995–96: hours worked per week while enrolled
Did not work while enrolled 14.0 9.8 38.5 37.8 100%
Worked part time 8.9 11.6 35.5 44.0 100%
Worked full time 14.5 7.2 8.3 69.9 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 1995–96 Beginning Postsecondary Students Longitudinal Study, Second Follow-up (BPS:96/01).

Computation by NCES QuickStats on 6/22/2009
ckeakb7
2
Percentage distribution of 1995–96 beginning postsecondary students' highest degree attained by 2001, by number of advanced placement tests taken
Persistence and completion at any institution as of 2000-01 Never attained
(%)
Certificate
(%)
Associate
(%)
Bachelor
(%)
Total
Estimates
Total 48.6 11.7 9.9 29.8 100%
Number of Advanced Placement tests taken
0 51.1 7.7 12.1 29.1 100%
1 38.1 2.6 6.0 53.4 100%
2 33.6 0.4 3.4 62.6 100%
Three or more 13.8 0.1 1.4 84.8 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 1995–96 Beginning Postsecondary Students Longitudinal Study, Second Follow-up (BPS:96/01).

Computation by NCES QuickStats on 6/22/2009
ckeak19
3
Percentage of beginning postsecondary students who received Pell grants, by race/ethnicity: 1995–96
  Pell Grant amount 1995-96
(%>0)
Estimates
Total 26.4
Race/ethnicity
White, non-Hispanic 19.0
Black, non-Hispanic 49.3
Hispanic 42.4
Asian/Pacific Islander 35.5
American Indian/Alaska Native 33.2
Other
‡ Reporting standards not met.

Source: U.S. Department of Education, National Center for Education Statistics, 1995–96 Beginning Postsecondary Students Longitudinal Study, Second Follow-up (BPS:96/01).

Computation by NCES QuickStats on 3/10/2009
cgfak7e
4
Percentage distribution of 1995–96 beginning postsecondary students' grade point average (GPA) through 2001, by income percentile rank
Cumulative Grade Point Average (GPA) as of 2001 Mostly A’s
(%)
A’s and B’s
(%)
Mostly B’s
(%)
B’s and C’s
(%)
Mostly C’s
(%)
C’s and D’s
(%)
Mostly D’s or below
(%)
Total
Estimates
Total 13.3 31.8 35.3 14.4 4.4 0.7 0.1 100%
Income percentile rank 1994
1-25 13.1 28.2 37.8 14.7 4.7 1.4 0.2 100%
26-50 13.5 30.2 37.3 12.8 5.8 0.3 0.2 100%
51-75 12.9 36.1 33.1 14.0 3.4 0.4 0.. 100%
More than 75 13.7 32.7 33.0 16.3 3.7 0.7 0.0 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 1995–96 Beginning Postsecondary Students Longitudinal Study, Second Follow-up (BPS:96/01).

Computation by NCES QuickStats on 6/22/2009
ckeak03
5
Percentage distribution of 1995–96 beginning postsecondary students' persistence at any institution through 2001, by gender
Persistence at any institution through 2001 Attained, still enrolled
(%)
Attained, not enrolled
(%)
Never attained, still enrolled
(%)
Never attained, not enrolled
(%)
Total
Estimates
Total 5.9 45.5 14.9 33.7 100%
Gender
Male 5.9 41.8 15.8 36.5 100%
Female 5.8 48.5 14.2 31.5 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 1995–96 Beginning Postsecondary Students Longitudinal Study, Second Follow-up (BPS:96/01).

Computation by NCES QuickStats on 6/22/2009
cgeakd4
1
Percent of graduate students who borrowed, by type of graduate program: 2003–04
  Loans: total student loans all sources
(%>0)
Estimates
Total 40.0
Graduate study: program
Business administration (MBA) 39.1
Education (any master's) 34.8
Other master of arts (MA) 41.3
Other master of science (MS) 31.8
Other master's degree 49.3
PhD except in education 19.9
Education (any doctorate) 27.1
Other doctoral degree 49.5
Medicine (MD) 77.3
Other health science degree 81.7
Law (LLB or JD) 81.0
Theology (MDiv, MHL, BD) 30.0
Post-baccalaureate certificate 30.1
Not in a degree program 28.0
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2003–04 National Postsecondary Student Aid Study (NPSAS:04).

NOTICE OF REVISIONS: The NPSAS:04 weights were revised in June 2009. The revised weights will produce 2003-04 estimates that differ somewhat from those in any tables and publications produced before June 2009. See the description for the total Stafford loan variable (STAFFAMT) for details.

Computation by NCES QuickStats on 8/25/2009
bbfak2a
2
Percentage of graduate students with assistantships, by graduate field of study: 2003–04
  Assistantships
(%>0)
Estimates
Total 15.3
Graduate study: major field
Humanities 20.8
Social/behavioral sciences 31.7
Life sciences 47.4
Math/Engineering/Computer science 37.9
Education 7.6
Business/management 7.9
Health 10.3
Law 5.8
Others 23.8
Undeclared or not in a degree program 5.4
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2003–04 National Postsecondary Student Aid Study (NPSAS:04).

NOTICE OF REVISIONS: The NPSAS:04 weights were revised in June 2009. The revised weights will produce 2003-04 estimates that differ somewhat from those in any tables and publications produced before June 2009. See the description for the total Stafford loan variable (STAFFAMT) for details.

Computation by NCES QuickStats on 8/25/2009
ckeak39
3
Percentage distribution of graduates students' student/employee role, by graduate field of study: 2003–04
Work: primarily student or employee A student working to meet expenses
(%)
An employee enrolled in school
(%)
No job
(%)
Total
Estimates
Total 35.8 45.1 19.1 100%
Graduate study: major field
Humanities 44.9 35.9 19.2 100%
Social/behavioral sciences 58.9 24.6 16.5 100%
Life sciences 61.0 20.7 18.3 100%
Math/Engineering/Computer science 47.4 38.3 14.3 100%
Education 26.3 63.3 10.4 100%
Business/management 24.8 61.8 13.3 100%
Health 39.4 19.0 41.6 100%
Law 39.6 11.6 48.8 100%
Others 47.0 38.5 14.5 100%
Undeclared or not in a degree program 20.5 67.3 12.2 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2003–04 National Postsecondary Student Aid Study (NPSAS:04).

NOTICE OF REVISIONS: The NPSAS:04 weights were revised in June 2009. The revised weights will produce 2003-04 estimates that differ somewhat from those in any tables and publications produced before June 2009. See the description for the total Stafford loan variable (STAFFAMT) for details.

Computation by NCES QuickStats on 8/25/2009
ckeakce
4
Percentage of graduate students who have ever borrowed loans, by institution type: 2003–04
  Total loan debt (cumulative)
(%>0)
Estimates
Total 65.2
Type of 4-year institution
Public 4-year nondoctorate 61.4
Public 4-year doctorate 60.6
Private not-for-profit 4-yr nondoctorate 61.6
Private not-for-profit 4-year doctorate 71.3
Private for-profit 4-year 85.9
Attended more than one institution 68.9
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2003–04 National Postsecondary Student Aid Study (NPSAS:04).

NOTICE OF REVISIONS: The NPSAS:04 weights were revised in June 2009. The revised weights will produce 2003-04 estimates that differ somewhat from those in any tables and publications produced before June 2009. See the description for the total Stafford loan variable (STAFFAMT) for details.

Computation by NCES QuickStats on 8/25/2009
ckeak5f
5
Average loan amount for graduate students, by parents' education, 2003–04
  Loans: total student loans all sources
(Mean[0])
Estimates
Total 6,302.0
Parent's highest education
Do not know parent's education level 7,677.5
High school diploma or less 5,878.7
Some college 6,016.3
Bachelor's degree 5,794.3
Master's degree or higher 7,185.9
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2003–04 National Postsecondary Student Aid Study (NPSAS:04).

NOTICE OF REVISIONS: The NPSAS:04 weights were revised in June 2009. The revised weights will produce 2003-04 estimates that differ somewhat from those in any tables and publications produced before June 2009. See the description for the total Stafford loan variable (STAFFAMT) for details.

Computation by NCES QuickStats on 8/25/2009
ckeakef
1
Percentage of undergraduate students who applied for aid, by parents' income: 2003–04
Aid: applied for federal aid Yes
(%)
No
(%)
Total
Estimates
Total 58.3 41.7 100%
Income: dependent student household income
Less than $32,000 78.7 21.3 100%
$32,000-59,999 66.6 33.4 100%
$60,000-91,999 56.9 43.1 100%
$92,000 or more 47.1 52.9 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2003–04 National Postsecondary Student Aid Study (NPSAS:04).

NOTICE OF REVISIONS: The NPSAS:04 weights were revised in June 2009. The revised weights will produce 2003-04 estimates that differ somewhat from those in any tables and publications produced before June 2009. See the description for the total Stafford loan variable (STAFFAMT) for details.

Computation by NCES QuickStats on 8/25/2009
cgeak8c
2
Percentage distribution of undergraduates' cumulative grade point average (GPA) categories, by major field of study: 2003–04
Cumulative Grade Point Average (GPA) as of 2003-04 Less than 2.75
(%)
2.75 - 3.74
(%)
3.75 or higher
(%)
Total
Estimates
Total 34.4 49.0 16.7 100%
College study: major
Humanities 35.9 50.4 13.6 100%
Social/behavioral sciences 35.0 52.1 12.8 100%
Life sciences 34.9 52.7 12.4 100%
Physical sciences 31.5 54.3 14.2 100%
Math 29.1 55.3 15.6 100%
Computer/information science 34.0 48.1 17.9 100%
Engineering 37.4 48.1 14.5 100%
Education 31.9 52.6 15.5 100%
Business/management 35.6 49.3 15.1 100%
Health 32.2 50.7 17.0 100%
Vocational/technical 33.3 47.1 19.6 100%
Other technical/professional 36.7 49.9 13.4 100%
Undeclared or not in a degree program 33.2 44.1 22.8 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2003–04 National Postsecondary Student Aid Study (NPSAS:04).

NOTICE OF REVISIONS: The NPSAS:04 weights were revised in June 2009. The revised weights will produce 2003-04 estimates that differ somewhat from those in any tables and publications produced before June 2009. See the description for the total Stafford loan variable (STAFFAMT) for details.

Computation by NCES QuickStats on 8/25/2009
cgeake9
3
Mean net price of attendance for undergraduate students, by type of institution: 2003–04
  Net price after all aid
(Mean[0])
Estimates
Total 6,656.0
Institution: type
Public less-than-2-year 5,616.5
Public 2-year 4,716.3
Public 4-year nondoctorate 6,253.5
Public 4-year doctorate 7,564.1
Private not-for-profit less-than-4-year 7,382.3
Private not-for-profit 4-yr nondoctorate 9,208.7
Private not-for-profit 4-year doctorate 14,812.2
Private for-profit less-than-2-year 7,842.9
Private for-profit 2 years or more 6,737.6
Attended more than one institution
‡ Reporting standards not met.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2003–04 National Postsecondary Student Aid Study (NPSAS:04).

NOTICE OF REVISIONS: The NPSAS:04 weights were revised in June 2009. The revised weights will produce 2003-04 estimates that differ somewhat from those in any tables and publications produced before June 2009. See the description for the total Stafford loan variable (STAFFAMT) for details.

Computation by NCES QuickStats on 8/25/2009
bcfak0c
4
Percentage distribution of undergraduates' parents' highest level of education, by type of institution: 2003–04
Parent's highest education High school or less
(%)
Some college
(%)
Bachelor's degree or higher
(%)
Total
Estimates
Total 37.1 21.6 41.3 100%
Institution: type
Public less-than-2-year 54.2 17.4 28.4 100%
Public 2-year 43.3 23.9 32.7 100%
Public 4-year nondoctorate 28.7 20.5 50.8 100%
Public 4-year doctorate 46.9 18.8 34.2 100%
Private not-for-profit less than 4-year 29.6 18.1 52.3 100%
Private not-for-profit 4-year nondoctorate 55.6 17.4 27.0 100%
Private not-for-profit 4-year doctorate 53.8 20.2 25.9 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2003–04 National Postsecondary Student Aid Study (NPSAS:04).

NOTICE OF REVISIONS: The NPSAS:04 weights were revised in June 2009. The revised weights will produce 2003-04 estimates that differ somewhat from those in any tables and publications produced before June 2009. See the description for the total Stafford loan variable (STAFFAMT) for details.

Computation by NCES QuickStats on 10/14/2009
cgeakd5
5
Average amount of Pell grants received by undergraduates, by income and dependency status: 2003–04
  Grants: Pell Grants
(Avg>0)
Estimates
Total 2,449.7
Income: categories by dependency status
Dependent: Less than $10,000 3,242.2
Dependent: $10,000-$19,999 3,176.1
Dependent: $20,000-$29,999 2,715.0
Dependent: $30,000-$39,999 1,958.3
Dependent: $40,000-$49,999 1,508.6
Dependent: $50,000-$59,999 1,309.0
Dependent: $60,000-$69,999 1,241.7
Dependent: $70,000-$79,999 1,404.4
Dependent: $80,000-$99,999
Dependent: $100,000 or more
Independent: Less than $5,000 2,860.3
Independent: $5,000-$9,999 2,642.9
Independent: $10,000-$19,999 2,291.7
Independent: $20,000-$29,999 2,328.3
Independent: $30,000-$49,999 1,561.9
Independent: $50,000 or more 1,124.3
‡ Reporting standards not met.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2003–04 National Postsecondary Student Aid Study (NPSAS:04).

NOTICE OF REVISIONS: The NPSAS:04 weights were revised in June 2009. The revised weights will produce 2003-04 estimates that differ somewhat from those in any tables and publications produced before June 2009. See the description for the total Stafford loan variable (STAFFAMT) for details.

Computation by NCES QuickStats on 8/25/2009
cgeak38
1
Percentage distribution of instructional faculty and staff's employment status, by institution type, Fall 2003
Employment status at this job Full time
(%)
Part time
(%)
Total
Estimates
Total 56.3 43.7 100%
Institution: type and control
Public doctoral 77.8 22.2 100%
Private not-for-profit doctoral 68.7 31.3 100%
Public master's 63.3 36.7 100%
Private not-for-profit master's 45.0 55.0 100%
Private not-for-profit baccalaureate 63.2 36.8 100%
Public associates 33.3 66.7 100%
Other 49.2 50.8 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Computation by NCES QuickStats on 6/19/2009
ckeak01
2
Percentage distribution of full-time instructional faculty and staff, by race/ethnicity, institution type: Fall 2003
Race/ethnicity White, non-Hispanic
(%)
Black, non-Hispanic
(%)
Asian/Pacific Islander
(%)
Hispanic
(%)
Other
(%)
Estimates
Total 80.3 5.9 8.6 3.4 1.2
Institution: type and control
Public doctoral 79.4 4.5 12.0 3.0 1.0
Private not-for-profit doctoral 79.1 5.3 11.9 2.9 0.8
Public master’s 78.3 8.9 7.6 3.6 1.6
Private not-for-profit master’s 85.4 5.1 5.7 2.5 1.3
Private not-for-profit baccalaureate 85.8 6.8 4.2 2.2 1.1
Public associates 81.2 7.2 4.4 5.5 1.7
Other 86.9 4.6 5.8 1.7 1.0
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Computation by NCES QuickStats on 6/19/2009
ckeak04
3
Percentage distribution of full-time instructional faculty and staff, by tenure status, institution type: Fall 2003
Tenure status Tenured
(%)
On tenure track but not tenured
(%)
Not on tenure track
(%)
Not tenured-no tenure system
(%)
Total
Estimates
Total 49.3 21.3 20.9 8.5 100%
Institution: type and control
Public doctoral 53.0 20.4 25.9 0.7 100%
Private not-for-profit doctoral 47.1 19.6 28.8 4.5 100%
Public master’s 53.7 28.3 16.9 1.0 100%
Private not-for-profit master’s 41.9 28.1 21.5 8.6 100%
Private not-for-profit baccalaureate 42.9 25.1 21.6 10.4 100%
Public associates 49.1 15.6 9.3 26.0 100%
Other 39.4 17.3 18.7 24.6 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Computation by NCES QuickStats on 6/19/2009
ckeak98
4
Percentage distribution of part-time instructional faculty and staff, by academic rank, institution type: Fall 2003
Academic rank Professor
(%)
Associate professor
(%)
Assistant professor
(%)
Instructor or lecturer
(%)
Other ranks/not applicable
(%)
Estimates
Total 4.6 2.9 3.4 42.2 46.9
Institution: type and control
Public doctoral 6.3 4.5 8.1 45.0 36.0
Private not-for-profit doctoral 5.6 4.9 9.1 31.9 48.5
Public master’s 6.4 2.3 2.0 40.7 48.7
Private not-for-profit master’s 2.7 3.4 2.6 30.3 60.9
Private not-for-profit baccalaureate 4.6 4.2 5.4 32.5 53.3
Public associates 3.4 1.5 1.0 49.5 44.6
Other 7.1 4.9 5.2 33.3 49.4
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Computation by NCES QuickStats on 6/19/2009
ckeakff
5
Average hours worked per week among full-time instructional faculty and staff, by tenure status: Fall 2003
  Hours worked per week
(Mean>0)
Estimates
Total 47.4
Tenure status
Tenured 53.3
On tenure track but not tenured 53.7
Not on tenure track 43.0
Not tenured-no tenure system 45.4
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Computation by NCES QuickStats on 6/19/2009
bkfakf3
1
Percentage of institutions with full- or part-time faculty represented by a union, by institution type: Fall 2003
Faculty represented by a union Not represented by a union
(%)
Represented by a union
(%)
Total
Estimates
Total 68.1 31.9 100%
Institution: type and control
Public doctoral 69.1 30.9 100%
Private not-for-profit doctoral 94.4 5.6 100%
Public master’s 58.1 41.9 100%
Private not-for-profit master’s 87.6 12.4 100%
Private not-for-profit baccalaureate 86.7 13.3 100%
Public associate’s 42.4 57.6 100%
Other 78.3 21.7 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Computation by NCES QuickStats on 6/19/2009
ckeak16
2
Among institutions with a tenure system, average percentage of undergraduate student credit hours assigned to full-time faculty and instructional staff, by institution type: Fall 2003
  Undergraduate instruction: percent full-time faculty
(Mean[0])
Estimates
Total 70.8
Institution: type and control
Public doctoral 68.6
Private not-for-profit doctoral 71.6
Public master’s 75.7
Private not-for-profit master’s 68.8
Private not-for-profit baccalaureate 76.1
Public associate’s 58.7
Other 82.8
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Computation by NCES QuickStats on 6/19/2009
ckeaka0
3
Percentage of institutions who have downsized tenured faculty, by institution type: Fall 2003
Downsized tenured faculty No
(%)
Yes
(%)
Total
Estimates
Total 85.7 14.3 100%
Institution: type and control
Public doctoral 83.4 16.6 100%
Private not-for-profit doctoral 93.9 6.1 100%
Public master’s 90.7 9.3 100%
Private not-for-profit master’s 99.6 0.4 100%
Private not-for-profit baccalaureate 88.1 11.9 100%
Public associate’s 87.7 12.3 100%
Other 68.0 32.0 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Computation by NCES QuickStats on 6/19/2009
ckeak5c
4
Percentage distribution of the maximum number of years full-time faculty and instructional staff can be on a tenure track without receiving tenure, by institution type: Fall 2003
Maximum years on tenure track No maximum
(%)
Less than 5 years
(%)
5 years
(%)
6 years
(%)
7 years
(%)
More than 7 years
(%)
Total
Estimates
Total 17.5 17.4 8.5 27.0 26.0 3.6 100%
Institution: type and control
Public doctoral 7.5 0.0 1.1 37.3 45.9 8.2 100%
Private not-for-profit doctoral 11.4 0.0 2.8 32.0 34.4 19.4 100%
Public master’s 1.5 0.0 22.0 37.1 38.9 0.6 100%
Private not-for-profit master’s 16.8 0.0 7.1 40.5 27.4 8.2 100%
Private not-for-profit baccalaureate 9.9 0.7 0.0 53.5 32.2 3.7 100%
Public associate’s 15.6 44.6 16.9 8.2 13.7 1.1 100%
Other 41.9 27.1 1.9 10.3 18.5 0.2 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Computation by NCES QuickStats on 6/19/2009
ckeak13
5
Percentage of institutions in which over half of student instruction hours are assigned to part-time faculty, by institution type: Fall 2003
  Undergraduate instruction: percent part-time faculty
(%>50)
Estimates
Total 17.9
Institution: type and control
Public doctoral 0.6
Private not-for-profit doctoral 9.9
Public master’s 1.6
Private not-for-profit master’s 15.6
Private not-for-profit baccalaureate 11.1
Public associate’s 23.9
Other 26.0
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Computation by NCES QuickStats on 6/19/2009
ckeakf7
1
Percentage distribution of 1992–93 bachelor's degree recipients' time-to-degree in years, by major field of study: 2003
Number of months to bachelor’s degree Within 4 years
(%)
4–5 years
(%)
5–6 years
(%)
6–10 years
(%)
More than 10 years
(%)
Total
Estimates
Total 35.5 27.4 11.4 11.7 14.0 100%
Undergraduate major
Business and management 32.6 26.9 8.7 13.3 18.6 100%
Education 32.9 30.4 10.7 11.0 15.0 100%
Engineering 25.3 37.4 15.9 11.4 10.0 100%
Health professions 22.0 27.3 13.5 14.2 23.1 100%
Public affairs/social services 28.3 29.7 11.9 13.2 17.0 100%
Biological sciences 53.5 21.7 10.9 8.4 5.5 100%
Mathematics & science 38.9 24.9 11.7 11.2 13.3 100%
Social science 47.5 25.3 11.4 10.2 5.6 100%
History 40.1 26.3 20.0 5.3 8.3 100%
Humanities 39.8 21.4 12.8 12.1 13.8 100%
Psychology 39.8 26.1 7.3 12.0 14.8 100%
Other 35.4 28.7 12.4 11.3 12.2 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 1993/03 Baccalaureate and Beyond Longitudinal Study (B&B:93/03).

Computation by QuickStats on 6/24/2009
cgeak2a
1
Percentage distribution of 1992–93 bachelor's degree recipients' time-to-degree in years, by major field of study: 2003
Number of months to bachelor’s degree Within 4 years (%) 4–5 years (%) 5–6 years (%) 6–10 years (%) More than 10 years (%) Total
Estimates
Total 46.3 24.3 9.0 8.0 12.4 100%
Undergraduate major
Business and management 43.1 22.1 9.3 8.0 17.6 100%
Education 38.3 29.9 8.7 9.6 13.5 100%
Engineering 38.7 35.9 12.3 7.2 5.9 100%
Health professions 29.2 29.3 10.2 11.6 19.7 100%
Public affairs/social services 32.8 26.2 8.6 11.3 21.2 100%
Biological sciences 62.0 21.2 8.7 4.4 3.8 100%
Mathematics & science 48.3 20.2 10.8 10.4 10.3 100%
Social science 63.2 18.8 7.6 4.2 6.2 100%
History 53.8 28.2 9.7 3.3 5.0 100%
Humanities 51.1 18.4 6.5 7.0 17.0 100%
Psychology 39.6 29.3 4.4 13.2 13.4 100%
Other 47.1 21.5 11.2 7.5 12.7 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 1993/03 Baccalaureate and Beyond Longitudinal Study (B&B:93/03).

Computation by QuickStats on 6/16/2009
ckeakfe
2
Percentage distribution of 1992–93 bachelor's degree receipient’s highest graduate degree attainment, by age at which student received bachelor's degree: 2003
Highest degree completed as of 2003 Bachelor’s degree
(%)
Master’s degree
(%)
First-professional degree
(%)
Doctoral degree
(%)
Total
Estimates
Total 73.5 20.4 4.1 2.0 100%
Age when received bachelor's degree
22 or younger 65.3 24.9 6.8 3.1 100%
23–24 80.7 15.5 2.4 1.4 100%
25–29 84.2 14.2 0.8 0.8 100%
30 or older 78.2 19.8 1.3 0.7 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 1993/03 Baccalaureate and Beyond Longitudinal Study (B&B:93/03).

Computation by QuickStats on 6/16/2009
ckeaked
2
Percentage distribution of 1992–93 bachelor's degree receipient’s highest graduate degree attainment, by age at which student received bachelor's degree: 2003
Highest degree completed as of 2003 Bachelor’s degree
(%)
Master’s degree
(%)
First-professional degree
(%)
Doctoral degree
(%)
Total
Estimates
Total 73.8 20.2 4.0 2.0 100%
Age when received bachelor's degree
22 or younger 65.5 24.6 6.7 3.1 100%
23–24 80.9 15.4 2.3 1.3 100%
25–29 85.0 13.7 0.6 0.7 100%
30 or older 78.5 19.4 1.3 0.8 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 1993/03 Baccalaureate and Beyond Longitudinal Study (B&B:93/03).

Computation by QuickStats on 6/16/2009
ckeak4e
3
Average annual salary among 1992–93 bachelor's degree recipients, by highest degree attained: 2003
  Job 2003: annual salary
(Mean[0])
Estimates
Total 55,407.6
Highest degree attained by 2003
Bachelor’s degree 53,547.5
Master’s degree 56,241.6
First-professional degree 83,798.6
Doctoral degree 63,214.4
SOURCE: U.S. Department of Education, National Center for Education Statistics, 1993/03 Baccalaureate and Beyond Longitudinal Study (B&B:93/03).

Computation by QuickStats on 6/16/2009
ckeak84
3
Average annual salary among 1992–93 bachelor's degree recipients, by highest degree attained: 2003
  Job 2003: annual salary
(Mean[0])
Estimates
Total 55,407.6
Highest degree attained by 2003
Bachelor’s degree 53,547.5
Master’s degree 56,241.6
First-professional degree 83,798.6
Doctoral degree 63,214.4
SOURCE: U.S. Department of Education, National Center for Education Statistics, 1993/03 Baccalaureate and Beyond Longitudinal Study (B&B:93/03).

Computation by QuickStats on 6/16/2009
ckeak2a
4
Percentage of 1992–93 bachelor degree recipients who were still paying undergraduate education loans, by occupation: 2003
  Total loans: amount owed as of 2003
(%>0)
Estimates
Total 37.8
Job 2003: occupation
Educators 38.2
Business and management 30.9
Engineering/architecture 26.2
Computer science 28.7
Medical professionals 51.9
Editors/writers/performers 27.9
Human/protective service/legal profess 51.0
Research, scientists, technical 30.9
Administrative/clerical/legal support 55.0
Mechanics, laborers
Service industries 28.5
Other, military 17.7
‡ Reporting standards not met.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 1993/03 Baccalaureate and Beyond Longitudinal Study (B&B:93/03).

Computation by QuickStats on 6/16/2009
bbfakd2
4
Percentage of 1992–93 bachelor degree recipients who were still paying undergraduate education loans, by occupation: 2003
  Undergraduate loans: total owed as of 2003
(%>1)
Estimates
Total 16.6
Job 2003: occupation
Educators 22.0
Business and management 12.7
Engineering/architecture 12.6
Computer science 10.8
Medical professionals 20.7
Editors/writers/performers 15.1
Human/protective service/legal profess 24.0
Research, scientists, technical 14.1
Administrative/clerical/legal support 24.7
Mechanics, laborers 18.1
Service industries 13.0
Other, military 13.6
SOURCE: U.S. Department of Education, National Center for Education Statistics, 1993/03 Baccalaureate and Beyond Longitudinal Study (B&B:93/03).

Computation by QuickStats on 6/16/2009
bffak6c
5
Percentage distribution of 1992–93 bachelor's degree recipients' teaching status, by highest degree attained: 2003
Teaching status in 2003 Currently teaching (%) Left teaching (%) Never taught
(%)
Total
Estimates
Total 10.6 9.3 80.2 100%
Highest degree completed as of 2003
Bachelor's degree 8.2 8.2 83.6 100%
Master's degree 20.1 13.3 66.6 100%
First-professional degree 0.6 4.9 94.5 100%
Doctoral degree 1.0 9.8 89.2 100%
NOTE: Rows may not add up to 100% due to rounding

SOURCE: U.S. Department of Education, National Center for Education Statistics, 1993/03 Baccalaureate and Beyond Longitudinal Study (B&B:93/03).

Computation by QuickStats on 6/16/2009
ckeake6
5
Percentage distribution of 1992–93 bachelor's degree recipients' teaching status, by highest degree attained: 2003
Teaching status in 2003 Currently teaching (%) Left teaching (%) Never taught
(%)
Total
Estimates
Total 10.6 9.3 80.2 100%
Highest degree completed as of 2003
Bachelor's degree 8.2 8.2 83.6 100%
Master's degree 20.1 13.3 66.6 100%
First-professional degree 0.6 4.9 94.5 100%
Doctoral degree 1.0 9.8 89.2 100%
NOTE: Rows may not add up to 100% due to rounding

SOURCE: U.S. Department of Education, National Center for Education Statistics, 1993/03 Baccalaureate and Beyond Longitudinal Study (B&B:93/03).

Computation by QuickStats on 6/16/2009
ckeake1
1
2
3
4
5
1
Disability as reported by teacher (parent if teacher data missing), by Child's race.
Disability as reported by teacher (parent if teacher data missing), Wave 1 Autism
(%)
Learning Disability
(%)
Mental Retardation
(%)
Speech Or Language Impairment
(%)
Other impairment
(%)
Estimates
Total7.2 2.4 4.3 47.1 39.0
Child's race
Hispanic10.3 3.5 7.1 41.7 37.5
Black Or African American/Non-Hispanic9.7 4.7 6.6 35.1 43.9
White/Non-Hispanic5.7 1.6 2.9 51.4 38.4
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: Pre-Elementary Education Longitudinal Study (PEELS), Waves 1-5

Computation by NCES QuickStats on 8/23/2011
cchbbf1
2
				
Child's main education setting, by Household income
Child's main education setting, Wave 1 Regular Education Classroom
(%)
Special Education Setting
(%)
Home
(%)
Other Specify
(%)
Total
Estimates
Total74.4 21.2 2.7 1.7 100%
Household income, Wave 1
$20,000 Or Less72.2 18.3 8.2 1.3 100%
$20,001 - 40,00068.6 27.0 0.0 4.4 100%
> $40,00080.3 19.0 0.6 0.0 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: Pre-Elementary Education Longitudinal Study (PEELS), Waves 1-5

Computation by NCES QuickStats on 8/23/2011
cchbbf2
3
Overall academic skills (kindergarten), by District poverty/wealth category.
Overall academic skills (kindergarten), Wave 1 Far Below Average
(%)
Below Average
(%)
Average
(%)
Above Average
(%)
Far Above Average
(%)
Total
Estimates
Total17.9 32.7 35.8 12.7 0.9 100%
District poverty/wealth category
High Wealth22.1 30.1 41.6 6.2 0.0 100%
Medium Wealth9.0 27.7 53.1 8.6 1.6 100%
Low Wealth22.8 32.7 23.8 20.7 0.0 100%
Very Low Wealth18.9 41.2 24.3 13.5 2.1 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: Pre-Elementary Education Longitudinal Study (PEELS), Waves 1-5

Computation by NCES QuickStats on 8/23/2011
cchbbf3
4
First professional license/certificate, 1 by Years teacher working with children with disabilities
First professional license/certificate, Wave 1 Child Development
(%)
Early Childhood Education
(%)
Early Childhood Special Education
(%)
Special Education
(%)
Other
(%)
Estimates
Total8.8 20.8 18.0 22.8 29.6
Years teacher working with children with disabilities, Wave 1
Less than 5 years9.4 27.9 17.4 14.3 31.0
6 - 10 years10.9 22.4 12.1 22.2 32.4
11 - 15 years11.1 28.0 19.8 18.0 23.2
15 years or more5.5 10.4 22.4 32.3 29.4
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: Pre-Elementary Education Longitudinal Study (PEELS), Waves 1-5

Computation by NCES QuickStats on 8/23/2011
cchbbf4
5
Description of child's school, by Total hours/week child attends school.
Description of child's school (kindergarten or higher), Wave 4 Regular School - Serves All Students
(%)
School Serves Only Disabled Students
(%)
Magnet School
(%)
>Other
(%)
Estimates
Total94.0 2.9 1.1 2.0
Total hours/week child attends school (kindergarten), Wave 4
15 hours or less96.1 3.3 0.0 0.6
16 to 25 hours88.6 3.6 3.9 3.8
26 to 30 hours83.0 10.5 3.3 3.2
More than 30 hours96.9 0.0 1.9 1.2
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: Pre-Elementary Education Longitudinal Study (PEELS), Waves 1-5

Computation by NCES QuickStats on 8/23/2011
cchbbf5
1
Percentage distribution of beginning postsecondary students who took distance education courses by student/employee role: 2003–04
Distance education courses in 2003-04 Yes
(%)
No
(%)
Total
Estimates
Total 9.3 90.7 100%
Job 2003-04: primarily student or employee
A student working to meet expenses 9.7 90.3 100%
An employee enrolled in school 11.4 8.6 100%
No job 7.6 92.7 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2003–04 Beginning Postsecondary Students Longitudinal Study, First Follow-up (BPS:04/06).

Computation by NCES QuickStats on 6/17/2009
cehak17
2
Percentage distribution of 2003–04 beginning postsecondary students' persistence at any institution through 2006, by gender.
Persistence at any institution through 2006 Attained, still enrolled
(%)
Attained, not enrolled
(%)
No degree, still enrolled
(%)
No degree, not enrolled
(%)
Total
Estimates
Total 7.0 8.9 50.7 33.5 100%
Gender
Male 6.5 7.5 50.4 35.6 100%
Female 7.3 9.9 50.9 31.9 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2003–04 Beginning Postsecondary Students Longitudinal Study, First Follow-up (BPS:04/06).

Computation by NCES QuickStats on 6/22/2009
cgeak59
3
Percentage of 2003–04 beginning postsecondary students who received financial aid, by undergraduate degree attainment and enrollment status through 2006
  Aid: total student aid all sources in 2003-04
(%>0)
Estimates
Total 70.6
Persistence at any institution through 2006
Attained a degree or certificate 80.1
No degree, still enrolled 70.6
No degree, not enrolled 66.1
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2003–04 Beginning Postsecondary Students Longitudinal Study, First Follow-up (BPS:04/06).

Computation by NCES QuickStats on 6/17/2009
cgeak94
4
Percentage distribution of 2003–04 beginning postsecondary students' degree attainment and enrollment status through 2006, by grade point average (GPA)
Persistence at any institution through 2006 Attained, still enrolled
(%)
Attained, not enrolled
(%)
No degree, still enrolled
(%)
No degree, not enrolled
(%)
Total
Estimates
Total 7.0 8.9 50.7 33.5 100%
Cumulative Grade Point Average (GPA) as of 2003-04
Below 2.0 3.9 4.2 39.4 52.5 100%
2.1 to 2.50 5.1 5.4 50.7 38.8 100%
2.51 to 2.99 6.4 6.7 59.9 27.0 100%
3.0 and above 8.2 11.4 50.8 29.5 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2003–04 Beginning Postsecondary Students Longitudinal Study, First Follow-up (BPS:04/06).

Computation by NCES QuickStats on 6/17/2009
cgeak4f
5
Percentage distribution of 2003–04 beginning postsecondary students’ degree attainment and enrollment status through 2006, by highest degree expectations
Persistence anywhere through 2006 Attained, still enrolled
(%)
Attained, not enrolled
(%)
No degree, still enrolled
(%)
No degree, not enrolled
(%)
Total
Estimates
Total 7.0 8.9 50.7 33.5 100%
Highest degree expected, 2003-04
No degree or certificate 3.8 17.1 16.3 62.8 100%
Certificate 6.9 41.5 10.3 41.3 100%
Associate’s degree 8.7 17.3 25.3 48.8 100%
Bachelor’s degree 6.9 7.9 45.2 40.0 100%
Post-BA or post-master certificate 5.1 13.4 42.9 38.6 100%
Master’s degree 7.1 4.8 60.6 27.4 100%
Doctoral degree 7.1 4.2 67.9 20.8 100%
First-professional degree 3.5 6.7 67.6 22.2 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2003–04 Beginning Postsecondary Students Longitudinal Study, First Follow-up (BPS:04/06).

Computation by NCES QuickStats on 6/22/2009
cgeak02
1
Percentage distribution of undergraduates' attendance intensity, by institution type: 2007–08
Attendance intensity Exclusively full-time
(%)
Exclusively part-time
(%)
Mixed full-time and part-time
(%)
Total
Estimates
Total 47.7 35.4 16.9 100%
Institution: type
Public less-than-2-year 64.5 31.5 4.0 100%
Public 2-year 26.3 58.8 14.9 100%
Public 4-year nondoctorate 54.5 27.9 17.6 100%
Public 4-year doctorate 65.0 15.4 19.6 100%
Private not-for-profit less than 4-year 55.2 28.9 15.8 100%
Private not-for-profit 4-yr nondoctorate 69.0 18.4 12.5 100%
Private not-for-profit 4-year doctorate 74.7 13.9 11.5 100%
Private for-profit less-than-2-year 75.0 15.8 9.1 100%
Private for-profit 2 years or more 67.0 18.7 14.4 100%
Attended more than one institution 40.8 26.2 33.0 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2007–08 National Postsecondary Student Aid Study (NPSAS:08)

Computation by NCES QuickStats on 6/22/2009
cgeakc9
2
Percentage of undergraduates who received Pell Grants, by income and dependency status: 2007–08
  Grants: Pell Grants
(%>0)
Estimates
Total 27.3
Income: categories by dependency status
Dependent: Less than $10,000 63.2
Dependent: $10,000-$19,999 72.7
Dependent: $20,000-$29,999 64.9
Dependent: $30,000-$39,999 53.5
Dependent: $40,000-$49,999 32.0
Dependent: $50,000-$59,999 15.4
Dependent: $60,000-$69,999 2.3
Dependent: $70,000-$79,999 0.0
Dependent: $80,000-$99,999 0.0
Dependent: $100,000 or more 0.0
Independent: Less than $5,000 53.3
Independent: $5,000-$9,999 65.5
Independent: $10,000-$19,999 52.3
Independent: $20,000-$29,999 34.8
Independent: $30,000-$49,999 28.2
Independent: $50,000 or more 0.2
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2007–08 National Postsecondary Student Aid Study (NPSAS:08)

Computation by NCES QuickStats on 6/26/2009
cgfak1b
3
Average net price of attendance after all financial aid for full-time undergraduate students, by type of institution: 2007–08
  Net price after all aid
(Avg>0)
Estimates
Total 11,658.9
Type of institution
Public less-than-2-year 9,667.4
Public 2-year 7,560.8
Public 4-year nondoctorate 8,922.5
Public 4-year doctorate 11,625.2
Private not-for-profit less-than-4-year 10,782.5
Private not-for-profit 4-year nondoctorate 14,462.2
Private not-for-profit 4-year doctorate 20,047.5
Private for-profit less-than-2-year 10,298.3
Private for-profit 2 years or more 14,406.9
Attended more than one institution
‡ Reporting standards not met.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2007–08 National Postsecondary Student Aid Study (NPSAS:08)

Computation by NCES QuickStats on 6/22/2009
cgeakf7
4
Percentage distribution of dependent undergraduates’ parents’ income, by type of institution: 2007–08
Parents’ income Less than $36,000
(%)
$36,000-66,999
(%)
$67,000-104,999
(%)
$105,000 or more
(%)
Total
Estimates
Total 24.8 25.5 25.0 24.7 100%
Institution: sector
Public 4-year 20.6 22.7 27.4 29.2 100%
Private not-for-profit 4-year 17.5 20.9 25.3 36.4 100%
Public 2-year 30.6 31.4 23.2 14.8 100%
Private for-profit 50.1 25.1 15.9 8.9 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2007–08 National Postsecondary Student Aid Study (NPSAS:08)

Computation by NCES QuickStats on 6/22/2009
cgeak3a
5
Mean estimated student need for undegraduate students, by type of degree program: 2007–08
  Aid: estimated student need
(Mean[0])
Estimates
Total 7,978.0
College study: degree program
Certificate 8,696.4
Associate’s degree 5,248.0
Bachelor’s degree 10,890.9
Not in a degree program or others 2,909.0
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2007–08 National Postsecondary Student Aid Study (NPSAS:08)

Computation by NCES QuickStats on 6/22/2009
cgeaka8
1
Percentage of graduate students who borrowed, by type of graduate program: 2007–08
  Graduate loan debt (cumulative)
(%>0)
Estimates
Total 53.2
Graduate degree: type
Master's degree 52.8
Doctoral degree 46.5
First-professional degree 82.1
Post-BA or post-master's certificate 51.6
Not in a degree program 34.8
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2007–08 National Postsecondary Student Aid Study (NPSAS:08)

Computation by NCES QuickStats on 6/19/2009
ckeake3
2
Percentage of graduate students with assistantships, by attendance intensity: 2007–08
  Assistantships
(%>0)
Estimates
Total 15.2
Attendance intensity
Exclusively full-time 23.3
Exclusively part-time 6.5
Mixed full-time and part-time 19.8
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2007–08 National Postsecondary Student Aid Study (NPSAS:08)

Computation by NCES QuickStats on 6/19/2009
ckeak41
3
Average tuition waiver received by graduate students, by type of graduate degree program: 2007–08
  Tuition waivers
(Avg>0)
Estimates
Total 6,785.2
Graduate degree: type
Master's degree 6,387.0
Doctoral degree 7,826.7
First-professional degree 8,521.4
Post-BA or post-master's certificate
Not in a degree program 2,206.9
‡ Reporting standards not met.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2007–08 National Postsecondary Student Aid Study (NPSAS:08)

Computation by NCES QuickStats on 6/19/2009
ckeak62
4
Percentage of graduate students who have ever borrowed loans, by income: 2007–08
  Loans: total student loans all sources
(%>0)
Estimates
Total 42.7
Income: total income
Less than $13,200 55.3
$13,200-37,399 50.4
$37,400-71,599 38.6
$71,600 or more 26.4
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2007–08 National Postsecondary Student Aid Study (NPSAS:08)

Computation by NCES QuickStats on 6/19/2009
ckeaka7
5
Average loan amount for graduate students, by type of institution attended: 2007–08
  Loans: total student loans all sources
(Avg>0)
Estimates
Total 18,494.7
Type of 4-year institution
Public 4-year nondoctorate-granting 10,668.2
Public 4-year doctorate-granting 16,470.2
Private not-for-profit 4-yr nondoctorate-granting 14,748.3
Private not-for-profit 4-year doctorate-granting 23,496.8
Private for profit 4-year 17,680.3
Attended more than one institution 17,270.5
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2007–08 National Postsecondary Student Aid Study (NPSAS:08)

Computation by NCES QuickStats on 6/19/2009
bbfakc7
1
Highest degree attained anywhere through 2009 by Single parent status in 2003-04.
Highest degree attained anywhere through 2009 Certificate
(%)
Associate’s degree
(%)
Bachelor’s degree
(%)
No degree
(%)
Total
Estimates
Total 9.4 9.3 30.7 50.5 100%
Single parent status in 2003-04
Single parent 17.9 6.3 3.3 72.5 100%
Not a single parent 8.4 9.7 34.0 47.9 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, BPS:2009 Beginning Postsecondary Students

Computation by QuickStats on 12/1/2010
epba07
2
Degree program when last enrolled 2009 by Cumulative federal student loan amount owed as of 2009.
Degree program when last enrolled 2009 Associate's Degree
(%)
Bachelor's Program (4 year)
(%)
Not in a degree program
(%)
Estimates
Total 23.2 68.8 8.1
Cumulative federal student loan amount owed as of 2009
$0 26.3 62.7 11.0
$1-4,899 36.5 53.1 10.5
$4,900-10,299 30.0 61.5 8.5
$10,300-17,999 14.7 81.8 3.5
$18,000 or more 9.8 88.2 2.0
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, BPS:2009 Beginning Postsecondary Students

Computation by NCES QuickStats on 5/27/2011
epba44
3
Attainment or level of last institution enrolled through 2009 by Attendance intensity pattern through 2009.
Attainment or level of last institution enrolled through 2009 Attained a degree or certificate
(%)
No degree, enrolled at 4-year
(%)
No degree, enrolled at less-than-4-year
(%)
No degree, not enrolled
(%)
Total
Estimates
Total 49.5 7.1 7.9 35.5 100%
Attendance intensity pattern through 2009
Always full-time 62.6 4.8 2.8 29.7 100%
Always part-time 15.7 1.8 11.3 71.3 100%
Mixed 41.9 11.2 13.5 33.4 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, BPS:2009 Beginning Postsecondary Students

Computation by QuickStats on 12/1/2010
epba32
4
Transfer and degree plans first year by Job 2004: Hours worked per week (incl work study).
Transfer and degree plans first year Degree, no transfer
(%)
Degree and transfer
(%)
No degree, transfer
(%)
No degree, no transfer
(%)
Total
Estimates
Total 26.6 21.3 24.1 28.0 100%
Job 2004: Hours worked per week (incl work study)
Did not work 30.7 17.6 18.5 33.2 100%
1-19 22.4 23.7 31.3 22.6 100%
20-29 21.8 25.6 30.0 22.6 100%
30-39 23.8 22.0 27.6 26.6 100%
40 or more 30.1 20.1 19.0 30.8 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, BPS:2009 Beginning Postsecondary Students

Computation by QuickStats on 12/1/2010
epba8e
5
Retention and attainment at first institution 6-year total 2009 by Highest level of high school mathematics.
Retention and attainment at first institution 6-year total 2009 Attained a degree or certificate
(%)
No degree, still enrolled
(%)
No degree, transferred
(%)
No degree, left without return
(%)
Total
Estimates
Total 38.8 6.1 26.8 28.4 100%
Highest level of high school mathematics
Algebra 2 31.9 7.3 32.6 28.2 100%
Trigonometry/Algebra II 43.2 4.9 31.9 20.0 100%
Pre-calculus 47.4 5.4 31.4 15.9 100%
Calculus 65.4 3.3 21.4 9.9 100%
None of these 25.4 7.1 28.1 39.4 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, BPS:2009 Beginning Postsecondary Students

Computation by QuickStats on 12/1/2010
epba9e
1
Family status in 2012 by Repayment status for any loans in 2012 (federal and private).
Family status in 2012 Unmarried, no dependent children
(%)
Unmarried with dependent children
(%)
Married, no dependent children
(%)
Married with dependent children
(%)
Total
Estimates
Total54.0 5.4 20.9 19.6 100%
Repayment status for any loans in 2012 (federal and private)
Deferring payments on loans57.0 7.5 16.6 18.9 100%
Repaying loans53.5 6.5 19.8 20.2 100%
Loans are paid off or forgiven49.2 2.9 23.5 24.3 100%
Defaulted53.7 13.8 12.4 20.2 100%
Did not borrow55.2 2.9 24.5 17.3 100%
NOTE: Rows may not add up to 100% due to rounding.

U.S. Department of Education, National Center for Education Statistics, 2008/12 Baccalaureate and Beyond Longitudinal Study (B&B:08/12).

Computation by NCES QuickStats on 6/27/2014
cgbe2d
2
Gender by Highest degree attained since bachelor's as of 2012.
Gender Male
(%)
Female
(%)
Total
Estimates
Total42.6 57.4 100%
Highest degree attained since bachelor's as of 2012
Undergraduate certificate or diploma36.9 63.1 100%
Associate's degree33.2 66.8 100%
Additional bachelor's degree36.7 63.3 100%
Post-baccalaureate certificate36.3 63.7 100%
Master's degree37.7 62.3 100%
Post-master's certificate24.1 75.9 100%
Doctoral degree - professional practice50.9 49.1 100%
Doctoral degree - research/scholarship 100%
Doctoral degree - other 100%
{Skipped}45.3 54.7 100%
{Missing} 100%
Did not earn degree39.2 60.8 100%
‡ Reporting standards not met.

NOTE: Rows may not add up to 100% due to rounding.

U.S. Department of Education, National Center for Education Statistics, 2008/12 Baccalaureate and Beyond Longitudinal Study (B&B:08/12).

Computation by NCES QuickStats on 6/27/2014
cgbe5a
3
Income (dependents' parents and independents) in 2006 by Cumulative amount borrowed for education through 2012.
Income (dependents' parents and independents) in 2006 0
(%)
$1-
27,799
(%)
$27,800-
62,099
(%)
$62,100-
105,899
(%)
$105,900
or more
(%)
Total
Estimates
Total2.3 24.7 24.3 24.8 23.8 100%
Cumulative amount borrowed for education through 2012
$1-7,9991.9 31.7 26.6 19.8 19.9 100%
$8,000-16,9991.3 25.1 25.2 25.0 23.5 100%
$17,000<>-29,9992.1 25.0 26.6 26.4 19.9 100%
$30,000 or more2.5 28.9 27.9 24.3 16.3 100%
NOTE: Rows may not add up to 100% due to rounding.

U.S. Department of Education, National Center for Education Statistics, 2008/12 Baccalaureate and Beyond Longitudinal Study (B&B:08/12).

Computation by NCES QuickStats on 6/27/2014
cgbed6
4
Transcript: Remedial courses: # taken by Annualized salary for primary job in 2012.
Transcript: Remedial courses: # taken 0
(%)
1
(%)
2
(%)
3 or more
(%)
Total
Estimates
Total74.1 15.5 5.6 4.9 100%
Annualized salary for primary job in 2012
$1-15,59974.1 13.4 5.2 7.3 100%
$15,600-26,99974.0 14.8 5.5 5.6 100%
$27,000-39,99968.8 17.1 7.5 6.6 100%
$40,000-75,99977.3 14.9 4.8 3.0 100%
76,000 or more77.2 16.4 4.4 2.0 100%
NOTE: Rows may not add up to 100% due to rounding.

U.S. Department of Education, National Center for Education Statistics, 2008/12 Baccalaureate and Beyond Longitudinal Study (B&B:08/12).

Computation by NCES QuickStats on 6/27/2014
cgbe3a
5
Undergraduate GPA as of 2007-08 by Annualized salary for primary job in 2012.
Undergraduate GPA as of 2007-08 Less than 2.00
(%)
2.00-2.49
(%)
2.50-2.99
(%)
3.00-3.49
(%)
3.50 or higher
(%)
Total
Estimates
Total0.3 6.3 21.2 35.8 36.4 100%
Annualized salary for primary job in 2012
$1-15,5990.3 6.6 16.8 33.6 42.7 100%
$15,600-26,9990.4 7.7 20.3 34.3 37.3 100%
$27,000-39,9990.2 6.2 24.4 36.5 32.6 100%
$40,000-75,9990.3 5.7 20.1 36.6 37.3 100%
76,000 or more0.0 4.0 16.7 33.5 45.8 100%
NOTE: Rows may not add up to 100% due to rounding.

U.S. Department of Education, National Center for Education Statistics, 2008/12 Baccalaureate and Beyond Longitudinal Study (B&B:08/12).

Computation by NCES QuickStats on 6/27/2014
cgbe1c
1
Percentage distribution of highest level of education earned as of June 2013, by Sex.
High school credential or less
(%)
Some college
(%)
Bachelor's degree or post-baccalaureate certificate
(%)
Master's degree or higher
(%)
Total
Estimates
Total15.7 51.1 26.6 6.7 100%
Sex
Male19.7 50.3 25.1 4.9 100%
Female11.8 51.8 28.1 8.3 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, Education Longitudinal Study of 2002 (ELS:2002), High School Sophomores.

Computation by NCES QuickStats on 9/11/2015
bbkbfnee
2
Percentage distribution of highest known degree attainment as of June 2013, by parent's highest level of education
Some College
(%)
Bachelor's degree
(%)
Master's degree or higher
(%)
Total
Estimates
Total27.7 57.1 15.2 100%
Parent's highest level of education
High school diploma or less47.5 44.4 8.0 100%
Some college36.4 53.6 10.0 100%
Bachelor's degree19.4 63.1 17.5 100%
Master's degree or higher10.7 60.3 29.1 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, Education Longitudinal Study of 2002 (ELS:2002), High School Sophomores.

Computation by NCES QuickStats on 9/11/2015
bbkbfph5f
3
Percentage distribution of Respondents' income from employment, by Employment status as of the third follow-up interview
No 2011 employment income
(%)
Less than $9,000
(%)
$9,000 - 21,999
(%)
$22,000 - 35,999
(%)
$36,000 or more
(%)
Total
Estimates
Total11.5 12.4 24.8 26.3 25.0 100%
Employment status as of the F3 interview
Unemployed38.1 23.1 24.1 10.2 4.5 100%
Out of the labor force55.9 13.3 15.4 10.2 5.3 100%
Working 0-34 hours/week8.3 28.4 42.5 15.1 5.7 100%
Working 35+ hours/week3.4 7.6 22.6 32.6 33.9 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, Education Longitudinal Study of 2002 (ELS:2002), High School Sophomores.

Computation by NCES QuickStats on 9/11/2015
bbkbfpfc
4
Percentage distribution of postsecondary GPA at all known institutions attended, by highest level of education student expected
Lower than 2.00
(%)
2.00 - 2.74
(%)
2.75 - 3.24
(%)
3.25 or higher
(%)
Total
Estimates
Total16.9 23.1 24.2 35.8 100%
Student's expected achievement in school: base year
Less than high school graduation35.6 26.3 20.5 17.6 100%
High school graduation or GED only28.4 25.0 20.9 25.7 100%
Attend or complete 2-year college/school22.4 21.8 24.6 31.1 100%
Attend college, 4-year degree incomplete26.0 31.6 17.5 24.9 100%
Graduate from college16.8 23.8 26.0 33.4 100%
Obtain Master's degree or equivalent12.9 20.9 24.4 41.8 100%
Obtain PhD, MD, or other advanced degree12.4 21.5 23.4 42.7 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, Education Longitudinal Study of 2002 (ELS:2002), High School Sophomores.

Computation by NCES QuickStats on 9/11/2015
bbkbfnce4
5
Percentage distribution of number of months between High School completion and BA completion, by Student's race/ethnicity.
Less than 48
(%)
48 - 59
(%)
60 - 71
(%)
72 or more
(%)
Total
Estimates
Total23.2 39.6 18.4 18.8 100%
Student's race/ethnicity
Amer. Indian/Alaska Native, non-Hispanic 100%
Asian, Hawaii/Pac. Islander,non-Hispanic29.1 34.7 17.2 18.9 100%
Black or African American, non-Hispanic16.3 33.1 22.3 28.3 100%
Hispanic, no race specified27.0 27.1 15.1 30.7 100%
Hispanic, race specified17.8 35.4 22.1 24.7 100%
More than one race, non-Hispanic26.5 43.8 10.6 19.2 100%
White, non-Hispanic23.7 41.5 18.3 16.5 100%
‡ Reporting standards not met.

NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, Education Longitudinal Study of 2002 (ELS:2002), High School Sophomores.

Computation by NCES QuickStats on 9/11/2015
bbkbfah9e
1
Percent of schools with at least one violent incident recorded by urbanicity
  Total number of violent incidents recorded
(%>0)
Estimates
Total73.8
Urbanicity - Based on Urban-centric location of school
City74.9
Suburb73.5
Town80.3
Rural70.2
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2009–10 School Survey on Crime and Safety (SSOCS), 2010.

Computation by NCES QuickStats on 6/24/2015
cefbfbee5
2
Percentage distribution of student bullying by grades offered
Q20b. Disciplinary occurences: Student bullying Happens daily
(%)
Happens at least once a week
(%)
Happens at least once a month
(%)
Happens on occasion
(%)
Never happens
(%)
Total
Estimates
Total6.8 16.2 22.6 51.7 2.6 100%
School grades offered - based on 07-08 CCD frame variables (School)
Primary5.7 13.9 20.4 56.8 3.2 100%
Middle13.2 25.4 26.1 35.0 0.3 100%
High school3.7 16.1 25.0 53.3 1.9 100%
Combined6.7 12.0 26.4 50.2 4.8 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2009–10 School Survey on Crime and Safety (SSOCS), 2010.

Computation by NCES QuickStats on 6/24/2015
cefbfbh09
3
Percentage distribution of parent participation in parent-teacher conferences by school size
Q5b. Parent participates in parent-teacher conference 0 to 25%
(%)
26 to 50%
(%)
51 to 75%
(%)
76 to 100%
(%)
School does not offer
(%)
Total
Estimates
Total6.4 17.0 23.1 50.9 2.7 100%
School size categories - based on 07-08 CCD frame variables (School)
Less than 3006.6 14.6 22.0 54.2 2.7 100%
300 to 4993.8 15.5 23.3 56.2 1.2 100%
500 to 9996.0 16.6 22.6 52.5 2.3 100%
1,000 or more14.0 27.4 26.3 23.4 8.9 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2009–10 School Survey on Crime and Safety (SSOCS), 2010.

Computation by NCES QuickStats on 6/24/2015
cefbfbk9d
4
Percentage distribution of number of gang-related and hate crimes by urbanicity
Total number of gang-related and hate crimes None
(%)
1 to 25
(%)
26 to 50
(%)
More than 50
(%)
Total
Estimates
Total93.0 6.6 0.2 0.1 100%
Urbanicity - Based on Urban-centric location of school
City88.0 10.9 0.7 0.4 100%
Suburb93.0 6.9 0.1 # 100%
Town94.9 5.0 0.1 # 100%
Rural96.5 3.5 # # 100%
# Rounds to zero.

NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2009–10 School Survey on Crime and Safety (SSOCS), 2010.

Computation by NCES QuickStats on 6/24/2015
cefbfbk6d
5
Average number of incidents recorded by urbanicity
  Total number of incidents recorded
(Mean[0])
Estimates
Total22.7
Urbanicity - Based on Urban-centric location of school
City29.8
Suburb24.5
Town21.1
Rural15.6
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2009–10 School Survey on Crime and Safety (SSOCS), 2010.

Computation by NCES QuickStats on 6/24/2015
cefbfbmc4
1
Percentage distribution of 2011–12 first-time postsecondary students’ 3-year attainment status at any institution, by level of institution.
Attainment or level of last institution enrolled through June 2014 Certificate
(%)
Associate’s degree
(%)
Bachelor’s degree
(%)
No Degree
(%)
Total
Estimates
Total7.5 6.9 1.4 84.1 100%
First institution type (IPEDS sector) 2011-12
4-year1.4 3.7 2.7 92.2 100%
2-year8.3 11.6 # 80.0 100%
Less-thatn-2-year70.5 0.1 # 29.4 100%
# Rounds to zero.
NOTE: Rows may not add up to 100% due to rounding

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2011-12 Beginning Postsecondary Students Longitudinal Study, First Follow-up (BPS:12/14).

Computation by NCES QuickStats on 11/13/2015
bdnbfnpe27
2
Percentage distribution of 3-year attainment rates at any institution, by highest level of high school mathematics, for 2011–12 first-time postsecondary students beginning at a 2-year public college.
Attainment or level of last institution enrolled through June 2014 Certificate
(%)
Associate's Degree
(%)
Bachelor's Degree
(%)
No Degree
(%)
Total
Estimates
Total4.5 11.0 # 84.5 100%
Highest level of high school mathematics
Less than Algebra 24.5 7.4 # 88.1 100%
Algebra 24.8 9.0 # 86.2 100%
Trigonometry1.1 15.9 # 83.0 100%
Pre-calculus3.6 16.6 0.2 79.7 100%
Calculus3.9 14.9 # 81.1 100%
# Rounds to zero.
NOTE: Rows may not add up to 100% due to rounding

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2011-12 Beginning Postsecondary Students Longitudinal Study, First Follow-up (BPS:12/14).

Computation by NCES QuickStats on 11/13/2015
bdnbfn2d
3
Percentage distribution of 3-year retention rates at any institution attended, by Race/ethnicity, first-time postsecondary students beginning at a 4-year college.
Enrollment and student characteristics Enrolled at 4-year institution
(%)
Enrolled at less- than-4-year institution
(%)
Not enrolled
(%)
Attained a certificate, Associate's degree, or Bachelor's Degree
(%)
Total
Estimates
Total67.2 5.3 19.8 7.8 100%
Race/ethnicity (with multiple)
White70.2 4.8 17.6 7.5 100%
Black or African American53.8 7.3 30.6 8.3 100%
Hispanic or Latino63.8 5.8 21.4 8.9 100%
Asian76.4 5.3 10.4 7.9 100%
American Indian or Alaska Native51.9 6.3 36.2 5.6 100%
Native Hawaiian/other Pacific Islander68.0 0.2 20.6 11.1 100%
More than one race65.9 5.4 22.8 5.9 100%
# Rounds to zero.
NOTE: Rows may not add up to 100% due to rounding

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2011-12 Beginning Postsecondary Students Longitudinal Study, First Follow-up (BPS:12/14).

Computation by NCES QuickStats on 11/13/2015
bdnbfb5d
4
Percentage distribution of 3-year attainment and retention rates at first institution attended, by Bachelor's program intentions within 5 years, for 2011–12 first-time postsecondary students beginning at a 2-year public college.
Enrollment and student characteristics Certificate
(%)
Associate's degree
(%)
Enrolled at first institution
(%)
Left first institution, but enrolled at another institution
(%)
Left first institution but never enrolled at another institution
(%)
Total
Estimates
Total3.4 10.6 27.3 16.5 42.2 100%
Bachelor's program intentions within 5 years 2012
Yes0.9 11.9 29.3 19.1 38.9 100%
No2.1 9.0 25.7 10.6 52.5 100%
# Rounds to zero.
NOTE: Rows may not add up to 100% due to rounding

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2011-12 Beginning Postsecondary Students Longitudinal Study, First Follow-up (BPS:12/14).

Computation by NCES QuickStats on 11/13/2015
bdnbfa8f
5
Percentage distribution of 2011–12 first-time postsecondary students’ Parent's highest level of education, by level of first institution: 2012–14
Parents' highest education level High School or less
(%)
Some postsec
(%)
Bachelor's degree or higher
(%)
Don't know highest education of either parent
(%)
Total
Estimates
Total31.2 26.5 38.4 4.0 100%
First institution type (IPEDS sector) 2011-12
First-time postsecondary students beginning at a 2-year public college37.2 31.0 26.7 5.1 100%
First-time postsecondary students beginning at a 4-year college22.7 23.4 51.4 2.5 100%
All others55.1 25.4 12.1 7.3 100%
# Rounds to zero.
NOTE: Rows may not add up to 100% due to rounding

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2011-12 Beginning Postsecondary Students Longitudinal Study, First Follow-up (BPS:12/14).

Computation by NCES QuickStats on 11/13/2015
bdnbfc31
1
Percentage distribution of total Carnegie credits earned in High School courses for fall 2009 ninth-graders, by parents' highest level of education.
X3 Total credits earned 0 - 22
(%)
22.5 - 26
(%)
26.5 or more
(%)
Total
Estimates
Total21.029.349.7100%
X2 Parents'/guardians' highest level of education
Less than high school39.127.633.3 100%
High school diploma or GED or alterntive HS credential24.630.644.7 100%
Certificate/diploma from school providing occupational training25.028.646.3 100%
Associate's degree18.629.052.3 100%
Bachelor's degree16.428.754.9 100%
Master's degree12.327.959.8 100%
Ph.D/M.D/Law/other high lvl prof degree11.029.059.9 100%
No bio/adoptive/step-parent in household 100%
‡ Reporting standards not met.

NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, High School Longitudinal Study of 2009 (HSLS:09).

Computation by NCES QuickStats on 2/17/2016
bhbbgd2f
2
Percentage distribution of fall 2009 ninth-graders considering a STEM major, by highest level of mathematics course taken in high school.
S3 C05C Major will be considering - STEM code Yes
(%)
No
(%)
Total
Estimates
Total22.977.1 100%
X3 Highest level mathematics course taken - ninth grade
Basic math18.881.2 100%
Other math17.083.0 100%
Pre-algebra12.088.0 100%
Algebra I17.382.7 100%
Geometry29.870.2 100%
Algebra II36.064.0 100%
Trigonometry50.349.7 100%
Other advanced math26.573.5 100%
Probability and statistics 100%
Other AP/IB math 100%
Precalculus50.849.2 100%
Calculus 100%
AP/IB Calculus 100%
No Math10.189.9 100%
‡ Reporting standards not met.

NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, High School Longitudinal Study of 2009 (HSLS:09).

Computation by NCES QuickStats on 2/18/2016
bkbbgkad
3
Percentage distribution of High School GPA in all academic courses for fall 2009 ninth-graders, by socioeconomic status.
X3 GPA for all academic courses 0 - 1.5
(%)
2 - 2.5
(%)
3
(%)
3.5 - 4
(%)
Total
Estimates
Total24.749.326.0# 100%
Socieoeconomic Status (Quintiles)
First quintile (lowest)36.246.817.0# 100%
Second quintile29.748.821.4# 100%
Third quintile23.453.423.2# 100%
Fourth quintile16.852.730.4# 100%
Fifth quintile (highest)10.043.246.8# 100%
# Rounds to zero.

NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, High School Longitudinal Study of 2009 (HSLS:09).

Computation by NCES QuickStats on 2/18/2016
bkbbgm9a
4
Percentage distribution of total credits earned in High School mathematics courses for fall 2009 ninth-graders, by respondent's sex.
X3 Credits earned in: mathematics 0 - 2.5
(%)
3 - 3.5
(%)
4 or more
(%)
Total
Estimates
Total21.423.754.9 100%
X2 Student's sex
Male24.723.651.7 100%
Female18.123.858.1 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, High School Longitudinal Study of 2009 (HSLS:09).

Computation by NCES QuickStats on 2/18/2016
bkbbgmc89
5
Percentage distribution of total credits earned in High School STEM courses for fall 2009 ninth-graders, by mathematics quintile score.
X3 Credits earned in: STEM 0 - 5.5
(%)
6 - 7.5
(%)
8 - 8.5
(%)
9 or more
(%)
Total
Estimates
Total17.328.922.331.5 100%
X2 Mathematics quintile score
First (lowest) quintile34.128.816.620.5 100%
Second quintile23.232.720.223.9 100%
Third (middle) quintile16.134.322.327.2 100%
Fouth quintile10.328.126.135.5 100%
Fifth (highest) quintile5.421.625.447.6 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, High School Longitudinal Study of 2009 (HSLS:09).

Computation by NCES QuickStats on 2/18/2016
bkbbgma4
1
Total number of violent incidents recorded by School size categories - based on 03-04 CCD frame variables (School).
  Total number of violent incidents recorded
(%>0)
Estimates
Total77.7
School size categories - based on 03-04 CCD frame variables (School)
< 30063.7
300 - 49977.3
500 - 99982.1
1,000 +96.5
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2005–06 School Survey on Crime and Safety (SSOCS), 2006.

Computation by NCES QuickStats on 2/10/2016
babbgadbe
2
Q5b. Parent involvement: Parent participates in parent-teacher conference by Urbanicity - from 03-04 CCD (School).
Q5b. Parent involvement: Parent participates in parent-teacher conference 0-25%
(%)
26-50%
(%)
51-75%
(%)
76-100%
(%)
School does not offer
(%)
Total
Estimates
Total6.7 14.5 23.9 52.6 2.3 100%
Urbanicity - from 03-04 CCD (School)
City7.1 15.6 25.8 49.1 2.4 100%
Urban Fringe5.7 10.9 21.1 59.7 2.7 100%
Town6.6 15.3 24.7 51.8 1.7 100%
Rural7.4 17.2 25.0 48.5 1.9 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2005–06 School Survey on Crime and Safety (SSOCS), 2006.

Computation by NCES QuickStats on 2/10/2016
babbgakfe0
3
Q7. Presence of security guard, security personnel, or sworn law enforcement officer by Q30. Level of crime where school is located.
Q7. Presence of security guard, security personnel, or sworn law enforcement officer Yes
(%)
No
(%)
Total
Estimates
Total41.7 58.3 100%
Q30. Level of crime where school is located
High level of crime49.4 50.6 100%
Moderate level of crime49.6 50.4 100%
Low level of crime39.1 60.9 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2005–06 School Survey on Crime and Safety (SSOCS), 2006.

Computation by NCES QuickStats on 2/10/2016
babbgd0b
4
Total number of disruptions by Q1s. School practice: Security cameras monitor the school.
Total number of disruptions None
(%)
1 to 5
(%)
6 to 10
(%)
More than 10
(%)
Total
Estimates
Total70.9 27.0 1.5 0.6 100%
Q1s. School practice: Security cameras monitor the school
Yes69.9 27.6 1.6 0.8 100%
No71.6 26.5 1.4 0.5 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2005–06 School Survey on Crime and Safety (SSOCS), 2006.

Computation by NCES QuickStats on 2/10/2016
babbgd28
5
Q25b. Percent students limited English proficient by School size categories - based on 03-04 CCD frame variables (School).
Q25b. Percent students limited English proficient 0% to 25%
(%)
26% to 50%
(%)
51% to 75%
(%)
76% to 100%
(%)
Total
Estimates
Total89.2 6.5 2.9 1.3 100%
School size categories - based on 03-04 CCD frame variables (School)
< 30093.2 3.4 2.1 1.3 100%
300 - 49991.0 6.3 2.1 0.6 100%
500 - 99985.5 8.3 4.1 2.0 100%
1,000 +87.6 8.4 3.1 0.9 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2005–06 School Survey on Crime and Safety (SSOCS), 2006.

Computation by NCES QuickStats on 2/11/2016
bbbbgk96
1
Total number of violent incidents recorded by School size categories - based on 05-06 CCD frame variables (School).
  Total number of violent incidents recorded
(%>0)
Estimates
Total75.5
School size categories - based on 05-06 CCD frame variables (School)
Less than 30060.6
300 to 49969.1
500 to 99983.4
1,000 or more97.0
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2007–08 School Survey on Crime and Safety (SSOCS), 2008.

Computation by NCES QuickStats on 2/9/2016
mbbgbd4
2
Parent participates in parent-teacher conference by Urbanicity - Based on Urban-centric location of school.
Q5b. Parent participates in parent-teacher conference 0 to 25%
(%)
26 to 50%
(%)
51 to 75%
(%)
76 to 100%
(%)
School does not offer
(%)
Total
Estimates
Total7.1 16.1 22.9 51.0 3.0 100%
Urbanicity - Based on Urban-centric location of school
City6.1 16.9 23.0 51.6 2.4 100%
Suburb4.4 12.7 22.3 56.4 4.2 100%
Town7.4 17.9 25.7 47.3 1.8 100%
Rural10.2 17.7 22.1 47.1 2.9 100%
NOTE: Rows may not add up to 100% due to rounding

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2007–08 School Survey on Crime and Safety (SSOCS), 2008.

Computation by NCES QuickStats on 2/9/2016
mbbgb12
3
Total number of full-time security guards, SROs, or sworn law enforcement officers by Q30. Level of crime where school is located for School grades offered - based on 05-06 CCD frame variables (School) (High school).
  Total number of full-time security guards, SROs, or sworn law enforcement officers
(Avg>0)
Estimates
Total4.5
Q30. Level of crime where school is located
High level of crime5.8
Moderate level of crime6.5
Low level of crime3.8
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2007–08 School Survey on Crime and Safety (SSOCS), 2008.

Computation by NCES QuickStats on 2/9/2016
mbbgc2c
4
Total number of disruptions by Q1t. School practice: Security cameras monitor the school.
Total number of disruptions None
(%)
1 to 10
(%)
11 to 20
(%)
21 to 30
(%)
More than 30
(%)
Total
Estimates
Total70.3 29.4 0.3 # # 100%
Q1t. School practice: Security cameras monitor the school
Yes66.6 33.0 0.4 0.1 # 100%
No74.9 25.0 0.2 # # 100%
# Rounds to zero.

NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2007–08 School Survey on Crime and Safety (SSOCS), 2008.

Computation by NCES QuickStats on 2/9/2016
mbbgcb1
5
Percent students limited English proficient by School size categories - based on 05-06 CCD frame variables (School) for School grades offered - based on 05-06 CCD frame variables (School) (Primary, Middle).
Q25b. Percent students limited English proficient 0% to 25%
(%)
26% to 50%
(%)
51% to 75%
(%)
76% to 100%
(%)
Total
Estimates
Total84.9 9.7 3.8 1.6 100%
School size categories - based on 05-06 CCD frame variables (School)
Less than 30089.4 6.5 1.5 2.6 100%
300 to 49988.3 8.9 2.4 0.4 100%
500 to 99980.7 12.0 6.2 1.1 100%
1,000 or more77.8 11.0 4.0 7.1 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2007–08 School Survey on Crime and Safety (SSOCS), 2008.

Computation by NCES QuickStats on 2/9/2016
mbbgce52
1
Census Region by School Typology.
Census Region Northeast
(%)
Midwest
(%)
South
(%)
West
(%)
Total
Estimates
Total24.125.829.820.2100%
School Typology
Catholic, parochial26.139.919.114.9100%
Catholic, diocesan23.437.421.917.3100%
Catholic, private34.625.721.118.6100%
Other religious, conservative Christian11.320.445.822.4100%
Other relig., affiliated w/ established denomination21.324.234.819.7100%
Other relig., not affiliated w/ any denomination23.235.328.912.7100%
Nonsectarian, regular school32.913.427.726.0100%
Nonsectarian, special program20.718.229.032.0100%
Nonsectarian, special education40.912.026.220.9100%
Counts
Total6654 6218 8304 5807 26983
School Typology
Catholic, parochial722 1090 523 414 2749
Catholic, diocesan647 1015 606 483 2751
Catholic, private303 227 193 169 892
Other religious, conservative Christian465 705 1811 933 3914
Other relig., affiliated w/ established denomination560 684 973 555 2772
Other relig., not affiliated w/ any denomination1343 1313 1740 780 5176
Nonsectarian, regular school1393 488 1184 1176 4241
Nonsectarian, special program609 536 875 978 2998
Nonsectarian, special education612 160 399 319 1490
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, Private School Universe Survey (PSS), 2011–12.

Computation by NCES QuickStats on 5/5/2016
febgd30
2
Student-Teacher Ratio by Percent to 4-Year College.
Student-Teacher Ratio 0 to 1
(%)
1 to 10
(%)
10 to 20
(%)
Higher than 20
(%)
Total
Estimates
Total1.353.639.55.6100%
Percent to 4-Year College
00.478.020.01.6100%
1% to 25%#53.840.45.8100%
26% to 50%0.463.431.94.3100%
51% to 75%#47.750.22.1100%
76% to 100%0.150.947.21.7100%
Counts
Total377 14273 10832 1499 26981
Percent to 4-Year College
06 939 218 21 1184
1% to 25%0 228 119 18 365
26% to 50%3 533 259 35 830
51% to 75%0 428 451 19 898
76% to 100%4 1845 1759 57 3665
# Rounds to zero.

NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, Private School Universe Survey (PSS), 2011–12.

Computation by NCES QuickStats on 5/5/2016
febgd6d
3
Days in School Year by Hours in School Day for Students.
Days in School Year Lowest quartile
(%)
Lower-middle quartile
(%)
Upper-middle quartile
(%)
Highest quartile
(%)
Total
Estimates
Total25.912.441.620.1100%
Hours in School Day for Students
1 to 341.43.529.125.9100%
4 to 625.812.342.419.4100%
7 to 925.913.241.719.1100%
109.22.921.566.3100%
Counts
Total6570 3426 11425 5562 26983
Hours in School Day for Students
1 to 3180 16 133 115 444
4 to 63333 1890 6280 2947 14450
7 to 93017 1508 4920 2215 11660
1040 12 92 285 429
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, Private School Universe Survey (PSS), 2011–12.

Computation by NCES QuickStats on 5/5/2016
febgda8a
4
Urban-Centric Community Type by Number of Male Students (Coeducational).
Urban-Centric Community Type City (ulocale=11, 12, 13)
(%)
Suburb (ulocale=21, 22, 23)
(%)
Town (ulocale=31, 32, 33)
(%)
Rural (ulocale=41, 42, 43)
(%)
Total
Estimates
Total32.435.49.422.8100%
Number of Male Students (Coeducational)
Lowest quartile26.439.48.525.7100%
Lower-middle quartile24.529.610.934.9100%
Upper-middle quartile34.233.513.119.2100%
Highest quartile42.239.86.211.8100%
Counts
Total9143 9788 2584 5468 26983
Number of Male Students (Coeducational)
Lowest quartile1767 2556 506 1197 6026
Lower-middle quartile1790 2090 763 2206 6849
Upper-middle quartile2278 2160 886 1149 6473
Highest quartile2756 2622 404 728 6510
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, Private School Universe Survey (PSS), 2011–12.

Computation by NCES QuickStats on 5/5/2016
febgdk8a
5
Size of School (K-12, UG) by School Typology.
Size of School (K-12, UG) Less
than 50
students
(%)
50-
149
students
(%)
150-
299
students
(%)
300-
499
students
(%)
500-
749
students
(%)
750
students
or more
(%)
Total
Estimates
Total43.624.817.87.93.62.3100%
School Typology
Catholic, parochial3.623.645.020.26.11.5100%
Catholic, diocesan3.623.440.019.19.74.1100%
Catholic, private18.218.320.016.913.912.8100%
Other religious, conservative Christian38.333.017.46.32.92.0100%
Other relig., affiliated w/ established denomination35.131.819.47.43.52.9100%
Other relig., not affiliated w/ any denomination63.122.39.03.21.31.0100%
Nonsectarian, regular school60.515.910.76.73.23.1100%
Nonsectarian, special program68.823.85.31.50.50.2100%
Nonsectarian, special education54.137.07.51.20.10.1100%
Counts
Total11480 6781 4903 2228 974 617 26983
School Typology
Catholic, parochial96 664 1238 552 161 38 2749
Catholic, diocesan102 650 1119 527 238 115 2751
Catholic, private130 171 190 159 126 116 892
Other religious, conservative Christian1498 1288 650 271 124 83 3914
Other relig., affiliated w/ established denomination1013 888 500 200 94 77 2772
Other relig., not affiliated w/ any denomination3109 1244 503 193 78 49 5176
Nonsectarian, regular school2631 639 434 266 138 133 4241
Nonsectarian, special program2088 692 156 43 14 5 2998
Nonsectarian, special education813 545 113 17 1 1 1490
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, Private School Universe Survey (PSS), 2011–12.

Computation by NCES QuickStats on 5/5/2016
febgd7d
1
Undergraduate degree program by Total grants.
Undergraduate degree program Certificate
(%)
Associate's degree
(%)
Bachelor's degree
(%)
Not in a degree program
(%)
Total
Estimates
Total8.0 42.3 46.4 3.3 100%
Total grants
$1-1,9999.0 54.3 33.0 3.7 100%
$2,000-3,99911.5 48.7 38.7 1.0 100%
$4,000-6,99910.1 42.4 46.7 0.7 100%
$7,000 or more2.1 11.8 85.8 0.3 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2011-12 National Postsecondary Student Aid Study (NPSAS:12).

Computation by NCES QuickStats on 6/23/2013
cahbd71
2
Federal Pell grant by NPSAS institution sector (4 with multiple).
Federal Pell grant $100-1,999
(%)
$2,000-2,999
(%)
$3,000-4,499
(%)
$4,500-5,550
(%)
Total
Estimates
Total24.5 24.1 15.7 35.7 100%
NPSAS institution sector (4 with multiple)
Public 4-year18.7 20.0 15.3 46.0 100%
Private not-for-profit 4-year19.1 20.8 16.0 44.2 100%
Public 2-year30.9 27.6 16.5 25.0 100%
Private for-profit24.9 26.4 14.0 34.7 100%
More than one school22.1 21.2 17.2 39.5 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2011-12 National Postsecondary Student Aid Study (NPSAS:12).

Computation by NCES QuickStats on 6/23/2013
cahbd22
3
Gender by Attendance intensity (all schools).
Gender Male
(%)
Female
(%)
Total
Estimates
Total43.0 57.0 100%
Attendance intensity (all schools)
Exclusively full-time43.6 56.4 100%
Exclusively part-time42.2 57.8 100%
Mixed full-time and part-time42.9 57.1 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2011-12 National Postsecondary Student Aid Study (NPSAS:12).

Computation by NCES QuickStats on 6/23/2013
cahbd1c
4
Total federal aid (excludes Veterans'/DOD) by Race/ethnicity (with multiple).
Total federal aid (excludes Veterans'/DOD) $100-3,699
(%)
$3,700-6,499
(%)
$6,500-11,599
(%)
$11,600 or more
(%)
Total
Estimates
Total24.5 23.8 26.6 25.1 100%
Race/ethnicity (with multiple)
White23.1 22.9 28.8 25.2 100%
Black or African American24.9 21.5 25.2 28.4 100%
Hispanic or Latino29.2 27.6 21.8 21.4 100%
Asian22.7 30.5 25.5 21.3 100%
American Indian or Alaska Native30.8 22.9 22.6 23.7 100%
Native Hawaiian / other Pacific Islander20.4 28.5 23.7 27.4 100%
Other22.0 23.2 26.2 28.6 100%
More than one race 100%
‡ Reporting standards not met.

NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2011-12 National Postsecondary Student Aid Study (NPSAS:12).

Computation by NCES QuickStats on 6/23/2013
cahbd67
5
Citizenship by Total aid amount.
Citizenship US citizen
(%)
Resident alien
(%)
Foreign or international student
(%)
Total
Estimates
Total94.0 4.2 1.8 100%
Total aid amount
$100-3,49994.3 4.9 0.8 100%
$3,500-7,69995.0 4.4 0.6 100%
$7,700-14,69995.8 3.7 0.5 100%
$14,700 or more96.3 2.7 1.0 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2011-12 National Postsecondary Student Aid Study (NPSAS:12).

Computation by NCES QuickStats on 6/23/2013
cahbd30
1
Institutional tuition & fee waivers by NPSAS institution type: Graduate (with multiple).
Institutional tuition & fee waivers $100-1,899
(%)
$1,900-5,599
(%)
$5,600-11,299
(%)
$11,300 or more
(%)
Total
Estimates
Total24.4 26.4 24.9 24.3 100%
NPSAS institution type: Graduate (with multiple)
Public 4-year nondoctorate-granting30.6 45.3 20.6 3.5 100%
Public 4-year doctorate-granting21.6 21.2 30.0 27.1 100%
Private not-for-profit 4-yr nondoctorate-granting34.5 45.2 16.3 3.9 100%
Private not-for-profit 4-year doctorate-granting16.0 30.3 17.8 35.9 100%
Private for profit 4-year50.5 33.5 15.4 0.7 100%
Attended more than one institution19.6 33.3 23.3 23.8 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2011-12 National Postsecondary Student Aid Study Graduate Students (NPSAS:12).

Computation by NCES QuickStats on 6/23/2013
cahbd28
2
Age as of 12/31/2011 by Total income (continuous).
  Age as of 12/31/2011
(Avg>0)
Estimates
Total32.3
Total income (continuous)
Less than $10,80026.7
$10,800-32,69929.8
$32,700-67,19934.5
$67,200 or more38.3
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2011-12 National Postsecondary Student Aid Study Graduate Students (NPSAS:12).

Computation by NCES QuickStats on 6/23/2013
cahbd9c
3
Total loans by Graduate degree program.
  Total loans
(Mean[0])
Estimates
Total9,656.8
Graduate degree program
Master's degree8,001.3
Doctoral degree4,249.0
First-professional degree4,583.1
Post-BA or post-master's certificate30,743.7
Not in a degree program13,175.0
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2011-12 National Postsecondary Student Aid Study Graduate Students (NPSAS:12).

Computation by NCES QuickStats on 6/23/2013
cahbd9d
4
Attendance intensity (all schools) by Employer aid (includes college staff).
Attendance intensity (all schools) Exclusively full-time
(%)
Exclusively part-time
(%)
Mixed full-time and part-time
(%)
Total
Estimates
Total46.5 37.9 15.6 100%
Employer aid (includes college staff)
$1-1,99922.1 62.6 15.2 100%
$2,000-4,99929.6 56.4 14.0 100%
$5,000-10,09936.3 48.5 15.3 100%
$10,100 or more55.3 28.9 15.8 100%
NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2011-12 National Postsecondary Student Aid Study Graduate Students (NPSAS:12).

Computation by NCES QuickStats on 6/23/2013
cahbd4f
5
State aid total by Graduate degree program.
State aid total $100-1,399
(%)
$1,400-1,999
(%)
$2,000-3,999
(%)
$4,000 or more
(%)
Total
Estimates
Total24.8 18.2 27.5 29.5 100%
Graduate degree program
Master's degree24.5 20.1 23.2 32.2 100%
Doctoral degree 100%
First-professional degree27.0 17.9 16.1 39.0 100%
Post-BA or post-master's certificate25.9 14.5 33.8 25.8 100%
Not in a degree program37.0 11.7 36.5 14.8 100%
‡ Reporting standards not met.

NOTE: Rows may not add up to 100% due to rounding.

SOURCE: U.S. Department of Education, National Center for Education Statistics, 2011-12 National Postsecondary Student Aid Study Graduate Students (NPSAS:12).

Computation by NCES QuickStats on 6/23/2013
cahbd63
1
Highest degree attained as of 2001 by First year: Hours per week enrolled 1995-96.
Certificate
(%)
Associate
(%)
Bachelor
(%)
Never attained
(%)
Total
Estimates
Total 11.7 9.8 29.8 48.6 100%
First year: Hours per week enrolled 1995-96
  Did not work while enrolled 14.0 9.8 38.5 37.8 100%
  Worked part time 9.1 11.5 33.5 45.8 100%
  Worked full time 15.3 6.4 7.6 70.7 100%
The names of the variables used in this table are: J1HOURY1 and DGREHI2B. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.

Source: U.S. Department of Education, National Center for Education Statistics, 1995-96 Beginning Postsecondary Students Longitudinal Study, Second Follow-up (BPS:96/01).

Computation by NCES PowerStats on 10/1/2010.
bhabhcfa
2
Cumulative persistence outcome 2000-01 by AP tests: Number taken (student).
Never attained
(%)
Certificate
(%)
Associate
(%)
Bachelor
(%)
Total
Estimates
Total 48.6 11.7 9.9 29.8 100%
AP tests: Number taken (student)
  0 51.1 7.7 12.1 29.1 100%
  1 38.1 2.6 ! 6.0 ! 53.4 100%
  2 33.6 0.4 !! 3.4 ! 62.6 100%
  Three or more 13.8 0.1 !! 1.4 !! 84.8 100%
! Interpret data with caution. Relative standard error (RSE) falls between 30 and 50 percent.
!! Interpret data with caution. Relative standard error (RSE) > 50 percent.

The names of the variables used in this table are: TEAPNUMB and PROUTYX6. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.

Source: U.S. Department of Education, National Center for Education Statistics, 1995-96 Beginning Postsecondary Students Longitudinal Study, Second Follow-up (BPS:96/01).

Computation by NCES PowerStats on 10/1/2010.
bhabhc97
3
PELL grant received 1995-96 with (percent > 0) by Race/ethnicity and citizenship status.
PELL grant received 1995-96
(%>0)
Estimates
Total 26.4
Race/ethnicity and citizenship status
  White, non-Hispanic 19.0
  Black, non-Hispanic 49.3
  Hispanic 42.4
  Asian/Pacific Islander 35.5
  American Indian/Alaska Native 33.2 !
  Other
‡ Reporting standards not met.

! Interpret data with caution. Relative standard error (RSE) falls between 30 and 50 percent.

The names of the variables used in this table are: PELL96 and SBRACECI. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.

Source: U.S. Department of Education, National Center for Education Statistics, 1995-96 Beginning Postsecondary Students Longitudinal Study, Second Follow-up (BPS:96/01).

Computation by NCES PowerStats on 10/1/2010.
bhabhc66
4
Grade point average 2001 by Income percentile rank (dependent & independent) 1994.
Mostly A’s
(%)
A’s and B’s
(%)
Mostly B’s
(%)
B’s and C’s
(%)
Mostly C’s
(%)
C’s and D’s
(%)
Mostly D’s or below
(%)
Total
Estimates
Total 13.2 31.6 35.4 14.4 4.5 0.7 ! 0.1 ! 100%
Income percentile rank (dependent & independent) 1994
  1-25 12.5 28.8 38.1 14.2 4.8 1.4 ! 0.2 !! 100%
  26-50 12.9 30.7 37.0 12.9 5.9 0.4 ! 0.2 ! 100%
  51-75 13.1 34.5 34.1 14.3 3.6 0.4 !! 0.1 !! 100%
  More than 75 14.1 32.7 32.5 16.2 3.9 0.7 !! 0.0 !! 100%
! Interpret data with caution. Relative standard error (RSE) falls between 30 and 50 percent.
!! Interpret data with caution. Relative standard error (RSE) > 50 percent.

The names of the variables used in this table are: SEGPA2B and PCTALL2. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.

Source: U.S. Department of Education, National Center for Education Statistics, 1995-96 Beginning Postsecondary Students Longitudinal Study, Second Follow-up (BPS:96/01).

Computation by NCES PowerStats on 10/1/2010.
bhabhcpea
5
Persistence and attainment 6-year total by Gender.
Attained, still enrolled
(%)
Attained, not enrolled
(%)
Never attained, still enrolled
(%)
Never attained, not enrolled
(%)
Total
Estimates
Total 5.9 45.5 14.9 33.7 100%
Gender
  Male 5.9 41.8 15.8 36.5 100%
  Female 5.8 48.5 14.2 31.5 100%
The names of the variables used in this table are: SBGENDER and PRAT2B. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.

Source: U.S. Department of Education, National Center for Education Statistics, 1995-96 Beginning Postsecondary Students Longitudinal Study, Second Follow-up (BPS:96/01).

Computation by NCES PowerStats on 10/1/2010.
bhabhd1f
1
Total loans with (percent > .5) by Graduate programs.
Total loans
(%>0.5)
Estimates
Total 40.0
Graduate programs
  Not in a degree program 28.0
  Business administration (MBA) 39.1
  Education (any master's) 34.8
  Other master of arts (MA) 41.3
  Other master of science (MS) 31.8
  Other master's degree 49.3
  PhD except in education 19.9
  Education (any doctorate) 27.1
  Other doctoral degree 49.5
  Medicine (MD) 77.3
  Other health science degree 81.7
  Law (LLB or JD) 81.0
  Theology (MDiv, MHL, BD) 30.0
  Post-baccalaureate certificate 30.1
The names of the variables used in this table are: GRADPGM and TOTLOAN. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.

Source: U.S. Department of Education, National Center for Education Statistics, 2003-04 National Postsecondary Student Aid Study (NPSAS:04).

NOTICE OF REVISIONS: The NPSAS:04 weights were revised in June 2009. The revised weights will produce 2003-04 estimates that differ somewhat from those in any tables and publications produced before June 2009. See the description for the total Stafford loan variable (STAFFAMT) for details.

Computation by NCES PowerStats on 10/1/2010.
bmabhec92
2
Total amount from assistantships with (percent > .5) by Graduate field of study.
Total amount from assistantships
(%>0.5)
Estimates
Total 15.3
Graduate field of study
  Undeclared or not in a degree program5.4
  Humanities20.8
  Social/behavioral sciences31.7
  Life sciences47.4
  Math/Engineering/Computer science37.9
  Education7.6
  Business/management7.9
  Health10.3
  Law5.8
  Others23.8
The names of the variables used in this table are: GRASTAMT and MAJORSGR. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.

Source: U.S. Department of Education, National Center for Education Statistics, 2003-04 National Postsecondary Student Aid Study (NPSAS:04).

NOTICE OF REVISIONS: The NPSAS:04 weights were revised in June 2009. The revised weights will produce 2003-04 estimates that differ somewhat from those in any tables and publications produced before June 2009. See the description for the total Stafford loan variable (STAFFAMT) for details.

Computation by NCES PowerStats on 10/1/2010.
bmabhed8b
3
Primary role as student or employee (includes work-study or assistantship) by Graduate field of study.
A student working to meet expenses
(%)
An employee enrolled in school
(%)
No job
(%)
Total
Estimates
Total 35.8 45.1 19.1 100%
Graduate field of study
  Undeclared or not in a degree program20.5 67.3 12.2 100%
  Humanities44.9 35.9 19.2 100%
  Social/behavioral sciences58.9 24.6 16.5 100%
  Life sciences61.0 20.7 18.3 100%
  Math/Engineering/Computer science47.4 38.3 14.3 100%
  Education26.3 63.3 10.4 100%
  Business/management24.8 61.8 13.3 100%
  Health39.4 19.0 41.6 100%
  Law39.6 11.6 48.8 100%
  Others47.0 38.5 14.5 100%
The names of the variables used in this table are: JOBROLE2 and MAJORSGR. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.

Source: U.S. Department of Education, National Center for Education Statistics, 2003-04 National Postsecondary Student Aid Study (NPSAS:04).

NOTICE OF REVISIONS: The NPSAS:04 weights were revised in June 2009. The revised weights will produce 2003-04 estimates that differ somewhat from those in any tables and publications produced before June 2009. See the description for the total Stafford loan variable (STAFFAMT) for details.

Computation by NCES PowerStats on 10/1/2010.
bmabhee01
4
Total loan debt (cumulative) with (percent > .5) by Type of 4-year graduate institution.
Total loan debt (cumulative)
(%>0.5)
Estimates
Total65.2
Type of 4-year graduate institution
  Public 4-year nondoctorate61.4
  Public 4-year doctorate60.6
  Private not-for-profit 4-yr nondoctorate61.6
  Private not-for-profit 4-year doctorate71.3
  Private for-profit 4-year85.9
  Attended more than one institution68.9
The names of the variables used in this table are: AIDSECTG and BORAMT3. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.

Source: U.S. Department of Education, National Center for Education Statistics, 2003-04 National Postsecondary Student Aid Study (NPSAS:04).

NOTICE OF REVISIONS: The NPSAS:04 weights were revised in June 2009. The revised weights will produce 2003-04 estimates that differ somewhat from those in any tables and publications produced before June 2009. See the description for the total Stafford loan variable (STAFFAMT) for details.

Computation by NCES PowerStats on 10/1/2010.
bmabheff04
5
Average Total loans by Parent's highest education level.
Total loans
(Avg)
Estimates
Total6,302.0
Parent's highest education level
  Do not know parent's education level7,677.5 !
  High school diploma or less5,878.7
  Some college6,016.3
  Bachelor's degree5,794.3
  Master's degree or higher7,185.9
! Interpret data with caution. Relative standard error (RSE) falls between 30 and 50 percent.

The names of the variables used in this table are: PAREDUC and TOTLOAN. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.

Source: U.S. Department of Education, National Center for Education Statistics, 2003-04 National Postsecondary Student Aid Study (NPSAS:04).

NOTICE OF REVISIONS: The NPSAS:04 weights were revised in June 2009. The revised weights will produce 2003-04 estimates that differ somewhat from those in any tables and publications produced before June 2009. See the description for the total Stafford loan variable (STAFFAMT) for details.

Computation by NCES PowerStats on 10/1/2010.
bmabheh25
1
Post-BA degree: Highest, collapsed by Undergrad major field of study 2.
  No post-baccalaureate enrollment
(%)
Certificate
(%)
Associates degree
(%)
Masters degree
(%)
Doctoral/professional degree
(%)
Total
Estimates
Total74.4 3.5 0.4 16.5 5.3 100%
Undergrad major field of study 2
Business and management82.7 3.2 0.0 12.5 1.6 100%
Education77.9 2.4 ! 0.5 !! 19.2 0.0 !! 100%
Engineering76.0 2.0 ! 0.1 !! 20.0 1.9 100%
Health professions72.1 3.4 0.6 ! 19.6 4.3 100%
Public affairs/social services73.5 4.2 ! 0.3 !! 18.4 3.6 ! 100%
Biological sciences50.6 2.7 ! 0.5 !! 15.3 31.0 100%
Mathematics and physical science71.5 3.9 0.0 17.6 7.0 100%
Social sciences66.4 4.4 0.1 !! 15.9 13.1 100%
History65.6 5.0 ! 0.1 !! 20.7 8.5 100%
Humanities73.7 4.1 0.7 !! 17.5 4.0 100%
Psychology64.7 1.7 ! 0.5 ! 25.8 7.3 100%
Other78.9 4.7 0.7 ! 12.7 3.0 100%
! Interpret data with caution. Relative standard error (RSE) falls between 30 and 50 percent.
!! Interpret data with caution. Relative standard error (RSE) > 50 percent.
The names of the variables used in this table are: HIDEGC and MAJORS4. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.
Source: U.S. Department of Education, National Center for Education Statistics, B&B: 00/01 Baccalaureate and Beyond Longitudinal Study.

Computation by NCES PowerStats on 2/8/2011.
gbbb93
2
Post-BA: Highest degree completed by Age received BA from NPSAS institution.
  Certificate
(%)
Associates
degree
(%)
Masters
degree
(%)
Doctoral/professional
degree
(%)
Total
Estimates
Total41.7 1.7 !! 44.5 2.3 ! 100%
Age received BA from NPSAS institution
  22 or younger36.9 0.3 !! 55.8 1.5 !! 100%
  23-2442.3 4.1 !! 44.9 1.2 !! 100%
  25-2938.6 0.0 40.2 8.0 !! 100%
  30 or older53.0 2.0 !! 25.3 0.7 !! 100%
! Interpret data with caution. Relative standard error (RSE) falls between 30 and 50 percent.
!! Interpret data with caution. Relative standard error (RSE) > 50 percent.
The names of the variables used in this table are: AGENBA and PBATT. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.
Source: U.S. Department of Education, National Center for Education Statistics, B&B: 00/01 Baccalaureate and Beyond Longitudinal Study.

Computation by NCES PowerStats on 2/9/2011.
bhabhd29
3
Average>0 Job income, annual amount, calculated by Highest degree plans.
  Job income, annual amount, calculated
(Avg>0)
Estimates
Total33,129.6
Highest degree plans
  No plans beyond bachelors34,078.4
  Post-baccalaureate certificate31,428.6
  Masters degree33,179.3
  Doctoral/professional degree29,082.7
The names of the variables used in this table are: CEANNERN and EDEXP. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.
Source: U.S. Department of Education, National Center for Education Statistics, B&B: 00/01 Baccalaureate and Beyond Longitudinal Study.

Computation by NCES PowerStats on 2/14/2011.
bebbbf6
4
Total amount borrowed for undergraduate education and median Amount owed on all undergraduate loans by current occupation
  Total amount borrowed for
undergraduate education
(%>1000)
Amount owed all under-
graduate loans 2000
(Median)
Estimates
Total57.9 7,777.0
Current occupation code, collapsed
  Educators64.9 10,322.0
  Business and managment61.0 5,383.0
  Engineering/software enginr/architecture58.7 3,991.0
  Computer science61.3 5,981.0
  Medical professionals65.9 9,800.0
  Editors/writers/performers57.1 10,000.0
  Human/protective service professionals71.4 11,894.0
  Research, scientists, technical61.1 7,500.0
  Administrative/clerical/legal63.7 8,595.0
  Mechanics, laborers55.1 1,200.0 !
  Service industries58.6 8,500.0
  Other, uncodeable52.9 5,256.0 !
Job related to undergraduate major, closely
  Not closely related60.9 6,709.0
  Closely related62.9 8,793.0
! Interpret data with caution. Relative standard error (RSE) falls between 30 and 50 percent.
The names of the variables used in this table are: OWEAMT1, JBRELMJR, TOTDEBT and OCCD. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.
Source: U.S. Department of Education, National Center for Education Statistics, B&B: 00/01 Baccalaureate and Beyond Longitudinal Study.

Computation by NCES PowerStats on 2/14/2011.
bebbbec
5
Current, teaching position type by Attendance intensity 1999-2000.
  Elementary or
secondary teacher
(%)
Substitute
teacher
(%)
Teacher's
aide
(%)
Itinerant
teacher
(%)
Support
teacher
(%)
Total
Estimates
Total67.9 23.2 6.5 0.9 1.4 100%
Attendance intensity 1999-2000
  Exclusively full-time68.2 22.8 6.3 0.8 ! 1.8 100%
  Half-time67.7 21.9 7.7 ! 2.4 !! 0.3 !! 100%
  Less than half-time70.5 20.6 ! 7.9 !! 0.6 !! 0.5 !! 100%
  Mixed66.3 25.9 6.6 1.0 !! 0.3 !! 100%
! Interpret data with caution. Relative standard error (RSE) falls between 30 and 50 percent.!! Interpret data with caution. Relative standard error (RSE) > 50 percent.
The names of the variables used in this table are: CGCURPOS and ATTNPTRN. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.
Source: U.S. Department of Education, National Center for Education Statistics, B&B: 00/01 Baccalaureate and Beyond Longitudinal Study.
Computation by NCES PowerStats on 2/14/2011.
bebbb4d
1
Aid: Applied for federal aid by Income: Dependent student household income.
No
(%)
Yes
(%)
Total
Estimates
Total 41.7 58.3 100%
Income: Dependent student household income
  Less than $32,00021.3 78.7 100%
  $32,000-59,99933.4 66.6 100%
  $60,000-91,99943.1 56.9 100%
  $92,000 or more52.9 47.1 100%
The names of the variables used in this table are: DEPINC and FEDAPP. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.

Source: U.S. Department of Education, National Center for Education Statistics, 2003-04 National Postsecondary Student Aid Study (NPSAS:04).

NOTICE OF REVISIONS: The NPSAS:04 weights were revised in June 2009. The revised weights will produce 2003-04 estimates that differ somewhat from those in any tables and publications produced before June 2009. See the description for the total Stafford loan variable (STAFFAMT) for details.

Computation by NCES PowerStats on 10/1/2010.
bmabhd14
2
Cumulative Grade Point Average (GPA) as of 2003-2004 by College study: Major field.
Less than 2.75
(%)
2.75 to 3.74
(%)
More than 3.75
(%)
Total
Estimates
Total34.4 49.0 16.7 100%
College study: Major field
  Humanities35.9 50.4 13.6 100%
  Social/behavioral sciences35.0 52.1 12.8 100%
  Life sciences34.9 52.7 12.4 100%
  Physical sciences31.5 54.3 14.2 100%
  Math29.1 55.3 15.6 100%
  Computer/information science34.0 48.1 17.9 100%
  Engineering37.4 48.1 14.5 100%
  Education31.9 52.6 15.5 100%
  Business/management35.6 49.3 15.1 100%
  Health32.2 50.7 17.0 100%
  Vocational/technical33.3 47.1 19.6 100%
  Other technical/professional36.7 49.9 13.4 100%
  Undeclared or not in a degree program33.2 44.1 22.8 100%
The names of the variables used in this table are: MAJORS12 and GPA. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.

Source: U.S. Department of Education, National Center for Education Statistics, 2003-04 National Postsecondary Student Aid Study (NPSAS:04).

NOTICE OF REVISIONS: The NPSAS:04 weights were revised in June 2009. The revised weights will produce 2003-04 estimates that differ somewhat from those in any tables and publications produced before June 2009. See the description for the total Stafford loan variable (STAFFAMT) for details.

Computation by NCES PowerStats on 10/1/2010.
bmabhd3a
3
Median Net price after all aid by NPSAS institution type.
Net price after all aid
(Avg)
Estimates
Total6,656.0
Institution sector (with multiple)
  Public less-than-2-year5,616.5
  Public 2-year4,716.3
  Public 4-year nondoctorate6,253.5
  Public 4-year doctorate7,564.1
  Private not-for-profit less than 4-year7,382.3
  Private not-for-profit 4-yr nondoctorate9,208.7
  Private not-for-profit 4-year doctorate14,812.2
  Private for-profit less-than-2-year7,842.9
  Private for-profit 2 years or more6,737.6
  Attended more than one institution
‡ Reporting standards not met.

The names of the variables used in this table are: NETCST1 and AIDSECT. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.

Source: U.S. Department of Education, National Center for Education Statistics, 2003-04 National Postsecondary Student Aid Study (NPSAS:04).

NOTICE OF REVISIONS: The NPSAS:04 weights were revised in June 2009. The revised weights will produce 2003-04 estimates that differ somewhat from those in any tables and publications produced before June 2009. See the description for the total Stafford loan variable (STAFFAMT) for details.

Computation by NCES PowerStats on 10/1/2010.
bmabhd32
4
Parent's highest education by Institution sector (with multiple).
High school or less
(%)
Some college
(%)
Bachelor's or higher
(%)
Total
Estimates
Total36.3 21.1 42.6 100%
Institution sector (with multiple)
  Public less-than-2-year53.9 17.3 28.8 100%
  Public 2-year42.7 23.5 33.8 100%
  Public 4-year nondoctorate34.6 21.7 43.6 100%
  Public 4-year doctorate24.2 18.9 56.9 100%
  Private not-for-profit less than 4-year46.1 18.5 35.3 100%
  Private not-for-profit 4-yr nondoctorate34.1 19.1 46.8 100%
  Private not-for-profit 4-year doctorate19.2 14.6 66.2 100%
  Private for-profit less-than-2-year54.8 17.1 28.1 100%
  Private for-profit 2 years or more53.0 20.0 27.0 100%
  Attended more than one institution31.1 21.8 47.1 100%
The names of the variables used in this table are: PAREDUC and AIDSECT. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.

Source: U.S. Department of Education, National Center for Education Statistics, 2003-04 National Postsecondary Student Aid Study (NPSAS:04).

NOTICE OF REVISIONS: The NPSAS:04 weights were revised in June 2009. The revised weights will produce 2003-04 estimates that differ somewhat from those in any tables and publications produced before June 2009. See the description for the total Stafford loan variable (STAFFAMT) for details.

Computation by NCES PowerStats on 10/1/2010.
bmabhd93
5
Average>0 Grants: Pell Grants by Income: Categories by dependency status.
Grants: Pell Grants
(Avg>0)
Estimates
Total2,449.7
Income: Categories by dependency status
  Dependent: Less than $10,0003,242.2
  Dependent: $10,000-$19,9993,176.1
  Dependent: $20,000-$29,9992,715.0
  Dependent: $30,000-$39,9991,958.3
  Dependent: $40,000-$49,9991,508.6
  Dependent: $50,000-$59,9991,309.0
  Dependent: $60,000-$69,9991,241.7
  Dependent: $70,000-$79,9991,404.4
  Dependent: $80,000-$99,999
  Dependent: $100,000 or more
  Independent: Less than $5,0002,860.3
  Independent: $5,000-$9,9992,642.9
  Independent: $10,000-$19,9992,291.7
  Independent: $20,000-$29,9992,328.3
  Independent: $30,000-$49,9991,561.9
  Independent: $50,000 or more1,124.3
‡ Reporting standards not met.

The names of the variables used in this table are: INCOME and PELLAMT. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.

Source: U.S. Department of Education, National Center for Education Statistics, 2003-04 National Postsecondary Student Aid Study (NPSAS:04).

NOTICE OF REVISIONS: The NPSAS:04 weights were revised in June 2009. The revised weights will produce 2003-04 estimates that differ somewhat from those in any tables and publications produced before June 2009. See the description for the total Stafford loan variable (STAFFAMT) for details.

Computation by NCES PowerStats on 10/1/2010.
bmabhe3a
1
Employed full or part time at this institution by 2000 Carnegie code by control.
Full time
(%)
Part time
(%)
Total
Estimates
Total56.3 43.7 100%
2000 Carnegie code by control
  Public doctoral77.8 22.2 100%
  Private not-for-profit doctoral68.7 31.3 100%
  Public master's63.3 36.7 100%
  Private not-for-profit master's45.0 55.0 100%
  Private not-for-profit baccalaureate63.2 36.8 100%
  Public associates33.3 66.7 100%
  Other49.2 50.8 100%
The names of the variables used in this table are: Q5 and X121Q0. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.

Source: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Computation by NCES PowerStats on 10/1/2010.
bkabhbfa
2
Race/ethnicity recoded by 2000 Carnegie code by control, for Employed full or part time at this institution (Full time).
White non-Hispanic
(%)
Black/African American non-Hispanic
(%)
Asian/Pacific Islander
(%)
Hispanic White or Hispanic Black
(%)
Other
(%)
Total
Estimates
Total80.3 5.8 9.2 3.4 1.2 100%
2000 Carnegie code by control
  Public doctoral78.9 4.2 12.9 3.0 1.1 100%
  Private not-for-profit doctoral78.2 5.1 12.8 3.2 0.7 ! 100%
  Public master's78.1 9.1 7.6 3.6 1.6 ! 100%
  Private not-for-profit master's85.6 5.1 5.7 2.4 1.3 100%
  Private not-for-profit baccalaureate85.7 6.9 4.1 2.2 1.2 100%
  Public associates80.7 7.4 4.4 5.7 1.7 100%
  Other86.7 5.0 5.6 1.8 0.9 ! 100%
! Interpret data with caution. Relative standard error (RSE) falls between 30 and 50 percent.

The names of the variables used in this table are: X03Q74, Q5 and X121Q0. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.

Source: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Computation by NCES PowerStats on 10/1/2010.
bkabhb68
3
Tenure status by 2000 Carnegie code by control, for Employed full or part time at this institution (Full time).
Tenured
(%)
On tenure track but not tenured
(%)
Not on tenure track
(%)
Not tenured-no tenure system
(%)
Total
Estimates
Total47.5 20.6 23.7 8.3 100%
2000 Carnegie code by control
  Public doctoral49.3 19.4 30.3 0.9 100%
  Private not-for-profit doctoral43.4 19.3 32.7 4.7 100%
  Public master's53.9 27.7 17.6 0.9 !! 100%
  Private not-for-profit master's42.0 27.4 22.2 8.4 100%
  Private not-for-profit baccalaureate42.7 24.5 22.7 10.1 100%
  Public associates48.5 15.5 10.1 25.9 100%
  Other39.8 16.8 19.4 24.1 100%
!! Interpret data with caution. Relative standard error (RSE) > 50 percent.

The names of the variables used in this table are: Q12, Q5 and X121Q0. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.

Source: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Computation by NCES PowerStats on 10/1/2010.
bkabhb0a
4
Rank by 2000 Carnegie code by control, for Employed full or part time at this institution (Part time).
Professor
(%)
Associate professor
(%)
Assistant professor
(%)
Instructor or Lecturer
(%)
Other ranks/Not applicable
(%)
Total
Estimates
Total4.4 2.8 3.7 43.1 46.0 100%
2000 Carnegie code by control
  Public doctoral6.7 4.5 9.4 42.1 37.3 100%
  Private not-for-profit doctoral5.9 4.7 11.7 32.1 45.5 100%
  Public master's6.2 2.2 ! 2.2 40.4 49.0 100%
  Private not-for-profit master's2.6 3.3 2.6 30.3 61.1 100%
  Private not-for-profit baccalaureate4.5 4.5 5.9 ! 31.7 53.5 100%
  Public associates3.1 1.4 0.9 51.5 43.2 100%
  Other6.7 4.8 4.9 35.2 48.4 100%
! Interpret data with caution. Relative standard error (RSE) falls between 30 and 50 percent.

The names of the variables used in this table are: Q10, Q5 and X121Q0. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.

Source: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Computation by NCES PowerStats on 10/1/2010.
bkabhb1b
5
Average>0 Average total hours per week worked by Tenure status.
Average total hours per week worked
(Avg>0)
Estimates
Total47.4
Tenure status
  Tenured53.3
  On tenure track but not tenured53.7
  Not on tenure track43.0
  Not tenured-no tenure system45.4
The names of the variables used in this table are: Q12 and X01Q31. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.

Source: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Computation by NCES PowerStats on 10/1/2010.
bkabhb3a
1
Any faculty represented by a union by 2000 Carnegie code by control.
Not represented by a union
(%)
Represented by a union
(%)
Total
Estimates
Total68.1 31.9 100%
2000 Carnegie code by control
  Public doctoral69.1 30.9 100%
  Private not-for-profit doctoral94.4 5.6 100%
  Public master's58.1 41.9 100%
  Private not-for-profit master's87.6 12.4 ! 100%
  Private not-for-profit baccalaureate86.7 13.3 !! 100%
  Public associates42.4 57.6 100%
  Other78.3 21.7 100%
! Interpret data with caution. Relative standard error (RSE) falls between 30 and 50 percent.
!! Interpret data with caution. Relative standard error (RSE) > 50 percent.

The names of the variables used in this table are: X01I12 and X121Q0. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.

Source: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Computation by NCES PowerStats on 10/1/2010.
bkabhb79
2
Median Undergraduate instruction: Percent full-time faculty by 2000 Carnegie code by control.
Undergraduate instruction: Percent full-time faculty
(Avg)
Estimates
Total67.1
2000 Carnegie code by control
  Public doctoral68.6
  Private not-for-profit doctoral70.7
  Public master’s75.6
  Private not-for-profit master’s68.3
  Private not-for-profit baccalaureate74.5
  Public associates59.9
  Other67.0
The names of the variables used in this table are: I19A and X121Q0. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.

Source: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Computation by NCES PowerStats on 10/1/2010.
bkabhb52
3
Full time tenure: Downsized tenured faculty by 2000 Carnegie code by control.
No
(%)
Yes
(%)
Total
Estimates
Total85.7 14.3 100%
2000 Carnegie code by control
  Public doctoral83.4 16.6 100%
  Private not-for-profit doctoral93.9 6.1 100%
  Public master's90.7 9.3 ! 100%
  Private not-for-profit master's99.6 0.4 100%
  Private not-for-profit baccalaureate88.1 11.9 !! 100%
  Public associates87.7 12.3 ! 100%
  Other68.0 32.0 !! 100%
! Interpret data with caution. Relative standard error (RSE) falls between 30 and 50 percent.
!! Interpret data with caution. Relative standard error (RSE) > 50 percent.

The names of the variables used in this table are: I7C and X121Q0. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.

Source: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Computation by NCES PowerStats on 10/1/2010.
bkabhb85
4
Full time tenure: Maximum years on tenure track by 2000 Carnegie code by control.
No maximum
(%)
Less than 5 years
(%)
5 years
(%)
6 years
(%)
7 years
(%)
More than 7 years
(%)
Total
Estimates
Total17.5 ! 17.4 8.5 27.0 26.0 3.6 100%
2000 Carnegie code by control
  Public doctoral7.5 0.0 1.1 37.3 45.9 8.2 100%
  Private not-for-profit doctoral11.4 0.0 2.8 32.0 34.4 19.4 100%
  Public master's1.5 0.0 22.0 ! 37.1 38.9 0.6 100%
  Private not-for-profit master's16.8 !! 0.0 7.1 !! 40.5 ! 27.4 ! 8.2 !! 100%
  Private not-for-profit baccalaureate9.9 ! 0.7 0.0 53.5 32.2 3.7 !! 100%
  Public associates15.6 !! 44.6 16.9 ! 8.2 ! 13.7 !! 1.1 ! 100%
  Other41.9 ! 27.1 ! 1.9 !! 10.3 !! 18.5 ! 0.2 !! 100%
! Interpret data with caution. Relative standard error (RSE) falls between 30 and 50 percent.
!! Interpret data with caution. Relative standard error (RSE) > 50 percent.

The names of the variables used in this table are: I6 and X121Q0. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.

Source: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Computation by NCES PowerStats on 10/1/2010.
bkabhb73
5
Undergraduate instruction: Percent part time faculty with (percent > 50) by 2000 Carnegie code by control.
Undergraduate instruction: Percent part time faculty
(%>50)
Estimates
Total17.9
2000 Carnegie code by control
  Public doctoral0.6
  Private not-for-profit doctoral9.9
  Public master's1.6
  Private not-for-profit master's15.6 !!
  Private not-for-profit baccalaureate11.1 !
  Public associates23.9
  Other26.0 !
! Interpret data with caution. Relative standard error (RSE) falls between 30 and 50 percent.
!! Interpret data with caution. Relative standard error (RSE) > 50 percent.

The names of the variables used in this table are: I19B and X121Q0. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.

Source: U.S. Department of Education, National Center for Education Statistics, 2004 National Study of Postsecondary Faculty (NSOPF:04).

Computation by NCES PowerStats on 10/1/2010.
bkabhb2c
1
Time between college entry and bachelor’s degree by Undergraduate major.
4 years or less
(%)
4 to 5 years
(%)
5 to 6 years
(%)
6 to 10 years
(%)
More than 10 years
(%)
Total
Estimates
Total 35.5 27.4 11.4 11.7 14.0 100%
Undergraduate major
  Business and management 32.6 26.9 8.7 13.3 18.6 100%
  Education 32.9 30.4 10.7 11.0 15.0 100%
  Engineering 25.3 37.4 15.9 11.4 10.0 100%
  Health professions 22.0 27.3 13.5 14.2 23.1 100%
  Public affairs/social services 28.3 29.7 11.9 ! 13.2 17.0 100%
  Biological sciences 53.5 21.7 10.9 8.4 5.5 100%
  Mathematics & science 38.9 24.9 11.7 11.2 13.3 100%
  Social science 47.5 25.3 11.4 10.2 5.6 100%
  History 40.1 26.3 20.0 ! 5.3 ! 8.3 100%
  Humanities 39.8 21.4 12.8 12.1 13.8 100%
  Psychology 39.8 26.1 7.3 12.0 14.8 100%
  Other 35.4 28.7 12.4 11.3 12.2 100%
! Interpret data with caution. Relative standard error (RSE) falls between 30 and 50 percent.

The names of the variables used in this table are: BAMAJOR and B2BATIM2. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.

Source: U.S. Department of Education, National Center for Education Statistics, B&B: 93/03 Baccalaureate and Beyond Longitudinal Study.

Computation by NCES PowerStats on 10/1/2010.
bmabhea28
2
Highest degree completed as of 2003 by Age when received bachelor’s degree, for Student has a bachelor’s degree (Yes).
Bachelor's degree
(%)
Master's degree
(%)
First-professional degree
(%)
Doctoral degree
(%)
Total
Estimates
Total 73.8 20.2 4.0 2.0 100%
Age when received bachelor’s degree
  22 or younger 65.5 24.6 6.7 3.1 100%
  23-24 80.9 15.4 2.4 1.3 100%
  25-29 84.9 13.7 0.7 ! 0.7 100%
  30 or older 78.8 19.1 1.3 ! 0.8 ! 100%
! Interpret data with caution. Relative standard error (RSE) falls between 30 and 50 percent.

The names of the variables used in this table are: BACC, AGEATBA and B3HDG03. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.

Source: U.S. Department of Education, National Center for Education Statistics, B&B: 93/03 Baccalaureate and Beyond Longitudinal Study.

Computation by NCES PowerStats on 10/1/2010.
bmabhee3
3
Median Job 2003: annual salary by Highest degree attained by 2003.
Job 2003: annual salary
(Avg>0)
Estimates
Total 52,423.0
Highest degree attained by 2003
  Bachelor's degree 50,430.9
  Master's degree 52,943.2
  First-professional degree 82,217.0
  Doctoral degree 60,705.7
The names of the variables used in this table are: B3CURINC and B3HDG03. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.

Source: U.S. Department of Education, National Center for Education Statistics, B&B: 93/03 Baccalaureate and Beyond Longitudinal Study.

Computation by NCES PowerStats on 10/1/2010.
bmabhef8e
4
Undergraduate loans: total owed as of 2003 with (percent > 1) by Occupational category 2003 (collapsed).
Undergraduate loans: total owed as of 2003
(%>1)
Estimates
Total 51.4
Occupational category 2003 (collapsed)
  Educators 54.3
  Business and management 49.4
  Engineering/architecture 54.8
  Computer science 56.2
  Medical professionals 52.9
  Editors/writers/performers 44.5
  Human/protective service/legal profess 53.4
  Research, scientists, technical 50.5
  Administrative/clerical/legal support 53.2
  Mechanics, laborers 50.6
  Service industries 48.7
  Other, military 51.1
The names of the variables used in this table are: B3UGLN and B3OCCAT. The variable names are unique identifiers. To locate these variables, enter the variable name in the search box.

Source: U.S. Department of Education, National Center for Education Statistics, B&B: 93/03 Baccalaureate and Beyond Longitudinal Study.

Computation by NCES PowerStats on 10/1/2010.
bmabhe4c
5

                                </
Teaching status as of 2003 interview by Highest degree attained by 2003.
Currently teaching
(%)
Left teaching
(%)
Never taught
(%)
Total
Estimates
Total 10.5 9.1 80.4 100%
Highest degree attained by 2003