NCES Blog

National Center for Education Statistics

Back to School by the Numbers

By Dana Tofig, Communications Director, Institute of Education Sciences

Across the country, hallways and classrooms are full of activity as students head back to school for the 2016–17 academic year. Each year, the National Center for Education Statistics (NCES) compiles some back-to-school facts and figures that give a snapshot of our schools and colleges for the coming year. You can see the full report on the NCES website, but here are a few “by-the-number” highlights. You can also click on the hyperlinks throughout the blog to see additional data on these topics.

The staff of NCES and the Institute of Education Sciences hopes our students, teachers, administrators and families have an outstanding school year!

 

50.4 million

The number of students expected to attend public elementary and secondary schools this year—slightly more than the 2015–16 school year. The racial and ethnic profile of these students will continue to shift, with 24.6 million White students, 7.8 million Black students, 13.3 million Hispanic students, 2.7 million Asian/Pacific Islanders students, 0.5 million  American Indian/Alaska Native students, and 1.5 million students who are two or more races. About 5.2 million students are expected to attend private schools.

 

16.1

The expected number of public school students per teacher in fall 2016. This ratio hasn’t changed much since 2000, when it was 16.0. However, the pupil/teacher ratio is lower in private schools—12.1—and has fallen since 2000, when it was 14.5. 

 

$11,600

This is the projected per-student expenditure in public elementary and secondary schools in 2016–17. Adjusting for inflation, per student expenditures are expected to rise about 1.5 percent over last school year.

 

3.5 million

The number of students expected to graduate from high school this academic year—nearly 3.2 million from public school and more than 310,000 from private schools.

 

20.5 million

This is the number of students expected to attend American colleges and universities this fall—an increase of 5.2 million since fall 2000. About 11.7 million of these students will be female, compared with 8.8 million males. About 13.3 million will attend four-year institutions and 7.2 million will attend two-year institutions.

 

14.5% and 16.5%

These percentages represent college students who were Black and Hispanic, respectively, in 2014. From 2000 to 2014, the percent of college students who were Black rose 2.8 percentage points (from 11.7 percent to 14.5 percent) and the percent of college students who were Hispanic rose 6.6 percentage points (from 9.9 percent to 16.5 percent).

 

$16,188 and $41,970

These are the average annual prices for undergraduate tuition, fees, room, and board at public and private non-profit institutions, respectively, for the 2014–15 academic year. The average annual price at private, for-profit institutions was $23,372. 

 

A New Guide to Education Data Privacy

By The National Forum on Education Statistics Education Data Privacy Working Group

The expanding use of data and new technologies for classroom instruction hold promise for facilitating learning and better personalizing education for students. However, these changes also heighten the responsibility of schools and education agencies to protect student privacy. The recently released Forum Guide to Education Data Privacy offers recommendations on how to do this.

 Privacy is one of the most important issues in education data policy today. Many states have passed laws that require education agencies to implement strong privacy programs and procedures. State and local education agencies (SEAs and LEAs) are responding to the growing demands for privacy protection, as well as expectations for transparency in how student data are collected, used, and protected. Local and state members of the National Forum on Education Statistics (the Forum) identified a particular need for a resource that would assist SEAs and LEAs in working with school staff to ensure that student data are properly protected. The Forum established an Education Data Privacy Working Group tasked with developing a resource to help education agencies support school staff in responsibly using and sharing student data for instructional and administrative purposes, as well as strengthen agency privacy programs and related professional development efforts. The Forum Guide to Education Data Privacy was released in early July.

Chapter 1 of the guide includes information on

  • federal and state privacy laws;
  • the interrelationships among data governance, data security, and data privacy;
  • roles and responsibilities for protecting privacy at various agency levels; and
  • effective professional development on data privacy and security.

Chapter 2 includes 11 case studies designed to highlight common privacy issues related to the use of student data and presents basic approaches to managing those issues. Topics include

  • using online apps in the classroom;
  • responding to parent and PTA requests for student contact information;
  • using and sharing student data within a school;
  • sharing data among community schools and community-based organizations;
  • using data in presentations and training materials; and
  • using social media.

Each case study includes a scenario that exemplifies the privacy risk, and offers various approaches and action steps that agencies can take to minimize the risk. The information presented in the case studies is based largely on the collective experience of members of the Forum.

The working group collaborated with the U.S. Department of Education’s Privacy Technical Assistance Center (PTAC) in the development of the guide. Links to free, helpful PTAC resources are highlighted throughout. 

It is important for education agencies to understand that there is no “one-size-fits-all” approach to protecting privacy. Each agency needs to consider relevant state and federal laws, state and local school board policies, parental expectations, student instructional needs, and the agency’s available resources when developing privacy guidelines and procedures. It is our hope that the Forum Guide to Education Data Privacy will help agencies develop privacy programs and procedures that fit their particular circumstances.    

 

About the National Forum on Education Statistics

The work of the National Forum on Education Statistics is a key aspect of the National Cooperative Education Statistics System. The Cooperative System was established to produce and maintain, with the cooperation of the states, comparable and uniform education information and data that are useful for policymaking at the federal, state, and local levels. To assist in meeting this goal, the National Center for Education Statistics (NCES), within the Institute of Education Sciences (IES) of the U.S. Department of Education, established the Forum to improve the collection, reporting, and use of elementary and secondary education statistics. The Forum addresses issues in education data policy, sponsors innovations in data collection and reporting, and provides technical assistance to improve state and local data systems.

Members of the Forum establish working groups to develop best practice guides in data-related areas of interest to federal, state, and local education agencies. They are assisted in this work by NCES, but the content comes from the collective experience of working group members who review all products iteratively throughout the development process. After the working group completes the content and reviews a document a final time, publications are subject to examination by members of the Forum standing committee that sponsors the project. Finally, Forum members (approximately 120 people) review and formally vote to approve all documents prior to publication. NCES provides final review and approval prior to online publication.

The information and opinions published in Forum products do not necessarily represent the policies or views of the U.S. Department of Education, IES, or NCES. For more information about the Forum, please visit nces.ed.gov/forum or contact Ghedam Bairu at Ghedam.bairu@ed.gov.

High Job Satisfaction Among Teachers, but Leadership Matters

By Lauren Musu-Gillette

Are teachers satisfied with their jobs? Overall, the answer appears to be yes. However, a recent NCES report highlights that teacher job satisfaction differs by school characteristics.

Newly released data shows that at least 9 out of 10 teachers reported that they were satisfied with their jobs in 2003–04, 2007–08, and 2011–12. A higher percentage of private school teachers than public school teachers reported that they were satisfied with their jobs in all of these years.


Percent of teachers reporting they were satisfied in their jobs: School years 2003–04, 2007–08, and 2011–12

NOTE: “Satisfied” teachers are those who responded “strongly agree” or “somewhat agree” to the statement: “I am generally satisfied with being a teacher at this school.”
SOURCE: U.S. Department of Education, National Center for Education Statistics, Schools and Staffing Survey (SASS).


Differences in teacher job satisfaction also emerged based on perceptions of administrative support.[i] In 2011–12, a higher percentage of teachers who believed that the administration in their schools was supportive were satisfied with their jobs. Among teachers who felt that the administration in their schools was supportive, 95 percent were satisfied with their jobs. This was 30 percentage points higher than the percentage of teachers did not feel the administration was supportive. This pattern was seen in private schools as well and is consistent with previous research that demonstrates the importance of schools administrators to teachers’ working conditions.[ii]   


Percent of satisfied teachers, by their perceptions of administrative support: School year 2011–12

NOTE: “Satisfied” teachers are those who responded “strongly agree” or “somewhat agree” to the statement: “I am generally satisfied with being a teacher at this school.”
SOURCE: U.S. Department of Education, National Center for Education Statistics, Schools and Staffing Survey (SASS).


[i] Support was measured by teachers’ agreement or disagreement with the statement “The school administration’s behavior toward the staff is supportive and encouraging.”
[ii] Ladd, H. F. (2011). Teachers’ Perceptions of Their Working Conditions: How Predictive of Planned and Actual Teacher Movement? Educational Evaluation and Policy Analysis, 33(2): 235-261.

Learning to Use the Data: Online Dataset Training Modules

UPDATED JANUARY 16, 2018: New and Updated Modules Added

By Andy White

NCES provides a wealth of data online for users to access. However, the breadth and depth of the data can be overwhelming to first time users, and, sometimes, even for more experienced users. In order to help our users learn how to access, navigate, and use NCES datasets, we’ve developed a series of online training modules.

The Distance Learning Dataset Training  (DLDT) resource is an online, interactive tool that allows users to learn about NCES data across the education spectrum and evaluate it for suitability for specific  research purposes. The DLDT program at NCES has developed a growing number of online training modules for several NCES complex sample survey and administrative datasets.  The modules teach users about the intricacies of various datasets, including what the data represent, how the data are collected, the sample design, and considerations for analysis to help users in conducting successful analyses. 

The DLDT is also a teaching tool that can be used by individuals both in and out of the classroom to learn about NCES complex sample survey and administrative data collections and appropriate analysis methods.

There are two types of NCES DLDT modules available: common modules and dataset-specific modules. The common modules help users broadly understand NCES data across the education spectrum, introduce complex survey methods, and explain how to acquire NCES micro-data. The dataset-specific modules introduce and educate users about particular datasets. The available modules are listed below and more information can be found on the DLDT website

 

AVAILABLE DLDT MODULES

Common Modules

  • Introduction to the NCES Distance Learning Dataset Training System
  • Introduction to the NCES Datasets
  • Introduction to NCES Web Gateways: Accessing and Exploring NCES Data
  • Analyzing NCES Complex Survey Data
  • Statistical Analysis of NCES Datasets Employing a Complex Sample Design
  • Acquiring Micro-level NCES Data
  • DataLab Tools: QuickStats, PowerStats, and TrendStats

Dataset-Specific Modules

  • Common Core of Data (CCD)
  • Introduction to MapED
  • Fast Response Survey System (FRSS)
  • Early Childhood Longitudinal Study Birth Cohort (ECLS-B)
  • Early Childhood Longitudinal Study Kindergarten Class of 1998-1999 (ECLS-K)
  • Early Secondary Longitudinal Studies (1972 – 2000)
    • National Longitudinal Study of 1972 (NLS-72)
    • High School and Beyond (HS&B)
    • National Education Longitudinal Study of 1988 (NELS:88)
  • Educational Longitudinal Study of 2002 (ELS:2002)
  • High School Longitudinal Study of 2009 (HSLS:09)
  • Introduction to High School Transcript Studies
  • Integrated Postsecondary Education Data System (IPEDS)
  • National Assessment of Educational Progress (NAEP)
    • Main, State, and Long-Term Trend NAEP
    • NAEP High School Transcript Study (HSTS)
    • National Indian Education Study (NIES)
  • National Household Education Survey Program (NHES)
  • Postsecondary Education Sample Survey Datasets
    • National Postsecondary Student Aid Study (NPSAS)
    • Beginning Postsecondary Student Longitudinal Study (BPS)
    • Baccalaureate and Beyond Longitudinal Study (B&B)
  • Postsecondary Education Quick Information System (PEQIS)
  • Private School Universe Survey (PSS)
  • Schools and Staffing Survey (SASS)
    • Teacher Follow-up Survey (TFS)
    • Principal Follow-up Survey (PFS)
    • Beginning Teacher Longitudinal Study (BTLS)
  • School Survey On Crime and Safety (SSOCS)
  • International Activities Program Studies Datasets
    • Progress in International Reading Literacy Study (PIRLS)
    • Trends in International Mathematics and Science Study (TIMSS)
    • Program for International Student Assessment (PISA)
    • Program for the International Assessment of Adult Competencies (PIAAC)

Statistical Concepts in Brief: Embracing the Errors

By Lauren Musu-Gillette

EDITOR’S NOTE: This is part of a series of blog posts about statistical concepts that NCES uses as a part of its work.

Many of the important findings in NCES reports are based on data gathered from samples of the U.S. population. These sample surveys provide an estimate of what data would look like if the full population had participated in the survey, but at a great savings in both time and costs.  However, because the entire population is not included, there is always some degree of uncertainty associated with an estimate from a sample survey. For those using the data, knowing the size of this uncertainty is important both in terms of evaluating the reliability of an estimate as well as in statistical testing to determine whether two estimates are significantly different from one another.

NCES reports standard errors for all data from sample surveys. In addition to providing these values to the public, NCES uses them for statistical testing purposes. Within annual reports such as the Condition of Education, Indicators of School Crime and Safety, and Trends in High School Drop Out and Completion Rates in the United States, NCES uses statistical testing to determine whether estimates for certain groups are statistically significantly different from one another. Specific language is tied to the results of these tests. For example, in comparing male and female employment rates in the Condition of Education, the indicator states that the overall employment rate for young males 20 to 24 years old was higher than the rate for young females 20 to 24 years old (72 vs. 66 percent) in 2014. Use of the term “higher” indicates that statistical testing was performed to compare these two groups and the results were statistically significant.

If differences between groups are not statistically significant, NCES uses the phrases “no measurable differences” or “no statistically significant differences at the .05 level”. This is because we do not know for certain that differences do not exist at the population level, just that our statistical tests of the available data were unable to detect differences. This could be because there is in fact no difference, but it could also be due to other reasons, such as a small sample size or large standard errors for a particular group. Heterogeneity, or large amounts of variability, within a sample can also contribute to larger standard errors.

Some of the populations of interest to education stakeholders are quite small, for example, Pacific Islander or American Indian/Alaska Native students. As a consequence, these groups are typically represented by relatively small samples, and their estimates are often less precise than those of larger groups. These less precise estimates can often be reflected in larger standard errors for these groups. For example, in the table above the standard error for White students who reported having been in 0 physical fights anywhere is 0.70 whereas the standard error is 4.95 for Pacific Islander students and 7.39 for American Indian/Alaska Native students. This means that the uncertainty around the estimates for Pacific Islander and American Indian/Alaska Native students is much larger than it is for White students. Because of these larger standard errors, differences between these groups that may seem large may not be statistically significantly different. When this occurs, NCES analysts may state that large apparent differences are not statistically significant. NCES data users can use standard errors to help make valid comparisons using the data that we release to the public.

Another example of how standard errors can impact whether or not sample differences are statistically significant can be seen when comparing NAEP scores changes by state. Between 2013 and 2015, mathematics scores changed by 3 points between for fourth-grade public school students in Mississippi and Louisiana. However, this change was only significant for Mississippi. This is because the standard error for the change in scale scores for Mississippi was 1.2, whereas the standard error for Louisiana was 1.6. The larger standard error, and therefore larger degree of uncertainly around the estimate, factor into the statistical tests that determine whether a difference is statistically significant. This difference in standard errors could reflect the size of the samples in Mississippi and Louisiana, or other factors such as the degree to which the assessed students are representative of the population of their respective states. 

Researchers may also be interested in using standard errors to compute confidence intervals for an estimate. Stay tuned for a future blog where we’ll outline why researchers may want to do this and how it can be accomplished.