NCES Blog

National Center for Education Statistics

Celebrate National Library Month: The Future of Libraries

By Christopher Cody and Bao Le

April is National Library Month! Did you know that NCES collects data on libraries?

While libraries have traditionally provided the public with a physical space for learning and accessing resources and information, the role of the library has expanded with advances in technology. With the dawn of the digital age, libraries have been working to meet the challenges of expanding access, learning opportunities, and overall public connection.[i] Academic libraries in particular, which are libraries located within postsecondary institutions, have embraced technological improvements, as shown in data collected by the National Center for Education Statistics (NCES).

The Academic Libraries (AL) Survey has a rich history at NCES, starting in 1966 when we began conducting the surveys on a three-year cycle as part of the Integrated Postsecondary Education Data System (IPEDS). The survey moved around a bit, but is now fully housed in IPEDS and is currently administered on a yearly cycle.

IPEDS’s AL Survey offers an abundance of data to track the advancement of libraries, including data on topics such as collections/circulations, expenses, and interlibrary services. These data show a clear progression of libraries into the digital age. Here are some highlights:

  • In 1996, “80 percent of institutions with an academic library had access from within the library to an electronic catalog of the library’s holdings, 81 percent had internet access within the library.”[ii]
  • In 1996, about 40 percent had library reference service by e-mail. Just 10 years later, 72 percent of academic libraries provided library reference service by e-mail or the internet.[iii]
  • In 2006, only 6 percent of all academic library collections were e-books. By 2014-15, about 23 percent of all collections were e-books and 31 percent of the total library collections were from electronic and digital sources (e-books, e-media, and databases) as shown in Enrollment and Employees in Postsecondary Institutions, Fall 2014; and Financial Statistics and Academic Libraries, Fiscal Year 2014: First Look (Provisional Data).
  • In 2014-15, postsecondary institutions housed approximately 1.1 billion items in physical library collections (books and media) and about 521 million items in electronic library collections (digital/electronic books, databases, and digital electronic media).

 

Over the past 20 years, libraries have evolved to ensure information is accessible to the public through the latest mediums of technology.


So in honor of National Library Month, take advantage of the abundant historical academic and school library data available through NCES located on the Library Statistics Program page. More recent academic library data can be accessed by visiting the Use the Data portal on the IPEDS website.

 

[i] Clark, L., Levien, R. E., Garmer, A. K., and Figueroa, M. (2015). Re-Thinking the Roles of U.S. Libraries. In D. Bogart and A. Inouye (Eds.), Library and Book Trade Almanac: formerly The Bowker Annual 2015, 60th Edition (pg. 3-22). Medford, NJ: Information Today Inc.

[ii] Cahalan, M. W., Justh, N. M., and Williams, J. W. (1999). Academic Libraries: 1996 (NCES 2000-326). National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education. Washington, DC.

[iii] Holton, B., Hardesty, L., and O’Shea, P. (2008). Academic Libraries: 2006 (NCES 2008-337). National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education. Washington, DC.

Statistical concepts in brief: How and why does NCES use sample surveys?

By Lauren Musu-Gillette

EDITOR’S NOTE: This is the first in a series of blog posts about statistical concepts that NCES uses as a part of its work. 

The National Center for Education Statistics (NCES) collects survey statistics in two main ways—universe surveys and sample surveys.

Some NCES statistics, such as the number of students enrolled in public schools or postsecondary institutions, come from administrative data collections. These data represent a nearly exact count of a population because information is collected from all potential respondents (e.g., all public schools in the U.S.). These types of data collections are also known as universe surveys because they involve the collection of data covering all known units in a population. The Common Core of Data (CCD), the Private School Survey (PSS) and the Integrated Postsecondary Education Data System (IPEDS) are the key universe surveys collected by NCES.

While universe surveys provide a wealth of important data on education, data collections of this magnitude are not realistic for every potential variable or outcome of interest to education stakeholders. That is why, in some cases, we use sample surveys, which select smaller subgroups that are representative of a broader population of interest. Using sample surveys can reduce the time and expense that would be associated with collecting data from all members of a particular population of interest. 


Example of selecting a sample from a population of interest

The example above shows a simplified version of how a representative sample could be drawn from a population. The population shown here has 60 people, with 2/3 males and 1/3 females. The smaller sample of 6 individuals is drawn from this larger population, but remains representative with 2/3 males and 1/3 females included in the sample.


For instance, the National Postsecondary Student Aid Study (NPSAS), Baccalaureate and Beyond (B&B), and the Beginning Postsecondary Study (BPS) select institutions from the entire universe of institutions contained in the Integrated Postsecondary Education Data System (IPEDS) database. Then, some students within those schools are selected for inclusion in the study.

Schools and students are selected so that they are representative of the entire population of postsecondary institutions and students. Some types of institutions or schools can be sampled at higher rates than their representation in the population to ensure additional precision for survey estimates of that population. Through scientific design of the sample of institutions and appropriate weighting of the sample respondents, data from these surveys are nationally representative without requiring that all schools or all students be included in the data collection.

Many of the NCES surveys are sample surveys. For example, NCES longitudinal surveys include nationally representative data for cohorts of students in the elementary grades (Early Childhood Longitudinal Survey), the middle grades (Middle Grades Longitudinal Study), as well as at the high school (High School Longitudinal Study), and college levels (Beginning Postsecondary Students). The National Household Education Survey gathers information on parental involvement in education, early childhood programs, and other topics using household residences rather than schools as the population. The National Postsecondary Student Aid Survey gathers descriptive information on all college students and their participation in student aid programs. Additionally, characteristics of teachers and principals and the schools in which they teach are obtained through the Schools and Staffing Survey, and the National Teacher and Principal Survey.

By taking samples of the population of interest, NCES is able to study trends on a national level without needing to collect data from every student or every school. However, the structure and the size of the sample can affect the accuracy of the results for some population groups. This means that statistical testing is necessary to make inferences about differences between groups in the population. Stay tuned for future blogs about how this testing is done, and how NCES provides the data necessary for researchers or the public to do testing of their own.

New Release: Forum Guide to Elementary/Secondary Virtual Education Data

By The National Forum on Education Statistics Virtual Education Working Group

Rapid advancements and innovations in virtual education are providing education agencies, educators, and students with new opportunities for teaching and learning. That growth increases the need for accurate, high-quality data about virtual education that provides a full picture of successes and challenges. A new resource released earlier this month can help with this important work.

In recent years, virtual education has become an integral part of K-12 education and nearly every student is exposed to virtual learning in some context—whether as a single aspect of a traditional course or program, in an entirely virtual program, or in any combination of traditional and virtual learning.

Virtual education is often a core aspect of curricula and class instruction, and students and teachers are increasingly adept at integrating lectures, lessons, and group work delivered via computers, tablets, and other devices into day-to-day teaching and learning. Moreover, many students and teachers no longer distinguish between virtual and traditional learning—the technology and tools used in virtual education are familiar to them and are no more novel than a pencil.

Despite widespread interest in enhancing and expanding virtual teaching and learning, many state education agencies and school districts do not yet have the ability to collect accurate, high-quality virtual education data. Some organizations have not yet specified the data they want to collect, while others have not developed reliable processes for gathering and managing data. The prevalence of virtual education, the increasing diversity in virtual education opportunities, and the rapid pace of technological change require new ways of thinking about how to modify data elements and systems to effectively identify, collect, and use virtual education data to inform and improve education.

Local and state members of the National Forum on Education Statistics (the Forum) identified this problem and established a Virtual Education Working Group, tasked with developing a resource to assist education agencies as they: 1) consider the impact of virtual education on established data elements and methods of data collection, and 2) address the scope of changes, the rapid pace of new technology development, and the proliferation of resources in virtual education. On February 4th, 2016, the Forum Guide to Elementary/Secondary Virtual Education Data was released.

In the document, the Forum Working Group members identify supports to virtual education data such as the organizational structure of virtual education, user experiences, challenges in collecting virtual education data, policy implications, as well as privacy and confidentiality protections. The document also includes common data elements for K12 virtual and blended data, such as

The working group also identifies elements that exist for traditional schools that are useful for virtual education. Finally, the Guide provides real-world examples and common practices implemented by state departments, local districts, and schools to modify their data systems and add elements that better reflect the needs unique to virtual education. 

As virtual education continues to expand in elementary/secondary school systems, education data collection and reporting systems need to evolve as well. It is important for all virtual education stakeholders – teachers, parents, education administrators, data systems administrators, and policymakers – to come together and creatively address the challenges of building a sound data infrastructure that considers the unique aspects of virtual education.

It is our hope that the Forum’s new guide can be a helpful tool in that process.

 

About the National Forum on Education Statistics

The work of the National Forum on Education Statistics is a key aspect of the National Cooperative Education Statistics System. The Cooperative System was established to produce and maintain, with the cooperation of the states, comparable and uniform education information and data that are useful for policymaking at the federal, state, and local levels. To assist in meeting this goal, the National Center for Education Statistics (NCES), within the Institute of Education Sciences (IES) of the U.S. Department of Education, established the Forum to improve the collection, reporting, and use of elementary and secondary education statistics. The Forum addresses issues in education data policy, sponsors innovations in data collection and reporting, and provides technical assistance to improve state and local data systems.

Members of the Forum establish working groups to develop best practice guides in data-related areas of interest to federal, state, and local education agencies. They are assisted in this work by NCES, but the content comes from the collective experience of working group members who review all products iteratively throughout the development process. After the working group completes the content and reviews a document a final time, publications are subject to examination by members of the Forum standing committee that sponsors the project. Finally, Forum members (approximately 120 people) review and formally vote to approve all documents prior to publication. NCES provides final review and approval prior to online publication.

The information and opinions published in Forum products do not necessarily represent the policies or views of the U.S. Department of Education, IES, or NCES. For more information about the Forum, please visit http://www.nces.ed.gov/forum and/or contact Ghedam Bairu at Ghedam.bairu@ed.gov

 

Examining the workforce skills of U.S. unemployed, young, and older adults: Updated data from the PIAAC

By Stephen Provasnik and Holly Xie

Educational attainment is one of the most common measures of workforce preparation and is certainly an important indicator of whether someone is job-ready. But this one metric does not fully capture the variety of skills that can be important to potential employers. One way that NCES measures the basic workplace skills and abilities of U.S. adults is through the Program for the International Assessment of Adult Competencies (PIAAC).[1] 

PIAAC includes a number of assessments designed to evaluate real-world skills in three important areas:

  • Literacy: The literacy assessment measures the extent to which respondents can understand, evaluate, use, and engage with written text in different contexts, such as home, work, and community;
  • Numeracy: The numeracy assessment evaluates respondents’ ability to access, use, interpret and communicate mathematical information that is deemed to be important in the workplace; and
  • Problem solving in technology-rich environments: This skill area assesses respondents’ use of digital technology, communication tools, and networks to gather and evaluate information, communicate with others, and perform practical tasks.

The newly released Skills of U.S. Unemployed, Young, and Older Adults in Sharper Focus: Results from the Program for the International Assessment of Adult Competencies (PIAAC) 2012/2014 describes the workforce skill levels of unemployed adults (age 16-65), young adults (age 16-34), and older adults (age 66-74). The report, along with additional data on the NCES website, includes results from the assessments described above, as well as information about respondents’ educational background, work history, the skills they use on the job and at home, their civic engagement, and their health and well-being.

The PIAAC results show a connection between skills and employment. For instance, more than 75 percent of unemployed adults (age 16-65) had attained a high school credential or less. Roughly one-third of these adults (with a high school credential or less) scored at the lowest levels in literacy and about half scored at the lowest levels in numeracy. Overall, adults who were unemployed or out of the labor force performed worse than their employed peers in all areas of the PIAAC.


Percentage of adults age 16 to 65 at each level of proficiency on the PIAAC numeracy scale, by employment status: 2012 and 20141

1United States data are the U.S. PIAAC 2012/2014 data. PIAAC international average is calculated from the U.S. PIAAC 2012/2014 data and international data from 2012 for all other countries shown in this report. Country- and region-specific results are available at http://nces.ed.gov/surveys/piaac/results/makeselections.aspx.
SOURCE: U.S. Department of Education, National Center for Education Statistics, Program for the International Assessment of Adult Competencies (PIAAC), Skills of U.S. Unemployed, Young, and Older Adults in Sharper Focus: Results from the Program for the International Assessment of Adult Competencies (PIAAC) 2012/2014: First Look


Among young adults age 16-34, the higher the level of education completed, the larger the percentages of young adults at the highest proficiency levels in all three skill areas, and the smaller the percentages at the lowest levels. This pattern was not seen among older U.S. adults (age 66-74). Among older U.S. adults, there was no measurable difference in the percentage performing at the highest levels in literacy or numeracy between those who had a bachelor’s degree and those who had a graduate or professional degree.


Percentage of adults age 66 to 74 at each level of proficiency on the PIAAC literacy scale, by highest level of educational attainment: 2014

# Rounds to zero.
‡ Reporting standards not met. Sample size insufficient to permit a reliable estimate.
SOURCE: U.S. Department of Education, National Center for Education Statistics, Program for the International Assessment of Adult Competencies (PIAAC), Skills of U.S. Unemployed, Young, and Older Adults in Sharper Focus: Results from the Program for the International Assessment of Adult Competencies (PIAAC) 2012/2014: First Look.


Much more data can be read in the full report. Additional PIAAC data will be released later this year, including information about adults who were incarcerated.

For more information, check out this video:

 


[1] The PIAAC survey is coordinated internationally by the OECD. NCES implements PIAAC in the United States. PIAAC is a household survey administered by trained data collectors to a nationally-representative sample of adults, ages 16 through 65, in each country, in the official language(s), and in most cases, in respondents’ homes on a laptop computer. PIAAC was first conducted in 2011-2012 and results were released in October 2013 with data from 23 countries, including the United States.

The findings reported here are based on data from the first round of PIAAC and a second round conducted in 2013-2014 in the United States to collect additional data on key subgroups of the adult population. To learn more about the U.S. administration and reporting of PIAAC, as well as related data tools, see https://nces.ed.gov/surveys/piaac/.

 

Diversity in home languages: Examining English learners in U.S. public schools

By Joel McFarland

More than 4.9 million English learners (EL) were enrolled in U.S. public elementary and secondary schools during the 2013-14 school year, representing just over ten percent of the total student population.   

Recently published data from the U.S. Department of Education’s EDFacts data collection shed light on the linguistic diversity of EL students, as well as the distribution of EL students across grades. EDFacts data are drawn from administrative records maintained by state education agencies and provide a detailed picture of the total population of K-12 public school students in the United States. (Depending on the organization or publication, ELs can also be known as English language learners (ELL) or Limited English Proficient (LEP) students.)

States reported that Spanish was the home language of nearly 3.8 million EL students in 2013-14, which accounts for 76.5 percent of all EL students and nearly 8 percent of all public K-12 students. Arabic and Chinese were the next most commonly spoken home languages, reported for approximately 109,000 and 108,000 students, respectively.

It may surprise some to learn that English (91,700 students) was the fourth most commonly reported home language. This may reflect students who live in multilingual households, and those who were adopted from other countries and raised to speak another language but currently live in English-speaking households. Overall, there were 38 different home languages reported for 5,000 or more students.


Ten most commonly reported home languages of English learner (EL) students

SOURCE: U.S. Department of Education, National Center for Education Statistics, EDFacts file 141, Data Group 678; Common Core of Data, "State Nonfiscal Survey of Public Elementary and Secondary Education." See Digest of Education Statistics 2015, table 204.27.


EDFacts data also allow us to examine the distribution of EL students across grade levels. Data from the 2013-14 school year show that a greater percentage of students in lower than in upper grades were identified as EL students. For example, 17.4 percent of kindergarteners were identified as EL students, compared to 8.0 percent of 6th graders and 6.4 percent of 8th graders. Among 12th graders, only 4.6 percent of students were identified as ELs. 


Percentage of Public K-12 students identified as English learners (ELs), by grade level: 2013-14SOURCE: U.S. Department of Education, National Center for Education Statistics, EDFacts file 141, Data Group 678; Common Core of Data, "State Nonfiscal Survey of Public Elementary and Secondary Education." See Digest of Education Statistics 2015, table 204.27.


More home language and grade-level data on English learners can be found in the Digest of Education Statistics. Additional data on EL students, including state-level data and data by locale (e.g., city, suburban, town, and rural), can be found in the Condition of Education.

Interested in recent research on how EL students are identified and served? Check out the Inside IES Research blog.