Skip Navigation
Digest of Education Statistics: 2017
Digest of Education Statistics: 2017

NCES 2018-070
January 2018

Appendix A.5. Organization for Economic Cooperation and Development

The Organization for Economic Cooperation and Development (OECD) publishes analyses of national policies and survey data in education, training, and economics in OECD and partner countries. Newer studies include student survey data on financial literacy and on digital literacy.

Education at a Glance

To highlight current education issues and create a set of comparative education indicators that represent key features of education systems, OECD initiated the Indicators of Education Systems (INES) project and charged the Centre for Educational Research and Innovation (CERI) with developing the cross-national indicators for it. The development of these indicators involved representatives of the OECD countries and the OECD Secretariat. Improvements in data quality and comparability among OECD countries have resulted from the country-to-country interaction sponsored through the INES project. The most recent publication in this series is Education at a Glance 2017: OECD Indicators.

Education at a Glance 2017 features data on the 35 OECD countries (Australia, Austria, Belgium, Canada, Chile, the Czech Republic, Denmark, Estonia, Finland, France, Germany, Greece, Hungary, Iceland, Ireland, Israel, Italy, Japan, the Republic of Korea, Latvia, Luxembourg, Mexico, the Netherlands, New Zealand, Norway, Poland, Portugal, the Slovak Republic, Slovenia, Spain, Sweden, Switzerland, Turkey, the United Kingdom, and the United States) and a number of partner countries, including Argentina, Brazil, China, Colombia, Costa Rica, India, Indonesia, Lithuania, the Russian Federation, Saudi Arabia, and South Africa.

The OECD Handbook for Internationally Comparative Education Statistics: Concepts, Standards, Definitions, and Classifications provides countries with specific guidance on how to prepare information for OECD education surveys; facilitates countries’ understanding of OECD indicators and their use in policy analysis; and provides a reference for collecting and assimilating educational data. Chapter 6 of the OECD Handbook for Internationally Comparative Education Statistics contains a discussion of data quality issues. Users should examine footnotes carefully to recognize some of the data limitations.

Further information on international education statistics may be obtained from

Andreas Schleicher
Director for the Directorate of Education and Skills
and Special Advisor on Education Policy
to the OECD's Secretary General
OECD Directorate for Education and Skills
2, rue André Pascal
75775 Paris CEDEX 16
France
andreas.schleicher@oecd.org
https://www.oecd.org

Top

Online Education Database (OECD.Stat)

The statistical online platform of the OECD, OECD.Stat, allows users to access OECD’s databases for OECD member countries and selected nonmember economies. A user can build tables using selected variables and customizable table layouts, extract and download data, and view metadata on methodology and sources.      

Data for educational attainment in this report are pulled directly from OECD.Stat. (Information on these data can be found in chapter A, indicator A1 of annex 3 in Education at a Glance 2017 and accessed at https://www.oecd.org/education/skills-beyond-school/EAG2017-Annex-3.pdf.) However, to support statistical testing, standard errors for some countries had to be estimated and therefore may differ from those published on OECD.Stat. NCES calculated standard errors for all data years for the United States. Standard errors for 2016 for Canada, the Republic of Korea, the Netherlands, Poland, Slovenia, and Turkey, as well as standard errors for the 2016 postsecondary educational attainment data for Japan, were estimated by NCES using a simple random sample assumption. These standard errors are likely to be lower than standard errors that take into account complex sample designs. Lastly, NCES estimated the standard errors for the OECD average using the sum of squares technique.

OECD.Stat can be accessed at https://stats.oecd.org/. A user’s guide for OECD.Stat can be accessed at https://stats.oecd.org/Content/themes/OECD/static/help/WBOS%20User%20Guide%20(EN).pdf.

Top

Program for International Student Assessment

The Program for International Student Assessment (PISA) is a system of international assessments organized by the Organization for Economic Cooperation and Development (OECD), an intergovernmental organization of industrialized countries, that focuses on 15-year-olds’ capabilities in reading literacy, mathematics literacy, and science literacy. PISA also includes measures of general, or cross-curricular, competencies such as learning strategies. PISA emphasizes functional skills that students have acquired as they near the end of compulsory schooling.

PISA is a 2-hour exam. Assessment items include a combination of multiple-choice questions and open-ended questions that require students to develop their own response. PISA scores are reported on a scale that ranges from 0 to 1,000, with the OECD mean set at 500 and a standard deviation set at 100. In 2015, literacy was assessed in science, reading, and mathematics through a computer-based assessment in the majority of countries, including the United States. Education systems could also participate in optional pencil-and-paper financial literacy assessments and computer-based mathematics and reading assessments. In each education system, the assessment is translated into the primary language of instruction; in the United States, all materials are written in English.

Forty-three education systems participated in the 2000 PISA; 41 education systems participated in 2003; 57 (30 OECD member countries and 27 nonmember countries or education systems) participated in 2006; and 65 (34 OECD member countries and 31 nonmember countries or education systems) participated in 2009. (An additional nine education systems administered the 2009 PISA in 2010.) In PISA 2012, 65 education systems (34 OECD member countries and 31 nonmember countries or education systems), as well as the states of Connecticut, Florida, and Massachusetts, participated. In the 2015 PISA, 73 education systems (35 OECD member countries and 31 nonmember countries or education systems), as well as the states of Massachusetts and North Carolina and the territory of Puerto Rico, participated.

To implement PISA, each of the participating education systems scientifically draws a nationally representative sample of 15-year-olds, regardless of grade level. In the PISA 2015 national sample for the United States, about 5,700 students from 177 public and private schools were represented. Massachusetts, North Carolina, and Puerto Rico also participated in PISA 2015 as separate education systems. In Massachusetts, about 1,400 students from 48 public schools participated; in North Carolina, about 1,900 students from 54 public schools participated; and in Puerto Rico, about 1,400 students in 47 public and private schools participated.

The intent of PISA reporting is to provide an overall description of performance in reading literacy, mathematics literacy, and science literacy every 3 years, and to provide a more detailed look at each domain in the years when it is the major focus. These cycles will allow education systems to compare changes in trends for each of the three subject areas over time. In the first cycle, PISA 2000, reading literacy was the major focus, occupying roughly two-thirds of assessment time. For 2003, PISA focused on mathematics literacy as well as the ability of students to solve problems in real-life settings. In 2006, PISA focused on science literacy; in 2009, it focused on reading literacy again; and in 2012, it focused on mathematics literacy. PISA 2015 focused on science, as it did in 2006.

Further information on PISA may be obtained from

Patrick Gonzales
International Assessment Branch
Assessments Division
National Center for Education Statistics
550 12th Street SW
Washington, DC 20202
patrick.gonzales@ed.gov
https://nces.ed.gov/surveys/pisa

Top

Program for the International Assessment of Adult Competencies

The Program for the International Assessment of Adult Competencies (PIAAC) is a cyclical, large-scale study that aims to assess and compare the broad range of basic skills and competencies of adults around the world. Developed under the auspices of the Organization for Economic Cooperation and Development (OECD), it is the most comprehensive international survey of adult skills ever undertaken. Adults were surveyed in 24 participating countries in 2012 and in an additional 9 countries in 2014.

PIAAC focuses on what are deemed basic cognitive and workplace skills necessary to adults’ successful participation in 21st-century society and in the global economy. Skills assessed include literacy, numeracy, problem solving in technology-rich environments, and basic reading skills. PIAAC measures the relationships between these skills and other characteristics such as individuals’ educational background, workplace experiences, and occupational attainment. PIAAC was administered on laptop computers or in paper-and-pencil format. In the United States, the background questionnaire was administered in both English and Spanish, and the cognitive assessment was administered only in English.

The 2012 PIAAC assessment for the United States included a nationally representative probability sample of households. This household sample was selected on the basis of a four-stage, stratified area sample: (1) primary sampling units (PSUs) consisting of counties or groups of contiguous counties; (2) secondary sampling units (referred to as segments) consisting of area blocks; (3) housing units containing households; and (4) eligible persons within households. Person-level data were collected through a screener, a background questionnaire, and the assessment.

Based on the screener data, 6,100 U.S. respondents ages 16 to 65 were selected to complete the 2012 background questionnaire and the assessment; 4,898 actually completed the background questionnaire. Of the 1,202 respondents who did not complete the background questionnaire, 112 were unable to do so because of a literacy-related barrier—either the inability to communicate in English or Spanish or a mental disability. Twenty others were unable to complete the questionnaire due to technical problems. The final response rate for the background questionnaire, which included respondents who completed it and respondents who were unable to complete it because of a language problem or mental disability, was 82.2 percent weighted. The overall weighted response rate for the household sample—the product of the component response rates—was 70.3 percent.

The 2014 PIAAC supplement repeated the 2012 administration of PIAAC to an additional sample of U.S. adults in order to enhance the 2012 sample. It included a sample of participants from different households in the PSUs from the 2012 sample.

Key to PIAAC’s value is its collaborative and international nature. In the United States, NCES has consulted extensively with the Department of Labor in the development of the survey, and staff from both agencies are co-representatives of the United States in PIAAC’s international governing body. Internationally, PIAAC has been developed through the collaboration of OECD staff and participating countries’ representatives from their ministries or departments of education and labor. Through this cooperative effort, all participating countries follow the quality assurance guidelines set by the OECD consortium and closely follow all agreed-upon standards set for survey design, assessment implementation, and reporting of results.

Further information on PIAAC may be obtained from

Holly Xie
International Assessment Branch
Assessments Division
National Center for Education Statistics
550 12th Street SW
Washington, DC 20202
holly.xie@ed.gov
https://nces.ed.gov/surveys/piaac/