Program for International Student Assessment (PISA)
Indicators 15 and 16 are based on data collected as part of the Program for International Student Assessment (PISA). PISA is sponsored by the Organization for Economic Co-operation and Development (OECD), an intergovernmental organization of 34 industrialized countries that serves as a forum for member countries to cooperate in research and policy development on social and economic topics of common interest.
PISA seeks to represent the overall yield of learning for 15-year-olds. PISA assumes that by age 15, young people have had a series of learning experiences, both in and out of school, that allow them to perform at particular levels in reading, mathematics, and science. Formal education will have played a major role in student performance, but other factors, such as learning opportunities at home, also play a role. PISA's results provide an indicator of the overall performance of a country's educational system, and they also provide information about other factors that influence performance (e.g., hours of instructional time). By assessing students near the end of compulsory schooling in key knowledge and skills, PISA provides information about how well prepared students will be for their future lives as they approach an important transition point for education and work. PISA thus aims to show how well equipped 15-year-olds are for their futures based on what they have learned up to that point.
PISA was first implemented in 2000 and is based on a 3-year cycle. PISA 2009 was the fourth cycle of the assessment. In each PISA cycle the capabilities of 15-year-olds in reading literacy, mathematics literacy, and science literacy are assessed. However, in each assessment year, PISA provides a detailed examination for one of the three subjects (referred to as a major domain) and a basic examination of the other two subjects (referred to as minor domains). The 2000 assessment focused on reading literacy; the 2003 assessment focused on mathematics literacy; the 2006 assessment focused on science literacy; and the 2009 assessment again focused on reading literacy.
In 2009, 65 countries and other education systems participated in PISA, including the 34 OECD countries, 26 non-OECD countries, and 5 other education systems. Other education systems refer to non-national entities, such as Shanghai-China. To implement PISA, each participating country and education system selected a representative sample of 15-year-olds. The PISA 2009 guidelines specified that a minimum of 4,500 students from a minimum of 150 schools was required in each country and education system in order to meet the minimum sample threshold to participate in the assessment. The guidelines also specified that within schools, a sample of 35 students was to be selected in an equal probability sample unless fewer than 35 students age 15 were available (in which case all students were selected). PISA 2009 standards required that students in the sample be 15 years and 3 months to 16 years and 2 months old at the beginning of the testing period.
In the United States, the PISA 2009 assessment was administered from September 21, 2009, to November 19, 2009. A total of 5,233 15-year-old students from 165 participating schools in the United States participated in the assessment.
PISA 2009 was developed by international experts and a consortium of test developers with items submitted and reviewed by representatives of each country for possible bias and relevance to PISA's goals. The final assessment consisted of 102 reading items, 36 mathematics items, and 52 science items allocated to 13 test booklets. Each booklet was made up of 4 test clusters, and the average number of items per cluster was 15 items for reading, 12 items for mathematics, and 17 items for science. Each student completed a 2-hour paper-and-pencil assessment. During the assessment, all students answered reading items, but only some students, depending on the test booklet which they received, answered mathematics and/or science items. In addition to the cognitive assessment, students received a 30-minute questionnaire designed to give information about their backgrounds, attitudes, and experiences in school. Principals in schools where PISA was administered were also given a 30-minute questionnaire to provide information about their schools. For more detailed information on sampling, administration, response rates, and other technical issues related to PISA data, see http://nces.ed.gov/pubs2011/2011004.pdf.
The OECD developed the PISA 2009 Assessment Framework: Key Competencies in Reading, Mathematics, and Science to design the PISA 2009 assessment in a collaborative effort of the PISA Governing Board and an international consortium. The PISA 2009 framework acts as a blueprint for the assessment, outlining what should be assessed.
Reading literacy in PISA 2009 is defined as "understanding, using, reflecting on, and engaging with written texts in order to achieve one's goals, to develop one's knowledge and potential, and to participate in society." Reading literacy is built on three "task characteristics": (1) situation, which distinguishes the range of context or purposes for which reading takes place; (2) text, the range of materials that are read; and (3) aspect, which consists of the mental strategies, approaches or purposes that readers use to negotiate their way into, around, and between texts.
The three reading literacy subscales (access and retrieve, integrate and interpret, and reflect and evaluate) were derived from three aspect categories: (1) access and retrieve, which includes navigating the information space provided to locate and retrieve one or more distinct pieces of information; (2) integrate and interpret, which includes developing an understanding of the coherence of the text and make meaning from something that is not stated; and (3) reflect and evaluate, which includes drawing upon knowledge, ideas, or attitudes beyond the text in order to relate the information provided within the text to one's own conceptual and experiential frame of reference.
Mathematics literacy in PISA 2009 is defined as "an individual's capacity to identify and understand the role that mathematics plays in the world, to make well-founded judgments and to use and engage with mathematics in ways that meet the needs of that individual's life as a constructive, concerned and reflective citizen."
Science literacy in PISA 2009 is defined as "scientific knowledge and use of that knowledge to identify questions, to acquire new knowledge, to explain scientific phenomena, and to draw evidence based conclusions about science-related issues, understanding of the characteristic features of science as a form of human knowledge and inquiry, awareness of how science and technology shape our material, intellectual, and cultural environments, and willingness to engage in science-related issues, and with the ideas of science, as a reflective citizen." Details on the PISA 2009 framework and the reading, science, and mathematics literacy competencies can be found at http://www.oecd.org/dataoecd/11/40/44455820.pdf.
The PISA 2000 and 2009 OECD averages used in the analysis of trends in reading literacy scores over time are based on the averages of the 27 OECD countries with comparable data for 2000 and 2009. As a result, the reading literacy OECD average score for PISA 2000 differs from previously published reports and the reading literacy OECD average score for PISA 2009 differs from the OECD average score used for analyses other than trend comparisons. The seven current OECD members not included in the OECD average for trend analysis include the Slovak Republic and Turkey, which joined PISA in 2003; Estonia and Slovenia, which joined PISA in 2006; Luxembourg, which experienced substantial changes in its assessment conditions between 2000 and 2003; and the Netherlands and the United Kingdom, which did not meet the PISA response rate standards in 2000. Though reading literacy scores can be compared for all PISA administrative cycles (2000, 2003, 2006, and 2009), the U.S. averages in 2000 and 2009 are compared with OECD average scores in 2000 and 2009 because reading literacy was the major domain assessed in those years.
The PISA mathematics framework was revised in 2003. Because of changes in the framework, it is not possible to compare mathematics learning outcomes from PISA 2000 with those from PISA 2003, 2006, and 2009. The PISA science framework was revised in 2006. Because of changes in the framework, it is not possible to compare science learning outcomes from PISA 2000 and 2003 with those from PISA 2006 and 2009. Details on the changes to PISA since 2000 can be found at http://www.oecd.org/document/61/0,3746,en_32252351_32235731_46567613_1_1_1_1,00.html.
The PISA 2003 and 2009 OECD averages used in the analysis of trends in mathematics literacy scores over time are based on the 29 OECD countries with comparable data for 2003 and 2009. The five current members not included in the OECD average for trend analysis include Chile, Estonia, Israel, Slovenia, which did not participate in 2003, and the United Kingdom, which did not meet PISA response rate standards for the 2003 assessment.
For science literacy trends, all 34 OECD countries are used.
The OECD excluded the data for Austria from the trend analysis in its report (PISA 2009 Results: Learning TrendóChanges in Student Performance Since 2000 (Volume V), available at http://www.pisa.oecd.org) because of a concern over a data collection issue in 2009; however, after consultation with Austrian officials, NCES kept the Austrian data in the U.S. trend reporting.
For more information on PISA, see http://nces.ed.gov/Surveys/PISA.