Commissioner of the National Center for Education Statistics
The Release of the Program for International Student Assessment (PISA)
December 6, 2004
Today the National Center for Education Statistics is releasing results on the performance of students in the United States on an international study, the Program for International Student Assessment (PISA). PISA, conducted every three years, is an assessment of 15-year-olds in reading literacy, mathematics literacy, and science literacy. While each area is studied every three years in each PISA data collection, one area is the major domain, involving more items and more detailed results. In 2000, reading literacy was the major domain; in 2003, it was mathematics literacy. In 2006, the focus will be on science literacy.
The results presented today include U.S. student performance on mathematics literacy and problem solving, and, in less detail, on reading literacy, science literacy, and differences by selected student characteristics.
Features of the Assessment
The use of the term “literacy” with each of the three areas is meant to reflect PISA’s emphasis on applied knowledge and skills. PISA strives to measure how well students are able to apply their knowledge and skills and use them to solve problems in a real-life context. PISA is not intended to report on how well students have mastered a particular curriculum or specific facts or formulas. Rather, it is meant to show what students’ cumulative knowledge and skill levels are at age 15. The age of 15 was chosen to give a measure of the “yield” of student learning at the latest possible age when students are still in school on a compulsory basis in all participating countries, and thereby provide an indicator of where countries are in terms of the outcomes of learning at the last mandatory stage of education. By focusing on “literacy,” PISA encompasses learning that may occur outside of school as well as in the classroom.
Comparative analyses of the content of PISA with that of the National Assessment of Educational Progress (NAEP) and other international assessments show that PISA requires more from students— PISA questions typically ask students to write out their responses, while NAEP and TIMSS questions more often involve multiple-choice responses. In mathematics, for example, PISA also had a much stronger content focus on the “data” area (which often deals with using charts or graphs); this fits with PISA’s emphasis on using materials in a real-world context.
In addition to assessing reading literacy, mathematics literacy, and science literacy, one of PISA’s other major goals is to begin to measure cross-curricular competencies, or skills, that are not tied to any one specific curricular area. In 2003, problem-solving skills were assessed as part of PISA, and those results are reported on here.
PISA is sponsored internationally by the Organization for Economic Cooperation and Economic Development (OECD), an intergovernmental organization of 30 highly industrialized countries. In 2003, 41 countries participated including all 30 OECD countries.
The U.S. report on PISA shows comparisons for 39 of the 41 participating countries. One (the United Kingdom) is left out because of low response rates. Brazil’s data were also left out because they were received too late for inclusion in this report. The U.S. report focuses on results for 2003, but there is some discussion of changes in performance from 2000; those comparisons are based only on those OECD countries that participated in both 2000 and 2003.
How PISA Was Conducted
The samples in each PISA participating country were selected to be representative of all 15-year- olds in the country. The U.S. sample included public and private schools in the United States. Students took a 2-hour assessment and filled out a questionnaire about their background, experiences, and attitudes. A total of 5,456 students and 262 schools participated in 2003. More information about how the assessment was developed, quality assurance, and methodology is included in the technical notes of the U.S. report on PISA. An additional source of information will be the PISA 2003 technical report that will be published by the OECD and should be available in early 2005.
U.S. Performance in Mathematics Literacy
Performance on the Combined Mathematics Literacy Scale
PISA’s major focus in 2003 was mathematics literacy. The mathematics literacy assessment was based on several sub-areas—space and shape, change and relationships, quantity, and uncertainty. Scores in PISA are devised so that the average is 500 and the standard deviation is 100, with about two-thirds of the students scoring between 400 and 600 on any given scale.
The U.S. average overall score of 483 in mathematics literacy is below the OECD average score of 500. Twenty of the 28 OECD countries, and 3 non-OECD countries (Hong Kong-China, Liechtenstein, and Macao-China) outscored the United States. This is somewhat different from the reading literacy results for 2000, where the United States performed at the OECD average level.
Performance of U.S. Students on the Mathematics Literacy Subscales: Space and Shape, Change and Relationships, Quantity, and Uncertainty
Although not the terms commonly used in schools, each of the sub-areas used in PISA has a relationship to the typical courses taught in U.S. schools. For example, space and shape is most closely related to geometry, and change and relationships to algebra. The sample items in the report show examples of what each subscale measured. PISA uses these sub-areas to evaluate the application of knowledge and skills to problems in a real-life context. In each of the four sub-areas, the U.S. scored below the OECD average. Of the other 38 countries, 24 outperformed the United States on the space and shape subscale, 21 on the change and relationships scale, 26 on the quantity scale, and 19 on the uncertainty scale.
Because mathematics literacy was a minor area or domain in 2000, only two of the sub-areas were measured then: change and relationships and space and shape. Differences between 2000 and 2003 can be measured only for these two subscales. In the case of the United States, scores did not change between 2000 and 2003 for either of these subscales. In both 2000 and 2003, about two-thirds of the OECD countries outperformed the United States on each scale.
PISA also uses proficiency levels to report on student performance. For mathematics literacy, there are 6 levels, with 6 being the highest level. There is an additional level, below level 1, which is not described—it includes students who did not answer enough questions correctly to be able to describe their skills accurately.
The U.S. average score of 483 is at the bottom cutpoint for level 3 and the OECD average score is around the midpoint for level 3.
The percentages of U.S. students at levels 4, 5, and 6 are smaller than the OECD average percentages. At the other end of the scale (levels 2 and below), however, the United States has larger percentages of students. Compared to the OECD average percentages, the percentages of U.S. students at each level for each of the subscales follow a similar pattern.
U.S. Performance in Problem Solving
The other major area covered in PISA 2003, which was new in 2003, was problem solving. Problem-solving items were separate from other items, and were specifically designed to be independent of any one curricular area, e.g., math or science.
They covered three main types of problems: system analysis and design, where students had to use information about a complex situation to analyze or design a system that met stated goals; troubleshooting, where students had to understand the reasons behind a malfunctioning device or system; and decision-making, where students had to make decisions based on a variety of alternatives and constraints. Sample items in the U.S. report show examples of what was measured in problem solving.
Problem solving as measured here was a separate content area for PISA. This is not to say that other areas did not involve problem solving; in fact, many of the mathematics and science items in PISA also included some aspect of problem solving. However, despite the fact that problem solving was separate from mathematics literacy, the U.S. performance was similar here, as well. The United States performed below the OECD average in problem solving, and 22 OECD and 3 non-OECD countries outperformed the United States.
As with mathematics literacy, PISA also uses proficiency levels to report on student performance in problem solving. For problem solving, there are 3 levels, with 3 being the highest level. Again, there is an additional level, below level 1, which includes students who did not answer enough questions correctly to be able to describe their skills.
The U.S. average score of 477 is a level 1 score, while the OECD average score of 500 is a level 2 score. The difference between a level 1 and level 2 score is that level 1 students have difficulty with multi-faceted problems with more than one data source, while level 2 students can use various types of reasoning with information from a variety of sources.
As you can see in the figure, the percentages of U.S. students at levels 2 and 3 are smaller than the OECD average percentages. At level 1 and below, the United States has larger percentages of students than OECD countries, on average.
U.S. Performance in Reading Literacy and Science Literacy
Although the major areas covered in 2003 were mathematics literacy and problem solving, reading and science literacy were also assessed to a more limited degree. In reading literacy, the U.S. score did not change from 2000 to 2003, and the United States performed at the OECD average in 2003, as it did in 2000. In science literacy, the U.S. score was not measurably different from the OECD average in 2000, but was below the OECD average in 2003.
Differences in Performance by Selected Student Characteristics
Performance by Sex
In mathematics literacy in the United States, males scored 486 on average and females scored 480. This pattern of males outperforming females was also true in most other OECD countries, where 20 of the other 28 showed this difference.
However, this is not the case with problem solving, where there was no measurable difference in U.S. scores by sex. Only seven countries showed any differences by sex, and in six of those seven countries, females outperformed males.
Performance by Parental Occupational Status
One PISA measure of student socioeconomic background is based on parental occupational status. Students were asked about their parents’ jobs, and this information was then coded to an International Standard Classification of Occupations. That code was in turn mapped to an internationally comparable index of occupational status, known as the International Socioeconomic Index, or ISEI. The index ranges from 16 to 90, with 16 being the lowest index score and 90 the highest. Someone with a score between 16 and 35, for instance, might be a taxi driver. Someone with a score between 71 and 90 might be a college professor.
Other studies show that in the United States higher socioeconomic background is associated with higher achievement. What the PISA data allow us to do is see how the United States fares in this regard compared to other countries.
Looking at the relationship between ISEI and student performance shows that a few countries (Belgium, Germany, and Hungary) have a stronger link between ISEI and scores in both mathematics and problem solving than the United States. That is, in these countries, a higher parental occupational status is associated with a greater increase in student score than in the United States. Eleven countries show a weaker link between ISEI and student performance than the United States, but the remaining countries were not measurably different from the United States.
Performance by Race/Ethnicity
Data were also collected for race/ethnicity in the United States. In mathematics literacy, White students scored above the OECD average, while Black and Hispanic students scored below the OECD average. On average, the score for White, Asian, and students of more than one race is a level 3 score, while Hispanics score at a level 2 and Blacks at a level 1. Within the United States, we see similar patterns of performance among the racial/ethnic groups to what has been observed from other studies, like NAEP: Whites, Asians, and students of more than one race outperform Blacks and Hispanics, and Hispanics, in turn, outperform Blacks. Results for problem solving by race mirror those for mathematics literacy.
This PISA report is intended to be used by educators, policymakers, and interested members of the public. It is extremely important to have the kind of performance data that PISA provides as an external perspective on the performance of our nation’s students.
The project director for this report was Mariann Lemke of NCES. She was assisted in her efforts by valuable staff from the Education Statistics Services Institute, (Anindita Sen, Erin Pahlke, Lisette Partelow, David Miller) and Westat (Trevor Williams, David Kastberg, Leslie Jocelyn). Recognition should also be given to Val Plisko, Associate Commissioner, responsible for overall direction of the project, Elois Scott, the director of the international activities program, and to Marilyn Seastrom, NCES’ Chief Statistician.
NCES also wishes to thank the schools and the students who participated in this study. Their participation has allowed us to provide the nation with this important international perspective on student performance.
For More Information
This presentation covers some of the major findings from PISA 2003 from the U.S. perspective, but of course, it is not the whole story. Other findings are available in the OECD’s report on PISA 2003, and additional results will be published by OECD in a series of future thematic reports. The PISA 2003 data will also be publicly available after December 7 for independent analyses.
Download, view, and print the twenty slides used in the Commissioner's presentation as a powerpoint file (372 KB).
For more information on PISA, please visit the PISA website at http://nces.ed.gov/surveys/pisa.
See the official U.S. Department of Education press release.