Two international assessments measure aspects of science skills: TIMSS, which focuses on students' content knowledge of the science that they are likely to have been taught in school by grades 4 and 8, and PISA, which focuses on the cognitive skills or abilities of 15-year-old students to apply science knowledge and skills to a variety of materials with a real-life context. Whereas TIMSS is closely linked to the curricula of the participating countries, PISA assesses 15-year-olds' scientific literacy, which it defines as
An individual's scientific knowledge and use of that knowledge to identify questions, to acquire new knowledge, to explain scientific phenomena, and to draw evidence-based conclusions about science-related issues, understanding of the characteristic features of science as a form of human knowledge and enquiry, awareness of how science and technology shape our material, intellectual, and cultural environments, and willingness to engage in science-related issues, and with the ideas of science, as a reflective citizen (OECD, 2006, p. 5).
On account of these different aims, the two assessments ask students to perform different tasks. TIMSS asks 4th- and 8th-graders to complete a range of multiple-choice and constructed response questions that test their knowledge of specific science topics or content domains—life science, physical science, and Earth science at grade 4 and biology, chemistry, physics, and Earth science at grade 8. 36 In contrast, PISA poses multiple-choice questions and constructed response questions that ask students to identify scientific issues (e.g., recognize issues that are possible to investigate scientifically), explain phenomena scientifically (e.g., describe or interpret phenomena scientifically and predict changes), and use scientific evidence (e.g., identify the assumptions, evidence, and reasoning behind conclusions). PISA presents students with a range of exercises, based on materials that they are likely to encounter as young adults, such as a discussion of acid rain, a picture of erosion at the Grand Canyon, or the results of a controlled experiment. 37
It is important to note that PISA's science assessment was revised in 2006 to (1) more clearly distinguish knowledge about science as a form of human inquiry from knowledge of science, and (2) add to the framework components on the relationship between science and technology. In addition, to more clearly distinguish scientific literacy from reading literacy, the PISA 2006 science test items required less reading, on average, than did the science items used in earlier PISA surveys. Because of these changes, it is not possible to compare science learning outcomes from PISA 2006 with those of earlier PISA assessments as is done for reading and mathematics. The differences in science performance that readers may observe when comparing PISA 2006 science scores with science scores from earlier PISA assessments may be attributable to changes in the nature of the science assessment as much as to changes in actual student performance.