Return to Methodology and Technical Notes
In addition to a range of scale scores as the basic form of measurement, PISA describes student proficiency in terms of levels. Higher levels represent the knowledge, skills, and capabilities needed to perform tasks of increasing complexity. PISA results are reported in terms of percentages of the student population at each of the predefined levels. Descriptions of each proficiency level can be found here for mathematics literacy, here for science literacy, and here for reading literacy.
To determine the performance levels and cut scores on the literacy scales, IRT techniques were used. With IRT techniques, it is possible to simultaneously estimate the ability of all students taking the PISA assessment, as well as the difficulty of all PISA items. Estimates of student ability and item difficulty can then be mapped on a single continuum. The relative ability of students taking a particular test can be estimated by considering the percentage of test items they get correct. The relative difficulty of items in a test can be estimated by considering the percentage of students getting each item correct. In PISA, all students within a level are expected to answer at least half of the items from that level correctly. Students at the bottom of a level are able to provide the correct answers to about 52 percent of all items from that level, have a 62 percent chance of success on the easiest items from that level, and have a 42 percent chance of success on the most difficult items from that level. Students in the middle of a level have a 62 percent chance of correctly answering items of average difficulty for that level (an overall response probability of 62 percent). Students at the top of a level are able to provide the correct answers to about 70 percent of all items from that level, have a 78 percent chance of success on the easiest items from that level, and have a 62 percent chance of success on the most difficult items from that level. Students just below the top of a level would score less than 50 percent on an assessment at the next higher level. Students at a particular level demonstrate not only the knowledge and skills associated with that level but also the proficiencies defined by lower levels. Patterns of responses for students below level 1b for reading literacy and below level 1 for mathematics and science literacy suggest that these students are unable to answer at least half of the items from those levels correctly. For details about the approach to defining and describing the PISA levels and establishing the cut scores, see the OECD’s PISA 2012 Technical Report (forthcoming). Table AA2 shows the cut scores for each proficiency level for mathematics, science, and reading literacy.
Table AA2. Cut scores for proficiency levels for mathematics, science, and reading literacy: 2012 | |||
---|---|---|---|
Proficiency level | Mathematics | Science | Reading1 |
Below level 1 | 0-358 | 0-335 | 0-262 |
Level 1 | greater than 358-420 | greater than 335-410 | greater than 262-335 (1b) |
greater than 335-407 (1a) | |||
Level 2 | greater than 420-482 | greater than 410-484 | greater than 407-480 |
Level 3 | greater than 482-545 | greater than 484-559 | greater than 480-553 |
Level 4 | greater than 545-607 | greater than 559-633 | greater than 553-626 |
Level 5 | greater than 607-669 | greater than 733-708 | greater than 626-698 |
Level 6 | greater than 669-1000 | greater than 708-1000 | greater than 698-1000 |
1The first reading literacy proficiency level is composed of levels 1a and 1b. The score range for below level 1 refers to scores below 1b. SOURCE: Organization for Economic Cooperation and Development (OECD), Program for International Student Assessment (PISA), 2012. |