In addition to using a range of scale scores as the basic form of measurement, PISA describes student proficiency in terms of levels of proficiency. Higher levels represent the knowledge, skills, and capabilities needed to perform tasks of increasing complexity. PISA results are reported in terms of percentages of the student population at each of the predefined levels.
To determine the performance levels and cut scores on the literacy scales, IRT techniques were used. With IRT techniques, it is possible to simultaneously estimate the ability of all students taking the PISA assessment, as well as the difficulty of all PISA items. Estimates of student ability and item difficulty can then be mapped on a single continuum. The relative ability of students taking a particular test can be estimated by considering the percentage of test items they get correct. The relative difficulty of items in a test can be estimated by considering the percentage of students getting each item correct. In PISA, all students within a level are expected to answer at least half of the items from that level correctly. Students at the bottom of a level are able to provide the correct answers to about 52 percent of all items from that level, have a 62 percent chance of success on the easiest items from that level, and have a 42 percent chance of success on the most difficult items from that level. Students in the middle of a level have a 62 percent chance of correctly answering items of average difficulty for that level (an overall response probability of 62 percent). Students at the top of a level are able to provide the correct answers to about 70 percent of all items from that level, have a 78 percent chance of success on the easiest items from that level, and have a 62 percent chance of success on the most difficult items from that level. Students just below the top of a level would score less than 50 percent on an assessment at the next higher level. Students at a particular level demonstrate not only the knowledge and skills associated with that level but also the proficiencies defined by lower levels. Patterns of responses for students in the proficiency levels labeled below level 1b for science and reading literacy and below level 1 for mathematics literacy suggest that these students are unable to answer at least half of the items from those levels correctly. For details about the approach to defining and describing the PISA proficiency levels and establishing the cut scores, see the OECD's PISA 2015 Technical Report. Table A-1 shows the cut scores for each proficiency level for science, reading and mathematics literacy.
|Table A-1. Cut scores for proficiency levels for science, reading, and mathematics literacy: 2015|
|Below level 1||0 to less than 260.54||0 to less than 262.04||0 to less than 357.77|
|Level 1 (1b)||260.54 to less than 334.94||262.04 to less than 334.75||357.77 to less than 420.07|
|Level 1 (1a)||334.94 to less than 409.54||334.75 to less than 407.47|
|Level 2||409.54 to less than 484.14||407.47 to less than 480.18||420.07 to less than 482.38|
|Level 3||484.14 to less than 558.73||480.18 to less than 552.89||482.38 to less than 544.68|
|Level 4||558.73 to less than 633.33||552.89 to less than 625.61||544.68 to less than 606.99|
|Level 5||633.33 to less than 707.93||625.61 to less than 698.32||606.99 to less than 669.30|
|NOTE: For science and reading literacy, proficiency level 1 is composed of two levels, 1a and 1b. The score range for below level 1 refers to scores below level 1b. For mathematics, there is a single proficiency category at level 1.
SOURCE: Organization for Economic Cooperation and Development (OECD), Program for International Student Assessment (PISA), 2015.