- Surveys & Programs
- Data & Tools
- Fast Facts
- News & Events
- Publications & Products
- About Us

- Overview
- PISA Data Explorer
- PISA 2018 Results
- Technical Notes
- Previous PISA Results
- PISA 2015 Results
- Welcome to the PISA 2015 Results
- Selected Findings from PISA 2015
- Introduction
- Science Literacy
- Reading Literacy
- Mathematics Literacy
- Financial Literacy
- Collaborative Problem Solving
- Trends in Student Performance
- State Results
- Methodology and Technical Notes
- Download all PISA 2015 tables and figures
- For More Information

- PISA 2012 Results
- PISA 2009 Results
- PISA 2006 Results
- PISA 2003 Results

- PISA 2015 Results
- PISA Young Adult Follow-up Study
- FAQs
- Data
- PISA Released Assessment Items
- Questionnaires
- Countries
- Schedule and Plans
- Partners
- PISA International Site
- Join NewsFlash

Return to Methodology and Technical Notes

In addition to using a range of scale scores as the basic form of measurement, PISA describes student proficiency in terms of levels of proficiency. Higher levels represent the knowledge, skills, and capabilities needed to perform tasks of increasing complexity. PISA results are reported in terms of percentages of the student population at each of the predefined levels.

To determine the performance levels and cut scores on the literacy scales, IRT techniques were used. With IRT techniques, it is possible to simultaneously estimate the ability of all students taking the PISA assessment, as well as the difficulty of all PISA items. Estimates of student ability and item difficulty can then be mapped on a single continuum. The relative ability of students taking a particular test can be estimated by considering the percentage of test items they get correct. The relative difficulty of items in a test can be estimated by considering the percentage of students getting each item correct. In PISA, all students within a level are expected to answer at least half of the items from that level correctly. Students at the bottom of a level are able to provide the correct answers to about 52 percent of all items from that level, have a 62 percent chance of success on the easiest items from that level, and have a 42 percent chance of success on the most difficult items from that level. Students in the middle of a level have a 62 percent chance of correctly answering items of average difficulty for that level (an overall response probability of 62 percent). Students at the top of a level are able to provide the correct answers to about 70 percent of all items from that level, have a 78 percent chance of success on the easiest items from that level, and have a 62 percent chance of success on the most difficult items from that level. Students just below the top of a level would score less than 50 percent on an assessment at the next higher level. Students at a particular level demonstrate not only the knowledge and skills associated with that level but also the proficiencies defined by lower levels. Patterns of responses for students in the proficiency levels labeled below level 1b for science and reading literacy and below level 1 for mathematics literacy suggest that these students are unable to answer at least half of the items from those levels correctly. For details about the approach to defining and describing the PISA proficiency levels and establishing the cut scores, see the OECD's *PISA 2015 Technical Report*. Table A-1 shows the cut scores for each proficiency level for science, reading and mathematics literacy.