Skip Navigation
  1. PIRLS
  2. PIRLS Resources

Frequently Asked Questions

In addition to the following questions about PIRLS, more FAQs about international assessments are available at http://nces.ed.gov/surveys/international/faqs.asp.


The Progress in International Reading Literacy Study (PIRLS) is an international assessment and research project designed to measure trends in reading achievement at the fourth-grade level as well as school and teacher practices related to instruction. Since 2001, PIRLS has been administered every 5 years.

In 2016, for the first time, education systems participating in PIRLS could choose to administer an optional assessment: ePIRLS. In addition to paper-and-pencil items in PIRLS, students who participated in ePIRLS were asked to complete online informational reading tasks. Each task involved navigating to and obtaining information from two to three different simulated websites, totaling 5 to 10 web pages. Students were then asked to complete a series of comprehension questions based on these tasks.

Continuing the transition to a digital assessment, in 2021 education systems will have the option of participating in an entirely digital assessment, which will be referred to as digitalPIRLS, or participating in the paperPIRLS option. Education systems participating in paperPIRLS will administer the traditional PIRLS items, which are based on literary and informational passages. Education systems participating in digitalPIRLS will administer an integrated assessment that will include both the traditional PIRLS assessment in digital format and the interactive online ePIRLS items. More than 50 education systems are expected to participate in PIRLS 2021. For more information about it, refer to the PIRLS 2021 Frameworks.

Close

PIRLS is a carefully constructed reading assessment that consists of a test of reading literacy and questionnaires to collect information about 4th-grade students' literacy performance.
PIRLS can help educators and policymakers by answering questions such as:

  • How well do 4th-grade students read?
  • How do students in one country compare with students in another country in reading literacy?
  • Do 4th-grade students value and enjoy reading?
  • Internationally, how do the reading habits and attitudes of students vary?
  • What aspects of reading literacy are assessed in PIRLS?
Close

PIRLS focuses on three aspects of reading literacy:

  • purposes of reading;
  • processes of comprehension; and
  • reading behaviors and attitudes.

The first two aspects form the basis of the written test of reading comprehension. The student background questionnaire addresses the third aspect.

In PIRLS, purposes of reading refers to the two types of reading that account for most of the reading done by young students, both in and out of school: (1) reading for literary experience, and (2) reading to acquire and use information. In the assessment, narrative fiction is used to assess students' ability to read for literary experience, while a variety of informational texts are used to assess students' ability to acquire and use information while reading. The PIRLS assessment contains about an equal proportion devoted to each of these two purposes.

Processes of comprehension refers to ways in which readers construct meaning from the text. Readers focus on and retrieve explicitly stated information; make straightforward inferences; interpret and integrate ideas and information; and evaluate and critique content, language, and textual elements.

For more information on the purposes for reading and processes of comprehension, see the PIRLS 2021 Assessment Framework.

Close

Assessment 
The assessment instruments include 4th-grade-level stories and informational texts collected from several different countries. Students are asked to use a repertoire of reading skills and strategies, including retrieving and focusing on specific ideas, making simple and more complex inferences, and examining and evaluating text features. The passages are followed by open-ended and multiple-choice format questions about the text.

The 2016 assessment consisted of 15 test booklets and 1 reader (presented in a magazine-type format with the questions in a separate booklet). The assessment was given in two 40-minute parts with a 5- to 10-minute break between them. Each booklet contained two parts—one block of literary experience items and one block of informational items—and each block occurred twice across the 15 booklets. As the entire assessment consisted of 12 blocks of passages and items, using different booklets allows PIRLS to report results from more assessment items than can fit in one booklet without making the assessment longer. To provide good coverage of each skill domain, the test items developed require about 8 hours of testing time. However, testing time is limited to 80 minutes per student by clustering items in blocks and randomly rotating the blocks throughout the student test booklets. As a consequence, no student receives all items (there were a total of 175 items on the 2016 assessment), but each item was answered by a representative sample of students.

A total of 12 reading passages—two from PIRLS 2001, 2006, and 2011; two from 2006 and 2011; two from PIRLS 2011 only; and six new passages—were included in the 2016 assessment booklets used in all participating education systems. The use of common passages from the 2001 through the 2016 assessments allows for the analysis of change in reading literacy over the 15-year period between administrations for countries that participated in these cycles. The passages, as well as all other study materials, were translated into the primary language or languages of instruction in each education system.

Questionnaires 
Background questionnaires are administered to collect information about students' experiences at home and at school in learning to read. A student questionnaire addresses students' attitudes toward reading and their reading habits. The student questionnaire is administered after the assessment portion, taking about 30 minutes to complete. In all, PIRLS takes 1½ to 2 hours of each student's time, including the assessment and background questionnaire.

In addition, questionnaires are given to students' teachers and school principals to gather information about students' school experiences in developing reading literacy. The teacher and school questionnaires are administered either online from a secure website or using a paper form. Teacher questionnaires take about 40 minutes to complete and ask teachers questions about their education and experience, available resources, and instructional practices. School questionnaires take about 40 minutes to complete and ask about school practices and resources.

In many countries (but not the United States), a parent questionnaire is also administered.

Close

Similar to PIRLS, the 2016 ePIRLS assessment consisted of five tasks (each of which was 40 minutes long), but students were asked to do only two of them. Students completed the tasks on a computer. Each task involved reading and obtaining information from two to three different websites totaling 5 to 10 web pages; afterward, students were asked to complete a series of comprehension questions based on the task. Students also completed a brief survey about their online access, knowledge, and use. ePIRLS took about 2 hours for students to complete, including the assessment and questionnaire.

Close

Each participating country agrees to select a sample that is representative of the target population as a whole. In 2001, the target population was the upper of the two adjacent grades with the most 9-year-olds. Beginning in 2006, the definition of the target population was refined to represent students in the grade that corresponds to the fourth year of schooling, counting from the first year of International Standard Classification of Education (ISCED) Level 1—4th grade in most countries, including the United States. This population represents an important stage in the development of reading. At this point, most children have learned to read and are using reading to learn. IEA's Trends in International Mathematics and Science Study (TIMSS) has also chosen to assess this target population of students.

Close

In each administration of PIRLS, schools are randomly selected first (with a probability proportional to the estimated number of students enrolled in the target grade), and then one or two classrooms are randomly selected within each school. In the United States, schools of varying demographics and locations are randomly selected so that the overall U.S. sample is representative of the U.S. school population. The random selection process is important for ensuring that a country's sample accurately reflects its schools and, therefore, can be compared fairly with samples of schools from other countries.

Close
Assessment year Number of participating schools Number of participating students Overall weighted participation rate (percent)
2001 174 3,763 83%
2006 183 5,190 82%
20111 370 12,726 81%
20162
PIRLS
ePIRLS

158
153

4,425
4,090

86%
80%

1 The reason for a larger sample size in 2011 than in previous administrations of PIRLS was that both TIMSS and PIRLS happened to coincide in that year. This led to the decision to draw a larger sample of schools and, where feasible, to administer both studies in the same schools, albeit to separate classrooms of students. Ultimately, TIMSS (grade 4) and PIRLS in the United States were administered in the same schools but to separately sampled classrooms of students.

2 All students in the selected classrooms were invited to participate in the PIRLS main study. The same students participated in ePIRLS, typically on the day following the main study. Five schools that participated in PIRLS chose not to participate in ePIRLS.

Close

The table below lists the total number of education systems that have participated in each of the four previous administrations of PIRLS at grade 4. The term “education system” refers to IEA member countries and benchmarking participants. IEA member “countries” may be complete, independent political entities or nonnational entities that represent a portion of a country (e.g., England, Hong Kong, the Flemish community of Belgium). Nonnational entities that are represented by their larger country in the main results (e.g., Abu Dhabi in the United Arab Emirates, Ontario in Canada), or whose countries are not IEA members (Buenos Aires), are designated as “benchmarking participants.” For a complete list of education systems participating in PIRLS, visit the PIRLS Participating Countries page.

Year Education systems participating in PIRLS at grade 4*
2001 36
2006 45
2011 53
2016 PIRLS: 58
ePIRLS: 16
*Education systems with off-grade participants not included.
Close

PIRLS is a cooperative effort involving the participation of representatives from every education system in the study. Prior to each administration of PIRLS, the assessment framework is reviewed and updated to reflect changes in the curriculum and instruction of the participating education systems, while maintaining the ability to measure change over time. Extensive input is received from experts in reading education, assessment, and curriculum, as well as representatives from national education centers around the world.

To enable educators, policymakers, and other stakeholders to better understand the results from PIRLS, many assessment items are released for public use after each administration. To replace these items, countries submit items for review by subject-matter specialists; additional items are written by a committee, in consultation with item-writing specialists in various countries, to ensure that the content, as explicated in the frameworks, is covered adequately. To evaluate these items, they are first reviewed by a committee and then field-tested in most of the participating education systems, with the field test results being used to evaluate item difficulty, how well items discriminate between high- and low-performing students, and evidence of bias toward or against individual countries or in favor of boys or girls. In 2016, 95 new items were selected for inclusion in the international assessment and added to 80 existing items.

PIRLS items include both multiple-choice and constructed-response items; the items are based on a selection of literary passages drawn from children's storybooks and informational texts. Literary passages include realistic stories and traditional tales, while informational texts include chronological and nonchronological articles, biographical articles, and informational leaflets.

In 2016, ePIRLS was designed to be an extension of PIRLS. Because the 2016 data collection marked the first administration of ePIRLS, all of the items were newly developed. Two of the tasks were released for public use and the rest retained to measure trends across future PIRLS cycles. Students complete the ePIRLS tasks on a computer which they use to obtain information from two to three simulated websites and then answer a series of comprehension questions.

Close

Three studies have compared PIRLS and NAEP in terms of their measurement frameworks and the reading passages and questions included in the assessments. The most recent study compared NAEP with PIRLS 2011 (see Highlights from PIRLS 2011, appendix C). Prior studies compared NAEP with PIRLS 2001 (A Comparison of the NAEP and PIRLS Fourth-Grade Reading Assessments PDF icon)  and compared NAEP with PIRLS 2006 (Comparing PIRLS and PISA with NAEP in Reading, Mathematics, and SciencePDF icon). The studies found the following similarities and differences:

Similarities

  • PIRLS and NAEP call for students to develop interpretations, make connections across text, and evaluate aspects of what they have read.
  • PIRLS and NAEP use literary passages drawn from children's storybooks and informational texts as the basis for the reading assessment.
  • PIRLS and NAEP use multiple-choice and constructed-response questions with similar distributions of these types of questions.

Differences

  • Results of readability analyses suggest that the PIRLS reading passages are easier than the NAEP passages (by about one grade level, on average).
  • PIRLS calls for more text-based interpretation than NAEP. NAEP places more emphasis on having students take what they have read and connect to other readings or knowledge and to critically evaluate what they have read.
Close

The most recent PIRLS administration was in 2016 with the release of the PIRLS results in December 2017. The next administration is scheduled for 2021, with the release of the results in December 2022.

Close

In schools with only one or two fourth-grade classrooms, all students are asked to participate. In schools with more than two fourth-grade classrooms, only students in two randomly selected classrooms are asked to participate. Some students with special needs or limited English proficiency may be excused from the assessment. In 2016, some classrooms selected to participate in PIRLS were also asked to take part in ePIRLS. In classrooms that were also asked to participate in ePIRLS, PIRLS was administered on day one and ePIRLS on day two.

Close
Back to Top