Skip Navigation

What PIAAC Measures

PIAAC is designed to assess adult skills over a broad range of abilities, from simple reading and numerical calculations to complex digital problem solving. To do this, PIAAC assesses literacy, numeracy, digital problem solving, and, as a separate domain to measure the low-end of literacy skills, reading components. These four domains are measured with either paper-and-pencil or computer-based modes of administration. Respondents who are not familiar with computers are given the paper-and-pencil version of the assessment.

PIAAC measures literacy and numeracy in both modes. However, the digital problem-solving items are all computer-administered, and the reading components tasks are only administered in the paper-and-pencil mode. All participating countries were required to assess the literacy and numeracy domains in Cycle I, but the digital problem-solving and reading component domains were optional. The United States assessed all four domains in Cycle I.

PIAAC tasks developed for all four domains are authentic, culturally appropriate, and drawn from real-life situations that are expected to be important or relevant in different contexts. Tasks’ contents and questions are intended to reflect the purposes of adults’ daily lives across different cultures, even if they are not necessarily familiar to all adults in all countries.

U.S. PIAAC Background Questionnaires: Cycle 1 (2011–2017)

U.S. Round Data Collection Links Difference From the Previous Round Questionnaire Link
Round 3 2017 U.S. PIAAC Household Study English | Spanish

Additional questions on:

  • Certifications
  • Desirability of employment hours increase and contract permanency
  • Military service
  • Total household income
Household Study Overview | Household Study Structure
Round 2 2014 U.S. PIAAC Prison Study English | Spanish

Additional questions on:

  • Educational and training activities in prison
  • Experiences with prison jobs
  • Involvement in vocational training and non-academic programs in prison
Prison Study Overview | Prison Study Structure
2012/2014 U.S. PIAAC Household Study (National Supplement) English | Spanish No additional questions; adjusted age-range and year changes throughout the questionnaire. Household Study Overview | Household Study Structure
Round 1 2012 U.S. PIAAC Household Study (Main Study) English | Spanish
Section A: Basic Demographics
Section B: Past and Present Education
Section C: Work History
Section D: Present Work Experience
Section E: Past Work Experience
Section F: Work Responsibilities
Section G: Skills Used at Work
Section H: Skills Used Outside of Work
Section I: Personal Characteristics and Health
Section J: General Background Information
Household Study Overview | Household Study Structure

The online questionnaires are formatted into boxes containing each question. After some questions, there are green boxes that provide the questionnaire’s automated routing instructions (i.e., instructions for which question comes next, based on the last response). With these routing instructions, the questionnaire adapts to respondents’ answers and routes them past irrelevant questions. For example, if a respondent says that they have not attended college, the routing instructions will skip past questions about what they studied in college.

You may use the green boxes to see how the routing process works by clicking on the links within the IF THEN statements for each possible response. Some of the links will take you to a different part of the questionnaire, but you can return to the question by using the yellow toolbar on the left side of the web page.

Item names

U.S. Household Background Questionnaire

Each item name begins with a letter representing the section to which it belongs (e.g., item H_Q03c belongs to Section H).

Questions that were adapted for the United States from the international version of the questionnaire have "US" at the end of the variable name (e.g., B_Q01aUS). Country-specific questions that were only administered in the United States have "USX" at the end of the variable name (e.g., B_Q01bUSX).

U.S. Prison Background Questionnaire

In most cases, each item name begins with a letter representing the section to which it belongs (e.g., item H_Q03c belongs to Section H). Questions that were added or edited to refer to experiences in prison have “P” at the beginning of the variable name (e.g., P_Q170).

Several questions were adopted from the household background questionnaire and may use the same variable naming convention as the household items, even if they refer to experiences in prison.

CLOSE

The Definition of Literacy Domain expanded on the definition of literacy used in the International Adult Literacy Survey (IALS) and the Adult Literacy and Lifeskills Survey (ALL)1 and provides a broad definition of literacy:

"Literacy is understanding, evaluating, using and engaging with written text to participate in society, to achieve one's goals, and to develop one's knowledge and potential."

This definition (a) highlights the ranges of cognitive processes involved in literacy; (b) focuses on a more active, participatory role of individuals in society; and (c) includes a range of text types, such as narrative and interactive texts, in both print and electronic formats.

While this is broader than the definition used in IALS and ALL, selected items from those assessments are used to provide a link to IALS and ALL. PIAAC items include continuous texts (e.g., sentences and paragraphs), noncontinuous texts (e.g., schedules, graphs, maps), and electronic texts (including hypertext, or text in interactive environments, such as forms and blogs). Task activities are presented in home, work, and community contexts, addressing the various purposes that adults pursue in their lives.

Based on the PIAAC framework, literacy tasks include items (in both computer-based and paper-and-pencil modes) that cover a range of difficulties—low, middle, and high—to present a comprehensive picture of the range of skills of adults in each country.


1 IALS and ALL definition: Literacy is using printed and written information to function in society to achieve one's goals and to develop one's knowledge and potential.

The Definition of Literacy Domain was prepared by an international literacy expert team in order to reflect the overall understanding of how best to assess adult literacy. The framework provides an overview of

  • how the domain is defined;
  • the range of skills to be measured;
  • the conceptual basis that guided the development of the PIAAC literacy tasks, including the types of items to be used; and
  • the content areas, contexts, and situations in adults’ lives that require literacy.
CLOSE

For PIAAC, literacy was defined as follows:

"Literacy is understanding, evaluating, using and engaging with written texts to participate in society, to achieve one’s goals, and to develop one’s knowledge and potential."

Some key terms within this definition are explained below.

"Understanding"

A basic task for the reader is constructing meaning, both large and small, literal and implicit, from text. This can be as basic as understanding the meaning of the words to as complex as comprehending the underlying theme of a lengthy argument or narrative. Certainly, evaluating or using a text implies some level of understanding and so provides an indirect measure of it, but it is the intent of the PIAAC assessment to have a more direct measure. The components framework provides the construct to support basic understanding, but the literacy assessment itself should also include tasks that explicitly tap more complex understanding, such as the relationships between different parts of the text, the gist of the text as a whole, and insight into the author’s intent. Readers also have to understand the social function of each text and the way this influences structure and content.

"Evaluating"

Readers continually make judgments about a text they are approaching. They need to assess whether the text is appropriate for the task at hand, determining whether it will provide the information they need. They have to make judgments about the truthfulness and reliability of the content. They need to account for any biases they find in the text. And, for some texts, they must make judgments about the quality of the text, both as a craft object and as a tool for acquiring information.

Such judgments are especially important for electronic texts. While published print information carries a sense of legitimacy, especially where the reader can assume there has been some review and edit process, sources for online information are more varied, ranging from authoritative to postings with unknown or uncertain authenticity. All information must be evaluated in terms of accuracy, reliability, and timeliness, but this is particularly important with online material.

"Using"

Much adult reading is directed toward applying the information and ideas in a text to an immediate task or to reinforce or change beliefs. Nearly all the tasks in previous international assessments have been of this kind. In some cases, using a text in this way requires just minimal understanding, getting the meaning of the words with some elementary recognition of structure (many menus, for example). In others, it requires using both syntactic and more complex structural understanding to extract the information. In all cases though, the reader approaches the text with a specific task in mind.

"Engaging with"

Many adults appear to read text only when some task requires them to do so. Others sometimes also read for the pleasure it brings them. That is, adults differ in how they engage with text and how much of a role reading plays in their lives. Studies have found that engagement with (that is, the attitude toward and practice of) reading is an important correlate with the direct cognitive measures. As such, it is necessary to understand these differences to get a full picture of adult literacy.

"Written text"

Previous literacy assessments have focused primarily on informative texts of both continuous and noncontinuous form. It is the intention of the new construct to expand the range of texts to include a greater variety of text types, such as narrative and interactive texts, and a greater variety of media. Until recently, most adult reading was of material printed on paper. Now, adults need to access and use text that is displayed on a screen of some kind, whether computer, a mobile device, or an ATM, or a Blackberry or iPhone. The PIAAC definition encompasses all these.

It is worth noting that including electronic text opens the assessment to new types of text and content. While one can find examples of similar texts in paper, they are much less common in that form. Some of these novel form/content combinations include interactive texts, such as exchanges in comments sections of blogs or in e-mail response threads; multiple texts, whether displayed at the same time on a screen or linked through hypertext; and expandable texts, where a summary can be linked to more detailed information if the user chooses.

"Participate in society"

While earlier definitions referred to the role of literacy in “functioning” in society, the PIAAC use of “participating” is meant to focus on a more active role for the individual. Adults use text as a way to engage with their social surroundings, to learn about and to actively contribute to life in their community, close to home and more broadly. And for many adults, literacy is essential to their participation in the labor force. In this, we recognize the social aspect of literacy, seeing it as part of the interactions between and among individuals.

"Achieve one’s goals"

Adults have a range of needs they must address, from basic survival to personal satisfaction and to professional and career development. Literacy is increasingly complicit in meeting those needs, whether it is to simply find one’s way through shopping or to negotiate complex bureaucracies, whose rules are commonly available only in written texts. It is also important in meeting adult needs for sociability, for entertainment, and leisure and for work.

"Develop one’s potential"

Surveys suggest that many adults engage in some kind of learning throughout their life, much of it self-directed and informal. Much of this learning requires some use of text, and, as individuals want to improve their life, whether at work or outside, they need to understand, use, and engage with printed and electronic materials.

For the complete PIAAC literacy framework, see

PIAAC Literacy: A Conceptual Framework

http://www.oecd-ilibrary.org/education/piaac-literacy-a-conceptual-framework_220348414075

CLOSE

In PIAAC, results are reported as averages on a 500-point scale or as proficiency levels. Proficiency refers to competence that involves “mastery” of a set of abilities along a continuum that ranges from simple to complex information-processing tasks.

This continuum has been divided into five levels of proficiency. Each level is defined by a particular score-point range associated with competence at specific information-processing tasks. Adults with literacy scores within the score-point range for a particular proficiency level are likely to successfully complete the tasks at that proficiency level as well as any lower proficiency levels. Adults with scores at a particular proficiency level might be able to complete a task at a higher proficiency level, but the probability is small and diminishes greatly the higher the level. The following descriptions summarize the types of tasks that adults at a particular proficiency level can reliably complete successfully.

The following descriptions summarize the types of tasks that adults at a particular proficiency level can reliably complete successfully.

See sample items for each level

Description of PIAAC literacy discrete proficiency levels

Proficiency level and score range Task descriptions
Below Level 1
0–175 points
The tasks at this level require the respondent to read brief texts on familiar topics to locate a single piece of specific information. There is seldom any competing information in the text, and the requested information is identical in form to information in the question or directive. The respondent may be required to locate information in short continuous texts; however, in this case, the information can be located as if the text were noncontinuous in format. Only basic vocabulary knowledge is required, and the reader is not required to understand the structure of sentences or paragraphs or make use of other text features. Tasks below Level 1 do not make use of any features specific to digital texts.
Level 1
176–225 points
Most of the tasks at this level require the respondent to read relatively short continuous, noncontinuous, or mixed texts in digital or print format to locate a single piece of information that is identical to or synonymous with the information given in the question or directive. Some tasks, such as those involving noncontinuous texts, may require the respondent to enter personal information into a document. Little, if any, competing information is present. Some tasks may require simply cycling through more than one piece of information. The respondent is expected to have knowledge and skill in recognizing basic vocabulary, determining the meaning of sentences, and reading paragraphs of text.
Level 2
226–275 points

At this level, texts may be presented in a digital or print medium and may comprise continuous, noncontinuous, or mixed types. Tasks at this level require respondents to make matches between the text and information and may require paraphrasing or low-level inferences. Some competing pieces of information may be present. Some tasks require the respondent to

  • cycle through or integrate two or more pieces of information based on criteria;
  • compare and contrast or reason about information requested in the question; or
  • navigate within digital texts to access and identify information from various parts of a document.
Level 3
276–325 points
Texts at this level are often dense or lengthy and include continuous, noncontinuous, mixed, or multiple pages of text. Understanding text and rhetorical structures becomes more central to successfully completing tasks, especially navigating complex digital texts. Tasks require the respondent to identify, interpret, or evaluate one or more pieces of information and often require varying levels of inference. Many tasks require the respondent to construct meaning across larger chunks of text or perform multi-step operations in order to identify and formulate responses. Often, tasks also demand that the respondent disregard irrelevant or inappropriate content to answer accurately. Competing information is often present, but it is not more prominent than the correct information.
Level 4
326–375 points
Tasks at this level often require respondents to perform multi-step operations to integrate, interpret, or synthesize information from complex or lengthy continuous, noncontinuous, mixed, or multiple-type texts. Complex inferences and application of background knowledge may be needed to perform the task successfully. Many tasks require identifying and understanding one or more specific, noncentral idea(s) in the text in order to interpret or evaluate subtle evidence, claims, or persuasive discourse or relationships. Conditional information is frequently present in tasks at this level and must be taken into consideration by the respondent. Competing information is present and sometimes seemingly as prominent as correct information.
Level 5
376–500 points
At this level, tasks may require the respondent to search for and integrate information across multiple, dense texts; construct syntheses of similar and contrasting ideas or points of view; or evaluate evidence-based arguments. Application and evaluation of logical and conceptual models of ideas may be required to accomplish tasks. Evaluating the reliability of evidentiary sources and selecting key information is frequently a requirement. Tasks often require respondents to be aware of subtle, rhetorical cues and to make high-level inferences or use specialized background knowledge.

NOTE: Every test item is located at a point on the proficiency scale based on its relative difficulty. The easiest items are those located at a point within the score range below level 1 (i.e., 175 or less); the most difficult items are those located at a point at or above the threshold for level 5 (i.e., 376 points). An individual with a proficiency score that matches a test item’s scale score value has a 67 percent chance of successfully completing that test item. This individual will also be able to complete more difficult items (those with higher values on the scale) with a lower probability of success and easier items (those with lower values on the scale) with a greater chance of success.

In general, this means that tasks located at a particular proficiency level can be successfully completed by the “average” person at that level approximately two-thirds of the time. However, individuals scoring at the bottom of the level would successfully complete tasks at that level only about half the time while individuals scoring at the top of the level would successfully complete tasks at the level about 80 percent of the time. Information about the procedures used to set the achievement levels is available in the OECD PIAAC technical report.

SOURCE: OECD Skills Outlook 2013

CLOSE

PIAAC has two modes of assessment computer-administered and paper-and-pencil. Respondents who are not familiar with computers are given the paper-and-pencil version of the assessment. PIAAC measures literacy and numeracy in both computer and paper modes.

The beginning of the computer-based assessment (CBA) includes a 5-minute literacy/numeracy core, while the paper-based assessment (PBA) begins with a 10-minute core of literacy/numeracy items in paper-and-pencil format. The literacy/numeracy core is a set of short, easy literacy and numeracy items that gather information about the basic literacy and numeracy skills of the participants and serve as a basis for routing.

Within the CBA, the CBA literacy domain consists of 52 items based on the PIAAC definition of literacy, which are all scored automatically. Of these 52 computer-based items, 30 were adapted from paper-based items administered as part of IALS and/or ALL and were used to link PIAAC results with those from IALS and ALL. The remaining 22 computer-based literacy items were newly created for PIAAC.

Within the PBA, the literacy domain consists of 24 items based on the PIAAC definition of literacy, which are scored by expert scorers. Of these 24 items, 6 are paper-based only and 18 are administered in both the paper-based and computer-based assessments.

Literacy items (both CBA and PBA) ask participants to answer questions about texts that are drawn from a broad range of real-life settings, including occupational, personal (home and family, health and safety, consumer economics, leisure, and recreation), community and citizenship, and education and training contexts. The text may be:

  • continuous texts (such as editorials, news stories, and brochures);
  • noncontinuous text (such as job applications, payroll forms, tables, and drug labels);
  • texts containing images;
  • hypertext (computer-based assessment only); or
  • text in interactive environments (e.g., message boards and chat rooms – computer-based assessment only)

The questions or tasks using these texts are meant to assess three specific cognitive processes:

  • Access and identify;
  • Interpret and integrate; and
  • Evaluate and reflect

Distribution of items by type of text

Type of text Number Percent
Print-based texts 36 62
Digital texts 22 38
Total 58 100

Note: Each category includes continuous, noncontinuous, and, combined texts.

Source: Literacy, Numeracy and Problem Solving in Technology-Rich Environments: Framework for the OECD Survey of Adult Skills. Paris: OECD, 2012, p. 25

Distribution of items by context

Context Number Percent
Work 10 17
Personal 29 50
Community 13 23
Education 6 10
Total 58 100

Distribution of items by task aspect

Task aspect Number Percent
Access and identify 32 55
Integrate and interpret 17 29
Evaluate and reflect 9 16
Total 58 100

Distribution of items by mode of administration

Mode of administration Number Percent
Computer- and paper-based 18 31
Computer-based only 34 59
Paper-based only 6 10
Total 58 100

Source: Adapted from Table 2.4 (p. 69) in the Technical Report of the Survey of Adult Skills (PIAAC) (3rd Edition). Paris: OECD, 2019.

CLOSE

In the PIAAC literacy domain, item difficulty is reported along a five-level proficiency scale with Level 1 corresponding to the easiest items and Level 5 corresponding to the most difficult items.

Literacy Below Level 1 Sample Item – Election results

The stimulus consists of a short report of the results of a union election containing several brief paragraphs and a simple table identifying the three candidates in the election and the number of votes they received. The test taker is asked to identify which candidate received the fewest votes. He or she needs to compare the number of votes that the three candidates received and identify the name of the candidate who received the fewest votes. The word “votes” appears in both the question and in the table and nowhere else in the text.

SOURCE: OECD Skills Outlook 2013: First Results From the Survey of Adult Skills. Paris: OECD, 2013. Page 65.

Literacy Level 1 Sample Item – Work Links

The stimulus consists of a job search results webpage containing a listing of job descriptions by company. The test taker is asked to identify which company is looking for an employee to work at night. He or she needs to review the job descriptions and identify the name of the company that meets the criteria.

The image displays a job search results webpage containing a listing of job descriptions by company. There is a filter box on the left of the image indicating options such as full-time/part-time work, temporary, casual/vacation, and the salary. The results on the left include the title, the company, a brief description, and the sector in which the job falls in.

SOURCE: Sample Items: Education and Skills Online http://www.oecd.org/skills/piaac/documentation.htm

Literacy Level 2 Sample Item – Lakeside fun run

The stimulus is a simulated website containing information about the annual fun run/walk organized by the Lakeside community club. The test taker is first directed to a page with several links, including “Contact Us” and “FAQs.” He or she is then asked to identify the link providing the phone number of the organizers of the event. In order to answer this item correctly, the test taker needs to click on the link “Contact Us.” This requires navigating through a digital text and some understanding of web conventions. While this task might be fairly simple for test takers familiar with web-based texts, some respondents less familiar with web-based texts would need to make some inferences to identify the correct link.

SOURCE: OECD Skills Outlook 2013: First Results From the Survey of Adult Skills. Paris: OECD, 2013, p. 65.

Literacy Level 2 Sample Item – Preschool rules

Respondents are asked to answer the question shown in the left pane by highlighting information in the list of rules for a preschool.

The image displays a bulleted list of rules for a preschool.

Correct response: 9:00 am.

SOURCE: Literacy, Numeracy and Problem Solving in Technology-Rich Environments: Framework for the OECD Survey of Adult Skills. Paris: OECD, 2012, p. 26.

Literacy Level 3 Sample Item – Library search

The stimulus displays results from a bibliographic search from a simulated library website. The test taker is asked to identify the name of the author of a book called Ecomyth. To complete the task, the test taker has to scroll through a list of bibliographic entries and find the name of the author specified under the book title. In addition to scrolling, the test taker must be able to access the second page where Ecomyth is located by either clicking the page number (2) or the word “next.” There is considerable irrelevant information in each entry this particular task, which adds to its complexity.

SOURCE: OECD Skills Outlook 2013: First Results From the Survey of Adult Skills. Paris: OECD, 2013, p. 65.

Literacy Level 3 Sample Item – Exercise equipment

The stimulus displays an exercise equipment chart. The respondent has to use the chart to determine which equipment received the largest number of “ineffective” ratings.

The image displays physical exercise equipment chart. There are two categories: cardio-training and muscle building equipment.

Correct response: Dumb bells/weights.

SOURCE: Literacy, Numeracy and Problem Solving in Technology-Rich Environments: Framework for the OECD Survey of Adult Skills. Paris: OECD, 2012, p. 27.

Literacy Level 4 Sample Item – Cash Flow

The test taker is presented with a newspaper article and an e-mail. The test taker is asked to identify a sentence in each source that contains common criticisms made about the devices described in the sources.

The image displays a news website called ‘The Daily Press – Your source for South Africa’s News.’ There is an article entitled, ‘Q Drum Rolls Its Way to Success.’

SOURCE: Sample Items: Education and Skills Online http://www.oecd.org/skills/piaac/documentation.htm

Literacy Level 4 Sample Item – Exercise equipment

Another item based on the exercise equipment chart stimulus asks the test taker to use the chart to identify which muscles will benefit the most from the use of a particular piece of equipment.

The image displays physical exercise equipment chart. There are two categories: cardio-training and muscle building equipment.

Correct response: Abdominal muscles.

SOURCE: Literacy, Numeracy and Problem Solving in Technology-Rich Environments: Framework for the OECD Survey of Adult Skills. Paris: OECD, 2012, p. 26.

NOTE: Items on this page are not actual replicas of test items.

CLOSE