Skip Navigation

Survey Instruments -> Assessment Questions

Of Special Note
NAAL test developers followed a multistep development and review process to ensure the quality of the assessment questions. They collaborated with literacy experts, content specialists, testing experts, survey methodologists, and other specialists. Review steps included:
  • reviewing questions for their ability to maintain trend lines as well as the efficiency, robustness, and interpretability of these comparisons;
  • conducting cognitive laboratory studies to help clarify the questions proposed for the assessment;
  • reviewing the literacy assessment tasks, scoring rubrics, and stimulus materials to ensure that they were appropriate for a range of literacy levels, interests, and socioeconomic backgrounds (this review process examined editorial correctness, accuracy, ease of scoring, and bias and sensitivity); and
  • analyzing field test data to ensure that items were functioning adequately.
1992: A Look Back
Hybrid tasks. In NALS, there were no published criteria for determining the status of hybrid tasks. Because hybrid tasks were not an option, if any section of the stimulus material was document, the entire stimulus material was coded as document (even if it contained prose). There were 2 hybrid tasks in the 1992 assessment.

Real-World Context

Since most real-life situations do not supply multiple-choice questions, NAAL assessment questions are open-ended and require short-answer responses. The respondent reads the question; locates, infers, and/or calculates an answer; and then writes that answer in a designated location. This dramatically reduces the possibility that respondents either accidentally or randomly choose the right answer, without knowing how to obtain it.

NAAL Tasks and Stimulus Materials

In the main NAAL, each assessment question is composed of a prose, document, or quantitative literacy task (i.e., instructions) and the corresponding stimulus material. Stimulus materials are the written text or printed material from which the answer to the task may be found or derived.

Prose questions require respondents to perform a prose task (e.g.,read an editorial) using one of five types of prose stimulus materials: expository, procedural, persuasive, narrative, and poetry. View a sample prose question.

Document questions require respondents to perform a document task (e.g., complete a tax form) using one of seven types of document stimulus materials: list, table, map/diagram, form, bill, graph, and other. View a sample document question.

Quantitative questions require respondents to identify, describe, or perform an arithmetic operation (addition, multiplication, subtraction, and division) either in prose or document materials, since there are no texts that are unique to quantitative tasks. The majority of NAAL quantitative tasks (39 out of 47) are embedded in document materials. View a sample quantitative question.

Hybrid questions require respondents to perform a task using a stimulus material that is both a prose and document structure. If the respondent is required to process the prose segment, the question is coded as a prose task, but if the respondent is required to process both prose and document segments, the question is coded as a hybrid task. Since hybrid tasks account for only 4 out of 152 tasks in 2003, there are too few with which to do any specific analysis. View a sample hybrid question.

Top