NAEP assessments explore students' performance in a range of subject areas through the use of cognitive items. Cognitive items are designed to assess what students know and can do, and are based on the framework and specifications documents for each assessment subject. These types of items have traditionally included multiple-choice items, constructed-response items scored dichotomously, and constructed-response items scored polytomously.
The NAEP transition from paper-based to digitally based assessments administered on tablets has made it possible to introduce new item types, which are classified as either
selected response or constructed response. In a selected-response item, students read a question and are presented with a finite set of response options from which they choose one or more correct answers. This format includes items such as single-selection multiple choice (SSMC), multiple-selection multiple choice (MSMC), matching, grid, zone, in-line choice, and drag-and-drop items. Constructed-response items require some type of written response. Students select answers for selected-response items with the provided stylus or a finger and type their responses to constructed-response items with the attached keyboard.
The item-development steps for each subject area are as follows:
The National Assessment Governing Board (the Governing Board) provides content frameworks and item specifications in each subject area.
The instrument development committee in each subject area provides guidance to NAEP staff about how the objectives described in the framework can be measured given the constraints of resources and the feasibility of measurement technology. The committee makes recommendations about priorities for the assessment (within the context of the assessment framework) and the types of items to be developed.
Concept sketches for
scenario-based tasks (SBTs) are developed and submitted for NCES and Governing Board approval prior to item development.
Specialists with subject-matter expertise and experience in creating items according to specifications develop and review the assessment questions.
For a new stimulus, the copyright permission process begins at the beginning of the item development phase. If the stimulus (text, artwork, graphics, charts, diagrams, music, video, etc.) is new, a permission form is completed by the appropriate subject assessment specialist. The NAEP copyright coordinator reviews the permission request form, along with the provided attachments, and submits it to the appropriate office at a minimum of 12 weeks in advance of the date for receipt of permission.
For a stimulus that needs to be renewed, the process also begins at the beginning of the item development phase. The assessment specialist first identifies the items and blocks to be used and informs the NAEP copyright coordinator. The copyright coordinator reviews and confirms that the number of uses are not exhausted, and the expiration date is still current to use in the upcoming assessment.
NAEP test development staff and external test specialists review and revise the items and accompanying scoring guides.
Editorial and fairness reviews are conducted as required by NCES.
Pilot test materials are prepared, and those that require secure clearance are sent to the federal
Office of Management and Budget (OMB). Contextual items are submitted to the OMB for clearance, while cognitive items do not need to be approved. In addition, materials such as recruitment and communication documents that would be sent to the field (e.g., Facts for Teachers and Facts for Districts) may also be included in clearance packages.
A pilot test is conducted in many of the states and jurisdictions slated to participate in the next operational assessment.
Based on the pilot test analyses, items are selected for inclusion in the operational assessment.
Each subject-area instrument development committee approves the selection of items to include in the next operational assessment.
Each subject-area instrument is submitted to the Governing Board for approval.
Operational materials, namely the contextual items and documentation showing which contextual items were removed, added, or revised are sent to the OMB to secure clearance.
The NAEP translation process typically begins after the English language versions of items or tasks have been finalized. The translation team prepares the initial translations which then undergo review by bilingual panels of experts, including content experts, teachers in the content area at grades 4 and 8, and linguistic experts. Historically, NAEP does not provide Spanish translations at grade 12. In addition, an external vendor conducts a complete independent Translation Verification Review of all translated assessment materials. Thereafter, NCES reviews and approves the translations prior to certification.
There are two types of NAEP digitally based assessment bilingual accommodations. For grades 4 and 8 mathematics and science operational assessments, and for grade 8 civics and U.S. history operational assessments, NAEP offers an English-Spanish full bilingual version as an accommodation for English learners (mainland U.S. administration). Selected cognitive blocks and all directions screens, tutorial screens, help screens, toolbar and rollover text, dialog boxes, and student questionnaires are presented in English and Spanish. Students can toggle back and forth between languages. In addition, for grades 4 and 8 mathematics, reading, and science operational assessments, and for grade 8 civics and U.S. history operational assessments, NAEP offers an English-Spanish partial bilingual version as an accommodation for English learners (mainland U.S. administration). All directions screens, tutorial screens, help screens, toolbar and rollover text, dialog boxes, and student questionnaires are presented in English and Spanish. Students can toggle back and forth between languages, but cognitive content remains in English. Students are assigned either the full bilingual version or the partial bilingual version based on their accommodation needs, and both versions are administered in the same way.
For NAEP paper-based assessments, NAEP has offered English-Spanish full bilingual versions as an accommodation for English learners (mainland U.S. administration). Selected forms with cognitive blocks, directions, and student questionnaires are presented in English and Spanish on facing pages.
In addition, for the NAEP mathematics assessment in Puerto Rico which is administered in Spanish, accommodations are offered to students identified as Spanish language learners
(SLL).
After a final review, the booklets are printed or packaged as digital test forms for digital delivery.
NAEP makes testing accommodations available to English learners and to students with disabilities in order to ensure a fully representative sample of students across the nation. This is accomplished by incorporating
universal design elements into NAEP digitally based assessments, in addition to accommodations that are provided by the test delivery system. To learn more about the accommodations allowed on NAEP assessments, see
NAEP Accommodations Increase Inclusiveness. Other accommodations, such as a Braille version of a test or the presentation of the test in sign language, are provided outside of the test delivery system. See
Digitally Based Assessments UDE and Accommodation Descriptions for more information.
Each administration of the NAEP assessment requires a new configuration of the student booklets given to students and how they are distributed to schools. To allow for wide content coverage within the limited testing time for each student, the instrument configuration entails a three-step design process for the subject areas to be assessed:
In the first step, NAEP uses a focused balanced incomplete block (BIB) or partially balanced incomplete block (pBIB) design to assign blocks or groups of cognitive items to student booklets or test forms. The "focused" aspect of NAEP's booklet or test form design requires that each student answer questions from only one subject area. In a BIB design, the cognitive blocks are balanced; each cognitive block appears an equal number of times in every possible position. Each cognitive block is also paired with every other cognitive block in a test booklet or test form exactly the same number of times. In a pBIB design, cognitive blocks may not appear an equal number of times in each position, or may not be paired with every other cognitive block an equal number of times.
Second, the
spiraling scheme is designed. Spiraling refers to interleaving booklets or test forms systematically so that when they are handed out in the specified order, any group of students will receive approximately the target proportions of different types of booklets or test forms.
The third step is the bundling design. In 2003, NAEP test developers introduced an enhanced bundling design, referred to as vertical bundling. Vertical bundling has flexibility with respect to bundle length and reduces the required number of different bundles, decreasing booklet wastage, and improving balance of within-session booklet or test form pairings.
Note: Until the 1984 assessment, NAEP was administered using matrix sampling and tape recorders; that is, by administering booklets of exercises using paced audio tapes that walked groups of students through the individual assessment exercises in a common booklet. In the 1984 assessment, a balanced incomplete block booklet design, which does not include audio tape pacing, was introduced, allowing for the administration of different booklets to students in the same session. The NAEP long-term trend assessments in mathematics/science continued to use the taped matrix sampling until the 2004 redesign of the long-term trend assessments. |