Table of Contents | Search Technical Documentation | References
Within each sampled school, a sample of students was selected from a list of students enrolled in the targeted grade, such that every student had an equal chance of selection. The student lists were submitted either electronically using a system known as E-filing or on hardcopy. In E-filing, student lists are submitted in Excel files by either school coordinators, NAEP State Coordinators, or NAEP TUDA Coordinators. The files can be submitted for one school at a time (known as single school E-file submission) or for more than one school at once (known as multiple school E-file submission). E-filing allows schools to easily submit student demographic data electronically with the student lists, easing the burden on NAEP field supervisors and school coordinators. Schools that are unable to submit their student lists using the E-filing system provide hardcopy lists to NAEP field supervisors. In 2017, for the state assessment samples and national assessment samples combined, there were 18,100 schools that E-filed their student lists, while 900 lists were submitted on hardcopy.
In year-round, multi-track schools, students who were not scheduled to be in school on the assessment day were removed from the student lists prior to sampling. Student base weights were adjusted to account for these students.
The sampling process was the same, regardless of list submission type. The sampling process was systematic (e.g., if the sampling rate was one-half, a random starting point of one or two was chosen, and every other student on the list was selected). For E-filed schools only, where demographic data were submitted for every student on the frame, students were sorted by gender and race/ethnicity before the sample was selected to implicitly stratify the sample.
In the certainty jurisdictions, all students were sampled in all schools. Otherwise, the sample size was dependent on whether the school was in a TUDA district, not involved in TUDA, or in Puerto Rico. For schools not involved in TUDA and not in Puerto Rico, the sample size was 62 students. In some cases, a larger school may have been selected with certainty during the school sample selection process, and thus may have a larger sample size. For schools in Puerto Rico, the sample size was 50 students in all schools.
In the six largest TUDA districts (New York, Los Angeles, Chicago, Miami-Dade, Clark County, and Houston), the student sample size was 66 students and for the remaining TUDA districts (except for the District of Columbia), the student sample size was 74. In the District of Columbia, the target sample size was the same for the TUDA schools as for the schools not involved in TUDA. If a TUDA school was selected with certainty during the school sample selection process, a larger student sample may have been selected.
Some students enrolled in the school after the sample was selected. In such cases, new enrollees were sampled at the same rate as the students on the original list.
After selection, the sampled students were first assigned to either a paper-based assessment (PBA) or digitally-based assessment (DBA), depending on whether the school would be using one or both modes. Schools were randomly assigned to a single mode of assessment (i.e. all students doing PBA or all students doing DBA) when the number of students in the sampled grade was fewer than 21, except in Puerto Rico where a single mode was assigned in schools with fewer than 20 students in the targeted grade. The determination of which mode would be used in a single-mode school was made during the school sample selection process. The assignment of each school to PBA-only or DBA-only was initialized according to the designed sampling rates for the PBA and DBA assessments. At the time of student list submission, if the number of students in the sampled grade was fewer than the threshold, all students were assigned to the mode designated for that school during the single-mode assignment initialization process.
In schools where the number of students exceeded the single-mode threshold, some sampled students were assigned to a DBA session and others were assigned to a PBA session. This assignment was complicated by the need to consider operational efficiency, while also meeting sampling requirements. The operational management of the DBA was based on administering the assessment in sessions containing 25 students apiece, in order to efficiently use the available equipment. To support this effort, the assignment of sampled students to DBA sessions was established first and all remaining sampled students were assigned to a PBA session.
In general, 50 sampled students in every non-certainty school (except in Puerto Rico) were to be assigned to the DBA (i.e., two sessions). However, in schools with fewer than the target sample size students, applying the designed rate of DBA assignment would, in some cases, result in DBA sessions with slightly more or slightly fewer than 25 students. These situations were deemed operationally inefficient, therefore adjustments were made to the DBA assignment process to prevent these occurrences. In schools where slightly fewer than 25 students would be assigned to a DBA session, that number was increased to 25 and the remainder of students were assigned to PBA. Similarly, in order to maintain balance in the overall proportions of DBA and PBA students, in schools where slightly more than 25 students would be assigned to a DBA session, that number was decreased to 25 and the remainder of students were assigned to PBA. In certainty schools with a larger student sample size, the same adjustments were made based on multiples of 25 instead of exactly 25. In Puerto Rico schools with both modes of assessment, if the number of students in the sampled grade was fewer than 50, half of the students would be assigned to DBA and the other half assigned to PBA. No adjustments were made to force the DBA assignment to exactly 25 students.
The ratio of PBA to DBA assignment differed in jurisdictions in order to meet the analytic requirements of the two components of the assessment.
In all schools (except in Puerto Rico), sampled students were randomly assigned to mathematics, reading, and the special mathematics assessment. This process was implemented by spiraling: the booklets (or test forms, for DBA) assigned to sampled students were provided from booklet packets (sets, for DBA) that had, on average, the correct ratio of each of the relevant assessments in a randomized order. The percentage of booklets by subject within the spiral for each grade is given in the table below. For schools in Puerto Rico, only the special mathematics assessment was conducted.
Some sampled students who were English learners (EL) or students with disabilities (SD) were excluded from the assessment because they could not be assessed with the accommodations NAEP provides.
| Grade | DBA | PBA | |||
|---|---|---|---|---|---|
| Math | Reading | Math | Reading | Special Mathematics Assessment | |
| Grade 4 | 50.00 | 50.00 | 48.08 | 48.08 | 3.84 |
| Grade 8 | 50.63 | 49.37 | 46.89 | 49.18 | 3.93 |
| SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2017 Assessment. | |||||