The National Center for Education Statistics (NCES) developed a methodology to show where states? Adequate Yearly Progress (AYP) standards fit on the NAEP scale. The methodology described in Mapping 2005 State Proficiency Standards onto NAEP Scales is based on mathematics and reading assessment data. The mapping methodology offers an approximate, but credible, indication of the relative stringency of the states? AYP standards.
While the mapped NAEP equivalent scores are useful in determining the relative rigor of state proficiency standards, the results of the study should be interpreted with caution. Variations among states can be due to many factors, including differences in assessment frameworks, test specifications, the psychometric properties of the tests, the definition of AYP standards, and the standard setting process. At the request of the Education Information Management Advisory Consortium of the Council of Chief State School Officers (EIMAC), NCES developed this profile with contextual factors to help readers interpret the mapping results. Each state?s assessment and standards profile is based on information verified by the state?s NAEP representative as accurate for the 2004-2005 school year.
Each profile describes the skills that students are required to perform at the AYP standard in each individual state?s reading and mathematics testing program at grades 4 and 8. The description helps the reader understand how the skills required by states? AYP standards differ among the states and when compared to those specified for NAEP proficiency. In addition, the profile includes data related to the NAEP equivalent score of each state?s AYP percentage, and percentages of excluded students and types of accommodations allowed. The diagram below provides a description of the information included in the profile.
Block 1: Appears as the first set of information in each state's profile for reading and mathematics, respectively. Describes NAEP equivalent grades and subjects tested, performance standard development, substantive changes made to the test since 2003, and skills assessed for AYP standard.
Block 2: Appears on the following page or below Block 1 under the headings "2005 NAEP scale equivalent" and "Standard error." Includes NAEP equivalent score. Some states may not have such data. Data may not have been available for the 2004-2005 year for a number of reasons, including (1) The NAEP parallel grade was not tested by the state during the 2004-2005 academic year; (2) The NAEP parallel grade was tested, but data were not made public for those grades and subjects; (3) The NAEP parallel grade was tested, but these outcomes correspond to skills assessed in prior years (e.g., a fall grade 4 assessment that measures grade 3 proficiency); and (4) The NAEP parallel grade was tested but the data were not used in the mapping study for any number of methodological reasons. The criterion for including a state in the study was the validity of the placement of the state standard on the NAEP scale. On average, 32-36 states were included depending on the grades and subjects.
Block 3: Appears adjacent to Block 2 in each state's profile under the headings "2005 NAEP scale equivalent" and "Relative error." The mapping method can be applied to any set of numbers, regardless of whether or not they are meaningfully related. To ensure scores are comparable, after determining the NAEP scale equivalents for each state standard, one computes the discrepancy between (a) the percentage meeting the standard reported by the state for each NAEP participating school and (b) the percentage of students meeting the state standard estimated by NAEP data for that school. If the mapping were error-free, these would be in complete agreement; however, some discrepancies will arise from random variation. This discrepancy should not be noticeably larger than would be accounted for by simple random sampling variation. If the discrepancy is noticeably larger than what would be expected if NAEP and the state assessment were parallel tests, then the validity of the mapping is questionable?that is, the mapping appears to apply differently in some schools than in others. As a criterion for questioning the validity of the placement of the state standard on the NAEP scale, an index is developed to determine whether the discrepancies are sufficiently large to indicate whether the NAEP and state achievement scales have less than 50 percent of variance in common. Therefore, values of 1.5 or higher of this relative error indicate poor mapping of school-level results, and comparisons between NAEP and state assessments should be made with caution.
Block 4: Appears adjacent to Block 3 in each state's profile under the headings "2005 NAEP scale equivalent," "Correlation between NAEP and state results", and "Unadjusted" or "Adjusted." The unadjusted correlation measures the degree of association between the percent of students scoring at the proficient level for each school in the sample on the state assessment and on NAEP. There are several factors that could influence the strength of this relationship. Differences between the samples taking the assessments, the time the assessments were administered, and the definition of the target skill could all impact the degree of association. The correlation between the percent of students meeting a high standard on one test and a low standard on the other are bound to be lower than the correlation between percents of students meeting standards of equal difficulty on the two tests. Also, correlations are biased downward by schools with small enrollments, by use of scores for an adjacent grade rather than the same grade, and by standards set near the extremes of a state?s achievement distribution, among other reasons. The adjusted correlation is an estimate of what the correlations would have been if they were all based on scores on nonextreme standards in the same grade in schools with 30 or more students per grade.
Block 5: Appears adjacent to Block 4 in each state's profile under the heading "2005 NAEP exclusion rates." NAEP has always endeavored to assess all students selected as a part of its sampling process, including students who are classified by their schools as students with disabilities (SD) and/or as English-language learners (ELL) (also referred to as limited English proficient or LEP). School personnel decide whether or not to exclude any of these students. Some students may participate with testing accommodations.
Block 6: Appears as the final block of information in each states profile by reading and mathematics, respectively, with the row heading "State accommodations not allowed on NAEP." The information pertaining to state accommodations not allowed on NAEP was compiled from separate tables listing state accommodations located in Lazarus, Thurlow, Lail, Eisenbraun, and Kato (2006). The state accommodations (e.g., tape recorder, Braille) included in this profile are mostly self-explanatory; however, the definition of some accommodations may not be intuitive for those who are not familiar with testing procedures. For example, many states allow students to complete an assessment in a study carrel?a small cubicle or stall with three sides that allows students to take the exam in relative privacy. Additionally, some accommodations have specific definitions within a state or have definitions that allow for multiple interpretations. For example, according to Lazarus et al., a communication device is a piece of equipment which certain states allow a student to use when responding to assessment questions. Although the authors list a symbol board as an example, a communication device is an inclusive term that could refer to any type of equipment used to facilitate student responses. Finally, there are some accommodations listed in the following profile that are allowed on NAEP under certain circumstances. For example, NAEP allows a calculator to be used for a subset of the tasks only. In the current profile, a calculator was included as an accommodation allowed by the state if it was non-standard, allowed under certain circumstances, and/or allowed with implications for aggregation and scoring. Profile users can refer to Lazarus et al. for more information about the definition of individual accommodations and the circumstances under which accommodations are allowed in each state.