The NAEP sampling and weighting staff establishes a quality control plan to meet the NCES statistical standards during the weighting processes of each assessment. The more simplified procedures introduced in 2003 resulted in less of a chance for the occurrence of problems, and greater opportunities for verifying that problems had not occurred.
1. Comparison of the original school sample with the frame was favorable. Only one statistically significant difference was noted at the national level. It involved the percentage of Black students enrolled in grade 8 public schools. (The frame estimate was 16.51 percent, and the school sample-based estimate was 17.05 percent.) Upon further investigation of this statistic for each individual state, only the difference for Idaho, which was 0.07 percent, was statistically significant.
2. Comparison of characteristics from the original public school sample and the participating public school sample showed no differences, a finding which was ascribed to the high response rates in the participating school sample. The same comparison for private schools showed that the responding sample reported more Black students at both the fourth and eighth grades.
In connection with the school nonresponse (NR) adjustment, a problem was detected in the imputation of achievement and income data for 11 schools. Further examination ascertained that the problem had no effect on the final NR adjustments.
3. The comparison of the participating school sample to the student sample is difficult to evaluate, because there are real differences in the data, especially due to time and for the percent Hispanic students enrolled in school. Investigation of these findings were conducted; some of the student groups that were studied to see if the differences were due to new enrollees included
4. Comparison of the participating student sample to the full student sample found very small differences, attributable, for the most part, to sampling error. Because of the design of the weighting process, no differences were found in the percent of students excluded.
5. Comparison of the mathematics and reading samples found some differences, most of which were attributable to sampling error. In order to reduce clustering in future NAEP efforts, a revision in the booklet spiraling procedure was suggested.
Final participation, exclusion, and accommodation rates (grade 4 mathematics, grade 4 reading, grade 8 mathematics, and grade 8 reading) were presented in quality control tables for each grade, subject, and jurisdiction. School rates were calculated as they had been calculated for previous assessments and according to National Center for Education Statistics (NCES) standards.
The rates were below 85 percent for certain kinds of private schools at both grades. Rates were below 85 percent for students at grade 8 in Cleveland and New York City TUDA jurisdictions. A nonresponse bias analysis was completed, as required by NCES standards.
In 2002, all missing data for Title I were imputed as "no", but in 2003 they were treated as "missing". Cases of inconsistency in percentages between 2002 and 2003 were noted, but explanations as to the cause are still lacking at this time, since the true value of how much was "missing" in 2002 is unknown. Further, it is unclear as to whether "missing" was more likely to mean "yes" or "no." Large variations also existed from state to state in the percentages. At best, extreme caution is advised in the use of Title I data as a trend variable from 2002 to 2003.
Some inconsistencies in these data between 2002 and 2003 have been noted. These inconsistencies appear to reflect a high degree of "status unascertained." Accordingly, it is suggested that the use of NSLP as a trend variable be limited to those cases in which the amount "not ascertained" does not exceed 10 percent in either year.
Within states, many changes over time were found to be attributable to sampling error. A few differences appeared to be due to school NR bias in 2002. Otherwise, no problems were detected at the state level. However, at grade 8, a two percent increase at the national level was noted in the percentage of Black students. This appeared to trace back to the original school sample in both years.
The presence of strong evidence led to the suggestion to the NAEP data analysis staff that school race/ethnicity data not be used for 14 schools, or about 0.1 percent of the sample. No single state contained more than two of these schools. The data indicated that codes were confused.
A few "unusual" changes in the data between 2002 and 2003 were noted, but were not attributable to inconsistencies in the codes on the Common Core of Data (CCD) for the two years (i.e., such inconsistencies were found to be quite rare). Some of these changes may be related to the large changes which occur in the school frames from year-to-year, with many schools added and many dropped. Some may be due to nonresponse bias in 2002. This problem is under further examination.
Public school response rates for 2003 held at a very high level; private school rates improved somewhat over the previous year, but continued to lag outside of Catholic and Lutheran private schools. Overall, student response rates remained similar to those recorded in 2002. A number of public schools in which response rate differences were noted between 2002 and 2003 were found to have been caused by school nonresponse bias in 2002, and the state was notified of this fact in 2002.
A nonresponse bias analysis has been undertaken for private schools at both grades, as well as for grade 8 students in two TUDA jurisdictions.
Reading exclusions were found to be much higher than exclusions for mathematics (balanced by higher accommodation in mathematics). Nonetheless, with a few exceptions, reading exclusion was generally less than in 2002. Some exclusion outliers were noted among TUDA jurisdictions.