Table of Contents  |  Search Technical Documentation  |  References

NAEP Weighting Procedures → Weighting Procedures for the 2000 State Assessment → Quality Control in the Weighting Process

Quality Control in the Weighting Process


Internal Quality Control Checks

Additional Quality Control Checks

Use of Normit Data to Check Standard Errors and Design Effects

When several stages of adjustments are used to produce final weights, it is essential to check the weighting procedures carefully to correct any errors in the preparation of weighting specifications or computer programs. A series of procedures are used for quality control (QC) of the weighting process.

For the 2000 state assessments, materials contractor's student- and session-level data files were compared with the data collection contractor's sample and field processing files as the first stage of validity and consistency checks. The materials contractor's files are outcomes of the NAEP assessment's administration. The data collection contractor's sample and field processing files were constructed prior to and during the administration of the NAEP assessment. The edits resulted in various student-level and session-level updates. All of the checks were re-processed until a final clean printout was obtained. Once the changes were final, new versions of the student and session files were created. The following text lists the steps that were involved in this process.

  1. Read in the NAEP materials-processing staff's files.

  2. Note the output files and variables.

  3. Provide frequencies on the student-level records.

  4. Perform student-level consistency checks (e.g., checking consistency of administration codes for SD/LEP students, checking the consistency of booklet numbers especially for accommodated students) and corrections.

  5. Perform subject-level checks and corrections.

  6. Check for consistency between subject-level counts and the actual number of student records.

  7. Impute student variables. The variables SEX, RACE, and year of birth and month of birth were imputed using a random, within-cells hotdeck approach.

  8. Verify that all school-level disposition codes (response status codes) are final.

  9. Check for correct matching between the school and student files.

  10. Check for any schools whose students are all excluded.

  11. Create the subject-level disposition codes for mathematics and science.

  12. Verify that the student records received were for the assigned subject.

  13. Check for within-school sampling violations of the sampling rules.

  14. Check for sampling weights less than 1.

  15. For schools that were assigned both subjects, verify that the numbers of students sampled per subject are approximately equal.

  16. Compare the sampled student count on the school file to the number of sampled students received from the materials-processing staff's files.

  17. Year-round schools check.

  18. On original school records, create the substitute’s disposition, and the substitute’s subject-level disposition codes.

  19. Verify that accommodated students are only in schools assigned accommodations (sample type S3).

  20. Check that schools correctly dropped a subject.

Last updated 13 August 2008 (KL)

Printer-friendly Version