|Block||Item||Range of response codes||Sample size||Percent exact agreement||Cohen's Kappa||Intraclass correlation|
|† The intraclass correlation is not reported for dichotomously scored items; Cohen's Kappa is not reported for polytomously scored items.
NOTE: Cohen's Kappa is a measure of reliability that is appropriate for items that are dichotomously scored. The intraclass correlation coefficient is most appropriate for items with more than two categories.
SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2004 Reading Long-Term Trend Assessment.