Skip Navigation
small NCES header image
NAEP Scoring → Scoring Monitoring → Within-Year Interrater Agreement → Arts Interrater Agreement

Arts Interrater Agreement

A random subsample of the arts responses for each constructed-response item is scored by a second individual to obtain statistics on interrater agreement. For arts items, 25 percent of the total sample are second scored. This interrater agreement information is used by the scoring supervisor to monitor the capabilities of all raters and maintain uniformity of scoring across raters.

Agreement reports are generated on demand by the scoring supervisor, trainer, scoring director, or item development subject-area coordinator. Printed copies are reviewed daily by the lead scoring staff. In addition to the immediate feedback provided by online agreement reports, each scoring supervisor can also review the actual responses scored by a rater with the backreading tool. In this way, the scoring supervisor can monitor each rater carefully and correct difficulties in scoring almost immediately with a high degree of efficiency.

During the scoring of an item, scoring supervisors monitor progress using an interrater agreement tool. This display tool functions in either of two modes:

  • displaying information about all first scores versus all second scores, or

  • displaying all scores given by an individual rater versus the scores assigned by the other raters.

The information is displayed as a matrix with first scores displayed in rows and second scores displayed in columns (for mode one), or with an individual rater's scores in rows and all other raters' scores in columns (for mode two). Results may be reviewed for either individual raters or the team as a whole. In this format, instances of exact agreement fall along the diagonal of the matrix. For completeness, data in each cell of the matrix contain the number and percentage of cases of agreement (or disagreement). The display also contains information on the total number of second readings and the overall percentage of agreement on the item. Since the interrater agreement reports are cumulative, a printed copy of the agreement of each item is made periodically and compared to previously generated reports. Scoring staff members save printed copies of all final agreement reports and archive them with the training sets.

Links to scoring statistics for constructed-response items, arts assessment: 2008
Year and subject Item-by-item rater agreement Interrater agreement ranges Number of constructed-response items
2008 Arts X X X
SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2008 Arts Assessment.

Last updated 27 January 2010 (JL)
Would you like to help us improve our products and website by taking a short survey?

YES, I would like to take the survey

or

No Thanks

The survey consists of a few short questions and takes less than one minute to complete.
National Center for Education Statistics - http://nces.ed.gov
U.S. Department of Education