Skip Navigation

Scoring and Scoring Reliability

Back

The TIMSS and TIMSS Advanced 2015 assessment items included both multiple-choice and constructed-response items. A scoring rubric (guide) was created for every constructed response item included in the TIMSS and TIMSS Advanced assessments. The rubrics were carefully written and reviewed by national research coordinators of all participating education systems and other experts as part of the field test of items, and revised accordingly.

The national research coordinator in each participating education system was responsible for the scoring of data for that participant, following established guidelines. The United States national research coordinator is Dr. Stephen Provasnik, National Center for Education Statistics. The national research coordinator and additional staff from each education system attended scoring training sessions held by the International Study Center. The training sessions focused on the scoring rubrics employed in TIMSS and TIMSS Advanced 2015. Participants in these training sessions were provided with extensive practice in scoring example items over several days. Information on within-country agreement and cross-country agreement among scorers was collected and documented by the International Study Center. Information on scoring reliability for constructed-response scoring in TIMSS and TIMSS Advanced 2015 is provided in Chapter 11 of Methods and Procedures in TIMSS 2015 at http://timss.bc.edu/publications/timss/2015-methods/chapter-11.html and of Methods and Procedures in TIMSS Advanced 2015 at http://timssandpirls.bc.edu/publications/timss/2015-a-methods/chapter-11.html.

Data Entry and Cleaning

The national research coordinator from each participating education system was responsible for their education system's data entry. In the United States, Westat was contracted to collect data for TIMSS 2015 and imported the data into data files using a common international format. This format was specified in the IEA Data Management Expert (DME) Manual (IEA Data Processing and Research Center 2014), which accompanied the IEA-supplied data management software (DME) given to all participating education systems to create data files. This software facilitated the checking and correction of data by providing various data consistency checks.

The data were then sent to the IEA Data Processing Center (DPC) in Hamburg, Germany, for further review and cleaning. The main purpose of this cleaning was to ensure that all information in the database conformed to the internationally defined data structure. It also ensured that the national adaptations to questionnaires were reflected appropriately in codebooks and documentation, and that all variables selected for international comparisons were comparable across education systems.

The DPC was responsible for checking the data files from each education system, applying standard cleaning rules to verify the accuracy and consistency of the data, and documenting electronically any deviations from the international file structure. Queries arising during this process were addressed to national research coordinators. In the United States, the national research coordinator, along with Westat, reviewed the cleaning reports and data almanacs and provided the DPC with assistance on data cleaning.

With the assessment data, the DPC subsequently compiled background univariate statistics and preliminary test scores based on classical item analysis and item response theory (IRT). All education systems were provided their univariate and reliability statistics, along with data almanacs containing international univariate and item statistics. This sharing allowed countries to review the statistics and data almanacs to ensure the data validity. Once any problems arising from this examination were resolved, sampling weights produced by Statistics Canada and IRT-scaled student proficiency scores in mathematics and science were added to the file.

Detailed information on the entire data entry and cleaning process can be found in Chapter 10 of Methods and Procedures in TIMSS 2015 at http://timss.bc.edu/publications/timss/2015-methods/chapter-10.html and of Methods and Procedures in TIMSS Advanced 2015 at http://timssandpirls.bc.edu/publications/timss/2015-a-methods/chapter-10.html.

Back