Skip Navigation
Digest of Education Statistics: 2018
Digest of Education Statistics: 2018

NCES 2020-009
December 2019

Appendix A.5. International Association for the Evaluation of Educational Achievement

The International Association for the Evaluation of Educational Achievement (IEA) is composed of governmental research centers and national research institutions around the world whose aim is to investigate education problems common among countries. Since its inception in 1958, the IEA has conducted more than 30 research studies of cross-national achievement. The regular cycle of studies encompasses learning in basic school subjects. Examples are the Trends in International Mathematics and Science Study (TIMSS) and the Progress in International Reading Literacy Study (PIRLS). IEA projects also include studies of particular interest to IEA members, such as the TIMSS 1999 Video Study of Mathematics and Science Teaching, the Civic Education Study, and studies on information technology in education.

The international bodies that coordinate international assessments vary in the labels they apply to participating education systems, most of which are countries. IEA differentiates between IEA members, which IEA refers to as “countries” in all cases, and “benchmarking participants.” IEA members include countries such as the United States and Ireland, as well as subnational entities such as England and Scotland (which are both part of the United Kingdom), the Flemish community of Belgium, and Hong Kong (a Special Administrative Region of China). IEA benchmarking participants are all subnational entities and include Canadian provinces, U.S. states, and Dubai in the United Arab Emirates (among others). Benchmarking participants, like the participating countries, are given the opportunity to assess the comparative international standing of their students’ achievement and to view their curriculum and instruction in an international context.

Some IEA studies, such as TIMSS and PIRLS, include an assessment portion, as well as contextual questionnaires for collecting information about students’ home and school experiences. The TIMSS and PIRLS scales, including the scale averages and standard deviations, are designed to remain constant from assessment to assessment so that education systems (including countries and subnational education systems) can compare their scores over time as well as compare their scores directly with the scores of other education systems. Although each scale was created to have a mean of 500 and a standard deviation of 100, the subject matter and the level of difficulty of items necessarily differ by grade, subject, and domain/dimension. Therefore, direct comparisons between scores across grades, subjects, and different domain/dimension types should not be made.

Further information on the International Association for the Evaluation of Educational Achievement may be obtained from https://www.iea.nl/.

Top

Trends in International Mathematics and Science Study

The Trends in International Mathematics and Science Study (TIMSS, formerly known as the Third International Mathematics and Science Study) provides data on the mathematics and science achievement of U.S. 4th- and 8th-graders compared with that of their peers in other countries. TIMSS collects information through mathematics and science assessments and questionnaires. The questionnaires request information to help provide a context for student performance. They focus on such topics as students’ attitudes and beliefs about learning mathematics and science, what students do as part of their mathematics and science lessons, students’ completion of homework, and their lives both in and outside of school; teachers’ perceptions of their preparedness for teaching mathematics and science, teaching assignments, class size and organization, instructional content and practices, collaboration with other teachers, and participation in professional development activities; and principals’ viewpoints on policy and budget responsibilities, curriculum and instruction issues, and student behavior. The questionnaires also elicit information on the organization of schools and courses. The assessments and questionnaires are designed to specifications in a guiding framework. The TIMSS framework describes the mathematics and science content to be assessed and provides grade-specific objectives, an overview of the assessment design, and guidelines for item development.

TIMSS is on a 4-year cycle. Data collections occurred in 1995, 1999 (8th grade only), 2003, 2007, 2011, and 2015. TIMSS 2015 consisted of assessments in 4th-grade mathematics; numeracy (a less difficult version of 4th-grade mathematics, newly developed for 2015); 8th-grade mathematics; 4th-grade science; and 8th-grade science. Students in Bahrain, Indonesia, Iran, Kuwait, Jordan, Morocco, and South Africa as well as Buenos Aires participated in the 4th-grade mathematics assessment through the numeracy assessment. In addition, TIMSS 2015 included the third administration of TIMSS Advanced since 1995. TIMSS Advanced is an international comparative study that measures the advanced mathematics and physics achievement of students in their final year of secondary school (the equivalent of 12th grade in the United States) who are taking or have taken advanced courses. The TIMSS 2015 survey also collected policy-relevant information about students, curriculum emphasis, technology use, and teacher preparation and training.

Top

Progress in International Reading Literacy Study

The Progress in International Reading Literacy Study (PIRLS) provides data on the reading literacy of U.S. 4th-graders compared with that of their peers in other countries. PIRLS is on a 5-year cycle: PIRLS data collections have been conducted in 2001, 2006, 2011, and 2016. In 2016, a total of 58 education systems, including both IEA members and IEA benchmarking participants, participated in the survey. Sixteen of the education systems participating in PIRLS also participated in ePIRLS, an innovative, computer-based assessment of online reading designed to measure students’ approaches to informational reading in an online environment.

PIRLS collects information through a reading literacy assessment and questionnaires that help to provide a context for student performance. Questionnaires are administered to collect information about students’ home and school experiences in learning to read. A student questionnaire addresses students’ attitudes toward reading and their reading habits. In addition, questionnaires are given to students’ teachers and school principals in order to gather information about students’ school experiences in developing reading literacy. In countries other than the United States, a parent questionnaire is also administered. The assessments and questionnaires are designed to specifications in a guiding framework. The PIRLS framework describes the reading content to be assessed and provides objectives specific to 4th grade, an overview of the assessment design, and guidelines for item development.

Top

TIMSS and PIRLS Sampling and Response Rates

2016 PIRLS

As is done in all participating countries and other education systems, representative samples of students in the United States are selected. The sample design that was employed by PIRLS in 2016 is generally referred to as a two-stage stratified cluster sample. In the first stage of sampling, individual schools were selected with a probability proportionate to size (PPS) approach, which means that the probability is proportional to the estimated number of students enrolled in the target grade. In the second stage of sampling, intact classrooms were selected within sampled schools.

PIRLS guidelines call for a minimum of 150 schools to be sampled, with a minimum of 4,000 students assessed. The basic sample design of one classroom per school was designed to yield a total sample of approximately 4,500 students per population. About 4,400 U.S. students participated in PIRLS in 2016, joining 319,000 other student participants around the world. Accommodations were not provided for students with disabilities or students who were unable to read or speak the language of the test. These students were excluded from the sample. The IEA requirement is that the overall exclusion rate, of which exclusions of schools and students are a part, should not exceed more than 5 percent of the national desired target population.

In order to minimize the potential for response biases, the IEA developed participation or response rate standards that apply to all participating education systems and govern whether or not an education system’s data are included in the TIMSS or PIRLS international datasets and the way in which its statistics are presented in the international reports. These standards were set using composites of response rates at the school, classroom, and student and teacher levels. Response rates were calculated with and without the inclusion of substitute schools that were selected to replace schools refusing to participate. In the 2016 PIRLS administered in the United States, the unweighted school response rate was 76 percent, and the weighted school response rate was 75 percent. All schools selected for PIRLS were also asked to participate in ePIRLS. The unweighted school response rate for ePIRLS in the final sample with replacement schools was 89.0 percent and the weighted response rate was 89.1 percent. The weighted and unweighted student response rates for PIRLS were both 94 percent. The weighted and unweighted student response rates for ePIRLS were both 90 percent.

2015 TIMSS and TIMSS Advanced

TIMSS 2015 was administered between March and May of 2015 in the United States. The U.S. sample was randomly selected and weighted to be representative of the nation. In order to reliably and accurately represent the performance of each country, international guidelines required that countries sample at least 150 schools and at least 4,000 students per grade (countries with small class sizes of fewer than 30 students per school were directed to consider sampling more schools, more classrooms per school, or both, to meet the minimum target of 4,000 tested students). In the United States, a total of 250 schools and 10,029 students participated in the grade 4 TIMSS survey, and 246 schools and 10,221 students participated in the grade 8 TIMSS (these figures do not include the participation of the state of Florida as a subnational education system, which was separate from and additional to its participation in the U.S. national sample).

TIMSS Advanced, also administered between March and May of 2015 in the United States, required participating countries and other education systems to draw probability samples of students in their final year of secondary school—ISCED Level 3—who were taking or had taken courses in advanced mathematics or who were taking or had taken courses in physics. International guidelines for TIMSS Advanced called for a minimum of 120 schools to be sampled, with a minimum of 3,600 students assessed per subject. In the United States, a total of 241 schools and 2,954 students participated in advanced mathematics, and 165 schools and 2,932 students participated in physics.

In TIMSS 2015, the weighted school response rate for the United States was 77 percent for grade 4 before the use of substitute schools (schools substituted for originally sampled schools that refused to participate) and 85 percent with the inclusion of substitute schools. For grade 8, the weighted school response rate before the use of substitute schools was 78 percent, and it was 84 percent with the inclusion of substitute schools. The weighted student response rate was 96 percent for grade 4 and 94 percent for grade 8.

In TIMSS Advanced 2015, the weighted school response rate for the United States for advanced mathematics was 72 percent before the use of substitute schools and 76 percent with the inclusion of substitute schools. The weighted school response rate for the United States for physics was 65 percent before the use of substitute schools and 68 percent with the inclusion of substitute schools. The weighted student response rate was 87 percent for advanced mathematics and 85 percent for physics. Student response rates are based on a combined total of students from both sampled and substitute schools.

Further information on the TIMSS study may be obtained from

Stephen Provasnik
International Assessment Branch
Assessments Division
National Center for Education Statistics
550 12th Street SW
Washington, DC 20202
(202) 245-6442
stephen.provasnik@ed.gov
https://nces.ed.gov/timss/
https://www.iea.nl/timss

Further information on the PIRLS study may be obtained from

Sheila Thompson
International Assessment Branch
Assessments Division
National Center for Education Statistics
550 12th Street SW
Washington, DC 20202
(202) 245-8330
sheila.thompson@ed.gov
https://nces.ed.gov/surveys/pirls/
https://www.iea.nl/pirls