The International Association for the Evaluation of Educational Achievement (IEA) is composed of governmental research centers and national research institutions around the world whose aim is to investigate education problems common among countries. Since its inception in 1958, the IEA has conducted more than 30 research studies of cross-national achievement. The regular cycle of studies encompasses learning in basic school subjects. Examples are the Trends in International Mathematics and Science Study (TIMSS) and the Progress in International Reading Literacy Study (PIRLS). IEA projects also include studies of particular interest to IEA members, such as the TIMSS 1999 Video Study of Mathematics and Science Teaching, the Civic Education Study, and studies on information technology in education.
The international bodies that coordinate international assessments vary in the labels they apply to participating education systems, most of which are countries. IEA differentiates between IEA members, which IEA refers to as "countries" in all cases, and "benchmarking participants." IEA members include countries such as the United States and Ireland, as well as subnational entities such as England and Scotland (which are both part of the United Kingdom), the Flemish community of Belgium, and Hong Kong (a Special Administrative Region of China). IEA benchmarking participants are all subnational entities and include Canadian provinces, U.S. states, and Dubai in the United Arab Emirates (among others). Benchmarking participants, like the participating countries, are given the opportunity to assess the comparative international standing of their students' achievement and to view their curriculum and instruction in an international context.
Some IEA studies, such as TIMSS and PIRLS, include an assessment portion, as well as contextual questionnaires for collecting information about students' home and school experiences. The TIMSS and PIRLS scales, including the scale averages and standard deviations, are designed to remain constant from assessment to assessment so that education systems (including countries and subnational education systems) can compare their scores over time as well as compare their scores directly with the scores of other education systems. Although each scale was created to have a mean of 500 and a standard deviation of 100, the subject matter and the level of difficulty of items necessarily differ by grade, subject, and domain/dimension. Therefore, direct comparisons between scores across grades, subjects, and different domain/dimension types should not be made.
Further information on the International Association for the Evaluation of Educational Achievement may be obtained from https://www.iea.nl/.
Trends in International Mathematics and Science Study
The Trends in International Mathematics and Science Study (TIMSS, formerly known as the Third International Mathematics and Science Study) provides data on the mathematics and science achievement of U.S. 4th- and 8th-graders compared with that of their peers in other countries. TIMSS collects information through mathematics and science assessments and questionnaires. The questionnaires request information to help provide a context for student performance. They focus on such topics as students' attitudes and beliefs about learning mathematics and science, what students do as part of their mathematics and science lessons, students' completion of homework, and their lives both in and outside of school; teachers' perceptions of their preparedness for teaching mathematics and science, teaching assignments, class size and organization, instructional content and practices, collaboration with other teachers, and participation in professional development activities; and principals' viewpoints on policy and budget responsibilities, curriculum and instruction issues, and student behavior. The questionnaires also elicit information on the organization of schools and courses. The assessments and questionnaires are designed to specifications in a guiding framework. The TIMSS framework describes the mathematics and science content to be assessed and provides grade-specific objectives, an overview of the assessment design, and guidelines for item development.
TIMSS is on a 4-year cycle. Data collections occurred in 1995, 1999 (8th grade only), 2003, 2007, 2011, 2015, and 2019. TIMSS 2015 consisted of assessments in 4th-grade mathematics; numeracy (a less difficult version of 4th-grade mathematics, newly developed for 2015); 8th-grade mathematics; 4th-grade science; and 8th-grade science. Students in Bahrain, Indonesia, Iran, Kuwait, Jordan, Morocco, and South Africa as well as Buenos Aires participated in the 4th-grade mathematics assessment through the numeracy assessment. In addition, TIMSS 2015 included the third administration of TIMSS Advanced since 1995. TIMSS Advanced is an international comparative study that measures the advanced mathematics and physics achievement of students in their final year of secondary school (the equivalent of 12th grade in the United States) who are taking or have taken advanced courses. The TIMSS 2015 survey also collected policy-relevant information about students, curriculum emphasis, technology use, and teacher preparation and training.
In TIMSS 2019, mathematics and science assessments and related questionnaires were administered in 64 education systems at the 4th-grade level and 46 education systems at the 8th-grade level. The 2019 assessment introduced eTIMSS, a digital version of TIMSS designed for computer- and tablet-based administration. Approximately half of the participating education systems—including the United States—elected to administer eTIMSS, and the remainder administered the assessment in the traditional paper-and-pencil format (paperTIMSS).
Countries participating in eTIMSS also administered paperTIMSS to a smaller "bridge" sample of students to evaluate mode effects and to link the two versions of the TIMSS assessment. An additional sample of 1,500 tested students was required to administer paper TIMSS booklets (paperTIMSS) containing the TIMSS 2015 trend assessment blocks. This bridge sample was to be obtained by selecting one additional class from a subset of the sampled schools, by selecting a distinct sample of schools, or by a combination of both strategies. In the United States, the bridge sample was obtained using a combination of both strategies. This bridge study enabled the eTIMSS and paperTIMSS achievement results to be reported on the same achievement scale in each grade and subject.
TIMSS 2019 was administered between April and May of 2019 in the United States. The U.S. sample was randomly selected and weighted to be representative of the nation. In order to reliably and accurately represent the performance of each country, international guidelines required that countries sample at least 150 schools and at least 4,000 students per grade (countries with small class sizes of fewer than 30 students per school were directed to consider sampling more schools, more classrooms per school, or both, to meet the minimum target of 4,000 tested students). In the United States, 287 schools and 8,776 students participated at the 4th-grade level, and 273 schools and 8,698 students at the 8th-grade level. The weighted school participation rate for the United States was 76 percent for grade 4 before the inclusion of replacement schools and 88 percent after the inclusion of replacement schools. For grade 8, the weighted school participation rate for the United States was 72 percent before the inclusion of replacement schools and 85 percent after the inclusion of replacement schools. The weighted student participation rate was 96 percent for grade 4 and 94 percent for grade 8.
Progress in International Reading Literacy Study
The Progress in International Reading Literacy Study (PIRLS) provides data on the reading literacy of U.S. 4th-graders compared with that of their peers in other countries. PIRLS is on a 5-year cycle: PIRLS data collections have been conducted in 2001, 2006, 2011, and 2016. In 2016, a total of 58 education systems, including both IEA members and IEA benchmarking participants, participated in the survey. Sixteen of the education systems participating in PIRLS also participated in ePIRLS, an innovative, computer-based assessment of online reading designed to measure students' approaches to informational reading in an online environment.
PIRLS collects information through a reading literacy assessment and questionnaires that help to provide a context for student performance. Questionnaires are administered to collect information about students' home and school experiences in learning to read. A student questionnaire addresses students' attitudes toward reading and their reading habits. In addition, questionnaires are given to students' teachers and school principals in order to gather information about students' school experiences in developing reading literacy. In countries other than the United States, a parent questionnaire is also administered. The assessments and questionnaires are designed to specifications in a guiding framework. The PIRLS framework describes the reading content to be assessed and provides objectives specific to 4th grade, an overview of the assessment design, and guidelines for item development.
In PIRLS 2016, representative samples of students in the United States were selected in the manner used in all participating countries and other education systems. The sample design that was employed is generally referred to as a two-stage stratified cluster sample. In the first stage of sampling, individual schools were selected with a probability proportionate to size (PPS) approach, which means that the probability is proportional to the estimated number of students enrolled in the target grade. In the second stage of sampling, intact classrooms were selected within sampled schools.
PIRLS guidelines call for a minimum of 150 schools to be sampled, with a minimum of 4,000 students assessed. The basic sample design of one classroom per school was designed to yield a total sample of approximately 4,500 students per population. About 4,400 U.S. students participated in PIRLS in 2016, joining 319,000 other student participants around the world. Accommodations were not provided for students with disabilities or students who were unable to read or speak the language of the test. These students were excluded from the sample. The IEA requirement is that the overall exclusion rate, of which exclusions of schools and students are a part, should not exceed more than 5 percent of the national desired target population.
In order to minimize the potential for response biases, the IEA has developed participation or response rate standards that apply to all participating education systems. The standards govern whether an education system's data are included in the PIRLS international datasets and how they are presented in the international reports, if they are included. These standards were set using composites of response rates at the school, classroom, and student and teacher levels. Response rates were calculated with and without the inclusion of substitute schools that were selected to replace schools refusing to participate. In the 2016 PIRLS administered in the United States, the unweighted school response rate was 76 percent, and the weighted school response rate was 75 percent. All schools selected for PIRLS were also asked to participate in ePIRLS. The unweighted school response rate for ePIRLS in the final sample with replacement schools was 89.0 percent and the weighted response rate was 89.1 percent. The weighted and unweighted student response rates for PIRLS were both 94 percent. The weighted and unweighted student response rates for ePIRLS were both 90 percent.
Further information on the TIMSS study may be obtained from
Lydia Malley
International Assessment Branch
Assessments Division
National Center for Education Statistics
550 12th Street SW
Washington, DC 20202
(202) 245-7266
lydia.malley@ed.gov
https://nces.ed.gov/timss/
https://www.iea.nl/studies/iea/timss
Further information on the PIRLS study may be obtained from
Cristobal de Brey
International Assessment Branch
Assessments Division
National Center for Education Statistics
550 12th Street SW
Washington, DC 20202
(202) 245-8330
cristobal.debrey@ed.gov
https://nces.ed.gov/surveys/pirls/
https://www.iea.nl/studies/iea/pirls