Skip Navigation
  1. TIMSS
  2. TIMSS Resources

Frequently Asked Questions

In addition to the following questions about TIMSS, more FAQs about international assessments are available at: http://nces.ed.gov/surveys/international/faqs.asp.

The National Center for Education Statistics (NCES), part of the U.S. Department of Education, is responsible for conducting TIMSS and TIMSS Advanced in the United States and for representing the United States in international collaboration on these assessments.

The International Association for the Evaluation of Educational Achievement (IEA) coordinates TIMSS and TIMSS Advanced internationally. The IEA is an independent international cooperative of national research institutions and government agencies with nearly 70 member countries worldwide. The IEA has a permanent secretariat based in Amsterdam and another office in Hamburg.

TIMSS is directed by the IEA's TIMSS & PIRLS International Study Center at Boston College, in close cooperation with IEA Amsterdam, IEA Hamburg, and Statistics Canada. The TIMSS & PIRLS International Study Center works with country representatives, called National Research Coordinators (NRCs), to design and implement TIMSS and TIMSS Advanced, assure quality control and international comparability, and report results. The U.S. NRC is Lydia Malley of NCES.

In TIMSS 2019, staff from Statistics Canada and IEA Hamburg worked with NRCs on all phases of sampling activities to ensure compliance with sampling and participation requirements. IEA Amsterdam worked with the TIMSS & PIRLS International Study Center to ensure the comparability of translations of the assessment items and questionnaires and to conduct an international quality assurance program of school visits to monitor and report on the administration of the assessment. IEA Hamburg staff worked closely with NRCs during the project to organize data collection operations and to check all data for accuracy and consistency within and across countries.

Data collection for TIMSS 2019 within the United States was done under contract with WESTAT, Inc.

Schools cannot sign up to participate in these assessments as part of the national U.S. sample. It is important for fair comparisons across education systems that each country only includes in its national sample those schools and students that are scientifically sampled by the international contractor to fairly represent the country. Moreover, given the design of TIMSS, no individual school scores can be calculated.

TIMSS requires participating education systems to draw probability samples of students who are nearing the end of their 4th or 8th year of formal schooling, counting from their 1st year of primary schooling. For TIMSS in the United States, one sample is drawn to represent the nation at grade 4 and another at grade 8. The U.S. national sample includes both public and private schools, randomly selected and weighted to be representative of the nation at grade 4 and at grade 8. Specifically, the study, utilizes a two-stage stratified cluster sampling design. The first stage makes use of a systematic probability-proportionate-to-size (PPS) technique to select schools. The second stage of sampling consists of selecting classrooms within sampled schools. At the classroom level, TIMSS samples intact classes that are available to students in the target grades. Two classrooms were selected per school in the United States, where feasible. In U.S. schools containing only one class, this class was selected.

TIMSS Advanced requires participating education systems to draw probability samples of students in their final year of secondary schools who are taking or have taken courses in advanced mathematics or who are taking or have taken courses in physics. For TIMSS Advanced 2015 (the most recent cycle of TIMSS Advanced), in the United States, two samples of 12th-graders were drawn to represent the nation—one for advanced mathematics and one for physics. The U.S. national samples included both public and private schools, randomly selected and weighted to be representative of the nation's advanced mathematics and physics students at the end of high school. Specifically, the study utilized a two-stage stratified cluster sampling design. The first stage made use of a systematic PPS technique to select schools. The second stage of sampling consisted of selecting students, rather than classrooms, within sampled schools.

At grade 4
Assessment year Number of participating
schools
Number of participating
students
Overall weighted participation
rate (percent)
1995 182 7,296 80
2003 248 9,829 78
2007 257 7,896 84
2011 369 12,569 80
2015 250 10,029 81
2019 287 8,776 84

At grade 8
Assessment year Number of participating
schools
Number of participating
students
Overall weighted participation
rate (percent)
1995 183 7,087 78
1999 221 9,072 85
2003 232 8,912 73
2007 239 7,377 77
2011 501 10,477 81
2015 246 10,221 78
2019 273 8,698 79

At grade 12
Assessment year Number of participating
schools
Number of participating
students
Overall weighted participation
rate (percent)
1995      
   Advanced mathematics 199 2,349 67
   Physics 203 2,678 68
2015      
   Advanced mathematics 241 2,954 66
   Physics 165 2,932 58

NOTE: The overall weighted participation rate is the product of the school participation rate, after replacement, and the student participation rate, after replacement. There was no grade 4 assessment in 1999.

There have been changes in the participation of education systems across TIMSS cycles. To conveniently compare these differences, the NCES TIMSS website provides a table of all TIMSS participating countries and subnational education systems for each of the TIMSS years of administration. Please follow this link to view the table.

For a table of all participating countries and subnational education systems for each of the TIMSS Advanced years of administration, please follow this link.

Due to the fact that the makeup of the participating countries changes in every administration, TIMSS uses the TIMSS scale centerpoint instead of the international average for consistent comparisons over time.

The TIMSS achievement scales were established in TIMSS 1995 based on the achievement distribution across all participating countries, treating each country equally. At each grade level, the scale centerpoint of 500 was set to correspond to the 1995 mean of the overall achievement distribution, and 100 points on the scale was set to correspond to the standard deviation. Achievement data from subsequent TIMSS assessment cycles were linked to these scales so that increases or decreases in average achievement may be monitored across assessments. TIMSS uses the scale centerpoint as a point of reference that remains constant from assessment to assessment.

At grade 4, TIMSS focuses on three domains of mathematics: At grade 4, TIMSS focuses on three domains of science:
Number (manipulating whole numbers and place values; performing addition, subtraction, multiplication, and division; and using fractions and decimals), Life science,
Geometric shapes and measures, and Physical science, and
Data display. Earth science.
At grade 8, TIMSS focuses on four domains of mathematics: At grade 8, TIMSS focuses on four domains of science:
Number, Biology,
Algebra, Chemistry,
Geometry, and Physics, and
Data and chance. Earth science.
At grade 12, TIMSS Advanced focuses on three domains of advanced mathematics: At grade 12, TIMSS Advanced focuses on three domains of advanced physics:
Algebra, Mechanics and thermodynamics,
Calculus, and Electricity and magnetism, and
Geometry. Wave phenomena and atomic/nuclear physics.

The results from TIMSS and PISA are difficult to compare because the assessments are different in at least three key ways that could influence the results. First, TIMSS assesses 4th- and 8th-graders, while PISA is an assessment of 15-year-old students, regardless of grade level. (In the United States, PISA data collection occurs in the autumn, when most 15-year-olds are in 10th grade.) Thus, the grade levels of students in PISA and TIMSS differ. Second, the knowledge and skills measured in the two assessments differ. TIMSS is intended to measure how well students have learned the mathematics and science curricula in participating countries, whereas PISA is focused on the application of knowledge to “real-world” situations. Third, the participating countries in the two assessments differ. Both assessments cover much of the world, but they do not overlap neatly. Both assessments include key economic competitors and partners, but the overall makeups of the countries participating in the two assessments differ markedly. Thus, the “averages” used by the two assessments are in no way comparable, and the “rankings” often reported in media coverage of these two assessments are based on completely different sets of countries.

To learn more about how the TIMSS assessment differs from PISA as well as NAEP, see the following paper: Comparing TIMSS with NAEP and PISA in Mathematics and Science (2007) (281 KB).

The results from TIMSS Advanced and PISA are difficult to compare because the assessments are different in ways similar to the differences between TIMSS and PISA. First, TIMSS Advanced and PISA assess two different student populations. TIMSS Advanced assesses students in their final year of secondary school (grade 12 in the United States) who were taking or had taken courses in advanced mathematics or physics. PISA is an assessment of 15-year-old students, regardless of grade level. (In the United States, PISA data collection occurs in the autumn, when most 15-year-olds are in 10th grade.) Second, the knowledge and skills measured in the two assessments differ. TIMSS Advanced is intended to measure how well students have learned the advanced mathematics and physics curricula in participating countries, whereas PISA is focused on the application of knowledge to “real-world” situations. Third, the participating countries in the two assessments differ markedly. TIMSS Advanced covers less than a dozen countries, while PISA includes about 70 education systems.

Both TIMSS and NAEP provide a measure of 4th- and 8th-grade mathematics and science learning. Both assessments measure students' mastery of mathematics and science knowledge, skills, and concepts that are closely linked to the curricula of the participating countries (in the case of TIMSS) and of the United States (in the case of NAEP). To learn more about how these two assessments compare in terms of their key features (e.g., purpose, partners, population, precision of estimates, and content), frameworks, and items, see the following paper: Comparing TIMSS with NAEP and PISA in Mathematics and Science (2007) (281 KB)

To learn more about the comparison of items and frameworks of TIMSS and NAEP, see the following paper: Comparison of TIMSS 2011 Items and the NAEP 2011 Framework (2011) (756 KB)

In addition, please see the tables below for a summary of the achievement score changes between the two assessments over the last 24 years.

Comparison of Mathematics Results
  NAEP 1990–2019
TIMSS 1995–2019
NAEP 2007–2019
TIMSS 2007–2019
NAEP 2011–2019
TIMSS 2011–2019
NAEP 2015–2019
TIMSS 2015–2019
NAEP
2017–2019
Grade 4
MATH
NAEP
TIMSS  
Grade 8
MATH
NAEP
TIMSS  

Comparison of Science Results
  TIMSS 1995–2019 TIMSS 2007–2019 NAEP 2009–2015
TIMSS 2011–2015
TIMSS 2015–2019
Grade 4
SCIENCE
NAEP      
TIMSS
NAEP 2011–2015
TIMSS 2011–2015
TIMSS 2015–2019
Grade 8
SCIENCE
NAEP  
TIMSS


 Achievement scores have decreased
 Achievement scores have not measurably changed
 Achievement scores have increased
The scaling of TIMSS and TIMSS Advanced data is conducted separately for each grade and each content domain. While the scales were created to each have a centerpoint of 500 and a standard deviation of 100, they are not directly comparable in terms of being able to say how much achievement or learning at one grade equals how much achievement or learning at the other grade. Comparisons can only be made in terms of relative performance.

TIMSS operates on a 4-year cycle, with 1995 being the first year it was administered. Countries in the Northern Hemisphere conduct the assessment between April and June of the assessment year, while countries in the Southern Hemisphere conduct the assessment in October and November of the assessment year. In both hemispheres, the assessment is conducted near the end of the school year.

TIMSS Advanced data were collected in 1995, 2008, and 2015. Since the United States did not participate in the 2008 administration of TIMSS Advanced, the administration in 2015 marked the first time that TIMSS Advanced data had been collected in the United States since 1995. There is no regular periodicity for the administration of TIMSS Advanced.

U.S. TIMSS results are available on the TIMSS Results web page and also on the TIMSS Publications & Products page. The latest TIMSS report shows results from TIMSS 2019 and includes grades 4 and 8. The latest TIMSS Advanced report presents results from 2015, when TIMSS Advanced was last administered.
TIMSS is scheduled to be administered next in 2023, with results to be reported at the end of 2024. No date has yet been set for the next administration of TIMSS Advanced.
Yes, states and large enough school districts can sign up to obtain their own TIMSS and TIMSS Advanced results at their own cost. Sample size restrictions apply. However, as is the case with the national sample, no school-level results are possible for these assessments because TIMSS is not designed to produce school-level estimates. Please contact Lydia Malley, the U.S. TIMSS National Research Coordinator, for more information.

The TIMSS mathematics and science assessment items are created based on the TIMSS Assessment Frameworks and are developed through an international consensus-building process involving input from experts in education, mathematics, science, and measurement. The development of the TIMSS items and scoring guides are the result of a widespread and intensive process of collaboration, piloting, and review among the participating countries.

TIMSS also administers background questionnaires to students, their teachers, and their school principals to better understand the contextual factors that affect students’ learning. In 2015, for the first time, the fourth-grade TIMSS assessment included a home questionnaire for students' parents and caregivers that collected information about students' home backgrounds and early learning experiences. The United States does not administer the home questionnaire. TIMSS also administers curriculum questionnaires to specialists to collect information about educational policies and the national contexts that shape the content and implementation of mathematics and science curricula across countries.

Although the majority of the assessment items, passages, and questionnaires are carried forward from the previous assessment cycle to measure trends, the task of updating the instruments for each new cycle—every 4 years for TIMSS since 1995—is a substantial undertaking. The Science and Mathematics Item Review Committee (SMIRC), composed of internationally recognized mathematics and science experts, reviews and recommends updates to the mathematics and science frameworks developed for each TIMSS administration. SMIRC also reviewed the TIMSS 2019 items at key points in their development process.

More information on the assessment design and general scoring method used for TIMSS can be found in the Assessment Framework section on the Methods and Procedures in TIMSS 2019 web page.

The central goal of TIMSS and TIMSS Advanced is to provide information for improving educational outcomes from both an individual education system and international perspective. Each of the TIMSS assessments also collect considerable amounts of descriptive data about the contexts for teaching and learning in participating countries from student, school, and teacher background questionnaires. Countries provide information about their education systems and curricula in mathematics, science, and reading by completing a country-level curriculum questionnaire and writing a descriptive chapter based on a common outline created as part of the development process. This information is compiled in the TIMSS Encyclopedia, which describes the major aspects of teaching and learning in mathematics and science in each participant country. A significant portion of the development and review effort by National Research Coordinators is dedicated to ensuring that the passages, test items, and questionnaires can be translated accurately and that the assessment items can reasonably measure the mathematics and science literacy skills of their education system's 4th-grade, 8th-grade, and advanced student sample populations. The TIMSS & PIRLS International Study Center then prepares an international version in English of all the assessment instruments for TIMSS. Subsequently, the test and questionnaire instruments are translated by participating countries into their languages of instruction, with the goal of creating high-quality translations that are appropriately adapted to the national context and are, at the same time, internationally comparable.

TIMSS 2019 was the seventh cycle for the TIMSS trend lines monitoring changes in educational achievement since 1995. Moreover, TIMSS 2019 marked the beginning of the transition from paperTIMSS (the paper-and-pencil test format used in previous assessment cycles) to eTIMSS (a digital version of TIMSS designed for computer- and tablet-based administration). eTIMSS offers an engaging, interactive, and visually attractive assessment to better assess complex areas of the mathematics and science frameworks and increase operational efficiency in translation, assessment delivery, data entry, and scoring.

Because not all TIMSS countries were prepared to conduct digital assessments, IEA decided to implement the transition over two assessment cycles—TIMSS 2019 and TIMSS 2023. About half of the countries participating in TIMSS 2019, including the United States, elected to administer eTIMSS, while the rest of the countries administered paperTIMSS.

The TIMSS 2019 assessment was carefully designed and analyzed so that the TIMSS 2019 mathematics and science achievement results for all participating education systems are reported on the same TIMSS trend scales (mathematics and science scales at grades 4 and 8). To ensure that the eTIMSS and paperTIMSS results could be reported on the same achievement scale, eTIMSS 2019 countries that had participated in TIMSS 2015 also re-administered the trend items in paper booklets to a separate nationally representative sample of students during data collection to provide a “bridge” between paperTIMSS and eTIMSS (see IEA’s Scaling the Achievement Data for TIMSS 2019 chapter for additional details).

As a part of the transition to a digital assessment, eTIMSS 2019 included a series of extended Problem Solving and Inquiry (PSI) tasks in mathematics and science at both the 4th and 8th grades. The eTIMSS PSIs were designed to simulate real-world or laboratory situations in which students could integrate and apply process skills and content knowledge to solve mathematics problems or conduct virtual scientific experiments and investigations. The PSI tasks were not included in the results reported in December 8, 2020, international or U.S- specific releases and reports. The IEA plans to provide information from the administration of the PSI tasks at a later date in 2021.

The digital mode of administration allowed eTIMSS to collect additional information about how students work through the items, such as screen-by-screen timing data and additional process variables that can be analyzed to study students’ interactions with the achievement items.




Back to Top