TIMSS and TIMSS Advanced achievement results are reported on a scale from 0 to 1,000, with a TIMSS scale average of 500 and standard deviation of 100. TIMSS provides an overall mathematics scale score and an overall science scale score, as well as subscale scores for the content and cognitive domains in each subject at each grade level. TIMSS Advanced provides an overall advanced mathematics scale score and an overall physics scale score, as well as subscale scores for the content and cognitive domains in each subject.
The scaling of data is conducted separately for each subject and grade. Data are also scaled separately for each of the content and cognitive domains. Because the level of difficulty of items necessarily differs between subject, grade, and domains, direct comparisons between scores across subjects, grades, and different domain types should not be made. In contrast, scores within a subject, grade, and domain are comparable over time.
In addition to scale scores, TIMSS and TIMSS Advanced have developed international benchmarks for each subject and grade. The international benchmarks provide a way to interpret the scale scores and to understand how students' proficiency in mathematics and science varies along the TIMSS scale. The TIMSS benchmarks describe four levels of student achievement (Advanced, High, Intermediate, and Low) for each subject and grade, based on the kinds of skills and knowledge that students at each score cutpoint would need to successfully answer the mathematics and science items. The TIMSS Advanced benchmarks similarly describe three levels of student achievement (Advanced, High, and Intermediate).
The TIMSS 2015 and TIMSS Advanced 2015 Results present the performance of U.S. students relative to their peers in other countries and other education systems and describe changes in mathematics and science achievement since 1995. Most of the TIMSS 2015 findings are based on the results presented in two international reports published by the IEA (forthcoming):
Most of the TIMSS Advanced 2015 findings are based on the results presented in the international report published by the IEA (forthcoming):
Besides findings based on the international reports, the TIMSS 2015 and TIMSS Advanced 2015 Results provide details about the achievement of subgroups of U.S. students that are not available in the international reports (e.g., scores of students of different racial and ethnic and socioeconomic backgrounds).
It is important to note that comparisons presented here treat all participating education systems equally, as is done in the international reports. Thus, the United States is compared with some education systems that participated in TIMSS without a complete national sample (e.g., Northern Ireland-GBR participated but there was no national United Kingdom sample) as well as with some education systems that participated as part of a complete national sample (e.g., Florida-USA participated as a separate state sample of public schools and as part of the U.S. national sample of all schools).
In addition to describing performance in 2015, the results also document changes in mathematics and science over time. TIMSS has been administered six times (every 4 years) since the first assessment in 1995. In each administration, the framework is reviewed and updated to reflect developments in the field and in curricula, while at the same time ensuring comparability in sampling procedures and assessment items. Additionally, each successive administration of TIMSS since 1995 has been scaled so that the mean of all education systems is 500, as it was originally set in 1995, and thus comparable across years. This report focuses on comparing the 2015 results with those from the prior TIMSS assessment in 2011 and, for a long-term perspective, the first TIMSS assessment in 1995.
Changes in advanced mathematics and physics achievement between 1995 and 2015 are reported to the extent possible. Six countries, including the United States, participated in TIMSS Advanced in both years. However, because of changes in the framework and sampling procedures between the 1995 and 2008/2015 administrations, results should be interpreted with caution, as described in the Methodology and Technical Notes. Performance changes between 2008 and 2015 (which are not subject to such concerns about comparability) are not presented because the United States did not participate in TIMSS Advanced in 2008.
All results are presented in tables, figures, and brief text summaries of key findings. In the interest of brevity, in most cases, only the names of education systems (including benchmarking participants) scoring higher than or not measurably different from the United States (not those scoring lower than the United States) are reported. Results also include data on the one U.S. state—Florida—that participated as a benchmarking education system.
All statistically significant differences described in this report are at the .05 level. Differences that are statistically significant are discussed using comparative terms such as "higher" and "lower." Differences that are not statistically significant are either not discussed or referred to as "not measurably different." In almost all instances, the tests for significance used were standard t tests. No adjustments were made for multiple comparisons.
For additional information on scaling, reporting, and statistical procedures, see the Methodology and Technical Notes.