Skip to main content
Skip Navigation

NAEP 2004 Trends in Academic Progress:
Three Decades of Student Performance in Reading and Mathematics

July 2005

Authors: Marianne Perie and Rebecca Moran

Download sections of the report (or the complete report) in a PDF file for viewing and printing.


Executive Summary

National Results 
   Average Scores 
   Percentiles 
   Performance Levels 
Student Group Results 
   Gender 
   Race/Ethnicity 
   Parents' Education 
Contextual Variables 
2004 Bridge Study 


The citizens and leaders of the United States have long valued education as a foundation for democracy, a resource for economic prosperity, and a means of realizing personal goals and individual potential. Throughout the nation’s history, the commitment to educate children has grown stronger and more inclusive, and in recent decades, so has the expectation that our nation’s schools and teachers be accountable (Ravitch 2002). In 2002, the reauthorization of the Elementary and Secondary Education Act—also known as the No Child Left Behind (NCLB) Act—further strengthened that commitment and expectation.

Since its inception in 1969, the National Assessment of Educational Progress (NAEP) has served the important function of measuring our nation’s educational progress by regularly administering various subject-area assessments to nationally representative samples of students. One of the primary objectives of NAEP is to track trends in student performance over time. This report presents the results of NAEP long-term trend assessments in reading and mathematics, which were most recently administered in 2004 to students ages 9, 13, and 17. Because the assessments have been administered at different times in the 35-year history of NAEP, they make it possible to chart educational progress since 1971 in reading and 1973 in mathematics. Prior to 2004, the most recent long-term trend assessment was given in 1999, when results were reported for reading, mathematics, and science.

It should be noted that these long-term trend assessments are different from more recently developed assessments in the same subjects that make up the “main NAEP” assessment program. Because the instruments and methodologies of the two assessment programs are different, comparisons between the long-term trend results presented in this report and the main assessment results presented in other NAEP reports are not possible.

Approximately 38,000 students participated in the reading assessment, and 37,000 participated in the mathematics assessment. Appendix A provides technical information on this study, including sample sizes and a description of the significance tests done on each set of results. Only differences that have been determined to be statistically significant at the 0.05 level after controlling for multiple comparisons are included in this report.

National Results

National results, provided in chapter 2, are described in three ways: average score, score at selected percentiles, and percentage of students performing at or above each performance level. Student performance in each subject area is summarized as an average score on a 0–500 scale. The five long-term trend performance levels presented in this report were set at 50-point intervals on the two subject-area scales to provide a verbal description of student performance at different points on the scale. All national findings are reported from 1971–2004 for reading and 1973–2004 for mathematics. The primary findings include the following:

Average Scores

  • Between 1999 and 2004, average reading scores increased at age 9 and average mathematics scores increased at ages 9 and 13. No measurable changes in average scores were found at age 17 in either subject between 1999 and 2004.
  • In reading, 9-year-olds scored higher in 2004 than in any previous assessment year, with an increase of 7 points between 1999 and 2004. Average scores for age 13 showed no measurable differences between assessment years 1999 and 2004, but still were higher in 2004 than the scores in 1971 and 1975. For age 17, the average score in 2004 was not measurably different from the average score in the first assessment year, 1971.
  • The average score in mathematics at age 9 was higher in 2004 than in any previous year—9 points higher than in 1999. The average score for 13-year-olds increased between 1999 and 2004 by 5 points. The average score at age 17 was not measurably different from 1973 or 1999.

Percentiles

  • The reading score of 9-year-olds at the median (50th percentile) was higher in 2004 than the median score in every other year.
  • Overall gains in reading scores for 13-year-olds were evident among higher performing students—those scoring at the 75th and 90th percentiles—between 1971 and 2004.
  • Seventeen-year-olds showed no measurable improvements in reading scores at any of the selected percentiles between 1999 and 2004 or between 1971 and 2004.
  • Mathematics scores for 9-year-olds at each of the selected percentiles showed gains between 1978 and 2004, increasing 26 points at the 10th percentile, 23 points at the 50th percentile, and 18 points at the 90th percentile.
  • The mathematics score for 13-year-olds at each of the five percentile levels was higher in 2004 than in every previous assessment year, except at the 10th percentile.
  • Mathematics scores for 17-year-olds in 2004 showed no measurable change since 1992 at any of the five percentiles.

go to Contents near the top of this page

Performance Levels

  • The partially developed skills and understanding associated with reading at level 200 were demonstrated by 70 percent of 9-year-olds in 2004, more than in any other assessment year except 1980; by 94 percent of 13-year-olds; and by almost all 17-year-olds.
  • The percentages of 13-year-olds and 17-year-olds who demonstrated the ability to interrelate ideas and make generalizations in reading (level 250) were 61 percent and 80 percent, respectively, in 2004, not measurably different from those in 1971 and 1999.
  • Reading performance at or above level 300—understanding complicated information—was demonstrated by 38 percent of 17-year-olds in 2004, down from 41 percent a decade earlier in 1994.
  • The beginning skills and understandings characteristic of level 200 in mathematics were demonstrated by 89 percent of 9-year-olds in 2004, more than in any other assessment year. Approximately 99 percent of 13-year-olds also demonstrated at least this level of performance in 2004.
  • At age 13, the percentages of students at level 300 in mathematics increased from 17 percent in 1990 to 23 percent in 1999 and then to 29 percent in 2004. Students at this level could perform moderately complex procedures and use logical reasoning to solve problems. In 2004, 59 percent of 17-year-olds were at or above level 300 in mathematics, an increase of 7 percentage points from 1978.
  • Across the assessment years in mathematics, between 5 and 8 percent of 17-year-olds performed at level 350, the highest performance level, in which students applied a range of reasoning skills to solve multistep problems.

go to Contents near the top of this page

Student Group Results

Chapter 3 describes the average scores for various groups of students, including male and female students; White, Black, and Hispanic students; and student-reported levels of parents’ education, which included less than high school, graduated from high school, some education after high school and graduated from college. Some of the results were as follows:

Gender

  • At all three ages in 2004, female students had higher average reading scores than their male counterparts.
  • In 2004, there was no measurable difference between the average mathematics scores of male and female students at age 9, but at ages 13 and 17, male students scored higher on average than female students.
  • The gender gap for 9-year-olds' reading scores in 2004 was smaller than the gaps in the first three assessment years and 1996. This gap did not change measurably between 2004 and any previous assessment year for 13-year-olds. This score gap in 2004 showed no measurable difference for 17-year-olds from the gap in 1999 or 1971.

go to Contents near the top of this page

Race/Ethnicity

  • White students had higher average reading scores in 2004 than in 1971 at ages 9 and 13.
  • For Black students at all three ages, average reading scores in 2004 were higher than in 1971.
  • Although White students continue to outscore Black students, the White-Black score gap in reading narrowed from 1971 to 2004 at all three ages. The White-Black reading score gap for 9-year-olds decreased from 35 points in 1999 to 26 points in 2004.
  • For Hispanic students, the average reading score at age 9 was higher in 2004 than in any other assessment year. Their average score at age 13 was higher in 2004 than in 1975, but not measurably different from that in 1999. No measurable difference was found between the average score for Hispanic students at age 17 in 2004 and that in 1999.
  • Although White students continue to outscore Hispanic students, the White-Hispanic reading score gap for students at age 9 in 2004 was smaller than it was in 1994, 1984, 1980, and 1975. The White-Hispanic reading score gap for 13-year-olds showed no measurable difference between 2004 and 1999 or 1975. The score gap between White and Hispanic students at age 17 was measurably smaller in 2004 than in 1975.
  • White students at all three ages scored higher, on average, in 2004 than in 1973 in mathematics.
  • The average mathematics scores for Black students were higher in 2004 than in 1973 at all three ages. Average scores for Black students at ages 9 and 13 were higher in 2004 than in any previous assessment year.
  • The differences in average scores for White and Black students at all ages decreased between the first (1973) and the most recent (2004) assessment in mathematics, although White students continued to outscore Black students in 2004. During this same period, the White-Black score gaps in mathematics narrowed by 12, 19, and 12 points for ages 9, 13, and 17, respectively.
  • Hispanic students’ performance in mathematics was higher at all three ages in 2004 than in any assessment year from 1973 through 1982. Average scores for Hispanic students at ages 9 and 13 were higher in 2004 than in any previous assessment year.
  • White students scored higher on average than Hispanic students at all three age levels in 2004. For ages 13 and 17, the White-Hispanic score gap was smaller in 2004 than in 1973, but for age 9 there was no measurable difference in the size of the score gap between the first (1973) and most recent (2004) assessment year.

go to Contents near the top of this page

Parents' Education

  • In 2004, the percentage of students reporting that at least one parent graduated from college has increased since 1980 for reading and 1978 for mathematics, while the percentage of students reporting that the highest level of education for their parents was a high school diploma or less has decreased.
  • At age 13, there have been no measurable changes in average reading scores between 2004 and any previous assessment year regardless of the level of parents’ education reported by the student.
  • The average reading score for 17-year-olds who indicated that at least one parent had some education after high school was lower in 2004 than in any previous assessment year. For 17-year-olds who indicated that at least one parent graduated from college, the average score in 2004 (298) was lower than the average scores in 1990 (302) and 1984 (302).
  • Students who reported that their parents had less than a high school education showed no measurable change in average mathematics score between 1999 and 2004 at either age 13 or 17, but their 2004 scores were higher than those in 1978.
  • For students whose parents’ highest education level was high school graduation or some education after high school, the average mathematics score at age 13 was higher in 2004 than in any other assessment year, while at age 17 there were no measurable changes between 1978 and 2004.
  • For students with at least one parent who graduated from college, the average mathematics score in 2004 was higher than in any other assessment year at age 13; no measurable difference was seen at age 17 between 1978 and 2004.

go to Contents near the top of this page

Contextual Variables

As described in chapter 4, examining student scores in the context of their learning and home environments provides useful information. Learning and home factors for which trends are reported include students’ reports of how often they read for fun, completed homework, used computers, and watched television, and the advanced mathematics courses they had taken. Some of the findings include the following:

Homework. Students who took the reading assessment were asked how many hours they had spent on homework the previous day.

  • The percentage of students at age 9 indicating that no homework was assigned or that they did not do any homework decreased between 1984 and 2004.In 2004, a greater percentage of 9-year-olds indicated that they spent less than 1 hour on homework than in any other year in which the question was asked. 
  • In 2004, the average reading score of 9-year-olds who spent less than 1 hour on homework was higher than the average reading scores of students who did not do the homework that was assigned or who spent more than 2 hours on homework.
  • At age 13, the percentage of students spending less than 1 hour on homework increased from 36 percent in 1984 to 40 percent in 2004. At the same time, the percentage of students spending 1 to 2 hours on homework decreased from 29 percent in 1984 to 26 percent in 2004.
  • At age 13, students who spent 1 to 2 hours or 2 or more hours on homework had higher average reading scores than their peers who spent less than 1 hour on homework, did not do their homework, or did not have any homework to do.
  • At age 17, the percentage of students reporting that they were not assigned homework increased from 22 to 26 percent. At the same time, the percentage of 17-year-olds indicating they had spent 1 to 2 hours on homework the previous day decreased from 27 to 22 percent between 1984 and 2004.
  • At age 17, students who spent 2 or more hours on homework had higher average reading scores in 2004 than those who spent 1 to 2 hours, whose scores were higher than those who spent less than 1 hour, whose scores in turn were higher than those who did not do any homework.

Reading for Fun. Students who took the reading assessment were asked to estimate how often they read for fun.

  • There were no measurable changes between 1984 and 2004 in the percentage of 9-year-olds indicating that they read for fun almost every day. At ages 13 and 17, the percentage saying they read for fun almost every day was lower in 2004 than in 1984. This trend was accompanied by an increase over the same 20-year time period in the percentage indicating that they never or hardly ever read for fun.
  • At all three ages, students who indicated that they read for fun almost every day had higher average reading scores in 2004 than those who said that they never or hardly ever read for fun. Students at all three age levels who said that they read for fun once or twice a week had higher average scores than those who never or hardly ever read for fun.

Computer Access and Usage. Students at ages 13 and 17 who took the mathematics assessment were asked three questions about their access to computers and how they used them.

  • The percentage of 13-year-olds with access to computers in schools increased from 12 percent in 1978 to 57 percent in 2004. The percentage of students receiving instruction in computers at age 13 also increased, from 14 percent in 1978 to 48 percent in 2004. In the 2004 assessment, 69 percent of 13-year-olds said that they had used a computer to solve a mathematical problem.
  • Similar increases were also seen among 17-year-olds, where the percentage of students with access to a computer in school increased by 33 percentage points between 1978 and 2004. The percentage of 17-year-olds using a computer to solve mathematics problems increased from 46 percent in 1978 to 66 percent in 1999, then to 70 percent in 2004. In that year, 36 percent reported that they had studied mathematics using computers.
  • There were no measurable differences in mathematics scores between 13-year-olds who responded positively and those who responded negatively to any of the computer access and usage questions in 2004. At age 17, students who indicated that they had access to a computer at school scored 5 points higher in 2004 than students who did not have such access.
  • In 2004, students at age 17 who reported that they had used a computer to solve a mathematical problem scored 6 points higher on average than students who had not used a computer for that purpose. There was no measurable difference in average mathematics scores for 17-year-olds based on whether or not they had studied mathematics using computers.

Course-Taking Patterns in Mathematics. Students at age 17 who took the mathematics assessment were asked to check all the mathematics courses they had taken or were currently taking. The highest course checked was used for the analyses.

  • A greater percentage of 17-year-olds indicated they were taking or had taken calculus in 2004 than in any previous assessment year. The percentage taking second-year algebra increased from 37 percent in 1978 to 53 percent in 2004, while the percentage of students who indicated that the highest level of mathematics they had taken by age 17 was pre-algebra or algebra was lower in 2004 than in 1978.
  • The trend towards higher-level course-taking was seen across all three racial/ethnic groups shown. The percentage of White, Black, and Hispanic students who indicated that their highest course was second-year algebra was higher in 2004 than in 1978. In 2004, a higher percentage of White students took calculus (19 percent) compared to Black students at the same age (8 percent). At 14 percent, the percentage of Hispanic students taking calculus was not measurably different from the percentage of either White or Black students in 2004.

go to Contents near the top of this page

2004 Bridge Study

Several changes were made to the long-term trend assessment in 2004 to align it with current assessment practices and policies applicable to the NAEP main assessments. These changes, discussed in detail in chapter 5, included replacing items that had outdated material, eliminating blocks of items for subjects no longer reported, replacing background questions, and changing some administration procedures. In addition, the 2004 modified assessment provided for the inclusion of and accommodations for students with disabilities and English language learners.

A bridge study was conducted to ensure that the interpretation of the assessment results remains constant over time. A bridge study involves administering two assessments: one that replicates the assessment given in the previous assessment year (a bridge assessment), and one that represents the new design (a modified assessment). In 2003–2004, students were randomly assigned to take either the bridge assessment or the modified assessment. The bridge assessment replicated the instrument given in 1999 and used the same administration techniques. The modified assessment included the new items and features discussed above. This modified assessment will provide the basis of comparison for all future assessments, and the bridge study will link its results back to the results of the past 33 years. The results from the bridge study are presented in chapters 2 and 4, and comparisons between the two assessments are provided in chapter 5.

  • Comparing the results of the modified and bridge assessments demonstrates that the link between the 2004 bridge and modified assessments was successful.

go to Contents near the top of this page


Download sections of the report (or the complete report ) in a PDF file for viewing and printing:


NCES 2005-464 Ordering information

Suggested Citation
Perie, M., and Moran, R. (2005). NAEP 2004 Trends in Academic Progress: Three Decades of Student Performance in Reading and Mathematics (NCES 2005-464). U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics. Washington, DC: Government Printing Office.


Last updated 06 July 2005 (RF)