Stuart Kerachsky

Acting Commissioner, National Center for Education Statistics

National Assessment of Educational Progress -- The Nation's Report Card: 12th Grade Reading and Mathematics 2009

November 18, 2010

Acting Commissioner Stuart Kerachsky's Briefing Slides (2.66 MB)

Today I am releasing the results of the 2009 grade 12 reading and mathematics assessments from the National Assessment of Educational Progress—the Nation’s Report Card. NAEP administered reading and mathematics assessments in 2009 to students across the country at grades 4, 8, and 12. Results for students in grades 4 and 8 were released earlier this year.

The grade 12 results released today contain our first-ever grade 12 results at the state level. There were 11 states that volunteered to participate in NAEP’s grade 12 pilot, which will examine the feasibility of using NAEP as a common yardstick to compare the performance of grade 12 students by state across the country.

The assessments were administered early in 2009. A nationally representative sample of over 100,000 twelfth-graders participated in the reading and mathematics assessments. The national-level results show the combined performance of public and private school students. Low participation rates for private schools, however, prevent reporting private school results separately. We do have separate state results for the 11 states participating in the grade 12 pilot, which assessed public school students only. The 11 states that agreed to participate were Idaho, South Dakota, Iowa, Illinois, Arkansas, Florida, West Virginia, New Jersey, Connecticut, Massachusetts, and New Hampshire.

We report student performance on NAEP in two ways: average scale scores and the percent of students at various NAEP achievement levels. The achievement levels were developed by the National Assessment Governing Board. They set standards for what students should know and be able to do. For each subject and for each grade, the Governing Board has established standards for *Basic*, *Proficient*, and *Advanced *performance levels. Ultimately, the goal is to have all students performing at or above the *Proficient* level. When comparing scores and other NAEP results we only discuss differences that are statistically significant.

**Grade 12 Participation and Motivation Rates**

NAEP has collected several indicators of grade 12 participation rates over time. For reportable results, the school participation rate must be at least 70 percent, which we have achieved. This rate was 70 percent in 1998, and has increased to 82 and 83 percent in 2005 and 2009 respectively. The participation rate for students was 67 percent in 2005, increasing to 81 percent in 2009. The overall participation rate, obtained by multiplying the school and student rates for a given assessment, was in the mid-fifties for several previous assessments before reaching 67 percent in 2009. In recent years, NCES has been making a special effort to improve participation at grade 12. For example, the NCES developed a best practices guide, which we distributed to each of the high schools participating in the assessment, helping them to encourage the high participation rates that were in fact achieved in 2009.

People have wondered how engaged grade 12 students are when they take the NAEP assessment. Student engagement has been relatively high. Twelfth-graders taking NAEP have answered at least 91 percent of the reading questions in every assessment since 1992. In 2009, students answered 95 percent of the questions in the reading assessment and 94 percent in mathematics.

**Participating States in Grade 12 Pilot Study**

This is the first grade 12 assessment to provide results for individual states. There was a wide variation in the demographic makeup of the 11 states participating in the grade 12 pilot state assessment. The states varied demographically among themselves and from the nation as a whole. For example, the percentages of Black grade 12 students in the 11 states ranged from 1 percent in New Hampshire and Idaho to 22 percent in Arkansas, compared to the national percentage of 16 percent. There was a similar range for Hispanic students, from 1 percent in West Virginia to 24 percent in Florida (the national percentage was 18 percent). For the percentages attending suburban schools, there was a particularly wide range, from zero in South Dakota to 78 percent in New Jersey (nationally the percentage was 36 percent). For the percentages of students reporting that at least one parent graduated from college, the range was from 38 percent in Arkansas and West Virginia to 59 percent in Massachusetts and New Hampshire (the national percentage was 47 percent).

**Grade 12 Reading Results**

All NAEP assessments are based on frameworks, which describe the specific knowledge and skills that should be assessed. In 2009, the grade 12 reading assessment was based on a new framework. After conducting a special study, we determined that 2009 results could be compared meaningfully with results from prior assessments based on a previous framework. Under the new framework in 2009, students read both literary and informational texts. They answered questions aligned to three reading processes: locate and recall, integrate and interpret, and critique and evaluate.

Average scores for grade 12 reading are reported on a 0–500 scale, and results for 2009 are compared to five previous assessments, going back to 1992. Today we will be discussing the significant differences between 2009 results and those for 1992, as well as those for the most recent prior grade 12 assessment, which took place in 2005.

The average scale score for 2009—288 points—was higher than in 2005 but lower than in 1992. In 2009, seventy-four percent of students were at or above *Basic*, compared to 80 percent in 1992. Thirty-eight percent of students were at or above *Proficient*, which was an increase since 2005 and not significantly different from 1992. In 2009, five percent of students were at *Advanced,* which was higher than the 4 percent at this level in 1992.

Seven states out of the 11 that volunteered to participate had higher average reading scores for twelfth-graders than the nation as a whole. They were Idaho, South Dakota, Iowa, Illinois, Connecticut, Massachusetts, and New Hampshire. In Arkansas, West Virginia, and Florida, the average reading score was lower than the national average. In New Jersey, the average reading score was not significantly different from the national average. Percentages of students at the various achievement levels also varied among the participating states.

Turning back to the nation as a whole, we have results for students overall and for each of the five major racial/ethnic groups. Overall, the 2009 average score was lower by 4 points than in 1992. Performance did not change significantly for four of the five racial/ethnic groups. For American Indian/Alaska Native students, our sample in 1992 was not large enough to provide reliable results. Next, comparing 2009 to 2005, we see a 2-point increase overall. In addition, scores for White students rose by 3 points, and scores for Asian/Pacific Islander students rose by 11 points.

Now let’s look at the score differences or gaps between various groups of students in 2009. The White-Black score gap was 27 points, the White-Hispanic gap was 22 points, and the White-American Indian/Alaska Native gap was 13 points. The 2-point score difference between Asian/Pacific Islander and White students was not statistically significant.

Compared to previous assessments, there was no change in the White-Black or the White-Hispanic gaps from either 1992 or 2005 and no change from the White-American Indian/Alaska Native gap in 2005. The gap in performance for White and Asian-Pacific Islander students did change. In fact, it disappeared. In 1992 and 2005, on average, White students had higher average reading scores than Asian/Pacific Islander students. However, in 2009, the increase for Asian/Pacific Islander students was large enough to eliminate the gap.

Female students scored 12 points higher than male students in 2009. This gap was not significantly different from either 1992 or 2005.

We asked grade 12 students a number of questions about their reading instruction, including a question about the frequency with which their teachers asked them to write long answers to questions involving reading. Seventy-two percent reported having to write such answers at least once a month, while about 28 percent reported doing so less frequently. Students who reported writing long answers at least once a month had higher scores, on average, than students who were asked to do this less frequently. This does not necessarily mean that asking students to write long answers more frequently will cause students’ scores to rise. There are many possible reasons for the association of writing long answers frequently with high scores. However, identifying associations like this one provides a basis for further research into ways to improve student performance.

The percentages of students reporting writing long answers to questions at least once a month for each of the 11 states varied. The percentages range from 64 percent in Iowa to 81 percent in Connecticut. The national percentage for public school students was 71 percent.

In addition to questions about their classroom instruction, we asked grade 12 students about their future educational plans. Sixty percent said they expected to finish their education by graduating from college, and another 26 percent said they expected to go on to graduate school. When we bring in the average scores, we see a clear association of higher educational goals with higher scores. Of course, there are many factors that contribute to higher student performance in addition to the goals that students set for themselves.

In the reading assessment, students were asked questions after reading authentic passages. One example from the grade 12 reading assessment, shown in the report card, is a rental agreement. Students were asked questions about specific sections of the agreement, including one which described the right of the landlord to enter the property subject to the rental agreement. One question asked students to critique and evaluate this section of the agreement, and explain why the language favored the landlord. Seven percent of student answers were scored as “full comprehension,” 59 percent showed “partial comprehension,” and 24 percent were rated as “little or no comprehension.” Nine percent of students omitted the question. In the 11 states, the percentages of students scored as showing “full or partial comprehension” ranged from 60 percent to 73 percent.

**Grade 12 Mathematics Results**

Now we’ll look at the results for grade 12 mathematics. As in reading, a new grade 12 mathematics framework was developed for the 2009 assessment. In mathematics, we can compare results in 2009 with those from 2005 only. Under the new assessment, grade 12 students were assessed in four mathematical content areas: number properties and operations; measurement and geometry; data analysis, statistics, and probability; and algebra. Average scores for mathematics are reported on a 0–300 scale.

The average scale score at grade 12 for 2009 was 153. This was 3 points higher than in 2005. When we look at achievement-level results for grade 12 mathematics, we see that the percentages at or above *Basic *and at or above *Proficient* both increased from 2005 to 2009. In both cases, the increase was about 3 percentage points.

Six of the 11 participating states had higher average mathematics scores than the nation as a whole. These states are South Dakota, Iowa, New Jersey, Connecticut, Massachusetts, and New Hampshire. In Arkansas, West Virginia, and Florida, the average mathematics score was lower than the national average. In Idaho and Illinois, the average score was not significantly different from the national average. All of the states had more than 50 percent of students who were at or above *Basic* and all had at least 13 percent of students at or above *Proficient*.

Scores increased overall and for all five major racial/ethnic groups from 2005 to 2009. The increases ranged from 4 score points for White and Black students to 13 points for Asian/Pacific Islander students.

In 2009, the White-Black score gap was 30 points, while the White-Hispanic gap was 23 points, and the gap between White and American Indian/Alaska Native students was 17 points. None of these gaps changed significantly from 2005. However, the gap between Asian/Pacific Islander and White students did change significantly, widening from 5 points in 2005 to 14 points in 2009 (161 for White students vs. 175 for Asian/Pacific Islander students). Scores for both groups increased since the last assessment, but the increase for Asian/Pacific Islander students was larger.

The gender gap did not change from 2005 to 2009. Male students scored 3 points higher than female students in both years. In contrast, the 2009 reading gap favored female students by 12 points.

We asked students to tell us the highest level of mathematics course they had completed. About 42 percent of students said they had completed a course in algebra II or trigonometry. In addition, about 42 percent said they had taken a course in either pre-calculus or calculus. On average, students who completed more advanced courses had higher scores than students who completed less advanced courses. However, even where the relationships are statistically significant, it does not necessarily follow that requiring students to take more advanced courses would ensure higher scores. You can obtain more information on statistically significant relationships between student performance and student course taking patterns through the NAEP Data Explorer (http://nces.ed.gov/nationsreportcard/naepdata/).

At the state level, the percentage of students who reported they had completed a course in pre-calculus or calculus ranged from 54 percent in Massachusetts to 27 percent in West Virginia. We have not evaluated the actual content of these courses. It is possible that content varies among the states.

We also asked students what their plans were immediately after leaving high school. A majority—62 percent—said they planned to attend a four-year college. The average score for these students was 166, higher than that of students who indicated that they planned to pursue other options, such as full-time work, attending a two-year college, or serving in the military.

Among the questions assessing students’ knowledge of measurement and geometry was one which required them to use a trigonometric relationship, that of a tangent. Students were shown a right triangle with a given length of the base and size of the angle between the base and the hypotenuse. Students were required to find the height of the triangle, found by multiplying the tangent of the angle by the length of the base. Students were given a calculator for this question, which allowed them to find the value of the tangent and multiply it. Thirty percent of students chose the correct answer, and four percent omitted the question.

The report contains profiles of the performance of students in each of the 11 states participating in the pilot program. The state profiles include data on scale scores overall, by race/ethnicity, and for students at the 10th, 25th, 50th, 75th, and 90th percentiles, as well as achievement- level results, for both reading and mathematics. Information is given for selected background variables as well.

There is much more information on student performance, both nationally and at the state level in *Grade 12 Reading and Mathematics: 2009 National and Pilot State Results*. In addition, the NAEP website, http://nationsreportcard.gov, has extensive information on the performance of students in each of the 11 participating states, access to released questions through NAEP’s Questions Center, as well as access to the NAEP Data Explorer, our online data-analysis tool.

In closing, I would like to thank all the students and schools who participated in these assessments.

Acting Commissioner Stuart Kerachsky's Briefing Slides (2.66 MB)

YES, I would like to take the survey

or

No Thanks