Skip Navigation

Mark Schneider
Commissioner, National Center for Education Statistics

National Assessment of Educational Progress
The Nation's Report Card: Writing 2007

April 3, 2008

Commissioner Mark Schneider's Powerpoint Presentation MS PowerPoint (1.4 MB)

Introduction

Today I am releasing the results of the 2007 writing assessment from the National Assessment of Educational Progress. This assessment was given January through March 2007 to 8th- and 12th-grade students across the country.

Good writing means you can tell a story, provide information, and persuade people with your words. While we still have a ways to go, America’s students are getting better at writing, with higher scores for both 8th-graders and high school seniors.

Overview of the 2007 Writing Assessment

About 140,000 8th-grade students and about 28,000 12th graders participated in the assessment. The 8th-grade sample is much larger because it provides results for the nation, for most states, and for 10 urban school districts. At the 12th-grade, we have national results only. There was no 4th-grade assessment.

We assessed student writing for three different purposes—narrative, informative, and persuasive. In the scoring of student answers, we recognized that these were essentially first drafts, not polished pieces of writing. Student responses could receive any of six ratings—from "Excellent" to "Unsatisfactory." But even an "Excellent" response could contain errors in grammar, spelling, and punctuation, as long as they were few in number and did not interfere with a reader’s ability to understand the response. More errors were acceptable for ratings of "Skillful" and "Sufficient," again as long as they did not interfere with understanding. Examples of NAEP writing questions and student responses, along with scoring guides and performance data, are available on the NAEP web site (http://nces.ed.gov/nationsreportcard/itmrls).

I will present national results for both grades in 2007, along with results for previous assessments in 2002 and 1998. At grade 8, there are results for 45 states and the Department of Defense schools, as well as for 10 large urban school districts. States and districts participate in the writing assessment on a voluntary basis.

NAEP reports student performance in two ways: scale scores and achievement levels. NAEP scale scores indicate what students know and what they can do. The writing assessment uses separate 0–300 scales for the eighth and twelfth grades.

Achievement levels were developed by the National Assessment Governing Board. They set standards for what students should know and be able to do. For each subject and for each grade, the Governing Board has established standards for Basic, Proficient, and Advanced performance. Today I will be reporting on the percentages of students who performed at or above Basic, those who performed at or above Proficient, and those who performed at the Advanced level. Ultimately, the goal is to have all students performing at or above the Proficient level.

2007 Writing Results

First, I will present results for grade 8 at the national, state, and urban district levels. Then I will present the 12th-grade national results.

Grade 8

National Results

At grade 8 the average score in 2007 was higher than in either 1998 or 2002. In the first writing assessment in 1998, the average score was set at 150, and since then performance has increased by 6 points.

When comparing NAEP scale scores or achievement-level percentages, we must remember that NAEP results are based on samples and there is a margin of error associated with each score. When comparing scores and other NAEP results we only discuss differences that are larger than the margin of error—those that are statistically significant. In the data figures and tables, we place an asterisk on a score from a previous assessment when the difference from 2007 is statistically significant. Both the 6-point difference since 1998 and the 3-point difference since 2002 are significant.

Turning to achievement-level results, 88 percent of students performed at or above Basic in 2007. This was higher than the 85 percent who achieved at this level in the previous assessment in 2002 or the 84 percent who did so in 1998. The percentage at or above Proficient in 2007, 33 percent, was higher than the 27 percent who achieved this level in 1998. The percentage at Advanced, 2 percent, was also higher than in 1998. But neither percentage was significantly different from 2002.

National Results by Race/Ethnicity

Writing scores were higher in 2007 for White, Black, and Hispanic students than in either previous assessment. Both Black and Hispanic students increased their scores by 10 points compared to 1998.

Asian/Pacific Islander students’ performance increased compared with 2002, but the difference from 1998 was not statistically significant.

For American Indian/Alaska Native students, there was no significant change compared to either year.

I noted that both White and Black students had higher average scores in 2007 than in prior assessments. Black students made greater gains. As a result, the 23-point White-Black performance gap in 2007 was smaller than the gap in either prior assessment (25 points in 2002 and 26 points in 1998).

The gap between White and Hispanic students did not change. Scores for Hispanic students were higher in 2007 than in the two prior assessments, but scores for White students increased as well. The gap between these two groups was 22 points in 2007.

National Results by Gender

Average scores increased in 2007 for both male and female students, and female students continued to have higher scores than males. The average score for female 8th-graders was 166, compared with 146 for males. Both groups’ average scores were 6 points higher than in 1998. The 20-point gap in 2007 between female and male performance in writing was about the same as in both previous assessments.

National Results by Family Income

NAEP reports student results according to their eligibility for the National School Lunch Program. We report on three groups, ranked according to family income level. Students coming from families near or below the poverty line are eligible for free lunches. If the family’s income is a little higher the student is eligible for reduced-price lunches. Students from families further above the poverty line are not eligible for the program. Because of changes in the availability of data, we cannot make comparisons to previous writing assessments.

The average score for students eligible for free lunch was 139; those eligible for reduced-priced lunch scored at 150; and students not eligible for the program averaged 164.

State Results

Trends in Average Scores of States

State participation in the NAEP writing assessment was voluntary. In 2007, 45 states participated, plus the Department of Defense Schools, for a total of 46 jurisdictions. For most of these jurisdictions, we can make comparisons with previous assessments.

At the state level, NAEP assesses the performance of public school students only.

Compared with 2002, scores increased in 20 out of the 39 jurisdictions for which a comparison can be made. Average scores decreased in one state, North Carolina. Comparing 2007 with 1998, there were increases in 29 out of 34 jurisdictions that participated in both years, and no declines.

Sixteen jurisdictions had higher scores in 2007 than in either previous assessment.

Trial Urban District Assessment

Since 2002, NAEP has conducted assessments in an increasing number of large urban school districts. The Trial Urban District Assessment, or TUDA, is a collaboration involving the National Center for Education Statistics, the National Assessment Governing Board, and the Council of the Great City Schools.

TUDA was designed to assess the performance of public school students at the district level. Because the assessments are the same for the nation, the states, and the urban districts, NAEP serves as a common yardstick for comparison.

Participation in TUDA is voluntary. In 2007, 10 districts were invited to participate in the writing assessment at grade 8, and all 10 agreed to do so. The District of Columbia is usually included in the TUDA results, but in 2007 we administered three assessments—reading, mathematics, and writing—and the District of Columbia only had enough students to participate in reading and mathematics.

The samples for the districts ranged from about 900 to 2,000 students. We can make comparisons with 2002 results for the four of these districts that also participated in that year. In three of these districts—Atlanta, Chicago, and Los Angeles—scores were higher in 2007. There was no significant change in Houston.

In addition to comparing the urban districts to the nation, we compare their performance to that of students in large central cities nationally. Large central cities are an appropriate comparison group for the TUDAs because their demographic makeup is closer to that of the districts. The large central city average is based on the performance of students in central cities with a population of at least 250,000. The average score for students in large central cities was higher in 2007 than in 2002.

In 2007, compared to both the national and large central city averages.

  • Charlotte-Mecklenburg in North Carolina was the only district whose average score was above the large central city average.
  • Seven districts—Atlanta, Austin, Boston, Chicago, Houston, New York City, and San Diego—had average scores that were not significantly different from the large central city average.
  • The remaining two—Cleveland and Los Angeles—were below the large central city average.
  • All the districts except Charlotte were below the national average.

We can compare score gains from 2002 to 2007 for Atlanta, Los Angeles, and Houston with the gains posted by the districts’ home states over the same period.

  • In Georgia, Atlanta’s 15-point increase in 2007 was larger than the state’s 6-point gain.
  • In California, the average score for Los Angeles in 2007 was 9 points higher than in 2002, while the statewide average did not change significantly.
  • In Texas scores did not change significantly for either Houston or the state.

Turning to achievement levels, three districts had a percentage at or above Proficient that was higher than the large central city percentage—Charlotte, San Diego, and Austin. Five districts—New York City, Boston, Chicago, Atlanta, and Houston—were comparable to the large central city percentage. Two districts—Los Angeles and Cleveland—were below the large central city percentage. The percentages at or above Proficient ranged from 31 percent in Charlotte to 9 percent in Cleveland.

Grade 12

Now I will describe the writing results for twelfth-graders nationally.

Overall Results

Scores were higher for twelfth-graders by 5 points since 2002, from 148 to 153; and by 3 points since 1998, when the average score was 150.

It should be kept in mind that eighth-grade and twelfth-grade performance are presented on separate 0-300 point scales. Each scale is constructed to reflect the difficulty of the work expected at the given grade. Thus, the twelfth-grade average of 153 in 2007 should not be interpreted as a "lower" score than the eighth-graders' average of 156.

For the most part, twelfth-grade achievement-level results improved in 2007. The percentage at or above Basic, 82 percent in 2007, was higher than in 2002, 74 percent, and in 1998, 78 percent. The percentage at or above Proficient, 24 percent, was higher than in 1998, when it was 22 percent. The percentage at Advanced declined since 2002 from 2 percent to 1 percent.

Results by Race/Ethnicity

Average scores for White students increased in 2007, compared to both prior assessments. Black and Asian/Pacific Islander students’ scores increased since 2002 only. For Hispanic and American Indian/Alaska Native students, scores did not change significantly compared to previous assessments.

At the twelfth grade, score gaps did not change significantly either between White and Black students (23 points in 2007) or between White and Hispanic students (20 points in 2007).

Results by Gender

In 2007, scores were higher for male students than in either 1998 or 2002, while scores for female students were higher than in 1998 only.

Female students continued to outperform male students. The gap in 2007, 18 points, was smaller than in 2002, but not significantly different from the gap in 1998.

For More Information

There is much more information in the Writing Report Card. In addition, the initial release website will provide:

In closing, I would like to thank all the students and schools that participated in these assessments.

Commissioner Mark Schneider's Powerpoint Presentation:
2006 12th Grade Writing Assessment MS PowerPoint (1.4 MB)

Top