Skip Navigation

Jack Buckley
Commissioner, National Center for Education Statistics

National Assessment of Educational Progress
NAEP 2011 Writing Assessment

September 14, 2012

Today I am releasing the results of the 2011 Writing Report Card, our first computer-based writing assessment.

The assessment was administered in early 2011, at grades 8 and 12. Our samples included 24,100 eighth-graders and 28,100 twelfth-graders. The 2011 assessment was conducted at the national level only. Both public and private school students were assessed.

Results for the assessment are reported in two ways: as average scale scores on a 0–300 scale, with a separate scale for each grade, and as percentages of students at the three achievement levels: Basic, Proficient, and Advanced.

The achievement levels were developed by the National Assessment Governing Board. They set standards for what students should know and be able to do. For each subject and for each grade, the Governing Board has established standards for Basic, Proficient, and Advanced performance. Ultimately, the goal is to have all students performing at or above the Proficient level.

Because the 2011 assessment uses new technologies that differ significantly from those used in previous assessments, we can't make comparisons to past results. As the first assessment in a new series, the average score for both grade 8 and grade 12 students has been set at 150. In the assessment, students are presented with tasks that reflect grade-appropriate, real-world issues and are designed to measure one of three communicative purposes:

The percentage of students' time devoted to each of the three purposes was different for the two grades, with greater emphasis on writing to convey experience at grade 8 than at grade 12. Each writing task fell into one of the three categories. In addition, each task specified or implied a particular audience that corresponded in some way to the task. To be effective, writers must have an awareness of their intended readers' needs and level of knowledge about the writer's topic.

When scoring student responses, we take into account the fact that students are providing us essentially first drafts.

Scorers received special training to ensure consistency in scoring for all responses. A certain percentage of student responses were scored twice, by different scorers, to ensure that scorers were indeed producing consistent results. Each student response was given one of six possible skill ratings, running from "Effective" down to "Little or no skill."

Students were evaluated using holistic scoring rubrics whose criteria were based on three broad features of writing:

There were many innovations in the 2011 writing assessment.

Because the 2011 writing assessment was computer-based, we were able to take advantage of what is called "universal design," which allows us to build into the software program used by all students a variety of features to accommodate special needs students—both students with disabilities and English language learners. All students can take advantage of these features if they wish. In addition to the accommodations incorporated into the universal design, some accommodations were available to special needs students only.

One of the grade 8 writing tasks offers an example of use of a multimedia format that featured an audio prompt. It asked students to immerse themselves in an imaginary situation and to write about it as if from personal experience. Students listened to an audio recording of atmospheric sounds while reading a few sentences from an imaginary journal. The first sentence of the text that students read was as follows: "When we first arrived on the island, we saw mountains and fields with lots of colorful flowers and large, strange-looking trees."

While students read the passage, the audio provided the sound of waves lapping on the shore, the squawking of birds, as well as the sound of footsteps in the sand to create a sense of the island world that the students were to imagine exploring.

Students then typed their response in a response window, allowing them to refer to the prompt at all times. Each student completed two writing tasks, or prompts, during the course of their participation in the assessment.

Grade 8 Results

Before discussing the grade 8 results, I must remind you that all NAEP results are based on samples, which means that there is a margin of error associated with each score or percentage. Therefore, we only identify those differences in scores or percentages that meet our standard for statistical significance.

Since many of our results refer to the NAEP achievement levels, I'll define them each briefly for you.

Twenty-seven percent of students performed at or above the Proficient level (24 percent at Proficient and 3 percent at Advanced). Fifty-four percent of students were at Basic, while 20 percent were below Basic.

We report scores in 2011 for the seven racial/ethnic groups for which NAEP now collects separate data. In 2011, fifty-eight percent of eighth-graders were White, while 14 percent were Black, and 20 percent were Hispanic. Scores for each group can be compared to the national average of 150 as well as to one another. Asian students, with an average score of 165, scored higher than all the other groups.

The average score for female eighth-graders in 2011 was 160 points, 19 points higher than the average score for male eighth-graders (calculated using unrounded numbers). While we can't compare 2011 scores with previous assessments, we can say that female students had consistently higher scores than male students, at all grades, on previous writing assessments. NAEP uses student eligibility for the National School Lunch Program as a measure of family income. Students whose families have an income that is less than 185 percent of the federal poverty level are eligible for the school lunch program, while those whose families are above 185 percent are not eligible.

In 2011 at grade 8, students who were eligible scored lower than students who were not eligible, by 27 points. Eligible students constituted 42 percent of all eighth-graders in 2011. In addition to assessing students' writing ability, the 2011 writing assessment included a questionnaire to be filled out by teachers of participating students, containing questions about their classroom practices and other topics. Among other things, teachers were asked how often they asked students to use computers to draft and revise their writing.

Students whose teachers more frequently asked them to use the computer to draft and revise their writing scored higher than those whose teachers did so less frequently. For example, students whose teachers said they never or hardly ever asked students to make such use of their computers had an average score of 141, at least 5 points below any of the other categories.

About 44 percent of students had teachers who said they asked students to use computers to draft and revise their writing either very often or always or almost always. It may be that asking students to use computers for their writing improves their writing skills, but there are many factors that affect student performance, and NAEP is not designed to identify the causes of student performance. It may also be that computer-based instruction may be more prevalent in higher-income areas, or that teachers with higher-performing students are more likely to ask their students to use computers.

Regardless of income, students' performance tends to increase with the frequency with which their teachers asked them to use computers to draft and revise their writing. The average score of lower-income students whose teachers reported never or hardly ever asking them to use computers for their writing was 130. The average score of lower-income students whose teachers reported always or almost always asking them to use computers for their writing was 141. There were similar differences for students from higher-income families, with the scores running from 155 for those in the never or hardly ever category to 167 for those whose teachers reported always or almost always asking them to use computers for their writing.

One of the universal design features of the writing assessment allowed students to listen to a writing prompt instead of reading it, via a text-to-speech tool. Seventy-one percent of eighthgraders used the text-to-speech function at least once.

We report the average scores for students who did not use this tool at all, those who used it once, those who accessed it twice, and those who used it three or more times. The results show that increased use of the text-to-speech tool correlated with lower scores, on average.

Grade 12 Results

At grade 12, as at grade 8, twenty-seven percent of students were at or above Proficient (24 percent at Proficient and 3 percent at Advanced). Twenty-one percent were below Basic, while 52 percent were at Basic.

White, Asian, and multiracial students had average scores that were comparable with each other and were higher than Black, Hispanic, and American Indian/Alaska Native students. The average score for female twelfth-graders in 2011 was 157, fourteen points higher than the average score for male twelfth-graders.

NAEP assessments ask students to report the highest educational level completed by each of their parents, using the five categories supplied by NAEP. In our analysis we group students according to the highest educational level attained by either parent. Higher educational attainment by a parent is associated with higher scores on NAEP. Nine percent of students said neither parent graduated from high school. The average score for these students was 129. In comparison, 49 percent said that at least one parent graduated from college. The average for these students was 160.

NAEP asked grade 12 students how many pages they wrote in a typical week for homework in their English/language arts class. Those who said they wrote four to five pages had an average score that was higher than the average scores of students who said they wrote fewer than four pages. Thirty-nine percent of students said they wrote none or up to one page in a typical week. NAEP results can't tell us if requiring students to write four or five pages a week for homework will improve their scores. It's possible that higher-performing students are likely to write more pages on their own.

Students were asked on the assessment how often during the school year they use a computer to make changes to a paper or report. In 2011, twelfth-graders who reported more frequent use of a computer to edit their writing had higher average writing scores than those who reported less frequent use. For example, students who said they always or almost always used a computer to edit their writing had a higher average score than students who said they did so less frequently. Fifty-six percent of students were in this top category.

The grade 12 writing assessment allowed students to access editing tools, among them a thesaurus. We report the average scores for students who did not access the thesaurus tool, those who accessed it once, and those who accessed it more than once. Sixty-nine percent of students never used the tool. Those who did scored higher, and those who used it at least twice had higher scores than those who only used it once.

The 2011 Writing Report Card provides all of this information and much more. In addition, the initial release website gives extensive information on the performance of students, and access to released assessment questions and student responses at all six rating levels through NAEP's Questions Center. The NAEP Data Explorer, our online data analysis tool, allows extensive further analysis of student performance as well.

In conclusion, I would like to offer my sincere thanks to all the students, teachers, and schools who participated in the 2011 writing assessment.

Visit the Nation's Report Card website..

Top