Skip Navigation

The Nation's Report Card - Results from the 2007 NAEP writing assessment

Dr. Peggy G. Carr Hello, and welcome to today's StatChat on the 2007 NAEP Writing Assessment. I hope you've had time to look at the results on the website. I'm sure that you have many questions regarding today's release, so let's get right to them.

Sharlene from Salt Lake City, UT asked:
Can you provide more information about the following groups of 8th and 12th graders in each area:
1. Students with disabilities and students without disabilities
2. Males and females overall
3. Ethnic groups: African Americans, Asians, Hispanics, and other groups?

Dr. Peggy G. Carr 's response:
Information on all of these student groups can be found on the Nation's Report Card website. For results by disability status, see For results by gender, see And for results by race/ethnicity, see

Maisie from New York, New York asked:
Was the national average scale score at 8th grade 156 or 154? You have reported it at 156 in Dr. Schneider's letter and 154 on the charts.

Dr. Peggy G. Carr 's response:
Maisie, I can understand your confusion, because both numbers are correct. The 156 score is the combined score for public and private school students. The 154 score is the average score for public school students only. We use the public school average when comparing the average score for public school students across states.

Stan from Washington, DC asked:
Is there any correlation between the ranking of the amount of money spent per pupil for cities or states and the ranking against the test scores of pupils, particularly in the areas of math and science? In other words, does it make sense to increase budgets in already well-funded, but failing systems, or should there first be a demonstration that the existing budgets are being used wisely and efficiently. For some very well funded, but failing school systems, there appears to be little or no accountability. To throw more money at a poorly administered system with marginal, but well-paid administrators and teachers, seems to be counterproductive. Are existing teacher performance standards adequate?

Dr. Peggy G. Carr 's response:
Stan, NAEP collects a wealth of background information but, unfortunately not the information that might provide an answer to your specific questions. For example, the amount of funding per student is not collected and, therefore, NAEP cannot be used to determine whether there is a correlation between the level of funding and test scores. In addition, NAEP is not designed to determine proficiency at the student or school level and would not be the appropriate tool to evaluate specific school systems in terms of test scores.

John from Bridgeport, Connecticut asked:
Two part question: First, many international studies are being conducted to compare US academic skills with other countries such as PISA. How does the US rank with other countries? And second, what instructional strategies and or programs seem to make the most difference in improving student writing skills?

Dr. Peggy G. Carr 's response:
Many international studies are being conducted to compare US academic skills with other countries such as PISA. An international study of writing was attempted in the 1970s, but did not succeed. The countries could not agree on a common standard for evaluating writing samples. Apparently, the conventions for what constitutes good writing differ from one language to another. Consequently, there is no international study that can answer your question.

To answer your second question, while NAEP does not and is not intended to measure the causal relationship between curriculum and performance,there is some information about instructional practices on the Nation's Report Card website. For instance, higher percentages of females used time during the assessment to formulate their writing (i.e. their test booklet had evidence of pre-writing"). See the percentage of students who had evidence of prewriting in their test booklets and do a further analysis on the scores.

Adrienne Greene from Newburgh, NY asked:
As a high school teacher of social studies I have been finding my students lacking the tools to adequately complete document-based questions and lacking in ability to sythesize information into a comprehensive essay that completes the task at hand.

Do the findings of the writing assessment necessitate the treatment of this expression as a separate ELA component at the middle school and secondary school levels?

Dr. Peggy G. Carr 's response:
There are data that may be of interest to you on the NAEP website about relationships between student performance and instruction, but it is important to note that these relationships are correlations, and do not establish causality. See

Mike Chapman from Helena, MT asked:
Do you have any comment concerning how much emphasis writing is given during teacher training at the university level?

Dr. Peggy G. Carr 's response:
Hi Mike--

See my response to Adrienne's question. Hope this helps.

Barbara from Old Bridge, NJ asked:
How does your school rate in national compare to NJ schools?

Dr. Peggy G. Carr 's response:
Hi Barbara, If you are asking about how 8th-grade students in New Jersey compared to the rest of the nation in writing, their average score was 175, which numerically was the highest in the nation. The New Jersey average (175) was significantly greater than the national average (154), considering just public schools. New Jersey's 8th-graders scored higher than any other state in the assessment. Results for individual schools can not be reported. For the 12th grade, average scores are reported only at the national and regional levels.

Amy from Rochester, NY asked:
What would you say is the most striking change in results from the 2002 assessment to the 2007 assessment?

Dr. Peggy G. Carr 's response:
Amy, Some of the more notable findings of the 2007 assessment included the following: The improvement at grade 12 in writing, which we have not seen in some of the other subjects recently; The improvement among male students at both grades 8 and 12; The narrowing of the performance gap between Black and White students; and Improvements in performance at the 8th grade in some of the large urban school districts since 2002.

Mike Bowler from Catonsville, MD asked:
I'm interested in what's happened to the "gender gap," which was substantial in '02. Was that true again in the '06 assessment, or has there been a closing of the gap or a change in the scoring pattern? And though I know you can't answer "why," maybe you could discuss what experts say about why one gender apparently writes better than the other.

Dr. Peggy G. Carr 's response:
Hello Mike, According to our 2007 results, the gap between male and female 12th-grade students is getting smaller. Twelfth-grade boys are improving their scores at a faster rate than 12th-grade girls. However, the gap has not changed for 8th-grade students., Eighth grade boys and girls are improving at about the same rate. In her statement, Amanda Avallone, the vice chair of the National Assessment Governing Board, addressed the issue of the ongoing gender gap in writing proficiency. She relied on her own experience as an eighth-grade teacher. Amanda Avallone's statement can be obtained from the National Assessment Governing Board: or (202) 357-6938.

Scott from Norfolk, VA asked:
What changes will result from the NAEP Writing assessment moving to computer-based testing under the new framework?

Dr. Peggy G. Carr 's response:
Scott, that is an excellent question. We are currently in the process of developing the computer based NAEP Writing assessment as well as the implementation thereof. Among the many changes, one of them is that we are considering is the use of standard editing tools (e.g., spelling check). The impact of changes such as these on student group writing proficiency differences is difficult to predict.

Bertha from Raleigh, NC asked:
Are children attending charter schools included in the NAEP sample and if so, can results be analyzed separately for those students?

Dr. Peggy G. Carr 's response:
Bertha, Charter school results are, indeed, included in the public school results. Results can be disaggregated for charter schools and public schools using the NAEP Data Explorer at Type of school variables are listed under "Major Reporting Variables". Choose the variable labeled "School identified as charter (National Public)"

Paul from Los Altos, CA asked:
Since the number of children excluded from participating in NAEP changes from test year to test year, do the results you are publishing now correct for that source of bias?

Dr. Peggy G. Carr 's response:
Paul, We have looked into this issue extensively, and are fairly certain that little or no bias is introduced into the results because of differential exclusion rates. With regard to state trends, a non-significant correlation was found between changes in the rate of exclusion and changes in average scores (.16). Therefore, exclusion rate changes explain little to none of the score changes.

Robert from Sterling, VA asked:
Do you think that the popularity of text messaging amongst kids and the shorthand-like manner in which they text is posing a negative impact on writing skills?

Dr. Peggy G. Carr 's response:
Hi Robert, There is no way to answer your question directly using the 2007 NAEP writing results. However, the assessment included a few background questions on the use of technology that can be correlated with student scores using our NAEP Data Explorer.

Bob from Brookeville, MD asked:
Are there any plans to test 4th graders in Writing the next time the assessment occurs?

Dr. Peggy G. Carr 's response:
Are there any plans to test 4th graders in Writing the next time the assessment occurs? The schedule of assessments set by the National Assessment Governing Board does call for a fourth grade writing assessment in 2011. The planned schedule can be found at

Matt from Topeka, KS asked:
Based upon the "evidence of pre-writing" information above, does NAEP have any intention of considering a shift away from the on-demand assessment of writing to a more process-based assessment?

Dr. Peggy G. Carr 's response:
The 2007 assessment does give students the opportunity to engage in a form of process writing. Students are given space in their test booklets to pre-write, and are encouraged to do so in directions. They are also given brochures that present some strategies for planning and revising and editing. Further, students' writing for the NAEP assessment is scored as first draft writing; a certain level of error is expected and acceptable. The framework upon which the new writing assessment (to be administered in 2011) will be based recognizes the value of drafting, planning, and revising. In fact, because at grades 8 and 12, students will take the 2011 assessment on computer, they will have access to a menu of editing tools typically available to those writing on the computer. However, the new framework also emphasizes timed, on-demand writing, given how much of this kind of writing people are expected to do in educational and workplace settings. Hence, students will be expected to produce on-demand writing for the 2011 assessment timing.

Mike Chapman from Helena, MT asked:
Mr. Casserly made reference to an "alignment" problem in NC as a possible reason for low scores. Does this refer to alignment of state curriculum vs NAEP frameworks, or something else?

Dr. Peggy G. Carr 's response:
Hi Mike, I can't speak for Mr. Casserly, as it could be one or all of these things. You might contact Mr. Casserly for clarification ( I might add that Charlotte has consistently scored comparably to or higher than North Carolina in all subjects tested. So the real answer is likely to be more complex than what we are seeing in this one assessment.

Dr. Peggy G. Carr :
Thanks for all the excellent questions. Unfortunately, I could not get to all of them, but please feel free to contact NAEP staff or myself for more information. I hope that you found this chat helpful and the report interesting.

Back to StatChat Home