Results from the 2002 National Assessment of Educational Progress(NAEP) Reading Assessment

Dr. Peggy G. Carr Hello, and welcome to today's StatChat on the NAEP 2002 Reading results for the nation and the states. I hope you've had time to look at the results on the website. I'm sure that you have many questions regarding today's release, so let's get right to them...

jay samuels from St. Paul Minn. , The University of Minnesota asked:
As a member of the National Reading Panel I am aware that there is for want of a better term a "racial achievement gap" in reading. In general, Blacks, Native Americans, and Hispanics do less well in comparison to Caucasians. I think that this gap is cast in the wrong light. It is not a racial gap but a socio-economic gap. In this case race and SES are confounded. A random selection of these underperforming groups and a random selection of better performing groups results in comparison groups that are significantly different in wealth and education. What do you think of my suggestion of recasting the problem as a gap based on SES rather than race?
Dr. Peggy G. Carr: I totally agree, but as it turns out we do have SES-surrogate variables in the assesment, such as Title I and eligibility for free or reduced price lunch. You can use the NAEP data tool to create a cross-tabulation of race/ethnicity with these variables as approach to investigating these relationships. Peggy

Kristi Garrett from Sacramento, California asked:
Some researchers hypothesize that older students score lower on some national and international comparisons -- like TIMMS & NAEP -- because of the low-stakes nature of the tests. In other words, they have learned these tests have no consequences for their future, so they don't give them their best effort. Could this motivational phenomenon be a partial explanation for the poorer performance of secondary students?
Dr. Peggy G. Carr: The National Assessment Governing Board (NAGB) is investigating the affects of motivation in the grade 12 assessment, and determining whether or not the program should be adjusted to make sure that we get optimal performance from 12th graders. However, it is also important to remember that if motivation were to explain declines in scores, we would need to believe that the average motivation of older children had declined, and we have no direct evidence of this, one way or the other. Peggy

sherri patterson from louisville kentucky asked:
I was wondering if i could take my classes on my computer instead of being in a classroom?
Dr. Peggy G. Carr: This a question best addressed to your local school district.

hermie from reno, nevada asked:
Will reading a local newspaper, or learning to use comics as a reading tool for values, etc...increase reading skills and a crave to read?
Dr. Peggy G. Carr: As a long-time comic reader myself, I believe that comics propelled my reading skills to a higher level. However, there is no evidence from the NAEP that comics reading is benefical or not, because we don't ask whether children read comics regularly. What we do ask is whether students read for fun! What we have found is that there is a positive relationship between reading for fun and NAEP scores -- not causal -- but interesting. Peggy

Josh from Dallas, TX asked:
Forgive me if this is obvious, but are scale scores broken down by race/ethnicity and by state easily available on the website? I'm looking for an easy comparison of the Hispanic and black scale scores of Texas vs. other states. Many thanks!
Dr. Peggy G. Carr: Yes, at grades four and eight race/ethnicity data are available state-by-state in our online data tool. First click the "NAEP data" link at the top of the home page. Then select "reading" "grade 4" or "grade 8" "National Public", and "Major Reporting Groups." This will give you a link for "race/ethnicity from school record." Selecting that will give you national data by race/ethnicity. Then click on user options (top right-hand corner), and choose "add/delete states," and select the states of interest. Peggy

Mark from Fort Wayne, Indiana asked:
Could we get some data on how Reading Recovery programs affect student achievement? Thank you.
Dr. Peggy G. Carr: Mark, Since NAEP does not collect data about Reading Recovery programs for the assessed students, NAEP results do not provide specific information about this topic. If there are states with full Reading Recovery implementation during the time period since the last assessments, it is possible that the Reading Recovery may be one factor in reading improvement, but NAEP cannot provide specific information on this. Peggy

Betsy Hammond from Portland, Oregon asked:
Why do you think California students read so poorly, according to NAEP results?
Dr. Peggy G. Carr: You should know while it is true that California students scored lower than students in some other states, many groups within the state have shown gains in recent years. For example 4th grade Black students gained 15 points since 1992, Hispanic students have gained about 12 points, and Asian/Pacific Island students have gained about 13 points. Overall state scores have gone up since their low point in 1994. These gains have come as the state has included greater percentages of special needs students in the assessment. As for why students in any state perform at different levels than those in other states, NAEP is not designed to answer this sort of causal question.

Miriam from Boston, MA asked:
For students with disabilities who participated in NAEP testing, were they provided with accommodations (which did not fundamentally alter the test) or modifications (which did fundamentally alter the test)?
Dr. Peggy G. Carr: NCES permits students accommodations as needed in order to participate appropriately in NAEP, as long as the accommodation does not alter the measurement of reading. Certain changes to assessment conditions are not permitted by NAEP on the reading assessment, such as reading of the passage aloud or translation, as we believe they may alter the construct being assessed. You can find an explanation about NAEP's inclusion policy on our website. Peggy

Todd from Raleigh, NC asked:
If a state's exclusion rate has increased from 7 percent to 12 percent from 1998 to 2002, can valid comparisons be made?
Dr. Peggy G. Carr: Yes, we believe the comparisons are valid. It is possible that an increase in exclusion over time might impact score trends, and readers need to consider such increases when interpreting results. However our studies of this issue indicate that exclusion alone does not account for very much of the score gains in states like North Carolina. Take a look at Appendix A of The Nation's Report Card. It contains information about the special analyses we conducted (pp 155-163). Peggy

Stephen from st. petersburg, florida asked:
What are the major caveats for using the new NAEP data to make comparisons among states. we can use our own state tests to see whether we are making progress over time, comparing this year's Florida students against last year's etc. NAEP is helpful to place our progress in context. Yet, direct state by state comparisons and rankings are discouraged. Would you advise against statements such as, "Florida improved its scale score, yet still ranks below Alabama in 4th grade reading?"
Dr. Peggy G. Carr: You can make state to state comparisons using NAEP data. The major caveat is that not all state by state comparisons reflect reliable differences and not simply random fluctuations. Figure 2.6 (p.32) of the Reading Report Card shows the results of statistical tests comparing every participating state to every other one. Results shown in this table indicate that Florida fourth graders had an average scale score that was higher than Alabama's fourth graders. Peggy

Clare from Shirley, MA asked:
I am very happy to see that students in Massachusetts performed very well on NAEP! How can these results be compared to those of a given state's own assessment program?
Dr. Peggy G. Carr: You are certainly right: Massachusetts fourth graders out performed students in other states, and the eighth grade students also did well. Unfortunately we have no information at this time that allows us to make direct comparisons of NAEP and the Massachusetts assessment. Peggy

Kristi Garrett from Sacramento asked:
California had significantly more (27%) English learners take the test than the national average (7%), which probably helps explain our lower scores on a reading test. We've been cautioned against using these scores to make state-by-state comparisons, but isn't that what typically happens?
Dr. Peggy G. Carr: This is an excellent and somewhat complex question. Comparisons of average scores in states do give an accurate picture of differences in the performance of students. However, you are also correct in noting that states have varied demographic compositions. People interpreting data may want to keep these differences in mind. You can gain a quick picture of a state's demographic makeup by going to the state profiles on our website and clicking on any state. Peggy

Nancy from Arlington, VA asked:
What role does NAEP take in the No Child Left Behind Act? Is it used to confirm states results, and whether they are making adequate yearly progress?
Dr. Peggy G. Carr: Nancy The only specific role defined for NAEP, in No Child Left Behind, is that states who receive Title I funds must participate in state NAEP in reading and math at grades 4 and 8 every other year. It is anticipated that policy makers will use NAEP results as a "serious discussion tool" in evaluating state assessment results.

Frank from Madison, WI asked:
If schools are chosen randomly for each assessment, why are certain schools selected several years in a row?
Dr. Peggy G. Carr: NAEP was a stratified, random sample. That is, it randomly selects schools that meet different criteria, such as low income or Hispanic students. This helps to reflect the abilities of all segments of a state. By the way, some small states have all their schools selected every assessment.

Amy Miller from Asheville, North Carolina asked:
I'm a reporter wondering how local school districts can use this data. Administrators in North Carolina have told me it isn't useful to them about their specific school district. How do you think can local school districts could use the information?
Dr. Peggy G. Carr: It's true that NAEP does not provide average scores for schools within North Carolina. Still, school districts can benefit by knowing how well their states as a whole did in comparison with the nation and other states. In addition, school districts may find it useful to see what the National Assessment Governing Board thinks is important to measure about reading skills, reflected in the reading questions. Sample questions are always available on the NAEP website at

Thanks for all the excellent questions. Unfortunately, I could not get to all of them, but please feel free to contact the NAEP staff if you need further assistance. I hope that you found this session to be helpful and the reports to be interesting. Later this summer, we will release the NAEP 2002 Writing results for the nation and the states. Please be sure to visit our website at for more information.

Back to StatChat Home