Skip Navigation
small NCES header image

TRANSCRIPT: The Nation's Report Card - Mathematics 2009

Dr. Peggy G. Carr

Hello, this is Dr. Peggy Carr, Associate Commissioner of the National Center for Education Statistics (NCES). Welcome to *Ask NAEP*, an online Q&A session about the findings for the NAEP 2009 Mathematics report. I hope you've had time to look at the results on the Nation's Report Card website, and that I can answer any questions you may have about them. I have a team of NAEP specialists here with me to make sure your questions are addressed thoroughly, so there may be a slight delay between the time you submit your question and the time the answer appears.

We are usually able to post answers for all questions, even after the *Ask NAEP* session ends at 4:00. If you cannot stay online for the full time, a complete *Ask NAEP* transcript will be available online after our discussion. I'm interested to hear which results you want to talk about.

Before we begin answering your questions, I would like to acknowledge what for many people may be one of the most interesting findings from the 2009 mathematics assessment — the fact that for the first time since the early 1990s, we saw no increase in the national average mathematics score among fourth-graders. Although NAEP is not designed to address why this leveling off may have occurred, considering additional data from the report helps us gain a more complete understanding of mathematics at grade four.

For example, it's important to go beyond the overall national scores, and look at what is happening across the states. In several states, fourth-grade students continue to make gains. Specifically, we found that scores increased in Nevada, Colorado, Kentucky, Maryland, District of Columbia, Rhode Island, New Hampshire, and Vermont. Of course, there were also some states in which scores declined -- Delaware, Indiana, West Virginia, and Wyoming.

But I encourage you to dig even deeper into the results released today, because there are many interesting findings to be examined. For instance, if you look at the percentile scores in states with overall gains, you'll find that in some cases, the gains were made mostly by higher performing students.

As you can see, there is a great deal to explore within this report. So—let's get started with your questions!

Jeff from Washington, DC asked:
What are the results for charter schools vs non-charter schools for DC? I noticed that several states have that information and historically DC has as well (NAEP 2007, 2005)

Dr. Peggy G. Carr's response:
Yes, you are correct that other states have separate data for their charter schools and non-charter schools. The results for DC that were released today show the combined data for both charter and non-charter schools. A separate press release is planned for later this fall for 18 urban districts, including DC. The data that will be available at that time will delineate the performance of students in schools that the D.C. school district is responsible for -- non-charter schools. At that time, the data for charter and non-charter schools in D.C. will be available on the web.

Richard from Saint Paul, Minnesota asked:
Is it true that school staff are those who decide who will be tested or be part of the National Assessment of Educational Progress (NAEP)sample participation, not the full IEP team?

Dr. Peggy G. Carr's response:
It is true that school staff decide whether a particular student identified as having a disability will participate in the assessment. However, the IEP is a prime consideration in this decision. School staff, and specifically the person who know the student best, are asked to fill out a questionnaire for each student with disabilities that asks whether that student takes the regular state test or an alternative test, and what testing accommodations the student requires. This information comes from the student's IEP. Then, in a pre-assessment visit, a NAEP field administrator discusses with the school staff whether that student will participate in NAEP. Usually, if the student takes the regular state assessment and needs testing accommodations that NAEP allows, that student will participate in NAEP. If the IEP calls for accommodations not allowed in NAEP (such as using a calculator for every math problem), the school staff is asked whether that student could participate in NAEP without those accommodations. The school makes the decision as to whether the student will participate.

Rick from Blaine, MN asked:
The NAEP uses a selection process which can pick your child. In this assessment process system, who and how are the Special Education parents notified, and based on what? Would it be the school staff who decides if the child was going to take the test or what other process are required for this? 10-14-09

Dr. Peggy G. Carr's response:
As I indicated in reply to Richard's question, the IEP team primarily determines whether the student is able to take the NAEP assessment. If a student is able to take the NAEP assessment, either the student or his/her parent can decline to take the assessment. All parents, not just those of students with an IEP, are notified of their student's selection, and can opt out without needing to provide any reason.

Kim from Arlington, VA asked:
Could you please discuss the results of students with disabilities and the exclusion rates for such students?

Dr. Peggy G. Carr's response:
NAEP has made great efforts to increase the participation of students with disabilities in the assessments. For example, in the grade 4 math assessment, 15 percent of students with disabilities were excluded from the assessment in 2009 compared with over half being excluded in 1990. This has taken place despite a large increase in students with disabilities during this period. As a percent of all students, most states excluded 1 or 2 percent of students due to their disabilities. Students with disabilities, on average, do perform below the level of all students. For example, in 2009 the nation's public-school 4th graders with disabilities scored at 249, compared with an average score of 285 for their nondisabled peers, a gap of 36 points. The NAEP program is continuing its efforts to increase the participation of disabled students, and to reduce the differences in state exclusion rates even further. The National Assessment Governing Board has an ad hoc committee on testing of students with disabilities and English language learners that is expected to make recommendations to the full Board in March on ways to increase participation.

Rob from Washington, DC asked:
How certain can we be that the standard setting levels used by NAEP (e.g., basic, proficient, advanced) mean precisely the same thing in mathematics in 1990 that they do in 2009?

Dr. Peggy G. Carr's response:
The standard setting levels themselves do not change, they are points on a scale that are deemed to be significant in terms of differentiating between levels of performance and are kept constant throughout the trend line. A related question might then be: did the scale change between 1990 and 2009, and, as a result, do the standard setting levels mean something different? NAEP is built around reporting trends and, therefore, employs linking methodologies based on 70% of the item pool, which is well above industry standard. As a result, we can be confident that the scale did not change and that the standard setting levels mean the same thing.

John Stallcup from Napa asked:
At the National Math Advisory Panel Meeting held at Stanford University in 2006 Jim Milgram presented compelling evidence that approximately 20% of the NAEP math questions had either incorrect answers of were not really math questions. Has this situation been addressed and/or corrected?

Dr. Peggy G. Carr's response:
The study cited in the National Math Advisory Panel report is the "Validity study of the NAEP mathematics assessment: Grades 4 and 8" conducted by the American Institutes for Research. That report was extremely helpful to the program by illuminating some areas of the NAEP mathematics assessment that could be made more accessible and could be improved through greater precision in what is expected of students. Many of the recommendations resulting from this study are now a part of the standard test development process. For example, NCES has begun to regularly convene a panel of 5 mathematicians to confirm the mathematical accuracy of new operational items. In addition, all new items are subject to an independent language review to improve accessibility by non-native speakers in conjunction with the content reviews. Both NCES and the item development contractor have added review steps to the development process. Staff and consultants have been made aware of the issues through ongoing training. While there may always be some who have differing opinions about the way a particular item has been written, quality control steps have been enhanced to assure the mathematical accuracy of NAEP items

Gwendolyn from Tallahassee, Florida asked:
The NAEP dataset is great and has been sustained over the years. Have you considered how to make it more meaningful to classroom teachers, teacher leaders and school administrators?

Dr. Peggy G. Carr's response:
Gwendolyn, The NAEP web site provides a variety of data tools to help school administrators and teachers. These are available from the NCES/NAEP home page (just under the top banner). They include

(1) The NAEP Data Explorer (NDE): Provides state-by-state and subgroup results for school administrator, teacher, and student surveys. Results include response percentages, average scale scores, and percentages at or above achievement-levels.

(2) The NAEP Questions Tool (NQT): Provides results for individual assessment items. NAEP releases a portion of the item pool each assessment. This tool enables examination of those items, percent correct statistics, score guides, and exemplar responses. Further, one can download items of interest to "create" their own test. Finally there is a "test yourself" feature that provides the opportunity to respond to 10 questions and see your results.

(3) The State Profiles and State Comparisons Tool: Provides easy comparison of state by state results and subgroup differences with effective graphical and visual displays.

John from Elkins, WV asked:
To what can we attribute the pause in growth of 4th grade mathematics performance from 2007 to 2009?

Dr. Peggy G. Carr's response:
It is not possible to attribute this pause to particular factors using NAEP because of its cross-sectional design. The same students are not measured over time. However, these data are valuable as part of an ongoing dialogue about the academic progress of students in this country. Educators and researchers at the state, and local levels need to examine what is going on in their jurisdictions to seek plausible answers to what is and is not working.

Justin Stone from Washington, DC asked:
What does it say about the minority of states where 4th grade math scores improved? Is the improvement due to the fact that many of these states are small?

Dr. Peggy G. Carr's response:
It is difficult to identify common factors in those states that improved. Each state controls its own curriculum, sets requirements for teachers, etc. One group of these states, including new Hampshire, Rhode Island, and Vermont, have formed a consortium to develop a whole new assessment system, including teacher development, curriculum, and many other elements. However, one can only hypothesize that such initiatives were factors in their improvement on NAEP. Other states with improvements have made similar efforts, which would have to be researched separately. In addition, many factors in all the improving states, such as social and economic status of the students, level of educational financing, and the priority given to education, must be considered.

Mary from Lewisburg, WV asked:
Although the data are complex, has an effort been made to look at growth in the levels of complexity? What accounts for no growth at Grade 4---is there a mathematical strand that has decreased while another increased?

Dr. Peggy G. Carr's response:
You are correct that the NAEP data are very complex. Although we do not analyze the data by level of mathematical complexity, results by content strands are available at our website through the NAEP Data Explorer.

Leonie from New York, NY asked:
In general, when there are no significant gains for NY State on the NAEPs, there are none for NYC as well when the TUDA results are released, since NYC students are such a large part of the students of NY State. Would you expect this pattern to hold true again, and when are the TUDA results due to be released? thanks.

Dr. Peggy G. Carr's response:
Hi Leonie, It is certainly the case the largest subset of a reporting unit will have a major impact on the results of the entire unit. In the case of New York State, approximately 35% of public-school students are from New York City. So certainly New York City's results will substantially impact the state results. However since there is another 65% of the student from "upstate New York", the City results will not necessarily parallel the state results. The Math TUDA results are scheduled for release later this fall. At this time, results for New York City will be available. Thank you for all of these excellent questions. We are hard at work getting your answers out. While we will stop accepting questions at 4:00, we will stay live online until all questions received before and during the chat have been answered.

Peter from Westminster, Maryland asked:
Does the NAEP mathematics results provide any evidence that the accountability policies associated with the Elementary and Secondary Education Act of 2001 have led to higher student achievement?

Dr. Peggy G. Carr's response:
It's difficult to say what role a particular program or piece of legislation plays in student performance as measured by NAEP. Although NAEP is not designed to identify the influence of any particular policy on student achievement, it does serve an important role as an external indicator of how states are doing in meeting their educational goals. NAEP results can, and do, figure in to discussions of accountability, especially when comparing across states that have different educational standards and different assessments.

Lori from Orlando, FL asked:
The report mentions The Mathematics Framework which describes the skills to be assessed. Are the NCTM standards by grade-level part of the framework? If so, why aren't they mentioned? Many schools use the NCTM standards as a basis for instructional delivery and assessment.

Dr. Peggy G. Carr's response:
The NAEP Mathematics framework and the NCTM standards both have the same five major content areas, as do many state frameworks and standards. However, the NAEP mathematics framework strives for a balance across a variety of philosophies of mathematics curriculum and instruction. The NCTM standards informed the early development of the mathematics framework, along with other sources.

Don from Norwalk, CT asked:
The consistency of the small gap over the years between boys and girls is surprising. Does this gap hold true across racial and economic lines?

Dr. Peggy G. Carr's response:
This is the kind of interesting question that NAEP data can help answer. The overall gender gap that you mention is, in fact, not persistent across all racial/ethnic groups or socioeconomic status. I encourage you to investigate further with our online data tool

Lori from Orlando, FL asked:
Will the data be extrapolated to compare to international students of the same age/education levels?

Dr. Peggy G. Carr's response:
The international study that assesses grade 4 and grade 8 students in mathematics is the Trends in International Mathematics and Science Study (TIMSS). TIMSS was conducted in 2007, and will be conducted again in 2011, but not in 2009. So it is not possible to directly compare these most recent results from NAEP with results from other countries, based on TIMSS. NAEP data are made available to the academic and research communities, and it is possible that someone will compare 2009 performance on NAEP with performance of similar students in other nations in other years. NCES has not conducted such a study in recent years. In 2011, the design of the NAEP assessment may be tweaked to collect data that will facilitate stronger comparisons of this type.

Sharon from Falls Church, VA asked:
Considering states that made improvements in 4th grade over the last two years, have you been able to identify any specific policy or system wide changes in these states?

Dr. Peggy G. Carr's response:
Sharon, At grade 4, four states showed improvement both from 2005 to 2007 and from 2007 and 2009: District of Columbia, Kentucky, New Hampshire, and Vermont. While NAEP is not designed to identify the influence of any particular policy or educational practice on student achievement, NAEP results can be used to inform policy decisions at the state level. It may be helpful to look further into policies and practices in these four states to more fully understand factors that may be contributing to student gains. For instance, Vermont and New Hampshire are participants, along with Rhode Island, in the New England Common Assessment Program (NECAP), which measures student knowledge relative to expectations developed by teachers in those states. Nonetheless, we cannot say that the policies or practices taking place in any state caused NAEP scores to increase.

Heather from Baltimore, MD asked:
I'm curious to know what you would attribute the rise in scores for Maryland and District of Columbia given one has state-level leadership that has been in place for many years and the other's leadership is relatively new?

Dr. Peggy G. Carr's response:
Although NAEP results show which states are making gains, NAEP is not designed to identify the reasons for academic performance. Learning results from a complex of factors, including leadership, support for education, resources, and the social and economic status of the students. While NAEP teacher and school questionnaires collect some information related to these factors, we encourage users of NAEP data to consider a variety of sources of information in making interpretations about student performance.

Lori from Orlando, FL asked:
Have there been any efforts to get test data from home schooled 4th and 8th graders as a point of comparison?

Dr. Peggy G. Carr's response:
Lori, That is an interesting question. The number of home-schooled students has increased lately. We know that many of the spelling and geography bee finalists are home-schooled. But we don't know how they perform on the NAEP assessment. The Federal government has not yet developed a sampling frame that includes such students.

Christopher Connell from Alexandria, VA asked:
Massachusetts is an outlier: fewest students at or below basic, most proficient and advanced in mathematics. This may not be new, but is there any explanation for why its students are achieving at higher levels?

Dr. Peggy G. Carr's response:
At the press release this morning Dave Driscoll, chair of the Governing Board and former chief of the Massachusetts education department, commented that Massachusetts put a lot of effort into upgrading the skills of teachers and raising their academic standards in mathematics. You can view his remarks at Click on the link to the archived webcast ("Watch the press conference of the 2009 Mathematics Report Card.").

Kate from Roseville, MN asked:
Can you comment on the positives and negatives coming out of the report since it's release this morning?

Dr. Peggy G. Carr's response:
There were some encouraging results in this report. For example, the grade 8 students continue to improve. The District of Columbia, Nevada, New Hampshire, Rhode Island, and Vermont made gains at both grades; no state decreased at both grades. The fourth-grade results have stalled, but this in and of itself is informative as well. I recommend that you download the Report Card and read the executive summary (pages 1-3) to get a quick look at the most essential findings.

Lori from Orlando, FL asked:
Does the current school group participating in the study include any virtual or fully online programs? If not, are there plans to include them and to disaggregate their data like you do for Catholic schools?

Dr. Peggy G. Carr's response:
It is important to distinguish virtual schools from fully online programs. As with home-schooled students, NAEP is not able to assess students who are only enrolled in fully on-line programs that they study in their homes. However, NAEP can assess students who attend central locations where they take virtual classes, and we do include eligible students who are in these programs whenever possible. However, the sample sizes of students from these programs are not large enough to allow NAEP to report results for them.

Brian from Clinton, MA asked:
I have two children and neither of them are yet in school. What can I do at home now to help them prepare. Also what can I be doing as a parent to prepare myself to help them with Math? I know a lot has probably changed since I was in school and my math skills are probably in need of a tune up.

Dr. Peggy G. Carr's response:
It's wonderful that you are thinking so far ahead! NAEP is concerned with the mathematics skills and knowledge of fourth, eighth, and twelfth grade students and research conducted with NAEP does not address the mathematical development of younger children. Another National Center for Education Statistics program, the Early Childhood Longitudinal Study - Kindergarten Cohort would be a good resource to consult. The longitudinal nature of the ECLS-K data enables researchers to study how a wide range of family, school, community, and individual factors are associated with school performance. For children younger than 5 years, the Early Childhood Longitudinal Study - Birth Cohort should be consulted.

Rickie from Scandia MN asked:
Does this NAEP assessment need to be listed for the child on a IEP, and provide a check box for the parent to say yes or no if there child is allowed to participate?

Dr. Peggy G. Carr's response:
Setting guidelines for the content of IEPs is a state-level responsibility. I am aware that some states are considering and beginning to implement what you describe. Regardless of whether or not NAEP is listed on the IEP, school districts are requested to notify parents about the NAEP assessment and offer them the opportunity to not have their child assessed.

Young from Washington DC asked:
Dr. Carr, thank you for releasing the invaluable performance data in Math. It seems NCES has invested efforts to improve the quality of data on student eligibility for NSLP and reports data compared up to 2003. Would you please elaborate NCES' refined efforts for doing QC and QA of this particular proxy measure of SES at student's household level? Plus would you update the recent efforts to develop more reliable measures of SES using Census geocoded data?

Dr. Peggy G. Carr's response:
In collaboration with the Bureau of the Census, NCES is exploring a new measure of SES using American Community Survey (ACS) data. This new SES measure explores the possibility of using ACS data to validate and complement NAEP's current measures of socioeconomic status. The 2009 pilot data are currently undergoing thorough analysis and review. The next phase of our validation study will be conducted in 2010 during the NAEP data collection window.

Leanna from Naples, New York asked:
What does this data mean in terms of approaches to teaching math? Parents are confused about which approaches are best and how to be most effective in supporting their child's learning in math. What guidance can you give parents who are worried about their child not mastering the 'basics' in math.

Dr. Peggy G. Carr's response:
While NAEP cannot answer your question directly in terms of recommending approaches to teaching math, we do collect data on several factors that have been associated with higher performance. For example, we have noted that students who take higher-level courses, such as algebra in the 8th grade, tend to outperform others. Data on student, teacher, and home background factors can be found on the NAEP Data Explorer. You may also find useful the information we have made available on the NAEP Questions Tool. There you will find hundreds of sample questions along with scoring guides and data showing how well students across the country perform on those questions. This might give you some indication of the skills and knowledge that are expected of students at different grades.

Rudy from Alexandria, VA asked:
The data seems to show that while all races in both 4th and 8th grade are improving, the achievement gap hasn't really shrunken over the past couple of years. Traditionally, what kinds of activities or actions will close this gap? Do you expect to see a point at which the white scores will top out and the scores of black and latino students will begin to close in?

Dr. Peggy G. Carr's response:
Over the past two decades, NAEP results have shown greater increases for black and latino students than for white students, resulting in the gaps beginning to shrink (although not from 2007 to 2009). It is quite possible that such differential score improvement could continue in the future. It is not necessary for the scores for white students to top out; the score gap could close simply from a continuation of the greater gains among blacks and latinos.

Mary King from Downers Grove, IL asked:
Why aren't all schools taking the same test? I don't know how schools can be compared when so many take different tests. In Illinois, the ISAT is now being written by a book company and it is a much easier test than what was previously used.

Dr. Peggy G. Carr's response:
Mary, you're right that not all states administer the same test. States are permitted to choose the assessments that they administer under No Child Left Behind, and therefore the tests taken and the definitions of "proficiency" vary across states. However, because states administer different tests, NAEP plays an important role as an independent measure of states' performance. NAEP is not aligned to any particular state curriculum and enables all participating states to be compared with respect to a common standard for Proficient performance.

Dr. Peggy G. Carr:
Thanks for all the excellent questions. Unfortunately, I could not get to all of them, but please feel free to contact NAEP staff or myself for more information. I hope that you found this chat helpful and the report interesting.

Back to StatChat Home

Would you like to help us improve our products and website by taking a short survey?

YES, I would like to take the survey


No Thanks

The survey consists of a few short questions and takes less than one minute to complete.
National Center for Education Statistics -
U.S. Department of Education