Skip Navigation

StatChat Home

TRANSCRIPT: The Nation's Report Card: 2011 NAEP Mathematics and Reading Assessments

Dr. Peggy G. Carr

Hello, this is Dr. Peggy Carr, Associate Commissioner of the National Center for Education Statistics (NCES). Welcome to Ask NAEP, an online Q&A session about the findings for the 2011 NAEP Mathematics and Reading reports. I hope you've had time to look at the results on the Nation's Report Card website, and that I can answer any questions you may have about them. I have a team of NAEP specialists here with me to make sure your questions are addressed thoroughly, so there may be a slight delay between the time you submit your question and the time the answer appears.

We are usually able to post answers for all questions, even after the Ask NAEP session ends at 3:00. If you cannot stay online for the full time, a complete Ask NAEP transcript will be available online after our discussion. I'm interested to hear which results you want to talk about.

Rivki Locker from asked:
Will a copy of this Power Point be shared on the NCES website? If so, when and where will it be available?

Dr. Peggy G. Carr's response:
We are glad you enjoyed the presentation at the press release. The PowerPoint from today's live press release is now available at http://nces.ed.gov/whatsnew/commissioner/remarks2011/11_01_2011.asp.


Dan Hardy from asked:
When will the 2011 TUDA district results be released?

Dr. Peggy G. Carr's response:
The 2011 TUDA mathematics and reading results will be released in December 2011.


Kwee Lan Teo Yam from asked:
When will the Trail Urban District Assessment results be released?

Dr. Peggy G. Carr's response:
The 2011 TUDA mathematics and reading results will be released in December 2011.


Thomas D. Wolsey from asked:
While NAEP guides and informs policy decisions on a large scale, what might teachers and professors do with the Nation's Report Card on a practical or classroom level?

Dr. Peggy G. Carr's response:
The Nation's Report Card and the materials available on www.nationsreportcard.gov provide a variety of resources that can be used in the classroom, as well as information about what students in the nation and the reporting jurisdictions know and can do in NAEP subjects. The "Test Yourself" tool has a short interactive quiz with questions from the most recent NAEP assessments that can be used as a warm-up activity for students. The "Submit" button provides immediate feedback to the student with the correct answer and an explanation of how this answer can be obtained, along with information about student performance in the nation for each question. For teachers, the NAEP Questions Tool provides released questions from NAEP assessments from over twenty years. Using this tool teachers can search for exemplar questions by content, item type (i.e., multiple choice or constructed response), or item difficulty. A collection of selected questions can be printed out for students, and the answers, sample student responses, and national summary performance data can be printed for the teacher. In addition, using the "Question Detail" that is available on this tool, detailed information about student performance on each question is available for NAEP jurisdictions in more recent years.


Jeanette Rundquist from New Jersey Star Ledger asked:
How great a concern is it that proficiency is so low? Only 40 percent of 4th graders proficient in math, and the other percentages are lower?

Dr. Peggy G. Carr's response:
I should point out that 40% percent of fourth graders are performing at or above NAEP Proficient in math, which means that we have do have some fourth graders (7%) whose performance exceeds NAEP Proficient. Of the fourth graders who did not perform at Proficient, 42% did perform at NAEP Basic, which means that these students have partial mastery of prerequisite knowledge and skills that are fundamental for proficient work at each grade. Indeed, it is the goal that students in this country reach Proficient, which in NAEP, means that they are able to demonstrate competency over challenging subject matter. We are encouraged by the significant and steady improvement in 4th grade achievement-level performance on NAEP mathematics. For example, the percentage of students performing at or above Proficient tripled from 13% in 1990 to 40% in 2011.


Carolyn Triano from asked:
What should be the role of the community college in remediation of the students with poor reading comprehension but desire to attend college with textbooks at the 13th grade level?

Dr. Peggy G. Carr's response:
There is certainly a lot of interest in this, Carolyn, and we are making progress toward being able to provide more data that will inform the question of student's readiness for college-level work. Although we may not be able to provide a direct answer to your question about the role community colleges should play in the remediation of students, we are working with the National Assessment Governing Board to report 12th-grade reading and mathematics results in terms of our high school seniors' preparedness for the workplace or for college-level coursework without remediation. You may be interested in reading about some of the studies that are currently underway to inform our approach to reporting on 12th-grade preparedness. By reporting twelfth graders performance in this manner, we hope to provide valuable information for both secondary and post-secondary educators that they can use to help students make successful transitions to careers or college.


Candace Cortiella from asked:
Was the percentage of participation for students with disabilities high enough to make the results generalizable?

Dr. Peggy G. Carr's response:
NAEP carefully considers the participation of various groups when assessing the credibility of results. When taken as a whole, the participation rate for students with disabilities is sufficient to provide statistically credible estimates. However, the degree to which results are generalizable to specific students with disabilities is limited, as NAEP does not specifically target students with disabilities when sampling.

There are some limitations when comparing across states, as there is variability in the composition of students with disabilities across states. Although there are several different categories of students with disabilities that participate in NAEP, we can only reliably report students with disabilities as a whole.


Albert Mitchell II from asked:
What are charter schools results? And how big is the gap between regular public schools?

Dr. Peggy G. Carr's response:
Charter schools results can be obtained by the NAEP Data Explorer. Specifically first choose "National Public" under the "Select Jurisdiction" tab. Then in the "Select Variables" tab, choose "Major Reporting Variables", and then "School Factors." Charter schools will appear fourth in the list. The results will show that charter schools scored slightly lower than the non charter public schools by 2 to 3 points at each subject/grade; however, one of them (math, grade 4) was a significant difference.


Emily Richmond from EWA asked:
Can you explain the sampling process, and what safeguards there are to prevent schools from pre-selecting students to participate in NAEP?

Dr. Peggy G. Carr's response:
Emily, NAEP uses a two-step process to select the students for the assessment. First, we use a complete list of schools in each state, for both grades 4 and 8. We select a sample of schools at random from each state, in a way that ensures that the different types of schools from across the state are represented in the sample.

Then each school in the sample submits a list of all students in the selected grade (4 or 8). NAEP contractors check the list for completeness, using historical data about the school, and also checking the internal consistency of the list to see that it seems complete. Only after the safeguards have been reviewed and approved by NAEP staff do we select a random sample of students from among all those in the grade. These are the students who take the assessment.

Finally, just before the assessment the NAEP staff who administer the assessment in the school review the original list of students with school staff, to ensure that no students are missing (in particular, students who may have enrolled since the time of submitting the student list originally). As needed, a supplemental sample of students is added so as to represent these newly listed students.

In this way we can ensure that the samples of students assessed in NAEP are representative of the student population within each state, and the nation.


Liz Bowie from Baltimore Sun asked:
What factors do you believe may have resulted in the increases over time in Maryland, Massachusetts and D.C.?

Dr. Peggy G. Carr's response:
Thanks for your question, Liz. Increases in Maryland, Massachusetts, and DC are impressive. There are, however, so many factors that can be attributable to those gains (e.g., changes in curriculum focus, staff devlopment, or instruction). Additionally states have different demographics and policies that can play important roles in state progress. The NAEP Data Explorer provides a wealth of information, including questions asked of students, teachers, and school administrators and the performance of NAEP students for each of those questions.

One interesting finding tht you may have noticed, Liz, is that here is a common component to the gains made by these three jurisdictions — those gains were throughout the scale score distribution. In other words the gains in average scale scores were also observed at percentiles throughout the distribution (10th, 25th, 50th, 75th, and 90th percentiles). For example for Maryland, math grade 4, their average scale score gain was 14 points and their gains across the five percentiles estimates were between 13 and 16 points.


Julia Janz from asked:
Is there any significant change shown in the private school sector?

Dr. Peggy G. Carr's response:
Thanks for your question, Julia.

Private school performance in mathematics increased at both grades 4 and 8 since 1990 (the first year the assessment was given), but not since 2009. At grade 4, the average score for private school students in 1990 was 224, compared to 247 in 2011. At grade 8, the scores were 271 in 1990 and 276 in 2011.

In reading, average scores for private schools did not change significantly from 1992, the earliest year the assessment was given, or from 2009, at either grade.


Laura Westberg from asked:
Were differences in scores examined for types of text or processes?

Dr. Peggy G. Carr's response:
Thank you for your question. NAEP uses literary and informational texts to assess comprehension, and provides an overall scale score across these two text types. In addition to reporting this overall score, NAEP breaks out results by literary and information text on separate 0-500 scales. Although not presented in the report card, these subscale results are available through the NAEP Data Explorer. Because the subscales are developed independently, the literary and information scores cannot be directly compared. However, trends in performance and group differences can be examined for each text type.

The framework for the NAEP reading assessment specifies the use of literary and informational texts in the assessment. The framework further specifies that the assessment questions measure three cognitive targets for these two types of texts: locate and recall, integrate and interpret, and critique and evaluate. However, the assessment was not designed to report subscale scores for these cognitive targets. There are, though, numerous sample questions available for your review and use on the NAEP Questions Tool. These released sample questions demonstrate how we have measured each of the cognitive targets and how students performed on the individual items.


Gina from Columbia, SC asked:
Should states use NAEP to compare themselves? Is it a good apples-to-apples way for states to look at their education systems?

Dr. Peggy G. Carr's response:
Yes, NAEP can be used as a common yardstick for making viable state-to-state comparisons. The NAEP assessments are the same, in content and administration procedures, across states; NAEP plays a valuable role in making an "apples to apples" comparison for the country. I encourage you to use the NAEP State Comparison tool to examine the state results at http://nces.ed.gov/nationsreportcard/statecomparisons/. The tool allows you to compare and sort state results for students overall, as well as by gender, race/ethnicity, and eligibility for the National School Lunch Program.


Denise Amos from asked:
How important is it if half the black 8th graders and 40 percent of 8th graders scoring below basic in math in terms of their chances of going to college? Are their schools not directing these students to be ready for algebra by 8th grade? ¬

Dr. Peggy G. Carr's response:
Good afternoon Denise and thanks for your question. NAEP is very interested in this topic. NAGB has commissioned a series of special studies which explore NAEP as an indicator of 12th grade academic preparedness in reading and mathematics. For information on this commission and associated activities, please see http://www.nagb.org/commission/. But more to your point, NCES is also considering how we could push these preparedness interpretations down to the 8th grade so that we will be able to make more statements about whether or not 8th graders appear to be on target for reaching college and career preparedness criteria. In response to your 2nd question, Algebra I, a common high school class in the past, is increasingly being taken in middle school. You might be interested in the results of the High School Transcript Study which further explore the relationship of taking algebra 1 before high school and coursetaking in high school.


jamaal shaheed from asked:
Can scores reported be interpolated for grades in between 4th and 8th (5th, 6th and 7th)?

Dr. Peggy G. Carr's response:
We're often asked about this, since many people would like to know how many NAEP scale score points can be expected to be gained in each year of schooling. However, we don't advise interpolating scores for grades not assessed by NAEP. Although reading and mathematics are developmental subjects, we know that learning increases at a decreasing rate as students progress from grade to grade, so there might be a larger number of NAEP points representing learning from grade 4 to 5 than grade 7 to 8.


Jamaal Shaheed from asked:
Why aren't state tests more aligned with the standards used by NAEP?

Dr. Peggy G. Carr's response:
Jamaal, Thank you for your interest in NAEP. As you probably know, standards refers to either content standards and performance standards. I will try to address both. However, I really can't answer the why part of your question, since there are many curricular, policy, and technical reasons why a state might decide upon one set of standards rather than another. However, what I can say is that only a few states have aligned their content standards or performance standards with NAEP.

Regarding content standards, you are correct that most state tests are not aligned with NAEP. The development of state test standards is a function of a number of factors, such as the current state curriculum, state educational reform initiatives, the composition of the standards committee, other national standards efforts (national teacher organizations, Common Core Standards), and yes, the political environment. NAEP does not strive to influence any state curriculum, but the NAEP frameworks are available if states choose to use them in the development of state standards.

Regarding performance standards, the NAEP Proficient achievement level "represents solid academic performance. Students reaching this level have demonstrated competence over challenging subject matter." This definition of Proficient is an aspiration, higher than simply grade level performance. The definitions of Proficient used by the states are different, and therefore is not aligned with NAEP.

More information about how state standards map on the NAEP scale can be found in a recent NAEP report at http://nces.ed.gov/nationsreportcard/studies/statemapping/.


Jamaal Abdul-Alim from Washington, D.C. asked:
When we talk about America's 4th and 8th graders from 2011, unless I'm mistaken, we're basically talking about the Class of 2019 and the Class of 2015, respectively. With a growing emphasis being placed on college and career readiness, what do the 2011 results portend about college and career readiness for America's students? In other words, can we make any predictions about what college enrollment will look like in those years, or the degree to which students will require remedial education, based on their performance as 4th and 8th graders?

Dr. Peggy G. Carr's response:
Thanks for your question Jamaal. Presently, NAEP results for 4th and 8th graders in 2011 are just that...a snapshot of what these students know and can do with the respective grade level materials. However, as I mentioned in my response to Denise, we are working with the Governing Board to be able to report 12th grader's performance in terms of their preparedness for post-secondary pursuits. We are also exploring the possibility of pushing these interpretations down to the 8th grade. Until we do so, our ability to make these types of predictions based on NAEP data will be somewhat limited.


John from NYC asked:
What accounts for the widening gap between white and black in DC?

Dr. Peggy G. Carr's response:
Thanks for your question, John.

The gap between White and Black students in DC has actually not been widening, at least on NAEP. For example, in math at grade 4, the difference in scores was 62 points in 1992 and 57 points in 2011. The same pattern is seen for reading as well, and at grade 8.

The gaps, though, are still wide in 2011. Many factors probably contribute to these differences. I note that the percentage of Black students eligible for the free and reduced-price lunch program (an indicator of low income) is much higher than for White students, and income is highly correlated with performance.


jamaal shaheed from asked:
How can this information be best used at the state level (DOE) as well as at the local community (school/family) level?

Dr. Peggy G. Carr's response:
Great question, Jamaal!

One of the principal uses of NAEP is by states to measure their progress over time, and to compare themselves to other states. The state-level NAEP assessments have been going on since the early 1990s so the state DOE can see its progress on NAEP over about the last 2 decades. Further, it is virtually impossible to compare states to each other using the states' own tests, which differ in many respects. But NAEP administers the same assessments to all states, allowing comparisons to be made.

At the local school level the uses of NAEP are a little less direct because we don't produce school (or individual student) results. But there is still a rich source of resources to be tapped. For example, we place all our released test questions on our website, so teachers can give these questions to their students. Then they can see how their students perform on these questions compared to students across the state or in the nation as a whole. Parents too can look at the types of questions asked of students at the same grade level as their children.

Educators, parents, and interested citizens at the state or local level might also examine the NAEP assessment frameworks, which outline what our Governing Board thinks students should know and be able to do at the grades we assess.


Jim KOHLMOOS from asked:
with the dramatic shift in demographics since 1990 (particularly the increase in poverty), aren't modest gains this year perhaps not quite so modest?

Dr. Peggy G. Carr's response:
You are correct that there have been dramatic shifts in demographics since 1990. To keep my answer brief, I'm going to answer with respect to grade 4 mathematics using the NAEP Data Explorer for unpublished results and the Report Card (page 12, figures 4 & 5), but the similar results can be found for reading grades 4 & 8 and mathematics at grade 8.

To have an influence on recent gains on NAEP, you would have to look not at the long term shifts in demographics, but at recent shifts (the same years for both NAEP and the demographic shifts). The shift in demographics since 2009 has not been large—the white population decreased by 2 percentage points, the Black population decreased by 1 point, and the Hispanic population increased by 1 point. These shifts are much less than the long-term shifts, and consequently, there is less possibility of a large impact on score gains. Using data on the percentage distribution of the three groups, I looked at the weighted average of scores for the three groups using the 2009 distribution of demographic groups and compared it to the weighted average with the actual 2011 distribution. There was no difference, which means that the modest gains since 2009 were unaffected by the shift in demographics since 2009.

Nevertheless, you are also correct that there have been dramatic shifts in demographics over the long term, which have probably had the effect of reducing the apparent gains of the overall population. Over the two decades, the scores of whites, blacks, and Hispanics have grown more than the average scores of the three groups combined, because the lower-scoring Hispanics have grown while the higher-scoring white group has shrunk as a proportion of the total.


Cathy Clark from asked:
Pennsylvania shows no significant change in overall scores, what message should this send to our schools?

Dr. Peggy G. Carr's response:
Indeed, Pennsylvania shows no significant change in its overall average score across both subjects and both grades. However, I would not necessarily consider Pennsylvania's stagnant performance between 2009 and 2011 as discouraging news, when you look beyond trends in average scores NAEP results show that fourth and eighth graders in Pennsylvania are outperforming their peers nationally in both mathematics and reading. Furthermore, achievement-level results show an increase in the percentage of fourth-grade students in Pennsylvania reaching NAEP Proficient in reading (defined as mastery over challenging subject matter). In comparison, the nation as whole did not see the same type of improvement.


Kristi from asked:
What should school board members and governance teams be looking for in these results, and how should NAEP be viewed as states adopt the common core standards and their assessments?

Dr. Peggy G. Carr's response:
Hi Kristi,

Thank you for your question. I view NAEP as an indicator of what students know and can do around the country and a way to look at students within your state. If school board members and governance teams would like to know how the students in their state compare to students around the country or examine how students have improved over the years, this is a powerful use of NAEP. Another excellent use of NAEP results, is to dig through the background questions that provide further information on student and teacher characteristics that might not make it into the report. These questions include items about teacher certification, student extra-curricular activites, and other questions that allow you to paint a more clear picture of a particular jurisdictions students. I am not sure what city/state you are from, but we also assess 21 large cities with the Trial Urban District assessment (TUDA). This will allow you an even more focused look at how your district compares to other districts around the country. This report comes out later this year.

As for the common core standards, we anticipate that their development will not change how NAEP is used. NAEP will continue to be used as an external indicator for the states belonging to the two consortia as well as those states who chose to not join either. NAEP will closely monitor the development of the common core standards and look for other ways it might better serve in a monitoring role for the nation.


jamaal shaheed from asked:
How is this assessment different from the PISA exam?

Dr. Peggy G. Carr's response:
There are actually many important differences between PISA and NAEP. For example, although they both measure mathematics, science and reading, PISA is designed to measure "literacy" broadly while NAEP has a stronger, although not direct, link to curriculum frameworks and seeks to measure students' mastery of specific knowledge, skills, and concepts. It is also important to note that PISA assesses 15-year-olds in as much as 60 countries while NAEP is an assessment of fourth-, eighth- and twelfth-graders in the US only. There are other differences between the two assessments, reflecting in part the different purposes of the assessments.

To learn more about the differences in the respective approaches to the assessment of mathematics, science and reading among PISA, TIMSS, and NAEP, you may be interested in the following papers:




Sheila Piippo from Minnesota asked:
How does the nation's performance on the NAEP assessments compare with performance on the International TIMMS assessment?

Dr. Peggy G. Carr's response:
You've raised an important question, Sheila, that has been addressed several times in the research literature. It requires a thoughtful answer, because TIMSS and NAEP are often assessed in different years, with tests that cover similar, but not identical subject matter, and with similar, but not identical performance standards (i.e., "achievement levels" or other reporting categories). To make the results comparable, special studies have been conducted. The most comprehensive and rigorous study to date is currently under way, and is scheduled to be released around the time of the next TIMSS release in December 2012.

The best source of information for now would be to consult publications by Gary Phillips Dr. Phillips has recently published several reports that involve statistically linking NAEP to TIMSS. The first report compared U.S. state-NAEP results to TIMSS in math and science (Chance Favors the Prepared Mind, 2007), the second compared U.S. School District results to TIMSS (Counting on the Future, 2008) and the newest report compared states and school district to TIMSS using letter grades (The Second Derivative: International Benchmarks in Mathematics for U.S. States and School Districts, 2009).

The links to these publications are below:

http://www.air.org/publications/documents/phillips.chance.favors.the.prepared.mind.pdf
http://www.air.org/files/AIR_Counting_on_the_Future.pdf
http://www.air.org/files/International_Benchmarks1.pdf



¬Jaime Sarrio¬ from Atlanta Journal Constitution asked:
Nationally, there is a push to improve math and science scores. Have we lost ground in Reading/LA as a result? Do those scores suggest there needs to be an added emphasis on Reading/LA?

Dr. Peggy G. Carr's response:
Thank you for your question. You have correctly observed that Math scores have increased fairly steadily, while reading performance has been fairly static. The scores are reported on separate scales, and comparisons cannot be made between the two scales. However, from a qualitative perspective, we can say that mathematics performance does appear to be improving at a somewhat more encouraging rate than reading.

There are potentially many contextual factors to consider, but one interesting factor relates to the changing demographics in our country. For example, the percentage of Hispanic students has increased markedly over the two decades of NAEP. In 1990, 6 percent of 4th graders were Hispanic, compared with 22 percent in 2011. Many of these students are likely to be English language learners, and the NAEP reading assessment is strictly an assessment of reading in English. Given this, it might be reasonable to expect a decrease in reading scores, but they have instead remained stable. This is, of course, just one of many factors to consider and that can be examined using the data available on our NAEP Data Explorer.


Dr. Peggy G. Carr:
Thanks for all the excellent questions. Unfortunately, I could not get to all of them, but please feel free to contact NAEP staff or me for more information. I hope that you found this chat helpful and the report interesting.