Skip Navigation
small NCES header image

StatChat Home

TRANSCRIPT: The Nation's Report Card: Science 2009

Dr. Peggy G. Carr

Hello, this is Dr. Peggy Carr, Associate Commissioner of the National Center for Education Statistics (NCES). Welcome to Ask NAEP, an online Q&A session about the findings for the NAEP 2009 Science report. I hope you've had time to look at the results on the Nation's Report Card website, and that I can answer any questions you may have about them. I have a team of NAEP specialists here with me to make sure your questions are addressed thoroughly, so there may be a slight delay between the time you submit your question and the time the answer appears.

We are usually able to post answers for all questions, even after the Ask NAEP session ends at 3:00. If you cannot stay online for the full time, a complete Ask NAEP transcript will be available online after our discussion. I'm interested to hear which results you want to talk about.

Winifred from Atlanta, Georgia asked:
How is grade 8 performance this time? Do middle grades still look like a weak link in the K-12 educational system as often least in some subjects?

Dr. Peggy G. Carr's response:
Hi, Winifred and thanks for your interest in grade 8. The NAEP science assessment results are scaled separately at each grade—so, direct comparisons among grades 4, 8 and 12 are somewhat limited. And, since this is the first administration of a new science assessment, we are not able determine if one grade is making progress in science, while another is not.

Christopher from asked:
What were the average scores for Black students in 4th grade? Thank you

Dr. Peggy G. Carr's response:
The national average science score for Black students in 4th grade is 127.

Stefan from asked:
How can I get a printed or electronic copy of the National Report Card?

Dr. Peggy G. Carr's response:
Please go to Thank you.

Sally Quintana from Riverside CA asked:
I have been a teacher in upper elementary and middle school for 17 years. What I would like to know is how policy makers are chosen to enforce rules and regulations for us? Have these indiviuals ever been in the classroom? If not how could they possibly know the real educational issues? I don't understand why people involved in the decision making aren't looking at the REAL problem. Students who do not have parental support (due to language barriers, socio-ecnomic status, or other serious issues) are not successful in school. That number of students with SERIOUS family issues is growing at an alarming rate! We cannot compare our educational system with that of other countries because of these factors. Our government should hold parents accountable not just teachers! How about presenting a passing report card to recieve governmental financial support?

Dr. Peggy G. Carr's response:
Thanks for your questions Sally...unfortunately these are beyond the scope of the Science Report Card. Congratulations on 17 years of teaching Sally; we appreciate all your hard work and effort as an educator.

Isaac from East Lansing, Michigan asked:
How to compare the results from 2009 to those from previous assessment years?

Dr. Peggy G. Carr's response:
The 2009 assessment is based on completely new Framework. This framework differs from the one that was used in the previous assessment years of 1996, 2000, and 2005. For example, the new framework was informed by advances in cognitive research, including how students learn increasingly complex material over time, and use technological design in 2009 allowed students to apply their science knowledge and skills to solve problems in a real world-context. Because of such changes, the results from the 2009 cannot be compared to those of previous assessment years.

Jaclyn from Dry Ridge, Kentucky asked:
Why was I selected?

Dr. Peggy G. Carr's response:
Hi Jaclyn,

Thank you for participating, Jaclyn, as it takes the cooperation of all who have been selected to maximize the quality of the sample. You should know that your answers represent thousands of students across America who are just like you.

In order to provide an accurate picture of 4th, 8th, and 12th graders throughout the United States, the sample of students is very carefully chosen to match the characteristics of the population as a whole.

To achieve a representative sample of students, the process begins by identifying a sample of schools with student populations that reflect the varying demographics of a specific jurisdiction, be it the nation, a state, or a district. Within each selected school, students are chosen at completely at random to participate.

Jim from Nebraska asked:
Are the scores between male and female significantly different for 4th, 8th, and 12th grade?

Dr. Peggy G. Carr's response:
James, thanks for your interest in the NAEP science assessment results. At the eighth and twelfth grades, males scored higher than females. Males scored higher than females in 4th grade overall science, although females scored higher in the life science subscale.

Elisabeth from Rhode Island asked:
Can you compare Rhode Island students' scores to students in other states and in other New England states?

Dr. Peggy G. Carr's response:
There are several ways to do this. One of the easiest is to look page 17, Figure 17 in the 2009 Science Report Card. Here the average science scores for 4th graders are listed for the nation and each state. For example, Rhode Island's average score was 150, Massachusetts is 160, and New Hampshire is 163. Similar state results for Grade 8 can be found on page 34. We do not have data for state results at Grade 12.

You can also use the NAEP State Comparisons Tool on our web site for these comparisons.
The State Comparisons Tool is found at

Make the selections that appear on the screen (grade, subject, etc.), then click "Next Steps." A table of states and scores will appear. Click on the label of the column you are interested in to order the scores from top to bottom (e.g., 2009 Scale Score). Next click on Rhode Island. This will generate symbols for the states that scored above Rhode Island (>), not statistically different from R.I. (=), and below R.I. (<).

Elliott from Iowa asked:
Is there any data breakdown between those students who are learning science w/ a teacher actually in the classroom w/ them vs. those who are learning remotely/virtually?

Dr. Peggy G. Carr's response:
Hello Elliott,

Unfortunately, there is no data from students regarding participation of online or virtual schools compared to regular classroom instruction. However, there is some information in the upcoming High School Transcript Study regarding online coursetaking in high school. This report is scheduled to be released later this spring.

James from Woodland asked:
Are the scores between male and female significantly different for 4th, 8th, and 12th grade?

Dr. Peggy G. Carr's response:
Please see answer to Jim's question above.

John from asked:
As was mentioned, due to test changes we cannot compare raw scores of the 2009 results to any previous year. Is there any validity in comparing percentages of students at the various achievement levels from 2005/2009?

Dr. Peggy G. Carr's response:
We discourage making these comparisons. A new group of expert science panelists applied the standard policy definitions of the achievement levels and the new preliminary achievement level descriptions included in the new science framework to the entirely new science test questions that are based on the new framework. The result was a new set of science performance standards, embodied in different cut points on the new scale and different descriptions of the science knowledge and pactices for each achievement level.

While the policy definition of "proficient" remains "solid academic performance," reflecting "competency over challenging subject matter," this stable element alone does not provide much in the way of comparability. It's no more valid to compare the percentage of students at or above the proficient level in science in 2005 with that in 2009 than it is to compare the percentage at or above proficient in 2009 science with those in reading, mathematics, or some other subject.

Anita from asked:
Do you consider the data on extracurricular science activities solid? Does the NAEP data address if the extracurricular science exposure was particularly effective for one demographic over another or true for all students?

Dr. Peggy G. Carr's response:
This is a great question, but NAEP does not collect data on students who participated in the Science assessment's extra curricular activities.

Tom from NAS asked:
where do I find the actual numbers that Alan referred to for Below Basic?

Dr. Peggy G. Carr's response:
Tom, the percentages of students at each of the achievement levels, as well as Below Basic, are displayed in the Achievement Level charts on our website at the following link.

Grade 4:

Grade 8:

Grade 12:

You can also explore these data in more depth through our on-line data analysis tool called the NAEP Data Explorer.

Jeanne from asked:
How many other countries will be participating in the test that is scheduled toward the end of this school year?

Dr. Peggy G. Carr's response:

In 2011, 63 countries participate in the Trends in International Mathematics and Science Study (TIMSS); 50 countries (not including the United States) participate in the TIMSS assessments at Grade 8 only.

Morgan from asked:
Is the achievement gap on NAEP 2009 Science greater than in other subject areas (specifically reading, mathematics, and writing)?

Dr. Peggy G. Carr's response:
Morgan, that is an interesting question. Since each subject is developed independently of the other subjects, it is not technically possible to ascertain whether an achievement gap in one subject is greater than the gap in another subject. For example, the content in mathematics may be more or less difficult than the content in reading.

Gina from asked:
Can you tell us a bit more about any attitudinal or interest in science data that was collected in this assessment, and in particular, how attitudes and levels of interest change throughout the grade levels? Thank you!

Dr. Peggy G. Carr's response:
Hi Gina, Thank you for your question. There are several questions on the NAEP background survey regarding student's affective reactions to science. Percentages and NAEP results are available for these types of variables for grades 4, 8, and 12 on the NAEP Data Explorer (NDE). Links to the NDE and support resources are available in a previous question.

Bill from Pennsylvania asked:
Was any data collected showing if the students that scored above basic had taken any technology/engineering classes?

Dr. Peggy G. Carr's response:
Hi Bill,

Thanks for your wonderful question. Unfortunately, we did not ask students if they had taken any classes that covered technology and engineering. We did, however, ask teachers how much time they spent teaching technology/engineering in the classroom which you can get more information on using the NAEP Data Explorer. One of the reasons NCES commisiioned the High School Transcript Study (HSTS) was to dig into these more specific details on what student curriculum is and what influence that could have on NAEP performance. That study will be coming out sometime in the Spring and will contain information addressing your question.

Eric from Boston, MA asked:
What role should the private industry play in helping to stress the importance of science education for the nation's youth?

Dr. Peggy G. Carr's response:
Eric, thanks for your interest in the NAEP science results. The NAEP data are valuable as part of an ongoing dialogue about student achievement, but we unfortunately cannot answer your question directly in terms of recommending specific actions on the part of private industry for improving science achievement among the nation's students.

The National Assessment Governing Board recognizes the important contributions that private industry can make towards improving student achievement, particularly in conjunction with efforts to ensure that the 12th grade NAEP measures student preparedness for education and training after high school. You may be interested in learning more about the Business Policy Task Force that the Governing Board has established. You can find information on the web at

Sarah from Olympia, WA asked:
Can we draw any conclusions about the amount of instruction students received? More broadly, what do states with great success do that other states could emulate?

Dr. Peggy G. Carr's response:

The NAEP Data Explorer (NDE) provides a wealth of information on the relationship between student performance and contextual questions such as the amount of instruction students receive in science. Further the NDE enables examination by state. Most relevant to amount of instruction students receive are questions asked of students' science teachers at both 4th and 8th grades asking teachers to indicate the amount of time spent on science instruction in a typical week.

To use the NDE go to the link at First select "Main NAEP" and then select subject (science) and grade (4th or 8th). Select a sample (e.g., select all states) and move to the "Select Variables" tab. The easiest way to find the data of interest is via a key-word search (right side of screen). Type in "science instruction", select the question and results by state are presented.

Note that you may want to look at different statistics (e.g. percents and average scale score). You can choose the statistics of interest under the "Edit Reports" Tab.

Bob Hillier from Honolulu, HI asked:
What were the factors that delayed the release until 22 months after the assessment? Does this delay impact use of data to improve classroom instruction? Note: if others have asked similar questions, please ignore my questions.

Dr. Peggy G. Carr's response:
Bob, One reason for the delay in reporting 2009 science results is that a higher priority is given to reporting 2009 reading and 2009 mathematics scores for the nation and the states. These reports are due for release six months after completion of data reporting. These results were followed after another month by results for Trial Urban Districts. Our staff are then able to begin work on the 2009 science report card. Because a new science framework was implemented this year, the cut scores and descriptions of the National Assessment Governing Board (NAGB) achievement levels had to be developed, and this work could not begin until the new science scale was estimated from the data.

NAEP is prohibited by law from influencing classroom instruction. However, NCES and NAGB hope that the 2009 science results will help local and state education leaders to make informed policy choices.

Emma Cody-Mitchell from Knoxville Tennessee asked:
You stated that the 2009 assessment is based on completely new Framework that reflects advances in the use of technology. Could you elaborate on the role of computer technology in the acquistion of science knowledge and skills for the grades assessed?

Dr. Peggy G. Carr's response:
The 2009 assessment is based on completely new Framework. Some examples on how this new framework differs from the one that was used in the previous assessment included how students learn increasingly complex material over time, and use technological design in 2009 allowed students to apply their science knowledge and skills to solve problems in a real world-context.

The 2009 assessment contained the Interactive Computer Task (ICT) and a Hands On Tasks (HOT) components at grades 4th, 8th and 12. The ICT used advances in computers technology so test how students perform science experiments. The HOT component presented students with manipulatives that they used to perform given science experiments and answer questions. The results for these two components will be released at a later date.

Bob from Brookeville, MD asked:
Nice job NCES on the report. I may not be remembering this correctly, but wasn't the student performance on the hands-on-tasks included in the 2005 Science results; that is, part of the Science scale?.. and if so, are we getting as complete a picture of what students know and can do in science if the 2009 results are only based on paper and pencil questions--and don't include the interactive computer tasks or the other science experiments that were part of the 2009 Science? thanks. Bob

Dr. Peggy G. Carr's response:

First thank you for the kind comments on the report. Your recollection is correct—the hands-on-tasks were included as part of the science composite score in 2005. However the 2009 science is based on a new framework that is not directly comparable to 2005. According to the 2009 science framework, the NAEP Science assessment should include various item types including hands-on performance tasks and interactive computer tasks. The hands-on tasks and initiative computer tasks were administered to two separate samples of students and cannot be part of the composite science score.

The hands-on tasks and initiative computer tasks were designed to measure students' abilities to combine their understanding of science content with the investigative skills that are reflected in the science practices specified in the 2009 science framework. Those results will be released later in 2011.

Of course, the future may bode well. One could imagine NAEP assessments and potential computer-based delivery systems containing highly interactive tasks may enable a richer evaluation of students' science performance.

Brian from Fredericksburg, VA asked:
Page 5 states:

The hands-on and interactive computer tasks in the 2009 science assessment were administered as part of a NAEP research study. Results for these tasks did not contribute to the results in this report and will be reported separately.

When are these results expected? Thank you!

Dr. Peggy G. Carr's response:
Hi Brian, please see the response to Bob's question above.

Richard from Villa Hills, KY asked:
How much external validity analysis was done with the new science grading scales? I am looking at Kentucky's eighth grade proficiency rate on NAEP and the percentage of the same eighth grade cohort that scored at or above the ACT's EXPLORE Benchmark Scores for science. NAEP shows 34% proficient, but the EXPLORE shows only 10.48% proficient. This is quite different from the comparisons for NAEP and EXPLORE math and reading, where agreement has been within a few percentage points. See those earlier results here:

Dr. Peggy G. Carr's response:
A good deal of attention was given to the validity of our NAEP test questions. NAEP used item tryouts, cognitive labs, and a good-sized national field test to check out how well our test questions measure science according to the new framework.

The percentage of students who score at or above the cut points on different tests need not agree. NCES has studied the relationship between NAEP's standard for proficiency in reading and mathematics and those of most states and reported them in publication number NCES2010456, called "Mapping State Proficiency Standards onto NAEP Scales: 2005-2007." The conclusion of that study was that NAEP generally sets its standards for reading and mathematics higher than those of nearly all states.

While we have not done a comparable study of science assessments, my understanding of the new science achievement level standards is that they are quite stringent.

Since NAEP and the states disagree on standards for performance in reading and mathematics, I see no reason to expect agreement on such standards in science.

Julie from Chicago, IL asked:
During this morning's webinar, Dr. Friedman shared two findings: 4th graders who did hands-on science weekly scored 7 points higher than their peers, and 8th and 12th graders who participated in extracurricular science activities generally outperformed their peers. I don't see this information in the report ... can you direct me to a source for this?

Dr. Peggy G. Carr's response:
Thank you so much for attending the webcast! This information is available on the NAEP Data Explorer (NDE), which can be found here:

In this data tool, you will be able to find how NAEP scores relate to many variables that are not included in the report, including the one on hands-on tasks. On the left side of the screen, there links to two helpful resources for using the NDE: a video tutorial and a Quick Reference Guide. Within these variables is a question regarding hands-on science activities.

However, there are no questions in the NDE regarding extracurricular science activities, so there is no way to explore the relationship between outside science activities and NAEP performance. In his comments, Dr. Friedman was discussing how learning opportunities via extracurricular activities correlate with NAEP performance in other sujects.

Anahit from Harvard University asked:
Are there any results of the student science background questions? Is there further analysis done to compare back? Questions with content questions?

Dr. Peggy G. Carr's response:

We have results for a large number of background questions asked of students, teachers, and school administrators.

These data available on the NAEP DATA Explorer for 1996, 2000, 2005 and 2009. The 1996, 2000 and 2005 assessments differed in content from the 2009 assessment, and content questions and performance results are not comparable.

However, frequencies for background variables would be comparable to those for identical questions given in previous assessments (such as course taking). Since those data appear under different framework, they are directly available through the NDE. They appear in the NDE under the respective framework that was used in the assessent were they were collected.

Tom from NAS asked:
why was the decision made to not test enough kids to get 12th grade state scores?

Dr. Peggy G. Carr's response:
Hi, Tom. You may be aware that since 1990 the NAEP program has reported state-level scores at grades 4 and 8 only. For the first time, however, in 2009 we did report state-level results at grade 12 on a trial basis for a small group of states in reading and mathematics only. Moving forward, the National Assessment Governing Board has determined that we should also report science at the state-level for 12th-graders. Therefore, depending upon the availability of funds, we should be able to report future 12th-grade state results in science as well.

Stephanie from Texas asked:
Can you put these data in context? How do students perform in science compared to other academic areas? Are the (seemingly low) levels of proficiency alarming?

Dr. Peggy G. Carr's response:
Hi Stephanie-

Thank you for your question as it is a very interesting one. Since each subject is developed independently of the other subjects, it is not appropriate to compare scores or percentages across different subjects. One comparison that could be made would be to examine the relative performance of a state compared to other states, and observe the relative standing of that state (compared to other states) in different subject areas.

Regarding any performance questions, a number of policy making and science experts have commented on today's release, including the press release panelists. The U.S. Department of Education also published a statement that speaks to this issue and is available on our website.

Dr. Peggy G. Carr:
Thanks for all the excellent questions. Unfortunately, I could not get to all of them, but please feel free to contact NAEP staff or myself for more information. I hope that you found this chat helpful and the report interesting.

Back to StatChat Home

Would you like to help us improve our products and website by taking a short survey?

YES, I would like to take the survey


No Thanks

The survey consists of a few short questions and takes less than one minute to complete.
National Center for Education Statistics -
U.S. Department of Education