Skip Navigation

StatChat Live with Dr. Gary W. Phillips

NCES Commissioner    Hi, and welcome to today's StatChat about the NAEP "1999 Trends in Academic Progress."

Let's get right to your questions...

John from Reading, MA asked:
As more states are moving to their own "state education frameworks" and are testing proficiency against those frameworks (which are different like snowflakes), how would you suggest that we could "roll up" this state data for a more detailed skills picture, nationally? Follow-up: Do you see less participation in the NAEP as schools spend more of their testing time on their respective state testing programs?
Dr. Gary Phillips: Thanks for raising this issue. I think there is a common misperception that somehow you can get a national average from combining data from State testing programs. You cannot aggregate state testing results to see a national picture of student achievement. NAEP's role has been to provide a common yardstick for the nation overall to see how well students are doing in various subjects. By having this common national measure, states are free to fashion testing programs that fit their particular needs.

Carol from Nacogdoches, TX asked:
What are our goals in education? That everyone maximize potential, everyone reach a certain level of competence (what is this level and who decides / assesses)... Who is responsible for this learning? Teachers, Parents, Students, Community... Which students are/are not making progress? Why?
Dr. Gary Phillips: This is a good question but does not apply to NCES. NCES does not set education goals. By conducting NAEP, NCES simply measures student achievement. NAEP has played a prominent role in documenting our nation's progress toward the national education goals that were formalized by the National Goals Panel in the 1990s. NAGB, the NAEP policy board, approaches the expectations we have about student performance in terms of Basic, Proficient, and Advanced performance standards.

Dr. Georgia Williams-Scaife from Arlington, Virginia asked:
As a result of our students' poor showing on the TIMSS, e.g., blame was placed on the fact that the curriculum in U.S. schools is "an inch deep and a mile long". Do the NAEP results imply the same kind of problem? If not, what appears to be the difference? If so what new practices, i.e. testing and/or curriculum, have surfaced that appear to be turning this around?
Dr. Gary Phillips: Thanks. This was a great question! NAEP has not undertaken a major study of curriculum materials used in U.S. classrooms, which was the case in TIMSS 1995. In fact, TIMSS included a number of components: assessments, questionnaires, a video study of 8th grade mathematics teaching, case studies of educational policy issues, and a curriculum study. The statement about "an inch deep and a mile wide" was not a statement from the TIMSS study, but was made by an independent researcher in his study. The comment about the depth and breadth of the curriculum in U.S. schools was made in regard to that specific curriculum study, and did not include any reference to the depth and breadth of curricula in other nations. It is not clear whether, across nations, there is relationship between the dimensions of curriculum materials utilized in a nation and the achievement of students in mathematics and science. Any observed relationship between curriculum and achievement within a country may differ when looking across countries. More study of all available data on curriculum in the participating TIMSS nations is necessary before any definitive conclusions can be offered.

John from Cincinnati, Ohio asked:
Have any correlation studies been done between school attendance (or absence) and results on NAEP? What do those results show? Have any correlation studies been done which show the relationship between report card marks and NAEP results? Have any correlation studies been done which show the relationship between high-stakes tests and NAEP results?
Dr. Gary Phillips: Thanks for writing in. I really like technical questions. Some preliminary research, still at the basic research and development stage, has been conducted looking at the correlation between school attendance and performance on NAEP; a negative correlation has been found between them. Studies have not been done looking at the relationship between report card marks and NAEP performance. However, data collected in recent NCES high school transcript studies of high school graduates enable researchers to look at relationships among factors like NAEP performance on various subjects, school attendance, course-taking, and grade-point averages. Where state test frameworks are similar to NAEP frameworks, some consistency of test results has been found.

Ethan from Monterey, CA asked:
What impact is NAEP expected to have on the development of statewide testing systems?
Dr. Gary Phillips: NAEP has established its place at the national level; its role is complement and supplement state testing programs. NAEP's outstanding reputation of reliability and validity stems from its leadership in developing large-scale assessment methodology and its psychometric sophistication. NAEP strives to continue fulfilling a leadership role in this area....Thanks for the question....Gary

Nadia from Arlington, VA asked:
How do these long-term trend findings in math and science relate to what is currently being taught and tested in our schools? Because the several national standards proposed in the past decade (e.g., by NCTM and by AAAS) apparently have affected the way math and science are taught in this country, are the NAEP assessments in math from 1978 (linked back to 1973) and in science from 1977 (linked back to 1969) still relevant?
Dr. Gary Phillips: This is a good question and it was brought up at the press conference. I see the long-term trend tests as measuring the core, basic knowledge and skills we expect students to know in math and science. These core objectives have really not changed over the past 30 years. While new national standards in math and science have been articulated since NAEP originated its trend studies in 1969, the same basic content in these core subjects is covered in classroom instruction and assessed in NAEP. Earliest NAEP results are still relevant but do not provide the same comprehensive coverage to key subjects that main NAEP does....Thanks....Gary

Sandra Palmer from Gwinnett County , Georgia asked:
What about the Fine Arts?
Dr. Gary Phillips: The NAEP Arts Assessment is scheduled again in 2007. A process report analyzing lessons learned in the 1997 NAEP arts field test is in preparation. It will be a web-based report and will include released items from the 1997 assessment.

Gaylynn from Bismarck, North Dakota asked:
Is there a specific date set when 2000 State NAEP Mathematics and Science results be released? If so, what is the date?
Dr. Gary Phillips: No specific date has been set for release of the 2000 State NAEP math and science results. But, math results are expected to be released around April-May 2001 and science results are expected around August-September 2001.

Marjolaine from Sierra Vista, Arizona asked:
The international assessments of math and science (e.g., the Third International Mathematics and Science Study), have been administered over several years. Does the performance of US students in the international assessments mirror their performance on the NAEP long-term-trend math and science tests? If not, what might explain the differences?
Dr. Gary Phillips: Currently, there is no clear indication of whether U.S. students in TIMSS mirror students on the NAEP long-term trend math and science tests. However, in December, NCES will release the next TIMSS report which may shed some light on the comparative performance of US students over time in the international assessments and the performance of students in math and science in the NAEP long-term trend assessments. Be cautioned that, even then, significant differences between the two assessment programs hamper precise comparisons.

John from Newark, Delaware asked:
What implications do the trends in reading results have in terms of current needs in reading instruction?
Dr. Gary Phillips: NCES simply measures student achievement in the NAEP reading assessment. Analyzing and interpreting the implications of reading results in terms of current needs in reading instruction is more properly the job of experts in the field of reading instruction. NAEP is a valuable source of information for that analysis....Thanks....Gary

linda from oklahoma city, ok asked:
How might I receive information concerning the topic if I am not able to participate in the live StatChat?
Dr. Gary Phillips: Call Peggy Carr at (202) 502 7321 or visit NAEP's website at

Marion from Melbourne, Australia asked:
Have there be any changes in age grade relationships between 1969 and 1999 that could be related to the relative performance of black and white students of the same age over time?
Dr. Gary Phillips: Thanks for the question.. Yes there have been a few changes in the modal grade. We are looking into this to see what effect is has....Gary

Dean from Greenbelt, Maryland asked:
Hello Dr. Phillips: Can you describe to me the application of Metadata standards to NCES statistical analyses?
Dr. Gary Phillips: What do you mean by metadata? Are you referring to meta-analysis?...Gary

Haiyan from Cambridge, MA asked:
Can you tell us more about test instruments in these many years and how reliable they are in measuring the same level of skill and competency? Thanks.
Dr. Gary Phillips: I am not sure what you are getting at in this question....Can you clarify?....Gary

Lyle from Chapel Hill, NC asked:
Will you remind us about how simimilar the assessment questions are from one assessment to the next, and whether questions posed for long-term trends represent early or current educational standards?
Dr. Gary Phillips: The questions are identical from year to year. We have been asking the same questions for the past 30 years.

Jill from Oakland asked:
Can you please translate the score gaps between racial subgroups into academic years? For example, do the scores between black and whites indicate blacks are two years behind in reading/math/science etc. than whites?
Dr. Gary Phillips: We do not have definitive data on this but as a rule of thumb a 10-12 point difference on the NAEP long term trend scale is equal to "about" one year...Gary

Cynthia from Austin Texas asked:
Are we going to continue to raise the standards of performance of students, in particular in mathematics and science? If we are, how will that be assessed?
Dr. Gary Phillips: The standards on NAEP are set by the National Assessment Governing Board. There are no immdeiate plans to revise these standards.

Kristi from West Sacramento, CA asked:
California's student population is far more diverse than anywhere else in the nation. What impact might that have on how we interpret the data in your report?
Dr. Gary Phillips: NCES provides the data, the software and the training to analyze the NAEP data from California.

Pat from Hopewell, NJ asked:
I could use some clarification about the correlations (the response to John's earlier question). Are those correlations done at the student, school, or state level?
Dr. Gary Phillips: Correlations are done at the student level....Gary

Ray Fenton from Anchorage, AK asked:
Gary, When we take students time to take any test, we like to give parents, teachers, and students the results. Do you see a time when NAEP results will be available for individuals to be used by parent, teachers, and schools?
Dr. Gary Phillips: Currently the legislation prohobits the release of such data. The National Assessment Governing Board is exploring this and may be making recommendations for reauthorization in the future..Thanks for your questions I can tell that you know the NAEP project very well...Gary

Paul from Crofton Maryland asked:
In your opinion how well does NAEP control for external factors such as SES and dissimilar school environments?
Dr. Gary Phillips: NAEP does not control for these factors in our official statistical reports. However, we do provide the data and the software for secondary researchers to do these analyses. Call Alex Sedlacek at (202) 505 7446 for more details...Thanks

Nada from Arlington, VA asked:
An earlier writer asked if this assessment represented current standards or those of decades ago -- are the measures reported here in line with today's standards?
Dr. Gary Phillips: They are still current because they are core competencies we expect students to know....Thanks...Gary

Stephen from Austin, TX asked:
What is the most surprising trend you have gleened from your NAEP analysis?
Dr. Gary Phillips: The fact that the majority/minority gap was substantially reduced during the 1980's but now appears to be widening. This is a hard trend to explain....Gary

Kristin from Pittsburgh, PA asked:
If educational standards have changed in the past 30 years, but the long-term Trend NAEP questions have remained the same, how informative are the test results for improving education?
Dr. Gary Phillips: Yes educational standards have changed over 30 years, but the core knowledge expected of students has not. That is why we have two types of NAEP assement. One for the long haul and one related to current standards....Gary

Gary from Cambridge, MA asked:
Does NAEP have any explanation of why the racial gap closed until the mid-80s or early 90s and then either froze or actually began to widen again in several of the measures? Has any focused research been launched to explain this reversal?
Dr. Gary Phillips: I do not really have a good explanation. We are looking into possible changes in demographics such as the increase in LEP students and other factors.....Gary

Richard from Portland, OR asked:
Do the newest trend data allow any conclusions about differences between students in Title I schools and others? Do you see any implications in the data for the debate on Title I accountability systems?
Dr. Gary Phillips: We do not have Title I data in the long term trend assessments. However, we do have those data in our newer surveys which will be released next year...Thanks...Gary

Catherine from Atlanta, GA asked:
Would you clarify the differences between the NAEP assessment used to gather trend data (in which the questions remain constant) and the assessment from which state results are made available in the different subject areas(in which questions are modified from year to year)?
Dr. Gary Phillips: Great question...The newer NAEP assessments were developed by the National Assessment Governing Board in the 1990's and are more focused on current reform efforts. It is on the newer assessments that we have the State-by-State data and the achievement levels.....Thanks for the question....Gary

Steve from Bangor, ME asked:
How are schools selected to participate in NAEP? If local districts have the choice to be included or not included, how does this influence representativeness?
Dr. Gary Phillips: Thanks for the question...Schools are selecerd randomly for the assessment (and so are students). All participation in NAEP is voluntary....Gary

Kristi from West Sacramento, CA asked:
Did the racial gap appear to be stronger in public or non-public schools?
Dr. Gary Phillips: Thanks for the question...We have not done the analyses for private schools yet. It is planned for an analyses available on the web at a later date....Gary

Mike from Quantico, Virginia asked:
Dr. Phillips, what do see as the greatest misinterpretation or misuse of the NAEP results?
Dr. Gary Phillips: Thanks for this question....People often do not realize that NAEP is a sample survey with a margin of error. So they interpret increases in scores without taking into account the standard error...Gary

George from Homestead, PA asked:
How many states participate in NAEP?
Dr. Gary Phillips: Each year between 39 and 41 States tend to participate...Thanks for the question....Gary

Bill from Harrisburg PA asked:
The long-term trend data is derived from questions which have remained the same for 30 years. The Report Card data is based on questions which change from year to year. When these reports show different trends, how do you reconcile the difference?
Dr. Gary Phillips: Thanks for the question.... Actually we find that they tend to show similar patterns...Gary

Lyle from Chapel Hill, NC asked:
The questions have remained the same for 30 years, but the curricula have been updated to meet new standards. By asking questions that may have become "obsolete," might NAEP trend results underestimate achievement gains?
Dr. Gary Phillips: Hello Lyle...I do not agree. I think the objectives on the long-term trend represent core knowledge that students should have and are still relevant today...See you later....Gary

Leslie from Rockville, MD asked:
Are there plans to discontinue the long-term trend assessment in the future? Can the same information be obtained from short-term trend calculations using the cross-sectional samples?
Dr. Gary Phillips: Thanks for the question. We plan to continue doing the long-term trend assessment. There really is no good technical way to obtain this type of information from the newer assessment. However, we are exploring all of the alternatives....Thanks...Gary

Paul from Crofton, Maryland asked:
Gary, are there any states that are consistently underrepresented? If this is true, do you anticipate have any significant affect on overall NAEP results.
Dr. Gary Phillips: Hello Paul...Our national sample represents all States. Some State (8-10) do not participate in our State assessments....Gary

Karrie from Illinois asked:
Can you highlight some of the trends that have resulted from 30 years of testing??
Dr. Gary Phillips: Math performance is up over the past 30 years at all ages (9, 13 and 17). ...That's all the time we have today. Thank you for all of your excellent questions. I'm sorry, I couldn't get to all of them. A transcript of this chat will be available tomorrow morning (8/25). For further information please visit the NAEP website at ...Gary

Back to StatChat Home