The Nation's Report Card — Results from the 2005 NAEP Trial Urban District Assessments in Reading and Mathematics
||Hello, and welcome to today's StatChat on the NAEP 2005 Trial Urban District Assessment (TUDA) reading and mathematics reports. I hope you've had time to examine the results. The TUDA provides a unique look at the performance of large urban districts across the country. I am interested in hearing what you want to talk about, so let's get right to your questions...
|Ann Sebald from Greeley, Colorado asked:|
In reviewing your website, I did not see that NCES disaggregates math and reading results by disability area. Does NCES track this information? I am specifically interested in how students with low-incidence disabilities (i.e., students who are blind/visually impaired, deaf/hard of hearing, deafblind, or have severe disabilities) performed on these assessments.
Ann M. Sebald, Ed.D.
|Dr. Peggy G. Carr:
||Ann, NCES/NAEP does indeed have results for students with disabilities overall. These data are in the Report Cards, p 36, table A-9.
Results by type or severity of disability are not available in the reports or on the public website through our NAEP Data Explorer. It may be possible for a secondary researcher to look at this data with raw data files. However, the sample sizes are probably so small when broken down that way—especially by district—that it is unlikely the results would be reliable.
|Kathy from Andover, MA asked:|
|I have never heard people address the fact that each time 8th graders are tested in a particular school or district, it's a different group of students. How can you draw conclusions about progress when the students being tested are different students each time? |
|Dr. Peggy G. Carr : ||Kathy,
The answer to your question is that conclusions can be drawn from NAEP Data about educational progress. The assessment results are based on statistically equivalent cohort samples in different years. The way NAEP measures progress is to test this year's eighth graders and compare their performance with last year's students. So, it is a comparison of cohort groups, not a longitudinal measure of a single group.
|Tom from New Haven, Conn. asked:|
|why aren't more districts included in the sample?|
|Dr. Peggy G. Carr : ||Tom,
Congress appropriated funding for the current set of 10 districts that participated in the TUDA. The number of districts could increase in the future, depending upon the availability of funds and at the discretion of Congress. |
|Stephen from Campbell, California asked:|
|Although the districts participating in the trial are characterized as urban districts, each district has its own unique context. The districts vary on large-scale factors like state curriculum and assessment policy and small-scale factors like the percentage of students who are English Language learners. How is NCES accounting for these variations to maximize the benefit of the trial? |
|Dr. Peggy G. Carr : ||Stephen,
You are correct in noting that urban districts vary in both large and small-scale factors that would affect assessment results. The same, of course, can be said of the states in our state-by-state assessments. To address these contextual issues, the NAEP assessment includes students, teacher, and school questionnaires that provide considerable data on background variables. NCES makes these data available to external researchers for the investigations of how these background variables are related to NAEP scores. The new NAEP Data Explorer (NDE), a data tool available on the NAEP website, makes the exploration of these data more accessible.
|Vicky from Providence, RI asked:|
|It appears to me thatif these trials are to be expanded we should first see some real significance in doing them. what is the future plan?|
|Dr. Peggy G. Carr : ||Vicky,
The significance of conducting TUDA can be found in the findings themselves. In addition, demonstrating that such assessments can be operationalized and seamlessly integrated into the NAEP state assessment program is another example of the significance in conducting this assessment. Being able to disaggregate student group data so that the progress of all students can be tracked is critically important to the nation. The National Assessment Governing Board (NAGB) is currently considering dropping the "T" from "TUDA"—meaning it will no longer be a trial program.|
|Marcia from washington dc asked:|
|what do these results mean for nclb?|
|Dr. Peggy G. Carr: ||Marcia,
Policymakers are in a better position to answer that question. But, from an analytic point of view, these data provide a wealth of information about how different student groups identified in NCLB are doing in our most challenging urban city contexts. The data show, for example, that in many instances these urban districts are making more progress than their states or public schools nationally.|
|Kimberly from Austin, TX asked:|
|Talking to my ed types in town... We're wondering what to take from this, when it comes to Austin and Houston. What conclusions do you take from this data?|
|Dr. Peggy G. Carr:
|Stephen from Santa Cruz, CA asked:|
|The initial findings focus on common results across the districts. However, the districts involved in the study vary by size, percentage of English Language learners, as well as state policy contexts. To what extent is NCES taking into account cross-site differences when interpreting the results?|
|Dr. Peggy G. Carr : ||We only have time for a few more questions. Stephen,
Good question. See my response to Stephen's question above, which was similar. In short, the TUDA report is descriptive in nature. Our online data tool, however gives you the power to dig deeper into the data.
|Kimberly from Austin, TX asked:|
|Was there any surprising data in these results?|
|Dr. Peggy G. Carr : ||
there are many suprisining findings in these results. Perhaps one of the most interesting is the progress being made by some districts that represents larger gains than seen for their states or for the nation—and some student groups surpassing the performance of their peers nationally. For example, White students in Atlanta, DC, Houston, Charlotte, and Austin performed better than White students in the nation as a whole and in large central cities. Similarly, Black and Hispanic students in New York City and Charlotte performed better than Black and Hispanic students in the nation and in large central cities.
|Ellen from Cleveland, Ohio asked:|
|Can you explain the drop in scores of white students?|
|Dr. Peggy G. Carr : ||Ellen,
If you were referring to the 4 point drop in eighth grade Cleveland white students' reading score, that apparent decline was within the margin of error. This is not statistically significant. For the nation, however, 8th grade white students' reading scores declined significantly from both 2002 and 2003. Although NAEP is not designed to make causal statements about score changes, contextual information from background questionnaires may shed light on the score changes. This information is available on the NAEP Data Explorer (NDE) http://nces.ed.gov/nationsreportcard/nde.
|D. Gale from Kansas asked:|
|Are there any special characteristics in Charlotte that other districts can learn from to help them in the future, especially in Reading?|
|Dr. Peggy G. Carr : ||When comparing NAEP results across urban districts, it is important to keep in mind how the districts vary in demographic factors. Charlotte has the lowest percentage of students eligible for free and reduced price lunch among the TUDA participants. Given that socioeconomic status is related to academic performance, we might expect Charlotte students to perform relatively well on NAEP. However, Charlotte and other urban districts that are performing well do merit special investigation. NCES does not conduct this research, but makes data available to external researchers to support such studies.|
|Thanks for all the excellent questions. Unfortunately, I could not get to all of them, but please feel free to contact NAEP staff or myself for more information. I hope that you found this session helpful and the reports interesting. Please explore all of the information available in the NAEP Data Explorer and on the release site at www.nationsreportcard.gov.|