View Quarterly by:
This Issue | Volume and Issue | Topics
|
|||
| |||
This article was excerpted from The Nation's Report Card: Trial Urban District Assessment, Reading Highlights 2003, a tabloid-style publication. The sample survey data are from the National Assessment of Educational Progress (NAEP) 2002 and 2003 Trial Urban District Reading Assessments. | |||
| |||
Introduction
Since 1969, the National Assessment of Educational Progress (NAEP) has been an ongoing nationally representative indicator of what American students know and can do in major academic subjects. Over the years, NAEP has measured students' achievement in many subjects, including reading, mathematics, science, writing, U.S. history, geography, civics, and the arts. In 2003, NAEP conducted a national and state assessment in reading at grades 4 and 8. NAEP is a project of the National Center for Education Statistics (NCES) within the Institute of Education Sciences (IES) of the U.S. Department of Education, and is overseen by the National Assessment Governing Board (NAGB). In 2001, after discussion among NCES, NAGB, and the Council of the Great City Schools, Congress appropriated funds for a district-level assessment on a trial basis, similar to the trial for state assessments that began in 1990, and NAGB passed a resolution approving the selection of urban districts for participation in the Trial Urban District Assessment (TUDA), a special project within NAEP. Representatives of the Council of the Great City Schools worked with the staff of NAGB to identify districts for the trial assessment. Districts were selected that permitted testing of the feasibility of conducting NAEP over a range of characteristics, such as district size, minority concentrations, federal program participation, socioeconomic conditions, and percentages of students with disabilities (SD) and limited-English-proficient (LEP) students. By undertaking the TUDA, NAEP continues a tradition of extending its service to education, while preserving the rigorous sampling, scoring, and reporting procedures that have characterized prior NAEP assessments at both the national and state levels. In 2002, five urban school districts participated in NAEP's first TUDA in reading and writing. In 2003, nine urban districts (including the original five) participated in the TUDA in reading and mathematics at grades 4 and 8: Atlanta City, Boston School District, Charlotte-Mecklenburg Schools, City of Chicago School District 299, Cleveland Municipal School District, Houston Independent School District, Los Angeles Unified, New York City Public Schools, and San Diego City Unified. Only public school students were sampled in the TUDA. Results for the District of Columbia public schools, which normally participate in NAEP's state assessments, are also reported. Average reading scores are reported on a 0–500 scale. Figure A shows the average scores at both grades for the participating districts. The average scores for public school students in the nation and for public school students attending schools located in large central cities are also shown for comparison. "Urban districts" refers to the 10 districts reported in this trial study. Eight of the 10 urban districts consist entirely of schools in cities with a population of 250,000 or more (i.e., large central cities as defined by NCES); two of them (Charlotte and Los Angeles) consist primarily of schools in large central cities, but also have from one-quarter to one-third of their fourth- and eighth-grade students enrolled in surrounding urban fringe or rural areas. All of the data for both districts were used to compare with data from large central cities and the nation. Average reading scores for fourth-graders in Chicago and for eighth-graders in Atlanta increased between the 2002 and 2003 assessments. Among public school students in the nation, the average reading score at grade 4 did not change significantly from 2002 to 2003, and at grade 8 the average score decreased. In public schools in large central cities, the average score at grade 4 increased from 2002 to 2003. At both grades 4 and 8, the average score for each participating district was lower than the nation, except in Charlotte, where the average scores at grades 4 and 8 were not found to differ significantly from those of the nation.
|
Figure A. Average NAEP reading scores, grade 4 and grade 8: By urban district, 2002 and 2003 *Significantly different from 2003.
† Not applicable. Did not participate in 2002. 1Data for grade 8 for New York City were not published in 2002 because the district did not meet the required 70 percent school participation rate. NOTE: NAEP sample sizes increased since 2002 compared to previous years, resulting in smaller detectable differences than in previous assessments. Significance tests were performed using unrounded numbers. SOURCE: U.S. Department of Education, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2002 and 2003 Trial Urban District Reading Assessments. (Originally published as the figure on p.1 of the publication from which this article is excerpted.) |
Achievement Levels Provide Standards for Student Performance
Achievement levels are performance standards set by NAGB to provide a context for interpreting student performance on NAEP. These performance standards, based on recommendations from broadly representative panels of educators and members of the public, are used to report what students should know and be able to do at the Basic, Proficient, and Advanced levels of performance in each subject area and at each grade assessed.1 The minimum scale scores for achievement levels are as follows:
As provided by law, NCES, upon review of a congressionally mandated evaluation of NAEP, has determined that achievement levels are to be used on a trial basis and should be interpreted and used with caution. However, both NCES and NAGB believe that these performance standards are useful for understanding trends in student achievement. NAEP achievement levels have been widely used by national and state officials. NAEP 2003 Reading Assessment Design
Assessment framework
The NAEP reading framework, which defines the content for the 2003 assessment, was developed through a comprehensive national consultative process and adopted by NAGB. The reading framework is organized along two dimensions, the context for reading and the aspect of reading. The context for reading dimension is divided into three areas that characterize the purposes for reading: reading for literary experience, reading for information, and reading to perform a task. Reading to perform a task is not assessed at grade 4, but all three contexts are assessed at grade 8. The aspects of reading, which define the types of comprehension questions used in the assessments, include forming a general understanding, developing an interpretation, making reader/text connections, and examining content and structure. Each student read one or two passages and responded to approximately 13–20 questions in 50 minutes. The complete framework is available on the NAGB web site (http://www.nagb.org/pubs/pubs.html). Student samples
Results from the 2002 and 2003 TUDA are reported for the participating districts for public school students at grades 4 and 8. The TUDA employed larger-than-usual samples within the districts, making reliable district-level data possible. The samples were also large enough to provide reliable estimates on subgroups within the districts, such as female students or Hispanic students. Data for grade 8 in New York City were not published for 2002 because the district did not meet the required 70 percent school participation rate. Accommodations
It is NAEP's intent to assess all selected students from the target population. Beginning in 2002, SD students and LEP students who require accommodations have been permitted to use them in NAEP, unless a particular accommodation would alter the skills and knowledge being tested. For example, in a reading assessment, NAEP does not permit the reading passages to be read aloud. Because the representativeness of samples is ultimately a validity issue, NCES has commissioned studies of the impact of assessment accommodations on overall scores. One paper that explores the impact of two possible scenarios on NAEP is available on the NAEP web site (http://nces.ed.gov/nationsreportcard/pdf/main2002/statmeth.pdf). Achievement-Level Results for Urban Districts
Among the districts that participated in both 2002 and 2003, the percentages of students at or above Proficient were found to be significantly higher in 2003 for students in Chicago at grade 4, and for students in Atlanta at grade 8. In all other participating districts, the percentages at or above Proficient were not found to differ from 2002 to 2003. The percentages at or above Proficient for public school students nationally were not found to differ significantly in 2002 from the corresponding percentages in 2003 at either grade 4 or grade 8. At grade 4, the percentage of students at or above Proficient in large central city public schools was higher in 2003 than in 2002. At grades 4 and 8, the percentage of students at or above Proficient in all urban districts was lower than that for the nation, except for Charlotte where the percentage of students at or above Proficient was not significantly different from that of the nation.2 Percentile Results From 2002 to 2003
Looking at changes in scores (for districts with 2 years of participation) for students at higher, middle, and lower performance levels gives a more complete picture of student progress. An examination of scores at different percentiles on the 0–500 reading scale at each grade indicates whether changes in average score results are reflected in the performance of lower-, middle-, and higher-performing students. Comparing scores at percentiles also shows differences in performance across levels within 1 year. The percentile indicates the percentage of students whose scores fell below a particular score. For example, in 2003, a fourth-grade public school student would have had to score at least 193 to score above the 25th percentile in the nation, but would have had to score only 179 or better to score above the 25th percentile compared with students in large central cities. At grade 4, the national and large central city public school scores at the 25th, 50th, and 75th percentiles were not found to differ significantly from 2002 to 2003; the scores for the 50th and 75th percentiles for students in Chicago were higher in 2003 than in 2002. The score for students in the District of Columbia at the 25th percentile was lower in 2003 than in 2002. At grade 8, scores for public school students in the nation were lower at the 25th and the 50th percentiles in 2003 than in 2002; the score for students in Houston at the 75th percentile was also lower in 2003 than in 2002. Scores at the 25th, 50th, and 75th percentiles for students in large central cities were not found to differ significantly between 2002 and 2003 at grade 8. How Various Groups of Students Performed in Reading
In addition to reporting the overall performance of assessed students, NAEP also reports on the performance of various subgroups of students. Five of the nine districts, as well as the District of Columbia, were assessed both in 2002 and 2003, so that comparisons over time will indicate whether the subgroup has progressed. Additionally, subgroups can be compared to each other within an assessment year. When reading these subgroup results, it is important to keep in mind that there is no simple, cause-and-effect relationship between membership in a subgroup and achievement in NAEP. A complex mix of educational and socioeconomic factors may interact to affect student performance. Gender
Average reading scores by gender. Table A presents the percentages of assessed male and female students and average reading scores in the 2 assessment years, where applicable. In 2003, at grade 4, female students scored higher, on average, than male students in every district (except Atlanta and Houston), in the nation, and in large central cities. Where data were available in both assessment years, there were no significant differences detected in any district for male students or female students between their respective average score in 2002 and their average score in 2003. At grade 8, while the average score for male students in public schools in the nation declined, the average scores for both male and female students in each of the districts and in large central cities in 2003 were not found to differ significantly from those in 2002 (table A). Female eighth-graders scored higher, on average, than male eighth-graders in the 10 urban districts, in large central cities, and in the nation. Average reading score gaps between female and male students. At grade 4, the score gaps between female and male students in Charlotte and the District of Columbia were wider than the gaps in the nation and large central cities. At grade 8, the score gap was wider in the District of Columbia than in public schools in large central cities and narrower in Chicago than in the nation. In 2003, female public school students in the nation scored higher, on average, than male students by 8 points at grade 4 and by 11 points at grade 8. Achievement-level results by gender. In 2003 at grade 4, Charlotte had a higher percentage of female students performing at or above Proficient than the nation, but no statistically significant difference was found between the percentage of male students at or above Proficient in Charlotte and those at or above Proficient in the nation. Compared to the nation, 9 of the 10 urban districts had lower percentages of both female and male fourth-grade students who performed at or above Proficient. Compared to public schools in large central cities, Charlotte had higher percentages of both male and female fourth-grade students who performed at or above Proficient. In New York City, the percentage of female fourth-grade students performing at or above Proficient was also higher than that recorded in large central cities. At grade 8, greater percentages of both male and female students in Charlotte performed at or above Proficient than their peers in public schools in large central cities. The percentages of eighth-grade male students at or above Proficient in Boston, Chicago, New York City, and San Diego and of female eighth-graders in Boston and San Diego were not found to differ significantly from the percentages of their counterparts at or above Proficient in large central cities. At both grades 4 and 8, the percentages of male and female students performing at or above Proficient were not found to differ statistically in 2003 from the percentages in 2002 in the nation, in large central cities, or in any of the districts that participated in both assessments. |
Table A. Average reading scale score results, by gender, grades 4 and 8 public schools: By urban district, 2002 and 2003
— Not available.
* Significantly different from large central city public schools. ** Significantly different from nation (public schools). *** Significantly different from 2003. NOTE: NAEP sample sizes increased since 2002 compared to previous years, resulting in smaller detectable differences than in previous assessments. Detail may not sum to totals because of rounding. Significance tests were performed using unrounded numbers. SOURCE: U.S. Department of Education, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2002 and 2003 Trial Urban District Reading Assessments. (Originally published as the first table on p. 5 of the publication from which this article is excerpted.) |
Race/ethnicity
Average reading scores by race/ethnicity. In each of the urban districts participating in the 2003 TUDA, Black students and/or Hispanic students constituted the majority or the largest racial/ethnic subgroup in both grades 4 and 8. This distribution differed from that for the 2003 national assessment, in which White students constituted a majority—59 percent of the fourth-grade sample and 61 percent of the eighth-grade sample (table B). At grade 4, Black students in Chicago scored higher on average in 2003 than in 2002, and Black students in the District of Columbia scored lower in 2003 than their counterparts in 2002 (table B). No significant difference was found between the national or large central city overall scores in 2003 and those for 2002 for any racial/ethnic subgroup. At grade 8, there was also no average score difference detected between 2002 and 2003 for any subgroup in the nation, in large central cities, or in the participating urban districts, except that Black eighth-graders in Atlanta scored higher on average in 2003 than in 2002 (table B). Statistically significant differences between racial/ethnic subgroups in the districts and their counterparts in the nation and in large central cities within the 2003 assessments are marked with asterisks in table B, as are statistically significant differences between 2002 and 2003. Average reading score gaps between selected racial/ethnic subgroups. At grade 4, the gaps between the average scores of White and Black students in Cleveland and Boston were narrower than the corresponding gap in large central cities. The gap between average scores of White and Hispanic students in Cleveland was also narrower than that in large central cities. The gaps between the average scores for White and Black students in Atlanta and the District of Columbia were wider than the corresponding gaps in large central cities and the nation. Similarly, the District of Columbia and San Diego had wider gaps between White students' and Hispanic students' average scores than the gap found in the nation. At grade 8, there was a narrower gap in Cleveland between White and Black students' scores and a narrower gap in Chicago between White and Hispanic students' scores than the corresponding gaps in large central cities and the nation. Los Angeles had a wider gap between White students' and Hispanic students' average scores than the corresponding gaps found in large central cities and the nation. Achievement-level results by race/ethnicity. At grade 4, no significant differences were detected between 2002 and 2003 in the percentages of subgroups of students at or above Proficient in public schools in the nation, in large central cities, or in any of the participating urban districts. At grade 8, there were also no significant differences detected between 2002 and 2003 in the percentages of subgroups of students performing at or above Proficient, except that Black eighth-grade students in Atlanta had a higher percentage at or above Proficient in 2003 than did their counterparts in 2002.
|
Table B. Average reading scale score results, by selected race/ethnicity, grades 4 and 8 public schools: By urban district, 2002 and 2003
— Not available.
‡ Reporting standards not met. Sample size is insufficient to permit a reliable estimate. * Significantly different from large central city public schools. ** Significantly different from nation (public schools). *** Significantly different from 2003. NOTE: NAEP sample sizes increased since 2002 compared to previous years, resulting in smaller detectable differences than in previous assessments. Significance tests were performed using unrounded numbers. SOURCE: U.S. Department of Education, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 2002 and 2003 Trial Urban District Reading Assessments. (Adapted from the table on p. 7 of the publication from which this article is excerpted.) |
Eligibility for free/reduced-price lunch
Reading performance by students' eligibility for free/reduced-price lunch. NAEP collects data on students' eligibility for free/reduced-price lunch as an indicator of economic status. In 2003, approximately 7 percent of fourth-graders and 6 percent of eighth-graders nationally attended schools that did not participate in the National School Lunch Program. Information regarding students' eligibility in 2003 was not available for 2 percent or less of fourth- and eighth-graders. For information on the National School Lunch Program, see http://www.fns.usda.gov/cnd/lunch/default.htm. At grade 4, no statistically significant differences from 2002 to 2003 were detected between the average scores or the percentages of students at or above Proficient in the nation or large central cities for students who were eligible for free/reduced-price lunch or for those who were not eligible. Among the participating urban districts, there were also no significant differences for these measures in 2002 and 2003, except in New York City where students who were not eligible for free/reduced-price lunch had a higher average scale score in 2003 than in 2002. At grade 8, students in public schools in the nation who were eligible for free/reduced-price lunch scored lower, on average, in 2003 than did their counterparts in 2002. For the participating districts, there were no significant differences detected in the average scores between 2002 and 2003, except that eighth-graders in Atlanta who were not eligible for free/reduced-price lunch scored higher in 2003 than did their counterparts in 2002. Similarly, at grade 8, students in Atlanta who were not eligible for free/reduced-price lunch were the only group whose percentage of students at or above Proficient was significantly higher in 2003 than in 2002. Average reading score gaps between students who were eligible and those who were not eligible for free/reduced-price lunch. In 2003, public school students who were not eligible for free/reduced-price lunch scored higher, on average, than eligible students, by 28 points at grade 4 and 25 points at grade 8. At grade 4, the gap in Houston was narrower than the gaps in large central cities and the nation, while the gap in Charlotte was wider than those in both large central cities and the nation. At grade 8, the District of Columbia and Houston had narrower score gaps than those in large central cities and the nation, while Charlotte and New York City had wider gaps in average scores than the gap found in large central cities. Reading performance by student-reported highest level of parents' education, grade 8
Eighth-grade students who participated in the NAEP 2002 and 2003 reading assessments, including those in the TUDA, were asked to indicate, from among five options, the highest level of education completed by each parent. The question was not posed to fourth-graders. In 2003, the average scores for students who indicated that a parent graduated from college were lower in Atlanta, Chicago, Cleveland, the District of Columbia, and Los Angeles than the average score for students in the same parental education category in public schools in large central cities. Average scores for students who reported that a parent graduated from college were higher in Charlotte than average scores for comparable students in large central cities. Among eighth-graders in public schools nationally, average scores were lower in 2003 than in 2002 for students who indicated that their parents either did not graduate from high school or did graduate from high school or college and for students who indicated that they did not know their parents' highest level of education. Among the participating urban districts, no statistically significant differences in average scores were detected between 2003 and 2002 at any level of parental education. Testing Status of Special-Needs Students Selected in NAEP Samples
NAEP endeavors to assess all students selected in the randomized sampling process, including SD students and students who are classified by their schools as LEP students. Some students who are sampled for participation, however, can be excluded from the sample according to carefully defined criteria. School personnel, guided by the student's Individualized Education Program (IEP), as well as by eligibility for Section 504 services, make decisions regarding inclusion in the assessment of SD students. Based on NAEP's guidelines, they also make the decision regarding inclusion of LEP students. The process includes evaluating the student's capability to participate in the assessment in English, as well as taking into consideration the number of years the student has been receiving instruction in English. The percentage of students excluded from NAEP may vary considerably across states or districts. Comparisons of achievement results across districts should be interpreted with caution if the exclusion rates vary widely.
Footnotes
1The NAEP achievement levels are as follows. Basic denotes partial mastery of prerequisite knowledge and skills that are fundamental for proficient work at each grade. Proficient represents solid academic performance for each grade assessed. Students reaching this level have demonstrated competency over challenging subject matter, including subject-matter knowledge, application of such knowledge to real-world situations, and analytical skills appropriate to the subject matter. Advanced signifies superior performance. Detailed descriptions of the NAEP reading achievement levels can be found on the NAGB web site (http://www.nagb.org/pubs/pubs.html). 2For Charlotte and Los Angeles, statistical comparisons restricted to just the schools in large central cities, as distinct from the whole-district comparisons used here, are available from the online Data Tool on the NAEP web site (http://nces.ed.gov/nationsreportcard/nde). The results of significance tests in this report for these two districts may differ slightly from those found by type of location in the online Data Tool.
|