Skip Navigation
small NCES header image

Jack Buckley
Commissioner, National Center for Education Statistics

National Assessment of Educational Progress
Vocabulary Results From the 2009 and 2011 NAEP Reading Assessments

December 6, 2012

Commissioner Jack Buckley's Briefing Slides MS PowerPoint (9 MB)

Good morning. I am here today to share with you results from our new report, Vocabulary Results From the 2009 and 2011 NAEP Reading Assessments. This report is the first NAEP report to present vocabulary results.

With the new NAEP vocabulary assessment, NCES explored the relationship between students' knowledge of what words mean and reading comprehension. As we know, reading requires a fundamental knowledge of the meaning of words. So, it is important to recognize how well students' vocabulary contributes to their understanding of what they read. The NAEP vocabulary assessment does just that it looks at how well students understand how a word contributes meaning to all or part of a passage, rather than asking students to define an isolated word.

When students encounter a word, they might think of several different definitions, depending on their prior knowledge and experience. For instance, in the NAEP vocabulary assessment, when shown the passage "Ducklings Come Home to Boston," students read the word "puzzled." Some students might think of a jigsaw puzzle piece or crossword puzzle. Other students might think of being confused or perplexed. But when placed into the context of the passage, only one of these word meanings fits.

In this particular passage, the word "puzzled" was used to convey confusion about why there were no ducks at the Public Garden in Boston. "Puzzled" is just one example of many words that students read as part of the NAEP vocabulary assessment. From this assessment we'll learn whether students at grades 4, 8, and 12 were able to understand a variety of words in context.

We assessed vocabulary as part of the 2009 and 2011 reading assessments. In 2009, we assessed reading at all three grade levels, while in 2011 we assessed grades 4 and 8. For both assessment years, the fourth- and eighth-grade national samples were composed of representative samples of all 50 states, plus the District of Columbia and the Department of Defense school system. These samples were combined to create the national samples. The grade 12 sample was a national sample only, but we also have separate state results for 11 states at grade 12 (these states participated at grade 12 on a trial basis in 2009).

The results are presented in two ways: as the percentages of students who correctly answered the vocabulary questions, and as scale scores on a 0-500 vocabulary scale. We do not have achievement levels (Basic, Proficient, Advanced) specific to vocabulary.

Vocabulary questions appeared in two different types of sections of the NAEP reading assessments. Comprehension sections included questions covering all aspects of the reading assessment. Students were asked to read full-length passages, which ranged from 800 to 1,200 words in length, depending on grade. The questions students were asked in these sections could be either multiple-choice or constructed-response—questions that require a written response from the student—about 10 questions in all. Two of the 10 were vocabulary questions, which are multiple-choice only.

The vocabulary only sections had shorter reading passages—about half the length of the passages in the comprehension section. There were about five vocabulary questions in these sections, which again were all multiple-choice.

Vocabulary words had to meet the following criteria:

  • They had to be characteristic of written language rather than conversational language.
  • They had to be words that could be used across several content areas, rather than technical terms whose use is confined to one content area.
  • They had to present familiar concepts, even if the word itself is not known.
  • In addition, the words had to be necessary for understanding all or part of the passage in which they occurred.

National Results

Now we'll look at student vocabulary performance using scale scores, starting with national results. Because all these score results are based on samples, when we compare scores, we only discuss differences in scores that are statistically significant.

We can identify lower-, middle-, and higher-performing students on the 0-500 vocabulary scale, using percentiles. Average scores for lower-performing students are given by the vocabulary scores for students at the 10th and 25th percentiles, while the score for the 50th percentile gives the score for middle-performing students. Results for higher-performing students are shown by the scores for the 75th and 90th percentiles.

At grade 4, the score for students at the 90th percentile decreased from 269 in 2009 to 266 in 2011, while the score for the 75th percentile decreased from 247 to 245. The remaining percentiles did not show statistically significant changes. We make similar comparisons for grade 8 students in 2009 and 2011. At grade 8, scores also decreased for higher-performing students, but increased for students at the 10th percentile.

We examined student vocabulary performance in connection with reading comprehension, as measured by performance on the overall NAEP reading assessment. On average, students who scored higher in reading comprehension also scored higher on vocabulary questions. This association was observed in grades 4 and 8 in 2011 and in grade 12 in 2009.

Examination of the White-Black score gap at grades 4 and 8 for vocabulary shows no significant change from 2009 to 2011. For grade 4, the 27-point gap in 2009 was not significantly different from the 29-point gap in 2011. For grade 8, the 30-point gap in 2009 was not significantly different from the 29-point gap in 2011.

There was no significant change in the White-Hispanic gap from 2009 to 2011 at grade 4, but at grade 8 there was a decrease, from 30 to 28 points. Neither the White nor the Hispanic score changed significantly over the two assessments, but the gap between them did.

At grade 4, the score differences between White and Asian/Pacific Islander students were not statistically significant for either year. At grade 8, the 5-point differences were significant in both years, with no significant change in the gap from 2009 to 2011.

For grade 12, we only have results for 2009. Grade 12 White students had a score of 307, higher than the scores for either Black or Hispanic students, but not significantly different from the score for Asian/Pacific Islander students.

Vocabulary scores for male and female students are compared at all three grades, using 2011 scores for grades 4 and 8 and 2009 scores for grade 12. At grades 4 and 8, female students had a higher average score, by a 2- or 3-point margin. At grade 12, the difference in scores was not statistically significant.

State Results

We have complete state results at grades 4 and 8 for both 2009 and 2011. For grade 12, we have limited results—for 11 states, in 2009 only.

We compare the average vocabulary score for each state with the national average at grade 4 in 2011. Twenty states had a score that was higher than the nation, while 12 had an average that was lower. In the remaining states, the vocabulary score was not significantly different from the national score.

At grade 8, 23 states had a score that was higher than the nation, while 13 had an average that was lower. For both grades, the states with higher scores are concentrated in the north central and north eastern areas of the country.

Grade 12 comparisons for 2009 show that 3 states out of 11 participating had a higher vocabulary score than the nation. The three, all located in New England, were New Hampshire, Massachusetts, and Connecticut. Two states—Arkansas and Florida—had an average score lower than the nation.

The report, Vocabulary Results From the 2009 and 2011 NAEP Reading Assessments, provides all of this information and much more. In addition, the initial release website gives extensive information on the performance of students and access to released assessment questions through NAEP's Questions Center. The NAEP Data Explorer, our online data-analysis tool, allows extensive further analysis of student performance as well.

In conclusion, I would like to offer my sincere thanks to all the students, teachers, and schools who participated in the 2009 and 2011 reading assessments.

Commissioner Jack Buckley's Briefing Slides MS PowerPoint (9 MB)

Visit the Nation's Report Card website..

Top

Would you like to help us improve our products and website by taking a short survey?

YES, I would like to take the survey

or

No Thanks

The survey consists of a few short questions and takes less than one minute to complete.
National Center for Education Statistics - http://nces.ed.gov
U.S. Department of Education