Acting Commissioner, National Center for Education Statistics
National Assessment of Educational Progress
Arts 2008: Music and Visual Arts
June 15, 2009
Good morning. I'm here today to release the results of the 2008 NAEP Arts Assessment.
This is our first arts assessment since 1997. Like the 1997 assessment, the 2008 assessment covered eighth-grade students only. We assessed students in two arts disciplines, music and the visual arts. Our representative samples included over 3,900 students in each discipline. We conducted the assessment early last year.
The 1997 framework developed by the Governing Board envisioned an assessment that would include music, visual arts, theatre, and dance, but in 1997 we discovered that few schools offered systematic instruction in either theatre or dance. For reasons both of cost and difficulties in sampling, we decided not to assess either theatre or dance in 2008.
We assessed student abilities in two different arts processes. The first process is "responding," which generally involves presenting students with a work of art and asking them to answer either multiple-choice or constructed-response questions about the art-work. For visual arts only, we also assessed student abilities in the "creating" process. That is, we asked them to create art works, which we then scored. Again because of cost considerations, we did not give students creating tasks in music.
In both music and visual arts, the tasks, whether responding or creating, had significant intellectual content and often required academic knowledge. We weren't asking students for a simple emotional reaction but rather to demonstrate their academic knowledge and exercise critical and analytical skills in their responses and in their artistic creations. For example, in music, students were asked to identify the name a piano dynamic marking and explain its meaning. In visual arts, they were asked to identify an example of Renaissance art from among several choices.
For the responding process in both music and visual arts, we were able to create 0-300 scales on which to present student performance. For the creating tasks, however, we did not have enough questions to allow the development of a 0-300 scale. Instead, we used what we call "average percentage of the maximum possible score." The scoring for each creating question allowed students full or partial credit, depending on the quality of their work. For example, students were asked to create a self-portrait, and their work was rated as "Sufficient," "Uneven," "Minimal," or "Insufficient." This presentation includes some examples of students' self-portraits later on.
The individual question scores were averaged together to create a single summary score for all the creating tasks. We also used summary scores for student groups—race/ethnicity or gender, for example—to make comparisons between those groups.
Comparisons between the 2008 and 1997 arts assessments were limited due to the passage of time. Since 1997, some of the methods we use for training scorers had changed, some of the student art-work used for training scorers in 1997 had cracked or faded over time, and the kinds of art supplies and tools available today are not exactly the same as those available in 1997.
Although we cannot compare overall performance on the assessments over time, we were able to compare performance over time on individual multiple-choice questions. But these do not reflect overall achievement in music or visual arts. We were also able to make comparisons between 1997 and 2008 regarding student participation in arts activities, which I'll describe in a few moments.
First I will describe the results for Music. These results, like the results for every NAEP assessment, are based on samples, so there is a margin of error associated with every score. When comparing NAEP scores, we only cite differences that are larger than the margin of error—those that are statistically significant.
For Music we have responding score results only, using the 0-300 scale, with the average score for all students set at 150. Looking at scores for students by race/ethnicity, we see that White and Asian/Pacific Islander students scored higher, on average, than Black and Hispanic students. (We do not have average scores for American Indian/Alaska Native students in the arts assessment because our samples were not large enough to produce reportable results for those students.
The average score for female students in music was higher than that for male students.
Students attending private schools had higher scores, on average, than those attending public schools. Viewed by school location, students attending city schools scored lower, on average, than those attending suburban, town, or rural schools.
We asked school administrators about the frequency of music instruction in their schools. According to their responses, in 2008 about 57 percent of eighth-graders attended schools offering music instruction at least three or four times a week. There was no significant difference between this 2008 percentage and the corresponding percentage obtained in the 1997 assessment.
We asked students a number of questions about what sort of music activities their teachers asked them to do in class, such as listen to music, sing, or play an instrument. The percentage who said they were asked to write down music increased from 26 percent in 1997 to 33 percent in 2008. This was the only activity for which the percentage changed significantly over time.
As an example of a music responding question, students were asked to identify the solo instrument in the opening measures of George Gershwin's Rhapsody in Blue, after listening to it. Fifty percent of the students correctly identified the instrument as a clarinet.
In another question, students were asked to choose one of four diagrams that seemed most similar to the texture of the music they heard. The students listed to the opening measures of a fugue, in which instruments playing a similar melody enter one at a time. The drawing showing a set of four similar wavy lines of differing lengths was correctly identified by 52 percent of grade 8 students as suggesting the pattern of the fugue.
As I noted earlier, we cannot compare student music responding scale scores for 2008 with those for 1997. However, we can compare the percentage correct for 21 multiple-choice questions from both assessments, and we can sum the percentage correct for these 21 questions to create an overall percentage correct for these items.
As we see, this overall percentage correct fell from 53 percent in 1997 to 51 percent in 2008. This overall percentage correct is not a measure of what students know and can do in responding to music because it omits all the information from the constructed-response questions in the assessment.
When we look at the questions one by one, we see that percentages correct decreased for five questions and increased for one.
Now I will turn to the visual arts portion of the assessment, which includes results for both responding and creating.
The overall average responding score for visual arts was set at 150, as it was in music. Again, White and Asian/Pacific Islander students had higher scores than Black and Hispanic students. In addition, female students outscored male students.
In visual arts, the 10-point difference in scores for private versus public school students was not statistically significant. Students attending suburban schools had a higher average score than students attending city schools.
As I explained earlier, score results for creating in visual arts are stated in terms of percent of the maximum score. The overall average percent of maximum score was 52 out of 100. White and Asian/Pacific Islander students had higher scores than Black and Hispanic students, and female students scored higher than male students.
We asked school administrators how often their school offered instruction in visual arts. According to their responses, in 2008 about 47 percent of eighth-grade students attended schools offering visual arts instruction at least three or four times a week. There were no significant changes since 1997 in the reported frequency of visual arts instruction.
We also asked students about the kinds of visual arts activities their teachers asked them to do in class. The percentage who said their teachers allowed them to choose their own art project fell from 47 percent in 1997 to 39 percent in 2008. In contrast, the percentage who said they were asked to write about their own artwork rose from 21 percent to 27 percent. These were the only in-class activities in which we saw a significant change. However, the percentage of eighth-graders who reported that they visited an art museum or gallery with their class dropped from 22 percent in 1997 to 16 percent in 2008.
As an example of how students were assessed about the responding aspect of visual arts, students were asked a number of questions about a self-portrait by the German artist Kathe Kollwitz, done with charcoal. They were also asked about a second self-portrait by the Austrian painter Egon Schiele, done with crayon and watercolor. Among other things, we asked students to identify a technical similarity between the two self-portraits, choosing from among four alternatives. Thirty-seven percent chose the correct option, "Both works combine loose gestural lines with careful drawing."
There were 12 common multiple-choice questions on the 1997 and 2008 visual arts assessments. The percent correct for these 12 questions overall was 42 percent in both 1997 and 2008. The percent correct for one question was lower in 2008 than in 1997. For the other 11 questions, the differences between 1997 and 2008 were not statistically significant.
In the creating part of the visual arts assessment, we asked students to create a self-portrait. They were given white drawing paper, colored oil pastels, a mirror, and a charcoal pencil. Their self-portraits were rated as "Sufficient," "Uneven," "Minimal," or "Insufficient."
Four percent of self-portraits received the highest rating, "Sufficient."
One self-portrait rated "Sufficient" showed clear and specific observations that communicate something important about the artist. The work incorporates identifying details, for example, expressive facial features. The self-portrait shows purposeful use of compositional elements and sophisticated use of materials, such as loose, skillful lines used to draw a jacket, which add definition to the body. The choice the student made to color only his face and T-shirt focuses the viewer on his face and expression and creates contrast between his figure and the surrounding space. The choice to stylize his features and use color on his face the way he has also suggest the student spent time observing the Schiele self-portrait, in particular.
Another student's self-portrait, also rated "Sufficient," showed clear and specific observations that communicated something important about herself as a subject. The work was fully developed and realized, and showed very good use of proportion, color, and line. In particular, the student skillfully used color to emphasize and create contrast between specific parts of her self-portrait.
Twenty-five percent of self-portraits were rated as "Uneven." In one self-portrait rated "Uneven," the student gave her work individuality by vivid use of color, facial expression, and the symbols incorporated in her jewelry and the background. However, elements of her work seem inconsistent and lacking in deliberation, such as the placement and rendering of the symbols and colors in the background.
Fifty-seven percent of self-portraits were rated as "Minimal." One self-portrait rated as "Minimal" indicated the student made efforts at specific observations but they were relatively minimal—for example, red lines in the eyes. Overall, the use of materials in this self-portrait was unskilled. For example, while this student may have been attempting to convey some sense of an individual person by emphasizing only his eyes and mouth with color, he lacked the skill to make this choice distinctive enough to convey his message.
Fourteen percent of students' self-portraits were rated "Insufficient." One example of a self-portrait rated "Insufficient" shows nonspecific observation, little awareness of composition, and highly unskilled use of materials. In contrast to the "Minimal" response, there were no features in this self-portrait that conveyed anything specific about a person, and it remained at a general level.
For More Information
This concludes my overview of the 2008 Arts Report Card. The patterns in differences in scores—for example, for race/ethnicity, gender, and type of school—were similar though not identical for music and the visual arts. The differences in scores for students according to race/ethnicity were similar to what we see in other NAEP Report Cards. While we could not make a full comparison with the 1997 assessment, the items we could compare tend to show comparable results.
You'll find much more information in the Arts Report Card itself, along with additional information available from the NAEP website. In particular, the NAEP Questions Tool allows access to questions from the assessment, student answers and artwork, and scoring guides to show you how we scored the students' work.
In closing, I would like to thank the teachers, schools, and students who participated in this assessment. Without their cooperation and hard work, we wouldn't have this report for you today.