The teaching and learning of mathematics in our nation's schools continue to generate tremendous attention, both among those who support recent innovations and, more recently, among those who question the wisdom of the promulgated reforms.  In order to bring an empirical basis to this debate, it is important to gather information on the policies and practices that are actually being implemented, and this report provides one source for such information. Written for policy makers and school administrators, this report is the second of a series that discusses results from the National Assessment of Educational Progress (NAEP) 1996 mathematics assessment.
General information about student performance is presented in the first report from the assessment: NAEP 1996 Mathematics Report Card for the Nation and the States.  A third report, tentatively titled Student Work and Classroom Practices in Mathematics, includes information about student performance within the various mathematics content strands; provides numerous examples of the assessment questions and student responses to those questions; and describes teachers' instructional practices and students' course-taking and attitudes towards mathematics. Finally, the fourth report in the series, tentatively titled Focused Studies in NAEP's 1996 Mathematics Assessment, is on special studies conducted through NAEP in 1996. These studies assessed student achievement in the following areas of mathematics: estimation, problem-solving within a real-life context, and challenging, higher-level-content problems.
In the NAEP 1996 Mathematics Report Card for the Nation and the States, national gains in students' mathematics scores were reported at all three grade levels: 4, 8, and 12.  Notably, average national scores, which had already increased between 1990 and 1992, increased again in 1996. Gains were also reported in many jurisdictions that took part in the NAEP state assessments. At grade 4, 15 of the 39 states and jurisdictions that participated in both the 1992 and 1996 assessments recorded an increase in their average mathematics scores. Similar gains were evident for 13 of the 37 states and jurisdictions that participated in both the 1992 and 1996 assessments at grade 8. No state NAEP assessments were conducted at grade 12.
This report describes the educational policies and practices that prevailed during this period of sustained increases in mathematics achievement, with particular attention to the relationship between these policies and practices and student performance on the NAEP mathematics assessment. More specifically, this report provides information on the status of mathematics education in 1996 and chronicles the changes that had taken place from the time of earlier NAEP assessments.
The report is based on information provided by students and their teachers and school administrators through background questionnaires that NAEP administers concurrent with its assessments.  Students at grades 4, 8, and 12 answered questions about their home backgrounds, the instruction they received, and their course-taking. Fourth- and eighth-grade teachers who participated in the assessment provided information about their education, professional careers, curricular practices, and instructional approaches, as well as the resources available to them for teaching mathematics.  School administrators answered questions about school policies and practices.
The report is organized around three central questions:
The major findings reported below include information about the status of teachers and mathematics instruction in our nation's schools as well as the relationships between student achievement in mathematics and teacher characteristics and school policies and practices in mathematics education. In general, we have highlighted positive relationships. However, the reader should keep in mind the limitations of survey data of the kind collected by NAEP. Statistically significant associations between particular policies or practices and achievement can provide an interesting starting point for analysis or deliberation, but they cannot demonstrate a causal relationship. Additionally, the lack of significant changes or relationships to achievement with respect to variables reported from the NAEP survey is not necessarily evidence that our nation has remained static with regard to reforming policies and practices that positively impact mathematics education. Perhaps an appropriate conclusion from this report is that factors that impact teaching and learning of mathematics in our nation's classrooms rarely, if ever, work in isolation. The following are major findings from this report.
Interpreting NAEP Results
The central mandate of NAEP is to provide information on what the nation's students know and can do in a variety of content areas. In addition, for over 25 years NAEP has regularly provided the nation's only comprehensive, recurrent data about the processes of education in the nation's schools. The latter information is intended to serve a number of important purposes. Specifically, it provides an educational context for understanding data on student achievement, it identifies differences in access to instruction and distribution of services among various types of students, and it tracks changes in policy-relevant variables across time. The findings reported above and the details in the chapters that follow are illustrations of how NAEP data serve these purposes.
However, there are some cautions that users of the information presented in this report should keep in mind. Much of the data were collected by self-report, and participants were responding to a brief, written questionnaire. Although the questions were written as clearly and unambiguously as possible, respondents working in different contexts or educated from different perspectives may have interpreted some of the questions differently. The reader should also use caution in interpreting tables that portray the association between NAEP background factors and mathematics achievement. In general, one contextual variable is presented at a time. Because of the complexity of the context in which learning takes place, examining a single variable at a time and its sole relationship to student achievement may not necessarily reveal the true underlying relationships between background factors and students' cognitive performance. For example, some instructional strategies may be used only, or most often, with high-achieving students, while other strategies may be used more frequently with lower-achieving students. Furthermore, the data reported here are cross-sectional and learning is cumulative. That is, the instructional resources examined by this report reflect a single year, not those the student has experienced in the 3, 7, or 11 years of schooling. In addition, the reader should remember that statistically significant differences may be differences that are not considered educationally significant.
Nevertheless, NAEP data are valuable, particularly when they are considered in light of other knowledge about the education system, such as trends in instructional reform, changes in the school-age population, and societal demands and expectations. Notably, they provide policy makers and administrators with a national benchmark against which to compare their own local policies and practices. Because of their basis in research, NAEP data also often help to inform our understanding of how school and instructional factors relate to achievement. Consequently, NAEP results can help practitioners to check the reasonableness of local findings in these areas. In addition, NAEP data can provide a detailed and research-based source of questions and approaches for examining local policy issues, conducting local studies, and creating local initiatives to change practice.
Overview of the Remainder of the
This report includes four chapters and two appendices. Chapter 2 considers the academic preparation, teaching certification, years of teaching experience, and continuing professional development of teachers who provide mathematics instruction to the nation's students. The third chapter describes the emphasis that mathematics instruction receives in our schools. In particular, it examines school policies regarding curriculum, graduation requirements, mathematics courses offered, and time allotted for mathematics instruction. The fourth chapter reports on resources, including the availability of calculators, that support mathematics learning. Finally, Appendix A includes more detailed procedural information on the NAEP 1996 mathematics assessment, while Appendix B includes standard error tables for the data presented in the body of the report.
In each of Chapters 2, 3, and 4, student performance data are often presented alongside data on background variables. The average mathematics composite scale score is the indicator of student achievement used in this report. 
Mathematics reform efforts since the publication of A Nation at Risk have championed the notion that more students should be ready to take algebra by the eighth grade.  In this report, therefore, the eighth-grade results are disaggregated by course enrollment (algebra, pre-algebra, and eighth-grade mathematics). This allows the reader to investigate how school policies and practices differ, if they do at all, depending on the type of mathematics course in which students are enrolled. 
Information on many of the variables is also provided for public school students by state or jurisdiction, using data from the 1996 mathematics NAEP state assessment.  The NAEP 1996 mathematics assessment was conducted nationally at grades 4, 8, and 12, and state-by-state at grades 4 and 8, with 44 states, the District of Columbia, Guam, the Department of Defense Domestic Dependent Elementary and Secondary Schools (DDESS), and the overseas Department of Defense Dependents Schools (DoDDS) participating. To ensure comparability across jurisdictions, NCES has established reporting guidelines related to school and student participation rates.  Results for jurisdictions failing to meet the required initial school participation rate of 70 percent are not reported, and jurisdictions failing to meet other participation guidelines are noted in the figures presenting state-by-state results.
In presenting the state-by-state data, jurisdictions are grouped by the following categories: "Percent Above the National Average," "Percent Does Not Differ from the National Average," and "Jurisdictions Below the National Average." Because all results are described in terms of the percentages of students affected (e.g., percentage of students whose teachers have undergraduate or graduate major in mathematics), jurisdictions "above the national average" are those in which the percentage of students affected was significantly higher than the national percentage. Similarly, jurisdictions that "do not differ from the national average" are those in which the percentage of students affected was not significantly different from the national percentage, and jurisdictions "below the national average" are those in which the percentage of students was significantly lower than the national percentage. Also included in the cross-jurisdiction figures are 1996 average mathematics composite scale scores for all students within a given jurisdiction. That is, the score data are for all students, and not only for the particular category of students (e.g., students whose teachers majored in mathematics) being reported.
NCES 98-495 Ordering information
Last updated 22 March 2001 (RH)