(NCES 98-521) Ordering information
The National Assessment of Educational Progress (NAEP) remains the only accurate and credible indicator of educational performance capable of informing about both national trends and state differences in student achievement. NAEP, which is also known as The Nations Report Card, tests fourth, eighth, and twelfth-grade students in reading, writing, mathematics, science, history, geography, civics, the arts, foreign language, and economics. However, NAEP has become less efficient and more complex and costly to administer over the years. As a result, it may be that not enough subjects are tested, and test results may not be reported to the public soon enough after the tests. In addition, an assessment designed to provide only national or state-level measures may not be meeting the needs of states or their school districts. Many people have suggested that the time has come for NAEP to be redesigned, so it can monitor the educational achievement of students in our nation in a more efficient and comprehensive manner and better serve the informational needs of local districts, states, and others -- without sacrificing its quality, accuracy, or reliability./1
NAEP serves many different constituencies whose opinions must figure heavily in determining the future directions NAEP should take. However, what these constituent groups want usually is not measured systematically. It is gleaned anecdotally or through processes of consultation or consensus building that are not systematic enough to give accurate indications of the group members' true feelings. Furthermore, the feelings of different groups are not usually compared and weighed in relation to one another. This study measured constituency opinions directly on key features and directions of NAEP.
Method
In order to directly and precisely measure, analyze, and compare the opinions of some key NAEP constituencies, a multi-stage process was undertaken:
(1) Key constituencies and issues were identified through consultation with NCES and NAGB. Groups (constituencies) included Chief State School Officers, State Assessment Directors, State Curriculum Directors, Governors Education Aides, State Board Chairs, State Legislature Education staff, Superintendents of large urban and suburban districts, and senior staff of national education organizations.
(2) Draft survey instruments to measure opinions about the issues were developed and pilot tested by a team of researchers from the American Institutes for Research (AIR) and the Education Statistics Services Institute (ESSI). After the instruments were reviewed and approved by the Office of Management and Budget (OMB), surveys were administered to representatives of the eight key constituencies (groups).
(3) Focus groups were conducted to assess opinions about these issues with representatives of other constituencies. Focus groups provide less precise indicators of how different constituencies feel about NAEP than can be obtained with surveys. Logistical considerations required their use to provide these other constituencies with opportunities for input into the redesign process.
Only issues in the NAEP redesign which were open -- for which policy decisions had not been made -- were investigated in this project. The survey topics and constituents responses are highlighted in the Results section of this summary.
Surveys
A draft NAEP Constituents Survey was developed and reviewed by NCES and NAGB. During the development of the survey, three basic principles were followed:
After review, pilot testing of the survey was conducted and minor modifications were made based on feedback from participants. Three versions of the survey were prepared to reflect the differences in knowledge and interests of different groups.
After OMB approval, the NAEP Constituents Survey was mailed to 424 constituents in late February, 1997. All members in each of the groups were surveyed. Within each group, there was typically one representative from each state. Table A identifies the groups surveyed, the number of constituents surveyed, the number of returned surveys from each group, and the final response rates. Individuals in different roles within the states were strongly encouraged to respond from the perspective of their roles, as opposed to deferring to other respondents. Further, states were urged not to attempt to create a unified response, so the distinct opinions of each group across states could be measured.
Intensive follow-up procedures were used to increase response rates to the mail survey. Telephone calls to nonrespondents were made by trained AIR/ESSI staff in March and early April. With telephone follow-up, the overall response rate was 83 percent (352 completed surveys).
Survey analyses included summaries of response frequencies for all items and cross tabulations by constituent group. Statistical tests were conducted to ensure that differences in responses across the different types of constituencies were not due to chance. The results of the analyses are summarized in the Results section of this summary.
Table A. Overall Response Rate, by Respondent Group --------------------------------------------------------------------------------------- Number of Number of Final potential surveys response Respondent Group respondents received rate --------------------------------------------------------------------------------------- State Education Agency Assessment Directors 52 51 98% State Education Agency Curriculum Directors 51 48 94% Large Suburban School District Superintendents 49 45 92% Large Urban School District Superintendents 48 41 85% Chief State School Officers 51 43 84% Staff of Education Associations 19 16 84% State Board of Education Chairpersons 52 37 71% State Legislature Education Committee Staff 51 36 71% Governors/Education Policy Aids 51 35 69% --------------------------------------------------------------------------------------- Total 424 352 83%
Focus Groups
For some groups of constituents, the NAEP Constituents Survey would not be feasible to administer. Protocols for focus groups were developed to parallel the structure of the basic survey. Seven focus groups were held to gather the input of:
(1) public high school principals,
(2) private high school principals,
(3) elementary school principals,
(4) members of the general and education press,
(5) the general public,
(6) members of national business organizations which conduct efforts to
support and improve schools, and
(7) teachers.
The focus groups were not mixed across constituencies (i.e., elementary school principals were interviewed with other elementary school principals and not with teachers, etc.) In total, 46 individuals from 19 different states and the District of Columbia participated in the focus groups.
To solicit input from the research community, the NAEP Constituents Survey was sent to all current American Education Research Association (AERA) and National Council on Measurement in Education (NCME) officers and their immediate predecessors. Since their concerns were expected to be very different from those of the constituencies involved with the implementation of NAEP, these results were analyzed separately. They are reported in the Focus Group section.
Results
Several topics were covered in the NAEP Constituents Survey and the focus groups. The following sections highlight constituents responses to each issue.
Emphasis of Background Information
Constituents were asked to identify how important they felt four types of background characteristics were for NAEP to study. This input was used to prioritize the importance of 1) school characteristics; 2) student background factors; 3) instructional practices; and 4) topics of current educational relevance.
Impact of Background Questions on Release of Results
Respondents were asked to consider the value of background items in light of their impact on when results could be released. Opinions were similar for all constituents.
Impact of Technical Documentation on Release of Results
Constituents were also asked to consider the necessity of technical documentation -- that is, the extensive documentation of the psychometric and other technical characteristics of the assessment -- in NAEP reports in light of its impact on the report release date.
Inclusion of a Parent Survey
Respondents were asked to consider the tradeoffs -- that is, the value of the information versus its costs -- of including a parent survey as part of NAEP. Opinions differed on this topic; however, most groups noted common benefits and potential concerns in collecting data from parents.
Schedule for Release of Results
Respondents were asked whether they prefer to have reports released as they become available (the current procedure) or as part of a predetermined schedule. The responses differed somewhat again between the survey and focus group participants.
State Mandates for NAEP Participation
State representatives were asked to indicate whether their state mandates NAEP participation.
Subjects Assessed at the State Level
State representatives were also asked to indicate how likely it would be that their state would participate in various years of NAEP state assessments.
Assessing Subject Areas in Combinations or Individually
Respondents were asked about their preferences for measuring and reporting results for social sciences and history, natural sciences, and reading/language arts; that is, whether they would prefer to have scores in each of the areas reported for each individual (component) subject or reported as a cluster.
Desire for Information on Skill Areas within Subjects
Constituents were asked to indicate how important subscale (skill area and discipline) scores, in addition to overall scores, were for the subjects of mathematics, reading, writing, science, history, and geography at each of the NAEP grade levels (4, 8, and 12). Similar preferences were noted by the survey and focus group respondents.
Frequency of Data Collections
In the past, NAEP tested students every other year. Congress recently authorized NAEP to collect data every year. Respondents were asked to indicate whether they would prefer a yearly administration, and if so, why they preferred it to a biennial schedule (i.e., administration of NAEP every other year).
Linking NAEP to International Assessments
Constituents were asked about the value of linking NAEP scores to international assessments, to allow individual state performance to be compared with the performance of other countries. Similar opinions were expressed by all constituent groups.
Obtaining State-Level Results
State education assessment and curriculum directors were presented with various approaches to obtaining state-level information on NAEP, including the option of a market basket approach, which would provide representative sets of assessment exercises, and the current, full state assessment. In the market basket approach, short modules representing the assessment would be offered to states. These modules could be used to obtain state-representative results, to calibrate state assessments to the NAEP scale, or to obtain state-level information more frequently than the NAEP-funded schedule would provide. Respondents were asked about their interest in using the current state NAEP assessment, the market basket assessment, or a combination of both measures.
Constituents were also asked how important seven factors (i.e., state costs, school burden, psychometric test properties, ability to compare results with other states, ability to use results for within state comparisons, ability to obtain information on instructional practices and their relationship to student achievement, ability to obtain student-level results) were in evaluating alternative approaches for obtaining state NAEP results.
States Paying for Some Services
Respondents were asked to assess their states willingness to pay for three different services: a state level assessment using the current approach; linking NAEP results with the states regular assessment; and the provision of extra market basket assessments for states to use as they desire.
Conclusions
Although the constituent groups did not have identical opinions on the issues discussed in the NAEP Constituents Survey and focus groups, there were many cases in which similar views were expressed. Attitudes which were common to most or all of the respondents include:
FOOTNOTES:
[1] National Assessment Governing Board, Policy Statement on Redesigning the National Assessment of Educational Progress, August 2, 1996.
[2] Education researchers preferred to have NAEP results released as soon as they were available.
Download/view the full report in a PDF file.(263K)
For more information about the content of this report, contact Arnold Goldstein at Arnold_Goldstein@ed.gov.