Skip Navigation

Clear Search


Search by:






Release Date 

     

Type of Product (help)

Survey/Program Area

Visit the IES Publications & Products Search to query all IES publications and products.

Search Results: (1-15 of 29 records)

 Pub Number  Title  Date
NCES 200501 2000 NAEP -- 1999 TIMSS Linking Report
The report describes the methodology used in an attempt to link the 2000 NAEP assessments in mathematics and science to the 1999 TIMSS assessments in those subjects. The report explains why the linking effort met only limited success.
8/8/2005
NCES 200319 NAEP Quality Assurance Checks of the 2002 Reading Assessment Results for Delaware
In March 2003, NCES asked Human Resources Research Organization (HumRRO) to participate in a special study of 2002 reading assessment results for Delaware. The study was based on seven technical factors aimed at exploring the data for potential problems: the sampling of Delaware students; the case weights of the Delaware data; the design for assigning test booklets; the scoring of Delaware data; scaling and equating for Delaware; coding data in Delaware; and a breach in test security in Delaware.
8/25/2003
NCES 200314 Feasibility Studies of Two-Stage Testing in Large-Scale Educational Assessment: Implications for NAEP
This report discusses the rationale for enhancing the current NAEP design by adding a capacity for adaptive testing. Items are tailored to the achievement level of the student in adaptive testing. The authors conclude that implementation of adaptive testing procedures, two-stage testing in particular, has the potential to increase the usability and validity of NAEP results. Adaptive testing would permit adequately reliable scores to be reported to individual students and their parents, increasing their personal stake in performing well. Improvement in data quality would also speed data processing and permit delivery of assessment results in a timely manner.
5/21/2003
NCES 200315 Computer Use and Its Relation to Academic Achievement in Mathematics, Reading, and Writing
In this study, the authors, using evidence obtained from the 1996 NAEP assessment in Mathematics and the 1998 NAEP main assessments in reading and writing, examine patterns of computer achievement in each of these three academic domains. The authors conclude that the design of the NAEP data collection precludes using such data to make even tentative conclusions about the relationship of achievement and computer use. They recommend further study, including a multi-site experiment to determine how teachers and students are using computers and the impact of computers on achievement.
5/21/2003
NCES 200316 Implications of Electronic Technology for the NAEP Assessment
This report emphasizes the need for NAEP to integrate the use of technology into its assessment procedures; it reviews major options; and suggests priorities to guide the integration. The author identifies three short-term goals for this development: a linear computer-administered assessment in a target subject area such as mathematics or science should be implemented; a computer administered writing assessment should be developed and implemented; the introduction and evaluation of technology-based test accommodations for handicapped students and English-language learners should be continued. The author suggests that NAEP consider redesign as an integrated electronic information system that would involve all aspects of the assessment process including assessment delivery, scoring and interpretation, development of assessment frameworks, specifications of population and samples, collection of data, and prepration and dissemination of results.
5/21/2003
NCES 200317 The Effects of Finite Sampling on State Assessment Sample Requirements
This study addresses statistical techniques the might ameliorate some of the sampling problems currently facing states with small populations participating in State NAEP. The author explores how the application of finite population correction factors to the between-school component of variance could be used to modify sample sizes required of states that currently qualify for the exemptions from State NAEP's minimum sample requirements. He also examines how to preserve the infinite population assumptions for hypothesis testing related to comparisons between domain means. Results lend support to alternate sample size specifications both in states with few schools and in states with many small schools. The author notes that permitting states to use design options other than the current State NAEP requirement could reduce costs related to test administration, scoring, and data processing.
5/21/2003
NCES 200311 Reporting the Results of the National Assessment of Educational Progress
This paper explores ways results of NAEP data collections might be communicated to a variety of audiences, each with differing needs for information, interests in its findings, and sophistication in interpreting its results. The author describes “market-basket” reporting as a feasible alternative to traditional NAEP reporting. These reports would include samples of items and exercises used in an assessment together with their scoring rubrics which would give a clearer picture of the kinds of skills assessed by NAEP, as well as an indication of skills not assessed. In the second section of the paper, the author cautions that in order to uphold strict standards of data quality, NAEP reports must format and display results to make them more accessible while also discouraging readers from drawing overly broad interpretations of the data. A final section describes a detailed program of research on reporting and dissemination of NAEP findings based on these three dimensions: the research questions to be asked; the audiences to whom the questions should be addressed; and the strategies through which the questions should be pursued – as well as the intersection of these dimensions. The author suggests that the highest priority be given to research on reporting through public media; followed by making NAEP reporting more understandable and useful to school curriculum and instruction personnel, reporting to the public, and further research with state education personnel.
5/20/2003
NCES 200312 An Investigation of Why Students Do Not Respond to Questions
Developers of NAEP have substantially changed the mix of item types on assessments, decreasing the numbers of multiple-choice questions and increasing the numbers of short and extended constructed-response questions. At the same time, researchers have noted unacceptably high student nonresponse rates. These rates seem to vary with student characteristics like gender and race, and they potentially confound NAEP reports, analyses, and subsequent conclusions. The small-scale, exploratory study, which is qualitative in nature, offers insights for setting future studies. Future studies might include quantitative analysis of existing NAEP data sets to determine whether observed patterns of association between omissions and student or item characteristics hold up over larger numbers of students and items than were included in this study.
5/20/2003
NCES 200313 A Study of Equating in NAEP
The authors detail a computer-simulation study they conducted to investigate the amount of uncertainty added to NAEP estimates by equating error, under three different equating methods, and while varying a number of factors that might affect accuracy of equating. Data from past NAEP administrations were used to guide the simulations, and error due to equating was estimated empirically. It is the authors’ conclusion that the merits of less biased measurements may outweigh the problems caused by slight adjustments to previously reported scores. They recommend that long-term trend lines be periodically reanalyzed using methods such as multiple-group IRT that can minimize such biases.
5/20/2003
NCES 200306 The Validity of Oral Accommodation in Testing
This study examines the impact of oral presentation of a mathematics test on the performance of disabled and non-disabled students. It is an example of empirical research providing evidence for evaluating the validity and fairness of accommodations use. Both learning disabled and non-disabled students improved their performance under the accommodated conditions, although learning disabled students had greater gains. The presence of an effect for the regular classroom students suggests the possibility that irrelevant variance in the non-accommodated scores is overcome by the use of the accommodation for both groups of students.
5/19/2003
NCES 200307 An Agenda for NAEP Validity Research
This report resulted from the systematic analysis undertaken by the NAEP Validity Studies Panel to consider the domain of validity threats to NAEP and to identify the most urgent validity research priorities. A framework of 6 broad categories was devised: 1) the constructs measured within each of NAEP’s subject domains; 2) the manner in which these constructs are measure; 3) the representation of the population; 4) the analysis of data; 5) the reporting and use of NAEP results; and 6) the assessment of trends. Panel subcommittees prepared papers laying out the critical validity issues in each area, and these papers are presented in chapters 2 through 7 of this report. The panel reviewed papers in each area and set priorities of each area of validity research by a consensus process. Sixteen suggested studies or areas of study were rated by the full panel. Four studies stood out as essential and 2 others were rated between highly important and essential. The panel indicated unanimously that studies are “Essential” to evaluate the validity aspects of NAEP’s new role under the No Child Left Behind legislation, however that role is operationalized.
5/19/2003
NCES 200308 Improving the Information Value of Performance Items in Large Scale Assessments
The authors first provide a summary overview of what is already known and what is needed to learn about item types for future NAEP assessments. Essentially, the question addressed here is: Do constructed-response items provide more information about what students are capable of doing than what multiple-choice items alone provide, and if so, what types of skills are tapped by the constructed-response items that are not measured by multiple-choice items? A fresh examination of the relationships between multiple-choice and constructed-response items is needed, according to the authors. The authors propose a set of studies that would provide needed information about the value added of performance items in mixed-format assessments such as NAEP.
5/19/2003
NCES 200309 Optimizing State NAEP: Issues and Possible Improvements
The paper addresses 3 key topics related to making state NAEP more efficient: reducing the burden for the states, stabilizing the assessment schedule, and facilitating and promoting the use of state NAEP data. The author recommends promoting the use of state NAEP data for the continued success of the NAEP program. She suggests that this could involve devoting greater attention to how best to link state assessment and NAEP results, developing more timely and user-friendly reports and working with states and other organizations to more effectively address the data needs of different NAEP audiences. She also proposes expending proportionately less of the state NAEP resources on data collection and more on disseminating information about the many uses of the program.
5/19/2003
NCES 200310 A Comparison of the NAEP and PIRLS Fourth-Grade Reading Assessments
In anticipation of questions about how the 2001 PIRLS fourth-grade assessment and the 2002 NAEP fourth grade reading assessment compare, NCES convened an expert panel to compare the content of the PIRLS and NAEP assessment to determine if they measured the same construct. This involved a close examination of how PIRLS and NAEP define reading, the texts used as the basis for the assessments, and the reading processes required of students in each.
4/7/2003
NCES 200206 The Measurement of Instructional Background indicators: Cognitive Laboratory Investigations of the Responses of Fourth and Eighth Grade Students and Teachers to Questionnaire Items
To improve the National Assessment of Educational Progress (NAEP), cognitive lab interviews were conducted with 4th- and 8th-grade students and their teachers to evaluate the validity of responses to background and instructional questions. Frequent discrepancies were found between students' and teachers' responses to the same questions. Causes are analyzed, and the paper contains recommendations for collecting more valid information.
9/5/2002
   1 - 15     Next >>
Page 1  of  2