Search Results: (1-15 of 39 records)
Pub Number | Title | Date |
---|---|---|
WWC 2023008 | Using Bayesian Meta-Analysis to Explore the Components of Early Literacy Interventions
The What Works Clearinghouse (WWC) released a report that applies two methodological approaches new to the WWC that together aim to improve researchers' understanding of how early literacy interventions may work to improve outcomes for students in grades K-3. First, this report pilots a new taxonomy developed by early literacy experts and intervention developers as part of a larger effort to develop standard nomenclature for the components of literacy interventions. Then, the WWC uses Bayesian meta-analysis—a statistical method to systematically summarize evidence across multiple studies—to estimate the associations between intervention components and intervention impacts. Twenty-nine studies of 25 early literacy interventions that were previously reviewed by the WWC and met the WWC's rigorous research standards were included in the analysis. This method found that the components examined in this synthesis appear to have a limited role in explaining variation in intervention impacts on alphabetics outcomes, including phonics, phonemic awareness, phonological awareness, and letter identification. This method also identified positive associations between intervention impacts on alphabetics outcomes and components related to using student assessment data to drive decisions, including about how to group students for instruction, and components related to non-academic student supports, including efforts to teach social-emotional learning strategies and outreach to parents and families. This report is exploratory because this synthesis cannot conclude that specific components caused improved alphabetics outcomes. |
9/12/2023 |
NCES 2023015 | Middle Grades Longitudinal Study of 2017–18 (MGLS:2017) Assessment Item Level File (ILF), Read Me
This ReadMe provides guidance and documentation for users of the Middle Grades Longitudinal Study of 2017-18 (MGLS:2017) Assessment Item Level File (ILF)(NCES 2023-014) made available to researchers under a restricted use only license. Other supporting documentation includes MGLS_Math_and_Reading_Items_User_Guide.xlsx, MGLS_MS1_Math_Item_Images.pdf, MGLS_MS2_Math_Item_Images.pdf, MGLS_MS1_MS2_Reading_Sample_Item_Type_Images.pdf, MGLS_MS1_MS2_EF_HeartsFlowers_Instructions.pptx, and MGLS_MS2_EF_Spatial_2-back_Instructions.pptx |
8/16/2023 |
NCES 2023014 | MGLS 2017 Assessment Item Level Files (ILF)
The Middle Grades Longitudinal Study of 2017-18 (MGLS:2017) measured student achievement in mathematics and reading along with executive function. The MGLS:2017 ILF contains the item level data from these direct measures that can be used in psychometric research for replicating or enhancing the scoring found in the MGLS:2017 RUF or in creating new scores. The Middle Grades Longitudinal Study of 2017–18 (MGLS:2017) Assessment Item Level File (ILF) contains two .csv files representing the two rounds of data collection: the MGLS:2017 Main Study (MS) Base Year (MS1) and the Main Study Follow-up (MS2) files. |
8/16/2023 |
NCES 2023013 | User’s Manual for the MGLS:2017 Data File, Restricted-Use Version
This manual provides guidance and documentation for users of the Middle Grades Longitudinal Study of 2017–18 (MGLS:2017) restricted-use school and student data files (NCES 2023-131). An overview of MGLS:2017 is followed by chapters on the study data collection instruments and methods; direct and indirect student assessment data; sample design and weights; response rates; data preparation; data file content, including the composite variables; and the structure of the data file. Appendices include a psychometric report, a guide to scales, field test reports, and school and student file variable listings. |
8/16/2023 |
NCES 2023055 | Overview of the Middle Grades Longitudinal Study of 2017–18 (MGLS:2017): Technical Report
This technical report provides general information about the study and the data files and technical documentation that are available. Information was collected from students, their parents or guardians, their teachers, and their school administrators. The data collection included direct and indirect assessments of middle grades students’ mathematics, reading, and executive function, as well as indirect assessments of socioemotional development in 2018 and again in 2020. MGLS:2017 field staff provided additional information about the school environment through an observational checklist. |
3/16/2023 |
NCES 2022049 | U.S. Technical Report and User Guide for the 2019 Trends in International Mathematics and Science Study (TIMSS)
The U.S. TIMSS 2019 Technical Report and User’s Guide provides an overview of the design and implementation of TIMSS 2019 in the United States and includes guidance for researchers using the U.S. datasets. This information is meant to supplement the IEA’s TIMSS 2019 Technical Report and TIMSS 2019 User Guide by describing those aspects of TIMSS 2019 that are unique to the United States including information on merging the U.S. public- and restricted-use student, teacher, and school data files with the U.S. data files in the international database. |
10/17/2022 |
NCES 2022033 | Split-Sample Administration of the 2019 School Crime Supplement to the National Crime Victimization Survey
The 2019 School Crime Supplement (SCS) included a randomized split-half experiment designed to compare two versions of an updated series of questions on bullying, and to test changes in wording for several additional items. One of the principle comparisons of the experiment was focused on the removal of the term “bullying” from the bullying series of items on the questionnaire. This report outlines the development, methodology, and results of the split-sample administration of the 2019 School Crime Supplement to the National Crime Victimization Survey. |
9/14/2022 |
NCES 2022001 | National Household Education Surveys Program of 2019: Methodological Experiments Report
This report presents the methods and findings of the methodological experiments that were included in the 2019 administration of the National Household Education Surveys Program. These experiments were organized around three key themes: (1) better understanding how the offered response modes affect response rates; (2) increasing response by web; and (3) increasing response from specific demographic subgroups. |
6/22/2022 |
NCES 2021011 | Technical Report and User Guide for the 2018 Program for International Student Assessment (PISA): Data Files and Database with U.S.-Specific Variables
This technical report and user guide is designed to provide researchers with an overview of the design and implementation of PISA 2018, as well as with information on how to access the PISA 2018 data. This information is meant to supplement OECD publications by describing those aspects of PISA 2018 that are unique to the United States. |
7/8/2021 |
NCES 2021029 | 2012–2016 Program for International Student Assessment Young Adult Follow-up Study (PISA YAFS): How reading and mathematics performance at age 15 relate to literacy and numeracy skills and education, workforce, and life outcomes at age 19
This Research and Development report provides data on the literacy and numeracy performance of U.S. young adults at age 19, as well as examines the relationship between that performance and their earlier reading and mathematics proficiency in PISA 2012 at age 15. It also explores how other aspects of their lives at age 19—such as their engagement in postsecondary education, participation in the workforce, attitudes, and vocational interests—are related to their proficiency at age 15. |
6/15/2021 |
REL 2021075 | Evaluating the Implementation of Networked Improvement Communities in Education: An Applied Research Methods Report
The purpose of this study was to develop a framework that can be used to evaluate the implementation of networked improvement communities (NICs) in public prekindergarten (PK)–12 education and to apply this framework to the formative evaluation of the Minnesota Alternative Learning Center Networked Improvement Community (Minnesota ALC NIC), a partnership between Regional Educational Laboratory Midwest, the Minnesota Department of Education, and five alternative learning centers (ALCs) in Minnesota. The partnership formed with the goal of improving high school graduation rates among students in ALCs. The evaluation team developed and used research tools aligned with the evaluation framework to gather data from 37 school staff in the five ALCs participating in the Minnesota ALC NIC. Data sources included attendance logs, postmeeting surveys (administered following three NIC sessions), a post–Plan-Do-Study-Act survey, continuous improvement artifacts, and event summaries. The evaluation team used descriptive analyses for quantitative and qualitative data, including frequency tables to summarize survey data and coding artifacts to indicate completion of continuous improvement milestones. Engagement in the Minnesota ALC NIC was strong, as measured by attendance data and post–Plan-Do-Study-Act surveys, but the level of engagement varied by continuous improvement milestones. Based on postmeeting surveys, NIC members typically viewed the NIC as relevant and useful, particularly because of the opportunities to work within teams and develop relationships with staff from other schools. The percentage of meeting attendees agreeing that the NIC increased their knowledge and skills increased over time. Using artifacts from the NIC, the evaluation team determined that most of the teams completed most continuous improvement milestones. Whereas the post–Plan-Do-Study-Act survey completed by NIC members indicated that sharing among different NIC teams was relatively infrequent, contemporaneous meeting notes recorded specific instances of networking among teams. This report illustrates how the evaluation framework and its aligned set of research tools were applied to evaluate the Minnesota ALC NIC. With slight adaptations, these tools can be used to evaluate the implementation of a range of NICs in public PK–12 education settings. The study has several limitations, including low response rates to postmeeting surveys, reliance on retrospective measures of participation in continuous improvement activities, and the availability of extant data on a single Plan-Do-Study-Act cycle. The report includes suggestions for overcoming these limitations when applying the NIC evaluation framework to other NICs in public PK–12 education settings. |
3/8/2021 |
REL 2021057 | Tool for Assessing the Health of Research-Practice Partnerships
Education research-practice partnerships (RPPs) offer structures and processes for bridging research and practice and ultimately driving improvements for K-12 outcomes. To date, there is limited literature on how to assess the effectiveness of RPPs. Aligned to the most commonly cited framework for assessing RPPs, Assessing Research-Practice Partnerships: Five Dimensions of Effectiveness, this two-part tool offers guidance on how researchers and practitioners may prioritize the five dimensions of RPP effectiveness and their related indicators. The tool also provides an interview protocol for RPP evaluators to use as an instrument for assessing the extent to which the RPP demonstrates evidence of the prioritized dimensions and their indicators of effectiveness. |
2/2/2021 |
IES 2020001REV | Cost Analysis: A Starter Kit
This starter kit is designed for grant applicants who are new to cost analysis. The kit will help applicants an a cost analysis, setting the foundation for more complex economic analyses. |
6/1/2020 |
NCSER 2020001 | An Introduction to Adaptive Interventions and SMART Designs in Education
Educators must often adapt interventions over time because what works for one student may not work for another and what works now for one student may not work in the future for the same student. Adaptive interventions provide education practitioners with a prespecified, systematic, and replicable way of doing this through a sequence of decision rules for whether, how, and when to modify interventions. The sequential, multiple assignment, randomized trial (SMART) is one type of multistage, experimental design that can help education researchers build high-quality adaptive interventions. Despite the critical role adaptive interventions can play in various domains of education, research about adaptive interventions and the use of SMART designs to develop effective adaptive interventions in education is in its infancy. To help the field move forward in this area, the National Center for Special Education Research (NCSER) and the National Center for Education Evaluation and Regional Assistance (NCEE) commissioned a paper by leading experts in adaptive interventions and SMART designs. This paper aims to provide information on building and evaluating high-quality adaptive interventions and review the components of SMART designs, discuss the key features of the SMART, and introduce common research questions for which SMARTs may be appropriate. |
11/25/2019 |
NCES 2019113 | U.S. PIRLS and ePIRLS 2016 Technical Report and User's Guide
The U.S. PIRLS and ePIRLS 2016 Technical Report and User's Guide provides an overview of the design and implementation in the United States of the Progress in International Reading Literacy Study (PIRLS) and ePIRLS 2016, along with information designed to facilitate access to the U.S. PIRLS and ePIRLS 2016 data. |
8/27/2019 |
1 - 15
Next >>
Page 1
of 3