Skip Navigation

Search Results: (1-15 of 30 records)

 Pub Number  Title  Date
NFES 2019160 Forum Guide to Personalized Learning Data
The Forum Guide to Personalized Learning Data is designed to assist education agencies as they consider whether and how to use personalized learning. It provides an overview of personalized learning and describes best practices used by education agencies to collect data for personalized learning; to use those data to meet goals; and to support relationships, resources, and systems needed for the effective use of data in personalized learning. Personalized learning is still a developing prospect in many locations. therefore, the concepts and examples provided are intended to help facilitate idea sharing and discussion.
9/6/2019
NFES 2019035 Forum Guide to Early Warning Systems
The Forum Guide to Early Warning Systems provides information and best practices to help education agencies plan, develop, implement, and use an early warning system in their agency to inform interventions that improve student outcomes. The document includes a review of early warning systems and their use in education agencies and explains the role of early warning indicators, quality data, and analytical models in early warning systems. It also describes how to adopt an effective system planning process and recommends best practices for early warning system development, implementation, and use. The document highlights seven case studies from state and local education agencies who have implemented, or are in the process of implementing, an early warning system.
11/16/2018
NFES 2018156 Forum Guide to Facility Information Management: A Resource for State and Local Education Agencies
The Forum Guide to Facility Information Management: A Resource for State and Local Education Agencies helps state and local education agencies plan, design, build, use, and improve their facility information systems. It includes a review of why school facilities data matter and recommends a five-step process that an education agency can undertake to develop a robust facility information system around goals, objectives, and indicators. The document also includes selected measures of school facilities quality and offers a logical approach to organizing facility and site data elements associated with facility identification, condition, design, utilization, management, and budget and finance.
7/19/2018
NCEE 20174027 Multi-armed RCTs: A design-based framework
Design-based methods have recently been developed as a way to analyze data for impact evaluations of interventions, programs, and policies. The estimators are derived using the building blocks of experimental designs with minimal assumptions, and have important advantages over traditional model-based impact methods. This report extends the design-based theory for the single treatment-control group design to designs with multiple research groups. It discusses how design-based estimators found in the literature need to be modified for multi-armed designs when comparing pairs of research groups to each other. It also discusses multiple comparison adjustments when conducting hypothesis tests across pairwise contrasts to identify the most effective interventions. Finally, it discusses the complex assumptions required to identify and estimate the complier average causal effect (CACE) parameter for multi-armed designs.
8/1/2017
NCEE 20174025 What is Design-Based Causal Inference for RCTs and Why Should I Use It?
Design-based methods have recently been developed as a way to analyze data from impact evaluations of interventions, programs, and policies. The approach uses the building blocks of experimental designs to develop impact estimators with minimal assumptions. The methods apply to randomized controlled trials and quasi-experimental designs with treatment and comparison groups. Although the fundamental concepts that underlie design-based methods are straightforward, the literature on these methods is technical, with detailed mathematical proofs required to formalize the theory. This brief aims to broaden knowledge of design-based methods by describing their key concepts and how they compare to traditional model-based methods, such as such as hierarchical linear modeling (HLM). Using simple mathematical notation, the brief is geared toward researchers with a good knowledge of evaluation designs and HLM.
7/25/2017
NFES 2016095 Forum Guide to Elementary/Secondary Virtual Education Data
Forum guide to elementary/secondary virtual education data was developed to assist education agencies as they: 1) consider the impact of virtual education on established data elements and methods of data collection, and 2) address the scope of changes, the rapid pace of new technology development, and the proliferation of resources in virtual education.
2/5/2016
NCEE 20154013 A Guide to Using State Longitudinal Data for Applied Research
State longitudinal data systems (SLDSs) promise a rich source of data for education research. SLDSs contain statewide student data that can be linked over time and to additional data sources for education management, reporting, improvement, and research, and ultimately for informing education policy and practice.

Authored by Karen Levesque, Robert Fitzgerald, and Joy Pfeiffer of RTI International, this guide is intended for researchers who are familiar with research methods but who are new to using SLDS data, are considering conducting SLDS research in a new state environment, or are expanding into new topic areas that can be explored using SLDS data. The guide also may be useful for state staff as background for interacting with researchers and may help state staff and researchers communicate across their two cultures. It highlights the opportunities and constraints that researchers may encounter in using state longitudinal data systems and offers approaches to addressing some common problems.
6/30/2015
NFES 2015158 Forum Guide to Alternative Measures of Socioeconomic Status in Education Data Systems
The Forum Guide to Alternative Measures of Socioeconomic Status in Education Data Systems provides “encyclopedia-type” entries for eight plausible alternative measures of socioeconomic status (SES) to help readers better understand the implications of collecting and interpreting a range of SES-related data in education agencies. Chapter 1 reviews recent changes in how SES data are collected in many education agencies and presents a call to action to the education community. Chapter 2 reviews practical steps an agency can take to adopt new measures. Chapter 3 describes each of the eight alternative measures, including potential benefits, challenges, and limitations of each option.
6/22/2015
NFES 2015157 Forum Guide to College and Career Ready Data
The Forum Guide to College and Career Ready Data examines how data are being used to support CCR initiatives. Chapter 1 presents an overview of college and career readiness. Chapter 2 focuses on five specific uses for data to support CCR programs: fostering individualized learning for students; supporting educators in addressing student-specific needs; guiding CCR programmatic decisions through the use of postsecondary feedback loops; measuring agency progress in meeting CCR accountability and continuous improvement goals; and maximizing career opportunities for all students. Each of the use cases includes policy and program questions to consider, a list of key data needs, useful analytics, suggested feedback to request from data users, and emerging needs related to the data use. Chapter 3 outlines a number of overarching issues for the use of CCR data, and Chapter 4 summarizes key points and emerging needs identified throughout the Guide.
6/5/2015
NCEE 20154011 Statistical Theory for the RCT-YES Software: Design-Based Causal Inference for RCTs
This Second Edition report updates the First Edition published in June 2015 that presents the statistical theory underlying the RCT-YES software that estimates and reports impacts for RCTs for a wide range of designs used in social policy research. The preface to the new report summarizes the updates from the previous version. The report discusses a unified, non-parametric design-based approach for impact estimation using the building blocks of the Neyman-Rubin-Holland causal inference model that underlies experimental designs. This approach differs from the more model-based impact estimation methods that are typically used in education research. The report discusses impact and variance estimation, asymptotic distributions of the estimators, hypothesis testing, the inclusion of baseline covariates to improve precision, the use of weights, subgroup analyses, baseline equivalency analyses, and estimation of the complier average causal effect parameter.
6/2/2015
NFES 2014802 FORUM GUIDE TO School Courses for the Exchange of Data (SCED) Classification System
This guide was developed by the National Forum on Education Statistics (Forum) to accompany the release of SCED Version 2.0 Course Codes at http://nces.ed.gov/forum/SCED.asp. It includes an overview of the SCED structure and descriptions of the SCED Framework elements, recommended attributes, and information for new and existing users on best practices for implementing and expanding their use of SCED.
6/27/2014
NCEE 20144017 Understanding Variation in Treatment Effects in Education Impact Evaluations: An Overview of Quantitative Methods
This report summarizes the complex research literature on quantitative methods for assessing how impacts of educational interventions on instructional practices and student learning differ across students, educators, and schools. It also provides technical guidance about the use and interpretation of these methods. The research topics addressed include: subgroup (moderator) analyses based on study participants’ characteristics measured before the intervention is implemented; subgroup analyses based on study participants’ experiences, mediators, and outcomes measured after program implementation; and impact estimation when treatment effects vary. The focus is on randomized controlled trials, but the methods are also applicable to quasi-experimental designs.
5/13/2014
NFES 2014801 Forum Guide to Supporting Data Access for Researchers: A Local Education Agency Perspective
This publication recommends a set of core practices, operations, and templates that can be adopted and adapted by LEAs as they consider how to respond to requests for both new and existing data about the education enterprise. .
12/4/2013
NCSER 20133000 Translating the Statistical Representation of the Effects of Education Interventions Into More Readily Interpretable Forms
This new Institute of Education Sciences (IES) report assists with the translation of effect size statistics into more readily interpretable forms for practitioners, policymakers, and researchers. This paper is directed to researchers who conduct and report education intervention studies. Its purpose is to stimulate and guide researchers to go a step beyond reporting the statistics that represent group differences. With what is often very minimal additional effort, those statistical representations can be translated into forms that allow their magnitude and practical significance to be more readily understood by those who are interested in the intervention that was evaluated.
11/28/2012
NCEE 20124019 Using an Experimental Evaluation of Charter Schools to Test Whether Nonexperimental Comparison Group Methods Can Replicate Experimental Impact Estimates

This NCEE Technical Methods Paper compares the estimated impacts of the offer of charter school enrollment using an experimental design and a non-experimental comparison group design. The study examined four different approaches to creating non-experimental comparison groups ordinary least squares regression modeling, exact matching, propensity score matching, and fixed effects modeling. The data for the study are from students in the districts and grades that were represented in an experimental design evaluation of charter schools conducted by the U.S. Department of Education in 2010 (For more information, see: http://ies.ed.gov/ncee/pubs/20104029/index.asp)

The study found that none of the comparison group designs reliably replicated the impact estimates from the experimental design study. However, the use of pre-intervention baseline data that are strongly predictive of the key outcome measures considerably reduced, but did not eliminate the estimated bias in the non-experimental impact estimates. Estimated impacts based on matched comparison groups were more similar to the experimental estimators than were the estimates based on the regression adjustments alone, the differences are moderate in size, although not statistically significant.

4/26/2012
   1 - 15     Next >>
Page 1  of  2