Search Results: (1-15 of 180 records)
|NCES 2017095||Technical Report and User Guide for the 2015 Program for International Student Assessment (PISA)
This technical report and user guide is designed to provide researchers with an overview of the design and implementation of PISA 2015 in the United States, as well as information on how to access the PISA 2015 data. The report includes information about sampling requirements and sampling in the United States; participation rates at the school and student level; how schools and students were recruited; instrument development; field operations used for collecting data; detail concerning various aspects of data management, including data processing, scaling, and weighting. In addition, the report describes the data available from both international and U.S. sources, special issues in analyzing the PISA 2015 data, as well as a description of merging data files.
|NCES 2017078||2016-17 Integrated Postsecondary Education Data System (IPEDS) Methodology Report
This report describes the universe, methods, and editing procedures used in the 2016-17 Integrated Postsecondary Education Data System (IPEDS) data collection.
|NCES 2017147||Best Practices for Determining Subgroup Size in
Accountability Systems While Protecting Personally
Identifiable Student Information
The Every Student Succeeds Act (ESSA) of 2015 (Public Law 114-95) requires each state to create a plan for its statewide accountability system. In particular, ESSA calls for state plans that include strategies for reporting education outcomes by grade for all students and for economically disadvantaged students, students from major racial and ethnic groups, students with disabilities, and English learners. In their plans, states must specify a single value for the minimum number of students needed to provide statistically sound data for all students and for each subgroup, while protecting personally identifiable information (PII) of individual students. This value is often referred to as the "minimum n-size."
Choosing a minimum n-size is complex and involves important and difficult trade-offs. For example, the selection of smaller minimum n-sizes will ensure that more students' outcomes are included in a state's accountability system, but smaller n-sizes can also increase the likelihood of the inadvertent disclosure of PII. Similarly, smaller minimum n-sizes enable more complete data to be reported, but they may also affect the reliability and statistical validity of the data.
To inform this complex decision, Congress required the Institute of Education Sciences (IES) of the U.S. Department of Education to produce and widely disseminate a report on "best practices for determining valid, reliable, and statistically significant minimum numbers of students for each of the subgroups of students" (Every Student Succeeds Act of 2015 (ESSA 2015), Public Law 114-95). Congress also directed that the report describe how such a minimum number "will not reveal personally identifiable information about students." ESSA prohibits IES from recommending any specific minimum number of students in a subgroup (Section 9209).
IES produced this report to assist states as they develop accountability systems that (1) comply with ESSA; (2) incorporate sound statistical practices and protections; and (3) meet the information needs of state accountability reporting, while still protecting the privacy of individual students.
As presented in this report, the minimum n-size refers to the lowest statistically defensible subgroup size that can be reported in a state accountability system. Before getting started, it is important to understand that the minimum n-size a state establishes and the privacy protections it implements will directly determine how much data will be publicly reported in the system.
|NCES 2017012||NATES 2013: Nonresponse Bias Analysis Report
The 2013 National Adult Training and Education Survey (NATES) was a pilot study that tested the feasibility of using address-based sampling and a mailed questionnaire to collect data on the education, training, and credentials of U.S. adults. This report presents study findings related to nonresponse bias. Nonresponse adjustments corrected for bias on key outcome measures, but not for many background variables. Auxiliary data were found to be of potential use in correcting this bias.
|NCES 2017004||Split-Half Administration of the 2015 School Crime Supplement to the National Crime Victimization Survey
The 2015 School Crime Supplement (SCS) to the National Crime Victimization Survey (NCVS) administration contained an embedded, randomized split-half experiment to compare two versions of an updated series of questions on bullying. This report outlines the development, methodology, and results of the split-half experiment.
|NCES 2016111||2015-16 Integrated Postsecondary Education Data System (IPEDS) Methodology Report
This report describes the universe, methods, and editing procedures used in the 2015-16 Integrated Postsecondary Education Data System (IPEDS) data collection.
|NCES 2015098||2014-15 Integrated Postsecondary Education Data System (IPEDS) Methodology Report
This report describes the universe, methods, and editing procedures used in the 2014-15 Integrated Postsecondary Education Data System (IPEDS) data collection.
|NCES 2015141||2008/12 Baccalaureate and Beyond Longitudinal Study (B&B:08/12): Data File Documentation
This publication describes the methods and procedures used in the 2008/12 Baccalaureate and Beyond Longitudinal Study (B&B:08/12). These graduates, who completed the requirements for a bachelorís degree during the 2007Ė08 academic year, were first interviewed as part of the 2008 National Postsecondary Student Aid Study (NPSAS:08), and then followed up in 2009 as part of B&B:08/09. B&B:08/12 is the second follow-up interview of this cohort. This report details the methodology and outcomes of the B&B:08/12 student interview data collection and administrative records matching.
|NCES 2015010||Teaching and Learning International Survey (TALIS) 2013: U.S. Technical Report
This technical report is designed to provide researchers with an overview of the design and implementation of the Teaching and Learning International Survey (TALIS) 2013. This information is meant to supplement that presented in OECD publications by describing those aspects of TALIS 2013 that are unique to the United States.
Chapter 2 provides information about sampling requirements and sampling in the United States. Chapter 3 provides information on instrument development. Chapter 4 describes the details of how schools and teachers were recruited, and Chapter 5 describes field operations used for collecting data. Chapter 6 describes participation rates at the school and teacher level. Chapter 6 also includes nonresponse bias analysis (NRBA) results for unit-level and item-level response rates (details of the NRBA are provided in appendix E). Chapter 7 describes international activities related to data processing, and weighting. Chapter 8 describes the data available from both international and U.S. sources. Chapter 9 discusses some special issues involved in analyzing the TALIS 2013 U.S. data because of response rates below the international TALIS standards (as described in chapter 6) and also includes selected data tables from the international TALIS report. In addition, the technical report includes all recruitment materials used during the conduct of the study, the U.S. versions of the TALIS questionnaires, and a complete list of all adaptations made to the questionnaires.
|NCES 2014067||2013-14 Integrated Postsecondary Education Data System (IPEDS) Methodology Report
This report describes the universe, methods, and editing procedures used in the 2013-14 Integrated Postsecondary Education Data System (IPEDS) data collection.
|NCES 2014025||Technical Report and User Guide for the Program for International Student Assessment (PISA) 2012
The Technical Report and User Guide for the PISA 2012 is a technical manual that describes how the U.S. data were collected and processed as well as how to use the data files to conduct statistical analyses. Information is presented on sampling, response rates, school and student recruitment, instrument development and distribution, and data management. The appendices of the Technical Report and User Guide include school recruitment materials, student and parent materials, student and school questionnaires, and a nonresponse bias analysis of PISA 2012 U.S. data.
|NCES 2013046||U.S. TIMSS and PIRLS 2011 Technical Report and User's Guide
The U.S. TIMSS and PIRLS 2011 Technical Report and User's Guide provides an overview of the design and implementation in the United States of the Trends in International Mathematics and Science Study (TIMSS) 2011 and the Progress in International Reading Literacy Study (PIRLS) 2011, along with information designed to facilitate access to the U.S. TIMSS and PIRLS 2011 data.
|NCES 2014041||2008/09 Baccalaureate and Beyond Longitudinal Study (B&B:08/09) - Full-scale Methodology Report
The 2008/09 Baccalaureate and Beyond Longitudinal Study (B&B:08/09), conducted for the U.S. Department of Educationís National Center for Education Statistics (NCES), collected information primarily about studentsí education and employment in the first year following receipt of their bachelorís degree.
This report describes the methodology and findings of the B&B:08/09 data collection, which included a student interview, a transcript data collection, and an administrative data records match.
|NCES 2013469||2011 NAEP-TIMSS Linking Study: Linking Methodologies and Their Evaluations
The 2011 NAEP-TIMSS linking study conducted by the National Center for Education Statistics (NCES) was designed to predict Trends in International Mathematics and Science Study (TIMSS) scores for the U.S. states that participated in 2011 National Assessment of Educational Progress (NAEP) mathematics and science assessment of eighth-grade students. The purpose of conducting the 2011 NAEP-TIMSS linking study was two-fold. The study was conducted to see whether it is possible to predict TIMSS scores for the states that did not participate in the TIMSS assessment. Secondly, the study was conducted to identify a method among various methodologies suggested in the literature for linking two assessments that are somewhat different.
This 2011 NAEP-TIMSS linking methodology paper was prepared to supplement the reading of U.S. States in a Global Context: Results From the 2011 NAEP-TIMSS Linking Study, NCES 2013-460.
|NCES 2013293||2012-13 Integrated Postsecondary Education Data System (IPEDS) Methodology Report
This report describes the universe, methods, and editing procedures used in the 2012-13 Integrated Postsecondary Education Data System (IPEDS) data collection.