Search Results: (1-15 of 191 records)
|NCES 2019417||Considerations for Using the School Courses for the Exchange of Data (SCED) Classification System in High School Transcript Studies: Applications for Converting Course Codes from the Classification of Secondary School Courses (CSSC)
The National Center for Education Statistics (NCES) began collecting high school transcripts with the High School and Beyond (HS&B) longitudinal study of students who were in the 10th grade in 1980. NCES has continued to collect transcripts for a secondary-level cohort in each subsequent decade. This report describes the two high school course coding systems used by NCES and the development of a crosswalk that allows data coded with the first system to be translated into the second system. It then provides tables with estimates generated using the two different systems.
|NCES 2019113||U.S. PIRLS and ePIRLS 2016 Technical Report and User's Guide
The U.S. PIRLS and ePIRLS 2016 Technical Report and User's Guide provides an overview of the design and implementation in the United States of the Progress in International Reading Literacy Study (PIRLS) and ePIRLS 2016, along with information designed to facilitate access to the U.S. PIRLS and ePIRLS 2016 data.
|NCES 2019046||Development of the 2018 Secondary School Course Taxonomy
This report describes the development of the Secondary School Course Taxonomy (SSCT), to be used with high school transcript coursetaking data that have been coded using the School Courses for the Exchange of Data (SCED). The SSCT aggregates the SCED-coded courses into 20 subject fields that align with how NCES has traditionally reported on career and technical education.
|NCES 2019031||Findings and Recommendations from the National Assessment of Educational Progress (NAEP) 2017 Pilot Study of the Middle School Transcript Study (MSTS): Methodological Report, NCES 2019-031
This report summarizes the methodological findings of a pilot study that was designed to test the feasibility of collecting eighth-grade student transcript and course catalog data via electronic submissions.
The transcript data of eighth-grade students from Trial Urban District Assessments (TUDA) schools that participated in the NAEP 2017 eighth-grade mathematics and reading assessments were collected.
|NCES 2018020||U.S. TIMSS 2015 and TIMSS Advanced 1995 & 2015 Technical Report and User's Guide
The U.S. TIMSS 2015 and TIMSS Advanced 1995 & 2015 Technical Report and User's Guide provides an overview of the design and implementation in the United States of the Trends in International Mathematics and Science Study (TIMSS) 2015 and TIMSS Advanced 1995 & 2015, along with information designed to facilitate access to the U.S. TIMSS 2015 and TIMSS Advanced 1995 & 2015 data.
|NCES 2018098||Measuring School Climate Using the 2015 School Crime Supplement: Technical Report
This report uses data from the 2015 School Crime Supplement (SCS) to the National Crime Victimization Survey (NCVS) to develop school climate measures and identify differences in scores for various student demographics including students experiencing or not experiencing criminal victimization and bullying.
|NCES 2018067||2015–16 Integrated Postsecondary Education Data System (IPEDS): Data Inconsistencies Between the Outcome Measures (OM) and Graduation Rates (GR) Survey Components
This data file documentation provides guidance and documentation to users of the Integrated Postsecondary Education Data System (IPEDS) data collected in the Graduation Rates (GR), 200 Percent Graduation Rates (GR200), and Outcome Measures (OM) survey components for the 2015–16 collection year. The purpose of the report is to document the data inconsistencies between the OM, GR, and GR200 survey components and describe the efforts made by the U.S. Department of Education’s National Center for Education Statistics (NCES) to improve data quality.
|NCES 2018195||2017-18 Integrated Postsecondary Education Data System (IPEDS) Methodology Report
This report describes the universe, methods, and editing procedures used in the 2017-18 Integrated Postsecondary Education Data System (IPEDS) data collection.
|NCES 2018183||Early Childhood Longitudinal Study, Kindergarten Class of 2010-11, First-Grade and Second-Grade Psychometric Report
This report describes the design, development, administration, quality control procedures, and psychometric characteristics of the direct and indirect child assessment instruments used to measure the knowledge, skills, and development of young children participating in the Early Childhood Longitudinal Study, Kindergarten Class of 2010–11 (ECLS-K:2011) in the in the first- and second-grade data collections. The focus of this volume is the third through sixth rounds of data collection: the fall 2011 and spring 2012 first-grade and the fall 2012 and spring 2013 second-grade rounds. Changes to previously released kindergarten scores are also described.
|NCES 2018182||Early Childhood Longitudinal Study, Kindergarten Class of 2010-11, Kindergarten Psychometric Report
This report describes the design, development, administration, quality control procedures, and psychometric characteristics of the direct and indirect child assessment instruments used to measure the knowledge, skills, and development of young children participating in the Early Childhood Longitudinal Study, Kindergarten Class of 2010–11 (ECLS-K:2011) in the kindergarten year. The focus of this volume is the first two waves of data collection: the fall 2010 kindergarten wave and the spring 2011 kindergarten wave.
|NCES 2018121||Administering a Single-Phase, All-Adults Mail Survey: A Methodological Evaluation of the 2013 NATES Pilot Study
This report describes the methodological outcomes from an address-based sampling (ABS) mail survey, the 2013 pilot test of the National Adult Training and Education Survey. The study tested the feasibility of (1) using single-stage sampling, rather than two-stage sampling (with a screener to identify adults within households), and (2) mailing out three individual survey instruments per household versus a composite booklet with three combined instruments.
|NCES 2017095||Technical Report and User Guide for the 2015 Program for International Student Assessment (PISA)
This technical report and user guide is designed to provide researchers with an overview of the design and implementation of PISA 2015 in the United States, as well as information on how to access the PISA 2015 data. The report includes information about sampling requirements and sampling in the United States; participation rates at the school and student level; how schools and students were recruited; instrument development; field operations used for collecting data; detail concerning various aspects of data management, including data processing, scaling, and weighting. In addition, the report describes the data available from both international and U.S. sources, special issues in analyzing the PISA 2015 data, as well as a description of merging data files.
|NCES 2017078||2016-17 Integrated Postsecondary Education Data System (IPEDS) Methodology Report
This report describes the universe, methods, and editing procedures used in the 2016-17 Integrated Postsecondary Education Data System (IPEDS) data collection.
|NCES 2017147||Best Practices for Determining Subgroup Size in
Accountability Systems While Protecting Personally
Identifiable Student Information
The Every Student Succeeds Act (ESSA) of 2015 (Public Law 114-95) requires each state to create a plan for its statewide accountability system. In particular, ESSA calls for state plans that include strategies for reporting education outcomes by grade for all students and for economically disadvantaged students, students from major racial and ethnic groups, students with disabilities, and English learners. In their plans, states must specify a single value for the minimum number of students needed to provide statistically sound data for all students and for each subgroup, while protecting personally identifiable information (PII) of individual students. This value is often referred to as the "minimum n-size."
Choosing a minimum n-size is complex and involves important and difficult trade-offs. For example, the selection of smaller minimum n-sizes will ensure that more students' outcomes are included in a state's accountability system, but smaller n-sizes can also increase the likelihood of the inadvertent disclosure of PII. Similarly, smaller minimum n-sizes enable more complete data to be reported, but they may also affect the reliability and statistical validity of the data.
To inform this complex decision, Congress required the Institute of Education Sciences (IES) of the U.S. Department of Education to produce and widely disseminate a report on "best practices for determining valid, reliable, and statistically significant minimum numbers of students for each of the subgroups of students" (Every Student Succeeds Act of 2015 (ESSA 2015), Public Law 114-95). Congress also directed that the report describe how such a minimum number "will not reveal personally identifiable information about students." ESSA prohibits IES from recommending any specific minimum number of students in a subgroup (Section 9209).
IES produced this report to assist states as they develop accountability systems that (1) comply with ESSA; (2) incorporate sound statistical practices and protections; and (3) meet the information needs of state accountability reporting, while still protecting the privacy of individual students.
As presented in this report, the minimum n-size refers to the lowest statistically defensible subgroup size that can be reported in a state accountability system. Before getting started, it is important to understand that the minimum n-size a state establishes and the privacy protections it implements will directly determine how much data will be publicly reported in the system.
|NCES 2017012||NATES 2013: Nonresponse Bias Analysis Report
The 2013 National Adult Training and Education Survey (NATES) was a pilot study that tested the feasibility of using address-based sampling and a mailed questionnaire to collect data on the education, training, and credentials of U.S. adults. This report presents study findings related to nonresponse bias. Nonresponse adjustments corrected for bias on key outcome measures, but not for many background variables. Auxiliary data were found to be of potential use in correcting this bias.