Search Results: (1-15 of 48 records)
|NCES 2023055||Overview of the Middle Grades Longitudinal Study of 2017–18 (MGLS:2017): Technical Report
This technical report provides general information about the study and the data files and technical documentation that are available. Information was collected from students, their parents or guardians, their teachers, and their school administrators. The data collection included direct and indirect assessments of middle grades students’ mathematics, reading, and executive function, as well as indirect assessments of socioemotional development in 2018 and again in 2020. MGLS:2017 field staff provided additional information about the school environment through an observational checklist.
|NCES 2021029||2012–2016 Program for International Student Assessment Young Adult Follow-up Study (PISA YAFS): How reading and mathematics performance at age 15 relate to literacy and numeracy skills and education, workforce, and life outcomes at age 19
This Research and Development report provides data on the literacy and numeracy performance of U.S. young adults at age 19, as well as examines the relationship between that performance and their earlier reading and mathematics proficiency in PISA 2012 at age 15. It also explores how other aspects of their lives at age 19—such as their engagement in postsecondary education, participation in the workforce, attitudes, and vocational interests—are related to their proficiency at age 15.
|NFES 2020132||Forum Guide to Exit Codes
The Forum Guide to Exit Codes provides best practice information for tracking data about when students transferred, completed high school, dropped out, or otherwise exited an education agency. This resource defines exit codes and reviews their use in an education agency; provides an updated, voluntary, common taxonomy for exit codes; discusses best practices and methods for addressing specific challenges in exit codes data collection; features case studies that highlight different education agencies’ approaches to and experiences with exit coding.
|REL 2020026||Relationships between Schoolwide Instructional Observation Scores and Student Academic Achievement and Growth in Low‑Performing Schools in Massachusetts
The Massachusetts Department of Elementary and Secondary Education (DESE), like other state education agencies and districts, recognizes that a key lever to turning around low-performing schools is the quality of instruction (Hill & Harvey, 2004; Hopkins, Harris, Watling, & Beresford, 1999). As part of the annual monitoring of state-designated low-performing schools, DESE’s external low-performing school monitors use Teachstone’s Classroom Assessment Scoring System (CLASS) tool to conduct observations. DESE’ external monitors rated low-performing schools on three domains of instruction—Emotional Support, Classroom Organization, and Instructional Support. This paper examines the relationships between these observation scores and academic growth and achievement within a school, after adjusting for the percentage of students with low incomes and the grade levels in these low-performing schools. Results show statistically significant positive relationships between schoolwide average observation scores for each instructional domain and school-level academic growth in both English language arts (ELA) and mathematics. On a 7-point scale, a 1-point increase in a school’s overall observation rating was associated with an increase in student growth of 4.4 percentile points of growth in ELA and 5.1 percentile points of growth in mathematics. For schoolwide achievement, which is measured by the percentage of students who met or exceeded expectations on the state assessment, results show a significant positive relationship between the classroom organization domain and ELA schoolwide achievement. There was no significant relationship between observation scores and schoolwide achievement in ELA for any other domain or for mathematics schoolwide achievement. The relationship between observation scores and current achievement levels may be weak because achievement levels may be influenced by many other factors including students’ prior achievement and the economic and social challenges their families face.
|NCES 2019084||Technology and K-12 Education: The NCES Ed Tech Equity Initiative
This interactive brochure provides an overview of the Initiative—including its purpose, goal, and target outcomes.
|NCES 2019086||Technology and K-12 Education: The NCES Ed Tech Equity Initiative: Framework
Check out our new factsheet to learn about the factors most critical to informing ed tech equity in the context of K-12 education!
|NCES 2019087||Technology and K-12 Education: The NCES Ed Tech Equity Initiative: Data Collection Priorities
This factsheet outlines the key subtopics NCES will prioritize in its ed tech equity data collections.
|NCES 2019031||Findings and Recommendations from the National Assessment of Educational Progress (NAEP) 2017 Pilot Study of the Middle School Transcript Study (MSTS): Methodological Report, NCES 2019-031
This report summarizes the methodological findings of a pilot study that was designed to test the feasibility of collecting eighth-grade student transcript and course catalog data via electronic submissions.
The transcript data of eighth-grade students from Trial Urban District Assessments (TUDA) schools that participated in the NAEP 2017 eighth-grade mathematics and reading assessments were collected.
|NCES 2018158||NAEP 2015 NIES Restricted-Use Data Files (Y46NIES)
This CD-ROM contains data and documentation files for the NAEP 2015 National Indian Education Study (NIES) for use in the analysis of NAEP data by secondary researchers. NIES, which was administered as part of NAEP, is a two-part study designed to describe the condition of education for American Indian and Alaska Native (AI/AN) students in the United States. The data files include NAEP mathematics and reading assessment data from the samples of American Indian/Alaska Native (AI/AN) students at grades 4 and 8 who participated in the National Assessment of Educational Progress (NAEP) 2015 mathematics or reading assessments, as well as NIES survey response data of sampled AI/AN students in grades 4 and 8, their teachers, and their school principals. A Data Companion is provided in electronic portable document format (PDF). This document contains information on the contents and use of the data files as well as the assessment design and its implications for analysis. NAEP datasets from 2002 onward require a Tool Kit with the updated NAEPEX. Your organization must apply for and be granted a restricted-use data license in order to obtain these data.
|NCES 2017249||Collaborative Problem Solving Skills of 15-Year-Olds: Results From PISA 2015
The focus of this Data Point is on the performance of students in the United States relative to their peers in 50 other education systems that participated in the PISA collaborative problem solving assessment in 2015. The PISA assessment of collaborative problem solving measured students’ ability to solve a problem by sharing the understanding and effort required to come to a solution, and pooling their knowledge, skills, and effort to reach that solution. Readers interested in more detailed data related to collaborative problem solving should also visit the NCES PISA website for data tables and figures. Please visit https://nces.ed.gov/surveys/pisa/pisa2015/index.asp to learn more.
|WWC SSR82160||The impact of computer usage on academic performance: Evidence from a randomized trial at the United States Military Academy
The 2016 study, "The Impact of Computer Usage on Academic Performance: Evidence from a Randomized Trial at the United States Military Academy," examined the impacts of computer usage on the academic performance of college students. The study found that students who were permitted to use Internet-enabled devices in class scored lower on final exams than those in classes that prohibited the use of such devices. The impact estimate for the combined multiple choice and short answer portion of the final exam meets WWC group design standards without reservations. The impact estimate for the essay question portion of the final exam does not meet WWC group design standards because essays were only graded once and therefore, the authors were unable to report a measure of reliability for these scores.
|REL 2017212||How are middle school climate and academic performance related across schools and over time?
The purpose of this study was to examine the relationship between school climate and academic performance in two different ways: (1) by comparing the academic performance of different schools with different levels of school climate and (2) by examining how changes in a school's climate were associated with changes in its students' academic achievement. To examine how school climate and academic performance are related, this study analyzed grade 7 student data from 2004/05 to 2010/11 from the California Healthy Kids Survey, the California Standardized Testing and Reporting program, and the California Basic Educational Data System for 978 middle schools in California. School climate was measured by a set of student survey questions that assessed students' perceptions about six domains of school climate. Schools with positive school climates were those in which students reported high levels of safety/connectedness, caring relationships with adults, and meaningful student participation, as well as low levels of substance use at school, bullying/discrimination, and student delinquency. Regression models were used to estimate the relationship between student-reported school climate and students' average academic performance across schools. Regression models were also used to estimate how, for a given school, academic performance changes as school climate changes. All models included controls for racial/ethnic composition, percentage of English learners, and percentage of students eligible for free/reduced-price meals. The study found that (1) middle schools with higher levels of positive student-reported school climate exhibited higher levels of academic performance; (2) increases in a school's level of positive student-reported school climate were associated with simultaneous increases in that school's academic achievement; and (3) within-school increases in academic achievement associated with school climate increases were substantially smaller than the academic performance differences across schools with different school climate levels. As positive school climate is continuing to gain more attention as a lever to improve student learning, there is increasing interest in how improvements in school climate are related to improvements in academic performance. Most studies examining the school climate-academic performance relationship compare the academic achievement across schools with different levels of school climate. Although the results of this study found that schools with high levels of positive school climate exhibited substantially higher levels of academic performance than their counterparts with low levels of positive school climate, such differences across schools were not an accurate guide for predicting the magnitude of school-specific gains in academic performance associated with increases in school climate.
|REL 2017172||English learner students' readiness for academic success: The predictive potential of English language proficiency assessment scores in Arizona and Nevada
The purpose of this study was to examine the relationship between the English language proficiency (ELP) assessment scores of English learner students in Arizona and Nevada and the students' subsequent performance on academic content tests in two key subject areas: English language arts (ELA) and mathematics. This study provided analyses focused on two samples of students: a cohort of students who were in grade 3 (elementary) in 2009/10 and a cohort of students who were in grade 6 (middle school) in 2009/10. In general, the study found that the higher the English learner students' English language proficiency level, the higher were their passing rates on the academic content assessments. In order to have at least a 50-percent probability of passing the academic content assessments in the two years following the ELP assessment, grade 6 English learner students in both Arizona and Nevada needed an ELP scale score that exceeded the threshold for English language proficiency, the minimum level for reclassification as fluent English proficient students and placement full time in mainstream, English-only classes. To have a 50-percent probability of passing the academic content assessments, grade 6 English learner students had to score between 1 and 46 scale score points above the reclassification minimum, depending on the state and academic content test. On the other hand, grade 3 English learner students in Arizona and Nevada did not have to reach the proficiency threshold on the initial ELP assessment in order to have a 50-percent or higher probability of passing the ELA and mathematics content tests in the two years following the assessment. To have a 50-percent probability of passing the academic content assessments, grade 3 English learner students could score between 15 and 46 scale score points below the reclassification minimum, depending on the state and academic content test. The probability of passing the ELA and math content tests when scoring at the minimum level for reclassification for grade 3 English learner students was always higher than for those in grade 6: at least 39 percentage points higher in math and at least 55 percentage points in ELA.
|REL 2016160||Survey methods for educators: Selecting samples and administering surveys (part 2 of 3)
Educators at the state and local levels are increasingly using data to inform policy decisions. While student achievement data is often used to inform instructional or programmatic decisions, educators may also need additional sources of data, some of which may not be housed in their existing data systems. Creating and administering surveys is one way to collect such data. However, documentation available to educators about administering surveys may provide insufficient guidance about sampling or analysis approaches. Furthermore, some educators may not have training or experience in survey methods. In response to this need, REL Northeast & Islands created a series of three complementary guides that provide an overview of the survey research process designed for educators. The guides describe (1) survey development, (2) sampling respondents and survey administration, and (3) analysis and reporting of survey data.
Part two of this series, "Sampling Respondents and Survey Administration," outlines the following steps, drawn from the research literature:
1. Define the population
2. Specify the sampling procedure
3. Determine the sample size
4. Select the sample
5. Administer the survey
The guide provides detailed, real-world examples of how these steps have been used in a REL research alliance project. With this guide, educators will be able to develop their own sampling and survey administration plans.
|REL 2016163||Survey methods for educators: Collaborative survey development (part 1 of 3)
Educators at the state and local levels are increasingly using data to inform policy decisions. While student achievement data is often used to inform instructional or programmatic decisions, educators may also need additional sources of data, some of which may not be housed in their existing data systems. Creating and administering surveys is one way to collect such data. However, documentation available to educators about administering surveys may provide insufficient guidance about the survey development process. Furthermore, some educators may not have training or experience in survey methods. In response to this need, REL Northeast & Islands created a series of three complementary guides that provide an overview of the survey research process designed for educators. The guides describe (1) survey development, (2) sampling respondents and survey administration, and (3) analysis and reporting of survey data.
Part one of this series, "Collaborative Survey Development," outlines the following steps, drawn from the research literature:
1. Identify topics of interest
2. Identify relevant, existing survey items
3. Draft new survey items and adapt existing survey items
4. Review draft survey items with stakeholders and content experts
5. Refine the draft survey using cognitive interviewing
The guide provides detailed, real-world examples of how these steps have been used in REL research alliance projects. With this guide, educators will be able to develop their own surveys in collaboration with other practitioners, researchers, and content experts.
1 - 15 Next >>
Page 1 of 4