Search Results: (1-15 of 45 records)
|REL 2020026||Relationships between Schoolwide Instructional Observation Scores and Student Academic Achievement and Growth in Low‑Performing Schools in Massachusetts
The Massachusetts Department of Elementary and Secondary Education (DESE), like other state education agencies and districts, recognizes that a key lever to turning around low-performing schools is the quality of instruction (Hill & Harvey, 2004; Hopkins, Harris, Watling, & Beresford, 1999). As part of the annual monitoring of state-designated low-performing schools, DESE’s external low-performing school monitors use Teachstone’s Classroom Assessment Scoring System (CLASS) tool to conduct observations. DESE’ external monitors rated low-performing schools on three domains of instruction—Emotional Support, Classroom Organization, and Instructional Support. This paper examines the relationships between these observation scores and academic growth and achievement within a school, after adjusting for the percentage of students with low incomes and the grade levels in these low-performing schools. Results show statistically significant positive relationships between schoolwide average observation scores for each instructional domain and school-level academic growth in both English language arts (ELA) and mathematics. On a 7-point scale, a 1-point increase in a school’s overall observation rating was associated with an increase in student growth of 4.4 percentile points of growth in ELA and 5.1 percentile points of growth in mathematics. For schoolwide achievement, which is measured by the percentage of students who met or exceeded expectations on the state assessment, results show a significant positive relationship between the classroom organization domain and ELA schoolwide achievement. There was no significant relationship between observation scores and schoolwide achievement in ELA for any other domain or for mathematics schoolwide achievement. The relationship between observation scores and current achievement levels may be weak because achievement levels may be influenced by many other factors including students’ prior achievement and the economic and social challenges their families face.
|NCES 2019084||Technology and K-12 Education: The NCES Ed Tech Equity Initiative
This interactive brochure provides an overview of the Initiative—including its purpose, goal, and target outcomes.
|NCES 2019086||Technology and K-12 Education: The NCES Ed Tech Equity Initiative: Framework
Check out our new factsheet to learn about the factors most critical to informing ed tech equity in the context of K-12 education!
|NCES 2019087||Technology and K-12 Education: The NCES Ed Tech Equity Initiative: Data Collection Priorities
This factsheet outlines the key subtopics NCES will prioritize in its ed tech equity data collections.
|NCES 2019031||Findings and Recommendations from the National Assessment of Educational Progress (NAEP) 2017 Pilot Study of the Middle School Transcript Study (MSTS): Methodological Report, NCES 2019-031
This report summarizes the methodological findings of a pilot study that was designed to test the feasibility of collecting eighth-grade student transcript and course catalog data via electronic submissions.
The transcript data of eighth-grade students from Trial Urban District Assessments (TUDA) schools that participated in the NAEP 2017 eighth-grade mathematics and reading assessments were collected.
|NCES 2018158||NAEP 2015 NIES Restricted-Use Data Files (Y46NIES)
This CD-ROM contains data and documentation files for the NAEP 2015 National Indian Education Study (NIES) for use in the analysis of NAEP data by secondary researchers. NIES, which was administered as part of NAEP, is a two-part study designed to describe the condition of education for American Indian and Alaska Native (AI/AN) students in the United States. The data files include NAEP mathematics and reading assessment data from the samples of American Indian/Alaska Native (AI/AN) students at grades 4 and 8 who participated in the National Assessment of Educational Progress (NAEP) 2015 mathematics or reading assessments, as well as NIES survey response data of sampled AI/AN students in grades 4 and 8, their teachers, and their school principals. A Data Companion is provided in electronic portable document format (PDF). This document contains information on the contents and use of the data files as well as the assessment design and its implications for analysis. NAEP datasets from 2002 onward require a Tool Kit with the updated NAEPEX. Your organization must apply for and be granted a restricted-use data license in order to obtain these data.
|NCES 2017249||Collaborative Problem Solving Skills of 15-Year-Olds: Results From PISA 2015
The focus of this Data Point is on the performance of students in the United States relative to their peers in 50 other education systems that participated in the PISA collaborative problem solving assessment in 2015. The PISA assessment of collaborative problem solving measured students’ ability to solve a problem by sharing the understanding and effort required to come to a solution, and pooling their knowledge, skills, and effort to reach that solution. Readers interested in more detailed data related to collaborative problem solving should also visit the NCES PISA website for data tables and figures. Please visit https://nces.ed.gov/surveys/pisa/pisa2015/index.asp to learn more.
|WWC SSR82160||The impact of computer usage on academic performance: Evidence from a randomized trial at the United States Military Academy
The 2016 study, "The Impact of Computer Usage on Academic Performance: Evidence from a Randomized Trial at the United States Military Academy," examined the impacts of computer usage on the academic performance of college students. The study found that students who were permitted to use Internet-enabled devices in class scored lower on final exams than those in classes that prohibited the use of such devices. The impact estimate for the combined multiple choice and short answer portion of the final exam meets WWC group design standards without reservations. The impact estimate for the essay question portion of the final exam does not meet WWC group design standards because essays were only graded once and therefore, the authors were unable to report a measure of reliability for these scores.
|REL 2017212||How are middle school climate and academic performance related across schools and over time?
The purpose of this study was to examine the relationship between school climate and academic performance in two different ways: (1) by comparing the academic performance of different schools with different levels of school climate and (2) by examining how changes in a school's climate were associated with changes in its students' academic achievement. To examine how school climate and academic performance are related, this study analyzed grade 7 student data from 2004/05 to 2010/11 from the California Healthy Kids Survey, the California Standardized Testing and Reporting program, and the California Basic Educational Data System for 978 middle schools in California. School climate was measured by a set of student survey questions that assessed students' perceptions about six domains of school climate. Schools with positive school climates were those in which students reported high levels of safety/connectedness, caring relationships with adults, and meaningful student participation, as well as low levels of substance use at school, bullying/discrimination, and student delinquency. Regression models were used to estimate the relationship between student-reported school climate and students' average academic performance across schools. Regression models were also used to estimate how, for a given school, academic performance changes as school climate changes. All models included controls for racial/ethnic composition, percentage of English learners, and percentage of students eligible for free/reduced-price meals. The study found that (1) middle schools with higher levels of positive student-reported school climate exhibited higher levels of academic performance; (2) increases in a school's level of positive student-reported school climate were associated with simultaneous increases in that school's academic achievement; and (3) within-school increases in academic achievement associated with school climate increases were substantially smaller than the academic performance differences across schools with different school climate levels. As positive school climate is continuing to gain more attention as a lever to improve student learning, there is increasing interest in how improvements in school climate are related to improvements in academic performance. Most studies examining the school climate-academic performance relationship compare the academic achievement across schools with different levels of school climate. Although the results of this study found that schools with high levels of positive school climate exhibited substantially higher levels of academic performance than their counterparts with low levels of positive school climate, such differences across schools were not an accurate guide for predicting the magnitude of school-specific gains in academic performance associated with increases in school climate.
|REL 2017172||English learner students' readiness for academic success: The predictive potential of English language proficiency assessment scores in Arizona and Nevada
The purpose of this study was to examine the relationship between the English language proficiency (ELP) assessment scores of English learner students in Arizona and Nevada and the students' subsequent performance on academic content tests in two key subject areas: English language arts (ELA) and mathematics. This study provided analyses focused on two samples of students: a cohort of students who were in grade 3 (elementary) in 2009/10 and a cohort of students who were in grade 6 (middle school) in 2009/10. In general, the study found that the higher the English learner students' English language proficiency level, the higher were their passing rates on the academic content assessments. In order to have at least a 50-percent probability of passing the academic content assessments in the two years following the ELP assessment, grade 6 English learner students in both Arizona and Nevada needed an ELP scale score that exceeded the threshold for English language proficiency, the minimum level for reclassification as fluent English proficient students and placement full time in mainstream, English-only classes. To have a 50-percent probability of passing the academic content assessments, grade 6 English learner students had to score between 1 and 46 scale score points above the reclassification minimum, depending on the state and academic content test. On the other hand, grade 3 English learner students in Arizona and Nevada did not have to reach the proficiency threshold on the initial ELP assessment in order to have a 50-percent or higher probability of passing the ELA and mathematics content tests in the two years following the assessment. To have a 50-percent probability of passing the academic content assessments, grade 3 English learner students could score between 15 and 46 scale score points below the reclassification minimum, depending on the state and academic content test. The probability of passing the ELA and math content tests when scoring at the minimum level for reclassification for grade 3 English learner students was always higher than for those in grade 6: at least 39 percentage points higher in math and at least 55 percentage points in ELA.
|REL 2016160||Survey methods for educators: Selecting samples and administering surveys (part 2 of 3)
Educators at the state and local levels are increasingly using data to inform policy decisions. While student achievement data is often used to inform instructional or programmatic decisions, educators may also need additional sources of data, some of which may not be housed in their existing data systems. Creating and administering surveys is one way to collect such data. However, documentation available to educators about administering surveys may provide insufficient guidance about sampling or analysis approaches. Furthermore, some educators may not have training or experience in survey methods. In response to this need, REL Northeast & Islands created a series of three complementary guides that provide an overview of the survey research process designed for educators. The guides describe (1) survey development, (2) sampling respondents and survey administration, and (3) analysis and reporting of survey data.
Part two of this series, "Sampling Respondents and Survey Administration," outlines the following steps, drawn from the research literature:
1. Define the population
2. Specify the sampling procedure
3. Determine the sample size
4. Select the sample
5. Administer the survey
The guide provides detailed, real-world examples of how these steps have been used in a REL research alliance project. With this guide, educators will be able to develop their own sampling and survey administration plans.
|REL 2016163||Survey methods for educators: Collaborative survey development (part 1 of 3)
Educators at the state and local levels are increasingly using data to inform policy decisions. While student achievement data is often used to inform instructional or programmatic decisions, educators may also need additional sources of data, some of which may not be housed in their existing data systems. Creating and administering surveys is one way to collect such data. However, documentation available to educators about administering surveys may provide insufficient guidance about the survey development process. Furthermore, some educators may not have training or experience in survey methods. In response to this need, REL Northeast & Islands created a series of three complementary guides that provide an overview of the survey research process designed for educators. The guides describe (1) survey development, (2) sampling respondents and survey administration, and (3) analysis and reporting of survey data.
Part one of this series, "Collaborative Survey Development," outlines the following steps, drawn from the research literature:
1. Identify topics of interest
2. Identify relevant, existing survey items
3. Draft new survey items and adapt existing survey items
4. Review draft survey items with stakeholders and content experts
5. Refine the draft survey using cognitive interviewing
The guide provides detailed, real-world examples of how these steps have been used in REL research alliance projects. With this guide, educators will be able to develop their own surveys in collaboration with other practitioners, researchers, and content experts.
|NCES 2016144||The Condition of Education 2016
NCES has a mandate to report to Congress on the condition of education by June 1 of each year. The Condition of Education 2016 summarizes important developments and trends in education using the latest available data. The 2016 report presents 43 key indicators on the status and condition of education and are grouped under four main areas: (1) population characteristics, (2) participation in education, (3) elementary and secondary education, and (4) postsecondary education. Also included in the report are 3 Spotlight indicators that provide a more in-depth look at some of the data.
|REL 2016119||Stated Briefly: How methodology decisions affect the variability of schools identified as beating the odds
This "Stated Briefly" report is a companion piece that summarizes the results of another report of the same name. Schools that show better academic performance than would be expected given characteristics of the school and student populations are often described as "beating the odds" (BTO). State and local education agencies often attempt to identify such schools as a means of identifying strategies or practices that might be contributing to the schools' relative success. Key decisions on how to identify BTO schools may affect whether schools make the BTO list and thereby the identification of practices used to beat the odds. The purpose of this study was to examine how a list of BTO schools might change depending on the methodological choices and selection of indicators used in the BTO identification process. This study considered whether choices of methodologies and type of indicators affect the schools that are identified as BTO. The three indicators were (1) type of performance measure used to compare schools, (2) the types of school characteristics used as controls in selecting BTO schools, and (3) the school sample configuration used to pool schools across grade levels. The study applied statistical models involving the different methodologies and indicators and documented how the lists schools identified as BTO changed based on the models. Public school and student data from one midwest state from 2007-08 through 2010-11 academic years were used to generate BTO school lists. By performing pairwise comparisons among BTO school lists and computing agreement rates among models, the project team was able to gauge the variation in BTO identification results. Results indicate that even when similar specifications were applied across statistical methods, different sets of BTO schools were identified. In addition, for each statistical method used, the lists of BTO schools identified varied with the choice of indicators. Fewer than half of the schools were identified as BTO in more than one year. The results demonstrate that different technical decisions can lead to different identification results.
|WWC SSR232||WWC Review of the Report "The Short-Term Effects of the Kalamazoo Promise Scholarship on Student Outcomes"
Researchers examined the impacts of the Kalamazoo Promise Scholarship program on academic and behavioral outcomes of students in grades 9–12 in Kalamazoo Public Schools (KPS). The Kalamazoo Promise Scholarship program offers college scholarships to graduating high school students in the KPS district. The percentage of tuition and fees covered is dependent on how long a student has attended school in the district. Students attending since kindergarten receive the full 100% of tuition and fees. Students attending since ninth grade receive a scholarship covering 65%. Students who enter KPS in tenth grade or later are not eligible to receive the scholarship. To assess the program’s impacts, researchers compared the academic and behavioral outcomes of students in high school, before and after the Kalamazoo Promise Scholarship program was introduced. Student outcomes that were included in the study were: student grade point averages, whether students earned course credits, the number of course credits earned, incidence of and number of days spent in suspension, and incidence of and number of days spent in in-school detention. This study uses a quasi-experimental design in which baseline equivalence of the groups cannot be demonstrated. Therefore, the research does not meet WWC group design standards.
1 - 15 Next >>
Page 1 of 3