Search Results: (1-15 of 627 records)
Pub Number | Title | Date |
---|---|---|
REL 2025008 | Teacher Certification, Retention, and Recruitment in Palau: Understanding Graduation Patterns of Teacher Education Students at Palau Community College
To strengthen teacher preparation in the Republic of Palau, Public Law 10-32 (enacted in 2018) requires all teachers in the country to hold an associate’s degree in education or in the subject area they will teach by the end of 2023. This policy change required many current teachers and those interested in the teaching profession to enroll in an associate’s degree program at Palau Community College (PCC), the country’s only postsecondary institution. To support policymakers’ understanding of how long it takes teachers and teacher candidates to meet the requirements of Public Law 10-32, this study examined the graduation patterns of teacher education students enrolled in associate’s degree programs at PCC. The results of this study will support PCC and the Palau Ministry of Education’s efforts to successfully train, retain, and recruit qualified teachers. |
12/2/2024 |
REL 2025007 | Evidence and Gap Map of Tier 2 Literacy Interventions for Grades K–3 in the Commonwealth of the Northern Mariana Islands
The Commonwealth of the Northern Mariana Islands Public School System requested a systematic review of Tier 2 literacy interventions for students in grades K–3. This review defines a Tier 2 literacy intervention as a supplemental instructional program for students who require support in addition to the Tier 1 core reading program. Of the 267 studies on Tier 2 literacy interventions identified, 20 met What Works Clearinghouse 5.0 standards with or without reservations. Two interventions—Reading Recovery and Literacy First—had strong evidence of positive effects (as defined by the Every Student Succeeds Act) on students’ literacy skills. One additional intervention—Achieve3000—had moderate evidence of positive effects. This report includes an evidence and gap map and a supplemental matrix that highlights implementation strategies used in each intervention. |
11/25/2024 |
REL 2025009 | Stabilizing School Performance Indicators in New Jersey to Reduce the Effect of Random Error
The Every Student Succeeds Act of 2015 requires states to use a variety of indicators, including standardized tests and attendance records, to designate schools for support and improvement based on schoolwide performance and the performance of groups of students within schools. Schoolwide and group-level performance indicators are also diagnostically relevant for district-level and school-level decisionmaking outside the formal accountability context. Like all measurements, performance indicators are subject to measurement error, with some having more random error than others. Measurement error can have an outsized effect for smaller groups of students, rendering their measured performance unreliable, which can lead to misidentification of groups with the greatest needs. Many states address the reliability problem by excluding from accountability student groups smaller than an established threshold, but this approach sacrifices equity, which requires counting students in all relevant groups. With the aim of improving reliability, particularly for small groups of students, this study applied a stabilization model called Bayesian hierarchical modeling to group-level data (with groups assigned according to demographic designations) within schools in New Jersey. Stabilization substantially improved the reliability of test-based indicators, including proficiency rates and median student growth percentiles. The stabilization model used in this study was less effective for non-test-based indictors, such as chronic absenteeism and graduation rate, for several reasons related to their statistical properties. When stabilization is applied to the indicators best suited for it (such as proficiency and growth), it leads to substantial changes in the lists of schools designated for support and improvement. These results indicate that, applied correctly, stabilization can increase the reliability of performance indicators for processes using these indicators, simultaneously improving accuracy and equity. |
10/21/2024 |
REL 2024006 | Strengthening the Pennsylvania School Climate Survey to Inform School Decisionmaking
This study analyzed Pennsylvania School Climate Survey data from students and staff in the 2021/22 school year to assess the validity and reliability of the elementary school student version of the survey; approaches to scoring the survey in individual schools at all grade levels; and perceptions of school climate across student, staff, and school groups. The survey encourages data-informed efforts in participating Pennsylvania schools to foster supportive learning environments that promote social and emotional wellness for students and staff. The study validated the elementary school student survey but found that one domain—safe and respectful school climate—did not meet the reliability threshold and thus suggests that revisions are needed. At all grade levels noninstructional staff had the most positive perceptions of school climate, followed by classroom teachers then students. The study found that different approaches to combining the school climate scores of students, teachers, and noninstructional staff within schools yielded slightly different distributions of school climate summary index scores. It also found that different performance category thresholds resulted in similar distributions of schools across categories. Scores calculated using simple averages were strongly and positively correlated with scores calculated using a more complex approach (Rasch models), suggesting that both approaches deliver similar information. School climate scores varied across student groups (defined by race/ethnicity, gender, and grade level) within schools and across school groups. Larger schools and schools with higher percentages of Black students tended to have lower school climate scores than other schools. The findings can inform the Pennsylvania Department of Education’s decisionmaking on revisions to the elementary school student survey, approaches to scoring and reporting survey results, and efforts to increase participation in future survey administrations. |
8/29/2024 |
REL 2024005 | Examining Implementation and Outcomes of the Project On‑Track High-Dosage Literacy Tutoring Program
School districts in northeastern Tennessee have had persistently low proficiency rates in grade 3 English language arts, which were exacerbated by disruptions in schooling due to the Covid-19 pandemic. In response, the Niswonger Foundation, a technical assistance provider that supports these districts, developed Project On-Track, a high-dosage, small-group literacy tutoring program for students in grade 1–3. Its online adaptive program, Amplify Reading, groups students by skill level and generates mini-lessons aligned to the science of reading that are delivered by tutors. Although the content of the tutoring sessions is highly structured, Project On-Track offers schools flexibility in how they implement the program, including when they provide tutoring, who provides tutoring, in which grade levels they offer tutoring, and how they identify students within a grade level for tutoring. This flexibility can make it easier for schools to adopt the program, particularly rural schools, which may face greater challenges in hiring tutors or delivering tutoring outside of school hours. However, variation in implementation may also affect program effectiveness. To inform future implementation of the program, this study describes the characteristics of students who participated in a full year of Project On-Track and how schools implemented the program, with a focus on three implementation features: when and how frequently tutoring is offered and who provides it. By reporting on the association between variations in implementation and student literacy scores, the study offers important insights to inform future program implementation. The study found no differences in student literacy scores based on timing or frequency of tutoring. Most schools (66 percent) offered tutoring during school and more than twice a week (64 percent). Rural schools were more likely to offer tutoring during school (92 percent) than were nonrural schools (47 percent). Most tutors were current teachers (55 percent) or retired teachers (12 percent). This study does not provide evidence of differences in student literacy scores based on tutor qualifications. More than half the students who participated in a full year of Project On-Track tutoring started the year with literacy assessment scores identifying them as most at risk for reading difficulties, and 42 percent of them improved to a lower risk category after one year of tutoring. Although this study uses descriptive methods and cannot assess effectiveness, the findings suggest that schools and districts using a highly structured tutoring program like Project On-Track might be able to exercise flexibility in when and how often tutoring is offered and by whom without compromising program quality and benefits to students. |
8/26/2024 |
REL 2024004 | Assessing the Validity and Reliability of the Pennsylvania School Climate Survey for Elementary School Students
The Pennsylvania Department of Education’s (PDE’s) Office for Safe Schools partnered with REL Mid-Atlantic to conduct a study analyzing the validity and reliability from PDE’s school climate survey for elementary school students. This survey, which is available on a voluntary basis to any school in the state, provides a way for schools to track their school climate and identify aspects of school climate that need additional support. The analysis examined the three domains of the PDE school climate survey: (1) social-emotional learning, (2) safe and respectful school climate, and (3) student support and academic engagement. The study found that the items in each of the three domains measured the constructs that they intended to measure and that the three domains were distinct from one another. However, one domain—safe and respectful school climate—fell short of the established threshold for reliability based on the correlations among the items within the domain. As a result, the study team recommended revisions to the safe and respectful school climate domain of the elementary school student survey to improve its internal consistency reliability. |
4/15/2024 |
REL 2023003 | Changes in school climate during COVID-19 in a sample of Pennsylvania schools
To assess how school climate changed during the pandemic, the Pennsylvania Department of Education's (PDE's) Office for Safe Schools partnered with REL Mid-Atlantic to conduct a study using data from PDE's school climate survey. This survey, which is available on a voluntary basis to any school in the state, provides a way to track school climate and identify schools that need additional support to improve school climate. The REL study analyzed changes in scores from a pre-pandemic year (2018/19) to the 2020/21 and 2021/22 school years. In a sample of Pennsylvania public schools that took the survey in all three years, students and teachers reported more positive perceptions of school climate in the 2020/21 school year, during hybrid and remote learning, compared to 2018/19 (before the pandemic) and 2021/22 (when schools had returned to fully in-person operation). This was an unexpected positive bump in the year in which schools experienced the most pandemic-related disruption. In contrast, school climate scores were steady across the years before COVID-19. The study also found no evidence of a significant decline in school climate scores between 2018/19 and 2021/22, suggesting the pandemic did not have a lasting negative effect on school climate in this sample of schools. One important caveat of this study is that the sample of schools was small and not representative of the rest of the state of Pennsylvania. In the future, increasing the number of schools completing the school climate survey over multiple years will allow PDE to conduct more informative analyses of the relationship between school climate and other factors, such as interventions to improve school climate. |
8/10/2023 |
REL 2023002 | Supporting the California Department of Education in Examining Data to Inform the Setting of Thresholds on the California Alternate English Language Proficiency Assessments for California
Staff from the California Department of Education (CDE) will present findings to the State Board of Education (SBE) from a project CDE conducted with analytic technical assistance from the Regional Educational Laboratory (REL) West. The SBE meeting will take place on May 18 and 19, 2023 at the California State Board of Education, 1430 N Street, Room 1101, Sacramento, California. This item is currently placed as the third item the SBE will take up, making it likely to be presented around midday on May 18. At the meeting, CDE plans to present the findings and implications from analyses it conducted of student achievement on the state’s alternate English language proficiency and English language arts assessments. REL West staff will attend the presentation in order to briefly describe REL West’s technical assistance role and support the CDE in addressing any questions posed by Board members about technical aspects of the data analysis that cannot be answered by CDE staff. The technical memo and slide deck will be made available on the REL website soon after the presentation to the Board. |
5/18/2023 |
REL 2023001 | Stabilizing subgroup proficiency results to improve identification of low-performing schools
The Every Student Succeeds Act (ESSA) requires states to identify schools with low-performing student subgroups for Targeted Support and Improvement (TSI) or Additional Targeted Support and Improvement (ATSI). Random differences between students’ true abilities and their test scores, also called measurement error, reduce the statistical reliability of the performance measures used to identify schools for these categorizations. Measurement error introduces a risk that the identified schools are unlucky rather than truly low performing. Using data provided by the Pennsylvania Department of Education (PDE), the study team used Bayesian hierarchical modeling to improve the reliability of subgroup proficiency measures, allowing PDE to target the schools and students that most need additional support. PDE plans to incorporate stabilization as a “safe harbor” alternative in its 2022 accountability calculations. The study also shows that Bayesian stabilization produces reliable results for subgroups as small as 10 students—suggesting that states could choose to reduce minimum counts used in subgroup calculations (typically now around 20 students), promoting accountability for all subgroups without increasing random error. Findings could be relevant to states across the country, all of which face the same need to identify schools for TSI and ATSI, and the same tension between accountability and reliability, which Bayesian stabilization could help to resolve. |
2/27/2023 |
REL 2023148 | Interpreting Findings from an Early Learning Inventory Pilot Study
This project was part of a larger REL Southwest coaching series to support the Oklahoma State Department of Education (OSDE) in using an early learning inventory (ELI) to assess children’s knowledge and skills at kindergarten entry and to improve state-funded early learning programs. The goals of the project were to assist OSDE in (1) preparing to implement an ELI pilot study, (2) preparing for sampling and recruitment for the ELI pilot study, (3) developing data collection measures to collect information during the pilot study about how the ELI is implemented and teacher outcomes, and (4) analyzing and interpreting data from the ELI pilot study. The coaching was delivered over the course of five sessions from fall 2020 to fall 2022. OSDE staff were the primary participants. The final two sessions of this coaching project included a review of the pilot study findings and methodology. This project equipped OSDE staff with information to make evidence-based decisions about the ELI and to conduct a more rigorous future study with the ELI. |
1/23/2023 |
REL 2023147 | The Louisiana Believe and Prepare Educator Preparation Reform: Findings from the Pilot and Early Implementation Years
Believe and Prepare is a teacher preparation reform implemented by the Louisiana Department of Education in collaboration with school systems and teacher preparation programs across the state. It was piloted in the 2014/15 school year and became mandatory in July 2018 for incoming teacher candidates in all 18 institutions of higher education that offer traditional teacher preparation programs. The reform focused on competency-based curricula, extended clinical experiences, and rigorous mentor teacher training. A central requirement of the reform is that teacher candidates must participate in a yearlong residency with a mentor teacher. This replaced the prior shorter-term student teaching requirement, typically six weeks. To explore the extent to which the reform is contributing to expected improvement in outcomes for early career teachers, this study examined the association between the reform and in-service teacher performance ratings, teacher retention, student test scores, teacher competency, and the likelihood of three placement outcomes (being placed in the school where the teacher completed a residency, filling a teaching position in a shortage area, and being placed in a rural school). Teachers who completed a program that had implemented Believe and Prepare were 2 percentage points more likely than teachers who completed a program that had not implemented it to stay in Louisiana for at least one year and 7 percentage points more likely to stay in the same school district for at least three years. Grade 4–8 students whose teachers completed a preparation program that had implemented Believe and Prepare during the pilot years scored 0.04 standard deviation lower on English language arts tests than students whose teachers completed a program that had not implemented it. Other teacher outcomes such as in-service performance ratings, competency as measured by Praxis II scores, school placement, and job assignment were not statistically different between teachers who completed a program that had implemented Believe and Prepare and teachers who completed other programs. |
12/12/2022 |
REL 2023143 | Encouraging Families to Visit a Literacy Website: A Randomized Study of the Impact of Email and Text Message Communications
The Arkansas Department of Education partnered with the Regional Educational Laboratory Southwest to study the feasibility and effectiveness of using brief email and text message communications to increase the number of parent and guardian visits to the Reading Initiative for Student Excellence (R.I.S.E.) state literacy website. In November 2021, the department sent test messages to families to determine the percentage of households with children in kindergarten–grade 6 in Arkansas public schools that had a working email address or cell phone number and whether the percentage differed by school locale (rural or nonrural) or demographic composition (percentage of economically disadvantaged students, Black students and Hispanic students, or English learner students). Subsequently, the study team randomly assigned 700 Arkansas public elementary schools to one of eight conditions, which varied the mode of communication (email only or email and text message), the presentation of information (no graphic or with a graphic), and the type of sender (generic sender or known sender). In January 2022 households with children in these schools were sent three rounds of communications with information about literacy and a link to the R.I.S.E. website. The study examined the impact of these communications on whether parents and guardians clicked the link to visit the website (click rate) and conducted an exploratory analysis of differences in how long they spent on the website (time on page). |
12/7/2022 |
REL 2023146 | Indicators of School Performance in Texas
The School Improvement Division of the Texas Education Agency (TEA) identifies, monitors, and supports low-performing schools. To identify low-performing schools, TEA assigns annual academic accountability ratings to its districts and schools, but these ratings are only provided once per year and are vulnerable to disruptions in the assessment system. Schools that receive low accountability ratings do not meet accountability expectations and are considered low-performing. |
12/5/2022 |
REL 2023140 | Biliteracy Seals in a Large Urban District in New Mexico: Who Earns Them and How Do They Impact College Outcomes?
New Mexico is one of 48 states that offer a biliteracy seal to high school graduates to recognize their proficiency in a non-English language. The Regional Educational Laboratory Southwest English Learners Research Partnership collaborated with a large urban district in New Mexico to study the characteristics and college readiness of students who earn different types of biliteracy seals (state, district, and global seals) and whether earning a seal improves college outcomes. The study used data from three cohorts of students who graduated from high school in the district from 2017/18 to 2019/20. The study examined the characteristics and college readiness of students who earned different types of seals, the number of students who met some requirements for a seal but did not earn one, and the effect of earning a seal on college outcomes. |
12/1/2022 |
REL 2023145 | Examining student group differences in Arkansas’ indicators of postsecondary readiness and success
Regional Educational Laboratory Southwest partnered with the Arkansas Department of Education (ADE) to examine Arkansas’s middle and high school indicators of postsecondary readiness and success, building on an earlier study of these indicators (Hester et al., 2021). Academic indicators include attaining proficiency on state achievement tests, grade point average, enrollment in advanced courses, and community service learning. Behavioral indicators include attendance, suspension, and expulsion. Using data on statewide grade 6 cohorts from 2008/09 and 2009/10, the study examined the percentages of students who attained the readiness and success indicators and the percentages of students who attained postsecondary readiness and success outcomes by gender, race/ethnicity, eligibility for the National School Lunch Program, English learner student status, disability status, age, and district locale. The study also examined whether the predictive accuracy, specificity, and strength of the indicators varied by these student groups. Three key findings emerged. First, the attainment of indicators of postsecondary readiness and success differed substantially for nearly all student groups, with the number of substantial differences on academic indicators exceeding those on behavioral indicators. The largest number of substantial differences in the attainment of academic indicators were between Black and White students, between students eligible and ineligible for the National School Lunch Program (an indicator of economic disadvantage), and between students who entered grade 6 before and after age 13. Second, attainment of postsecondary readiness and success outcomes varied substantially across student groups, with the largest differences between students with and without a disability. Third, predictive accuracy (the percentage of students with the same predicted and actual outcomes) and strength (the relative importance of a single indicator) were similar across student groups in most cases. Leaders at ADE and in Arkansas districts can use these findings to identify appropriate indicators of postsecondary readiness and success and to target supports toward student groups who most need them. These findings can help leaders identify and address disparities such as inequitable access to resources and supportive learning environments. |
11/21/2022 |
1 - 15
Next >>
Page 1
of 42