Skip Navigation

Clear Search


Search by:






Release Date 

     

Type of Product (help)

Survey/Program Area

Visit the IES Publications & Products Search to query all IES publications and products.

Search Results: (1-15 of 624 records)

 Pub Number  Title  Date
REL 2024006 Strengthening the Pennsylvania School Climate Survey to Inform School Decisionmaking
This study analyzed Pennsylvania School Climate Survey data from students and staff in the 2021/22 school year to assess the validity and reliability of the elementary school student version of the survey; approaches to scoring the survey in individual schools at all grade levels; and perceptions of school climate across student, staff, and school groups. The survey encourages data-informed efforts in participating Pennsylvania schools to foster supportive learning environments that promote social and emotional wellness for students and staff. The study validated the elementary school student survey but found that one domain—safe and respectful school climate—did not meet the reliability threshold and thus suggests that revisions are needed. At all grade levels noninstructional staff had the most positive perceptions of school climate, followed by classroom teachers then students. The study found that different approaches to combining the school climate scores of students, teachers, and noninstructional staff within schools yielded slightly different distributions of school climate summary index scores. It also found that different performance category thresholds resulted in similar distributions of schools across categories. Scores calculated using simple averages were strongly and positively correlated with scores calculated using a more complex approach (Rasch models), suggesting that both approaches deliver similar information. School climate scores varied across student groups (defined by race/ethnicity, gender, and grade level) within schools and across school groups. Larger schools and schools with higher percentages of Black students tended to have lower school climate scores than other schools. The findings can inform the Pennsylvania Department of Education’s decisionmaking on revisions to the elementary school student survey, approaches to scoring and reporting survey results, and efforts to increase participation in future survey administrations.
8/29/2024
REL 2024005 Examining Implementation and Outcomes of the Project On‑Track High-Dosage Literacy Tutoring Program
School districts in northeastern Tennessee have had persistently low proficiency rates in grade 3 English language arts, which were exacerbated by disruptions in schooling due to the Covid-19 pandemic. In response, the Niswonger Foundation, a technical assistance provider that supports these districts, developed Project On-Track, a high-dosage, small-group literacy tutoring program for students in grade 1–3. Its online adaptive program, Amplify Reading, groups students by skill level and generates mini-lessons aligned to the science of reading that are delivered by tutors. Although the content of the tutoring sessions is highly structured, Project On-Track offers schools flexibility in how they implement the program, including when they provide tutoring, who provides tutoring, in which grade levels they offer tutoring, and how they identify students within a grade level for tutoring. This flexibility can make it easier for schools to adopt the program, particularly rural schools, which may face greater challenges in hiring tutors or delivering tutoring outside of school hours. However, variation in implementation may also affect program effectiveness. To inform future implementation of the program, this study describes the characteristics of students who participated in a full year of Project On-Track and how schools implemented the program, with a focus on three implementation features: when and how frequently tutoring is offered and who provides it. By reporting on the association between variations in implementation and student literacy scores, the study offers important insights to inform future program implementation.

The study found no differences in student literacy scores based on timing or frequency of tutoring. Most schools (66 percent) offered tutoring during school and more than twice a week (64 percent). Rural schools were more likely to offer tutoring during school (92 percent) than were nonrural schools (47 percent). Most tutors were current teachers (55 percent) or retired teachers (12 percent). This study does not provide evidence of differences in student literacy scores based on tutor qualifications. More than half the students who participated in a full year of Project On-Track tutoring started the year with literacy assessment scores identifying them as most at risk for reading difficulties, and 42 percent of them improved to a lower risk category after one year of tutoring. Although this study uses descriptive methods and cannot assess effectiveness, the findings suggest that schools and districts using a highly structured tutoring program like Project On-Track might be able to exercise flexibility in when and how often tutoring is offered and by whom without compromising program quality and benefits to students.
8/26/2024
REL 2024004 Assessing the Validity and Reliability of the Pennsylvania School Climate Survey for Elementary School Students
The Pennsylvania Department of Education’s (PDE’s) Office for Safe Schools partnered with REL Mid-Atlantic to conduct a study analyzing the validity and reliability from PDE’s school climate survey for elementary school students. This survey, which is available on a voluntary basis to any school in the state, provides a way for schools to track their school climate and identify aspects of school climate that need additional support. The analysis examined the three domains of the PDE school climate survey: (1) social-emotional learning, (2) safe and respectful school climate, and (3) student support and academic engagement. The study found that the items in each of the three domains measured the constructs that they intended to measure and that the three domains were distinct from one another. However, one domain—safe and respectful school climate—fell short of the established threshold for reliability based on the correlations among the items within the domain. As a result, the study team recommended revisions to the safe and respectful school climate domain of the elementary school student survey to improve its internal consistency reliability.
4/15/2024
REL 2023003 Changes in school climate during COVID-19 in a sample of Pennsylvania schools
To assess how school climate changed during the pandemic, the Pennsylvania Department of Education's (PDE's) Office for Safe Schools partnered with REL Mid-Atlantic to conduct a study using data from PDE's school climate survey. This survey, which is available on a voluntary basis to any school in the state, provides a way to track school climate and identify schools that need additional support to improve school climate. The REL study analyzed changes in scores from a pre-pandemic year (2018/19) to the 2020/21 and 2021/22 school years. In a sample of Pennsylvania public schools that took the survey in all three years, students and teachers reported more positive perceptions of school climate in the 2020/21 school year, during hybrid and remote learning, compared to 2018/19 (before the pandemic) and 2021/22 (when schools had returned to fully in-person operation). This was an unexpected positive bump in the year in which schools experienced the most pandemic-related disruption. In contrast, school climate scores were steady across the years before COVID-19. The study also found no evidence of a significant decline in school climate scores between 2018/19 and 2021/22, suggesting the pandemic did not have a lasting negative effect on school climate in this sample of schools. One important caveat of this study is that the sample of schools was small and not representative of the rest of the state of Pennsylvania. In the future, increasing the number of schools completing the school climate survey over multiple years will allow PDE to conduct more informative analyses of the relationship between school climate and other factors, such as interventions to improve school climate.
8/10/2023
REL 2023002 Supporting the California Department of Education in Examining Data to Inform the Setting of Thresholds on the California Alternate English Language Proficiency Assessments for California
Staff from the California Department of Education (CDE) will present findings to the State Board of Education (SBE) from a project CDE conducted with analytic technical assistance from the Regional Educational Laboratory (REL) West. The SBE meeting will take place on May 18 and 19, 2023 at the California State Board of Education, 1430 N Street, Room 1101, Sacramento, California. This item is currently placed as the third item the SBE will take up, making it likely to be presented around midday on May 18.

At the meeting, CDE plans to present the findings and implications from analyses it conducted of student achievement on the state’s alternate English language proficiency and English language arts assessments. REL West staff will attend the presentation in order to briefly describe REL West’s technical assistance role and support the CDE in addressing any questions posed by Board members about technical aspects of the data analysis that cannot be answered by CDE staff. The technical memo and slide deck will be made available on the REL website soon after the presentation to the Board.
5/18/2023
REL 2023001 Stabilizing subgroup proficiency results to improve identification of low-performing schools
The Every Student Succeeds Act (ESSA) requires states to identify schools with low-performing student subgroups for Targeted Support and Improvement (TSI) or Additional Targeted Support and Improvement (ATSI). Random differences between students’ true abilities and their test scores, also called measurement error, reduce the statistical reliability of the performance measures used to identify schools for these categorizations. Measurement error introduces a risk that the identified schools are unlucky rather than truly low performing. Using data provided by the Pennsylvania Department of Education (PDE), the study team used Bayesian hierarchical modeling to improve the reliability of subgroup proficiency measures, allowing PDE to target the schools and students that most need additional support. PDE plans to incorporate stabilization as a “safe harbor” alternative in its 2022 accountability calculations. The study also shows that Bayesian stabilization produces reliable results for subgroups as small as 10 students—suggesting that states could choose to reduce minimum counts used in subgroup calculations (typically now around 20 students), promoting accountability for all subgroups without increasing random error. Findings could be relevant to states across the country, all of which face the same need to identify schools for TSI and ATSI, and the same tension between accountability and reliability, which Bayesian stabilization could help to resolve.
2/27/2023
REL 2023148 Interpreting Findings from an Early Learning Inventory Pilot Study
This project was part of a larger REL Southwest coaching series to support the Oklahoma State Department of Education (OSDE) in using an early learning inventory (ELI) to assess children’s knowledge and skills at kindergarten entry and to improve state-funded early learning programs. The goals of the project were to assist OSDE in (1) preparing to implement an ELI pilot study, (2) preparing for sampling and recruitment for the ELI pilot study, (3) developing data collection measures to collect information during the pilot study about how the ELI is implemented and teacher outcomes, and (4) analyzing and interpreting data from the ELI pilot study. The coaching was delivered over the course of five sessions from fall 2020 to fall 2022. OSDE staff were the primary participants.

The final two sessions of this coaching project included a review of the pilot study findings and methodology. This project equipped OSDE staff with information to make evidence-based decisions about the ELI and to conduct a more rigorous future study with the ELI.
1/23/2023
REL 2023147 The Louisiana Believe and Prepare Educator Preparation Reform: Findings from the Pilot and Early Implementation Years
Believe and Prepare is a teacher preparation reform implemented by the Louisiana Department of Education in collaboration with school systems and teacher preparation programs across the state. It was piloted in the 2014/15 school year and became mandatory in July 2018 for incoming teacher candidates in all 18 institutions of higher education that offer traditional teacher preparation programs. The reform focused on competency-based curricula, extended clinical experiences, and rigorous mentor teacher training. A central requirement of the reform is that teacher candidates must participate in a yearlong residency with a mentor teacher. This replaced the prior shorter-term student teaching requirement, typically six weeks.

To explore the extent to which the reform is contributing to expected improvement in outcomes for early career teachers, this study examined the association between the reform and in-service teacher performance ratings, teacher retention, student test scores, teacher competency, and the likelihood of three placement outcomes (being placed in the school where the teacher completed a residency, filling a teaching position in a shortage area, and being placed in a rural school). Teachers who completed a program that had implemented Believe and Prepare were 2 percentage points more likely than teachers who completed a program that had not implemented it to stay in Louisiana for at least one year and 7 percentage points more likely to stay in the same school district for at least three years. Grade 4–8 students whose teachers completed a preparation program that had implemented Believe and Prepare during the pilot years scored 0.04 standard deviation lower on English language arts tests than students whose teachers completed a program that had not implemented it. Other teacher outcomes such as in-service performance ratings, competency as measured by Praxis II scores, school placement, and job assignment were not statistically different between teachers who completed a program that had implemented Believe and Prepare and teachers who completed other programs.
12/12/2022
REL 2023143 Encouraging Families to Visit a Literacy Website: A Randomized Study of the Impact of Email and Text Message Communications
The Arkansas Department of Education partnered with the Regional Educational Laboratory Southwest to study the feasibility and effectiveness of using brief email and text message communications to increase the number of parent and guardian visits to the Reading Initiative for Student Excellence (R.I.S.E.) state literacy website.

In November 2021, the department sent test messages to families to determine the percentage of households with children in kindergarten–grade 6 in Arkansas public schools that had a working email address or cell phone number and whether the percentage differed by school locale (rural or nonrural) or demographic composition (percentage of economically disadvantaged students, Black students and Hispanic students, or English learner students). Subsequently, the study team randomly assigned 700 Arkansas public elementary schools to one of eight conditions, which varied the mode of communication (email only or email and text message), the presentation of information (no graphic or with a graphic), and the type of sender (generic sender or known sender). In January 2022 households with children in these schools were sent three rounds of communications with information about literacy and a link to the R.I.S.E. website. The study examined the impact of these communications on whether parents and guardians clicked the link to visit the website (click rate) and conducted an exploratory analysis of differences in how long they spent on the website (time on page).
12/7/2022
REL 2023146 Indicators of School Performance in Texas
The School Improvement Division of the Texas Education Agency (TEA) identifies, monitors, and supports low-performing schools. To identify low-performing schools, TEA assigns annual academic accountability ratings to its districts and schools, but these ratings are only provided once per year and are vulnerable to disruptions in the assessment system. Schools that receive low accountability ratings do not meet accountability expectations and are considered low-performing.
12/5/2022
REL 2023140 Biliteracy Seals in a Large Urban District in New Mexico: Who Earns Them and How Do They Impact College Outcomes?
New Mexico is one of 48 states that offer a biliteracy seal to high school graduates to recognize their proficiency in a non-English language. The Regional Educational Laboratory Southwest English Learners Research Partnership collaborated with a large urban district in New Mexico to study the characteristics and college readiness of students who earn different types of biliteracy seals (state, district, and global seals) and whether earning a seal improves college outcomes. The study used data from three cohorts of students who graduated from high school in the district from 2017/18 to 2019/20. The study examined the characteristics and college readiness of students who earned different types of seals, the number of students who met some requirements for a seal but did not earn one, and the effect of earning a seal on college outcomes.
12/1/2022
REL 2023145 Examining student group differences in Arkansas’ indicators of postsecondary readiness and success
Regional Educational Laboratory Southwest partnered with the Arkansas Department of Education (ADE) to examine Arkansas’s middle and high school indicators of postsecondary readiness and success, building on an earlier study of these indicators (Hester et al., 2021). Academic indicators include attaining proficiency on state achievement tests, grade point average, enrollment in advanced courses, and community service learning. Behavioral indicators include attendance, suspension, and expulsion. Using data on statewide grade 6 cohorts from 2008/09 and 2009/10, the study examined the percentages of students who attained the readiness and success indicators and the percentages of students who attained postsecondary readiness and success outcomes by gender, race/ethnicity, eligibility for the National School Lunch Program, English learner student status, disability status, age, and district locale. The study also examined whether the predictive accuracy, specificity, and strength of the indicators varied by these student groups.

Three key findings emerged. First, the attainment of indicators of postsecondary readiness and success differed substantially for nearly all student groups, with the number of substantial differences on academic indicators exceeding those on behavioral indicators. The largest number of substantial differences in the attainment of academic indicators were between Black and White students, between students eligible and ineligible for the National School Lunch Program (an indicator of economic disadvantage), and between students who entered grade 6 before and after age 13. Second, attainment of postsecondary readiness and success outcomes varied substantially across student groups, with the largest differences between students with and without a disability. Third, predictive accuracy (the percentage of students with the same predicted and actual outcomes) and strength (the relative importance of a single indicator) were similar across student groups in most cases.

Leaders at ADE and in Arkansas districts can use these findings to identify appropriate indicators of postsecondary readiness and success and to target supports toward student groups who most need them. These findings can help leaders identify and address disparities such as inequitable access to resources and supportive learning environments.
11/21/2022
REL 2023144 English learner proficiency in Texas before and during the COVID-19 pandemic
This study examined levels of English proficiency before and during the COVID-19 pandemic among English learner students in grades 3–12 in Texas. In 2020/21, nearly 750,000 students in grades 3–12—approximately one in five Texas students—were English learner students. In accordance with Texas state law and the Every Student Succeeds Act, English proficiency is measured annually using a statewide assessment, the Texas English Language Proficiency Assessment System (TELPAS), which assesses English learner students’ listening, speaking, reading, and writing skills in English. This study focused on TELPAS scores among students who took the test in 2020/21 and compared those scores with a matched cohort of similar students from 2018/19. The study found that, despite missing data because of pandemic-related disruptions to testing, students who took the TELPAS were representative of the overall Texas English learner student population in the years prior to and during the pandemic. The study also found that rates of reclassification from an English learner student to an English proficient student declined between 2017/18 and 2020/21, and trends in the characteristics of reclassified students changed, with lower percentages of students in major urban areas, eligible for the National School Lunch Program, who spoke Spanish at home, and who identified as Hispanic reclassified in 2020/21 than in 2017/18. On average, during the pandemic, English learner students in elementary grades earned meaningfully lower scores on the listening, speaking, and reading domains of the TELPAS than similar students earned before the pandemic, particularly in speaking. The findings for secondary grades were mixed; middle school students earned lower scores in listening and high school students earned higher scores in speaking. Finally, the study did not find evidence that English learner student program models, such as dual-language immersion or content-based English as a second language, were meaningfully associated with English proficiency in 2020/21. Leaders at the Texas Education Agency and Texas school districts could consider focusing recovery resources on elementary schools and to some degree on middle schools and identifying and supporting evidence-based strategies to cultivate proficiency. The Texas Education Agency may consider studying the effect of program models on language proficiency and the relationship between reclassification, shifting English proficiency levels, and changing reclassification standards.
11/3/2022
REL 2023142 An Examination of the Costs of Texas Community Colleges
Policymakers in Texas want to understand the funding levels necessary for community colleges to meet their promise of providing an affordable and accessible pathway to a postsecondary certificate or degree. Regional Educational Laboratory Southwest conducted this study to help leaders at the Texas Higher Education Coordinating Board better understand the extent to which Texas community colleges have adequate funding for reaching the desired levels of student success, as measured by success points milestones used in the state’s performance-based funding system. The study involved three types of analyses: a needs analysis, an equity analysis, and a cost function analysis. The needs analysis found that community colleges with higher percentages of first-generation college students, students who are economically disadvantaged, students who are academically disadvantaged, students older than 24 years, and English learner students earn fewer success points milestones per full-time equivalent student. The equity analysis found that community colleges with higher percentages of students who are academically disadvantaged spent less per full-time equivalent student, suggesting that there may be resource inequities for these students. The cost function analysis found that spending was not high enough to cover the cost of providing an equal opportunity for first-generation college students, students who are economically disadvantaged, students older than 24 years, and English learner students to achieve the same level of outcomes as students without these needs. The findings from this study can inform Texas policymakers’ efforts to distribute funding for community colleges to support equitable opportunities for all students to succeed in college.
10/26/2022
REL 2023141 Early Progress and Outcomes of a Grow Your Own Grant Program for High School Students and Paraprofessionals in Texas
The Texas Education Agency launched the Grow Your Own (GYO) grant program in 2018 to encourage districts to develop or expand existing high-quality education and training courses for high school students and to support district-employed paraprofessionals (including instructional aides and long-term substitute teachers) to pursue certifications that would allow them to enter full-time teaching roles. This study aimed to help state education leaders in Texas understand the progress of districts in implementing the GYO program and the early outcomes of participants. This study analyzed data from 2015/16 through 2020/21 for districts that received GYO funding in the first two grant cycles and districts in the same geographic locales within the same regions that did not receive GYO funding (comparison districts). The study found that the majority of districts awarded a GYO grant were in rural areas and small towns. GYO districts were more likely to have a smaller enrollment and had a higher average percentage of Hispanic students than comparison districts. The findings suggest that the program appeared to meet the Texas Education Agency’s goal of providing opportunities to students and paraprofessionals in rural and small school settings and students of color to participate in GYO activities. The study also found that the percentage of students completing education and training courses in GYO districts was low (about 10 percent) during the grant years, and the percentage was similar in comparison districts before and after the grant awards. A disproportionate share of students who completed education and training courses in GYO districts were female. Although it is too soon to tell whether the GYO program will, over time, increase the size and diversity of the state’s teacher pool, leaders at the Texas Education Agency can use these early findings to both understand the progress of districts in achieving the GYO grant program aims and help identify aspects of the program that might need further investigation.
10/24/2022
   1 - 15     Next >>
Page 1  of  42