Search Results: (1-15 of 40 records)
|NCES 2018158||NAEP 2015 NIES Restricted-Use Data Files (Y46NIES)
NAEP 2015 NIES Restricted-Use Data Files (Y46NIES)
|NCES 2017249||Collaborative Problem Solving Skills of 15-Year-Olds: Results From PISA 2015
The focus of this Data Point is on the performance of students in the United States relative to their peers in 50 other education systems that participated in the PISA collaborative problem solving assessment in 2015. The PISA assessment of collaborative problem solving measured students’ ability to solve a problem by sharing the understanding and effort required to come to a solution, and pooling their knowledge, skills, and effort to reach that solution. Readers interested in more detailed data related to collaborative problem solving should also visit the NCES PISA website for data tables and figures. Please visit https://nces.ed.gov/surveys/pisa/pisa2015/index.asp to learn more.
|WWC SSR82160||The impact of computer usage on academic performance: Evidence from a randomized trial at the United States Military Academy
The 2016 study, "The Impact of Computer Usage on Academic Performance: Evidence from a Randomized Trial at the United States Military Academy," examined the impacts of computer usage on the academic performance of college students. The study found that students who were permitted to use Internet-enabled devices in class scored lower on final exams than those in classes that prohibited the use of such devices. The impact estimate for the combined multiple choice and short answer portion of the final exam meets WWC group design standards without reservations. The impact estimate for the essay question portion of the final exam does not meet WWC group design standards because essays were only graded once and therefore, the authors were unable to report a measure of reliability for these scores.
|REL 2017212||How are middle school climate and academic performance related across schools and over time?
The purpose of this study was to examine the relationship between school climate and academic performance in two different ways: (1) by comparing the academic performance of different schools with different levels of school climate and (2) by examining how changes in a school's climate were associated with changes in its students' academic achievement. To examine how school climate and academic performance are related, this study analyzed grade 7 student data from 2004/05 to 2010/11 from the California Healthy Kids Survey, the California Standardized Testing and Reporting program, and the California Basic Educational Data System for 978 middle schools in California. School climate was measured by a set of student survey questions that assessed students' perceptions about six domains of school climate. Schools with positive school climates were those in which students reported high levels of safety/connectedness, caring relationships with adults, and meaningful student participation, as well as low levels of substance use at school, bullying/discrimination, and student delinquency. Regression models were used to estimate the relationship between student-reported school climate and students' average academic performance across schools. Regression models were also used to estimate how, for a given school, academic performance changes as school climate changes. All models included controls for racial/ethnic composition, percentage of English learners, and percentage of students eligible for free/reduced-price meals. The study found that (1) middle schools with higher levels of positive student-reported school climate exhibited higher levels of academic performance; (2) increases in a school's level of positive student-reported school climate were associated with simultaneous increases in that school's academic achievement; and (3) within-school increases in academic achievement associated with school climate increases were substantially smaller than the academic performance differences across schools with different school climate levels. As positive school climate is continuing to gain more attention as a lever to improve student learning, there is increasing interest in how improvements in school climate are related to improvements in academic performance. Most studies examining the school climate-academic performance relationship compare the academic achievement across schools with different levels of school climate. Although the results of this study found that schools with high levels of positive school climate exhibited substantially higher levels of academic performance than their counterparts with low levels of positive school climate, such differences across schools were not an accurate guide for predicting the magnitude of school-specific gains in academic performance associated with increases in school climate.
|REL 2017172||English learner students' readiness for academic success: The predictive potential of English language proficiency assessment scores in Arizona and Nevada
The purpose of this study was to examine the relationship between the English language proficiency (ELP) assessment scores of English learner students in Arizona and Nevada and the students' subsequent performance on academic content tests in two key subject areas: English language arts (ELA) and mathematics. This study provided analyses focused on two samples of students: a cohort of students who were in grade 3 (elementary) in 2009/10 and a cohort of students who were in grade 6 (middle school) in 2009/10. In general, the study found that the higher the English learner students' English language proficiency level, the higher were their passing rates on the academic content assessments. In order to have at least a 50-percent probability of passing the academic content assessments in the two years following the ELP assessment, grade 6 English learner students in both Arizona and Nevada needed an ELP scale score that exceeded the threshold for English language proficiency, the minimum level for reclassification as fluent English proficient students and placement full time in mainstream, English-only classes. To have a 50-percent probability of passing the academic content assessments, grade 6 English learner students had to score between 1 and 46 scale score points above the reclassification minimum, depending on the state and academic content test. On the other hand, grade 3 English learner students in Arizona and Nevada did not have to reach the proficiency threshold on the initial ELP assessment in order to have a 50-percent or higher probability of passing the ELA and mathematics content tests in the two years following the assessment. To have a 50-percent probability of passing the academic content assessments, grade 3 English learner students could score between 15 and 46 scale score points below the reclassification minimum, depending on the state and academic content test. The probability of passing the ELA and math content tests when scoring at the minimum level for reclassification for grade 3 English learner students was always higher than for those in grade 6: at least 39 percentage points higher in math and at least 55 percentage points in ELA.
|REL 2016160||Survey methods for educators: Selecting samples and administering surveys (part 2 of 3)
Educators at the state and local levels are increasingly using data to inform policy decisions. While student achievement data is often used to inform instructional or programmatic decisions, educators may also need additional sources of data, some of which may not be housed in their existing data systems. Creating and administering surveys is one way to collect such data. However, documentation available to educators about administering surveys may provide insufficient guidance about sampling or analysis approaches. Furthermore, some educators may not have training or experience in survey methods. In response to this need, REL Northeast & Islands created a series of three complementary guides that provide an overview of the survey research process designed for educators. The guides describe (1) survey development, (2) sampling respondents and survey administration, and (3) analysis and reporting of survey data.
Part two of this series, "Sampling Respondents and Survey Administration," outlines the following steps, drawn from the research literature:
1. Define the population
2. Specify the sampling procedure
3. Determine the sample size
4. Select the sample
5. Administer the survey
The guide provides detailed, real-world examples of how these steps have been used in a REL research alliance project. With this guide, educators will be able to develop their own sampling and survey administration plans.
|REL 2016163||Survey methods for educators: Collaborative survey development (part 1 of 3)
Educators at the state and local levels are increasingly using data to inform policy decisions. While student achievement data is often used to inform instructional or programmatic decisions, educators may also need additional sources of data, some of which may not be housed in their existing data systems. Creating and administering surveys is one way to collect such data. However, documentation available to educators about administering surveys may provide insufficient guidance about the survey development process. Furthermore, some educators may not have training or experience in survey methods. In response to this need, REL Northeast & Islands created a series of three complementary guides that provide an overview of the survey research process designed for educators. The guides describe (1) survey development, (2) sampling respondents and survey administration, and (3) analysis and reporting of survey data.
Part one of this series, "Collaborative Survey Development," outlines the following steps, drawn from the research literature:
1. Identify topics of interest
2. Identify relevant, existing survey items
3. Draft new survey items and adapt existing survey items
4. Review draft survey items with stakeholders and content experts
5. Refine the draft survey using cognitive interviewing
The guide provides detailed, real-world examples of how these steps have been used in REL research alliance projects. With this guide, educators will be able to develop their own surveys in collaboration with other practitioners, researchers, and content experts.
|NCES 2016144||The Condition of Education 2016
NCES has a mandate to report to Congress on the condition of education by June 1 of each year. The Condition of Education 2016 summarizes important developments and trends in education using the latest available data. The 2016 report presents 43 key indicators on the status and condition of education and are grouped under four main areas: (1) population characteristics, (2) participation in education, (3) elementary and secondary education, and (4) postsecondary education. Also included in the report are 3 Spotlight indicators that provide a more in-depth look at some of the data.
|REL 2016119||Stated Briefly: How methodology decisions affect the variability of schools identified as beating the odds
This "Stated Briefly" report is a companion piece that summarizes the results of another report of the same name. Schools that show better academic performance than would be expected given characteristics of the school and student populations are often described as "beating the odds" (BTO). State and local education agencies often attempt to identify such schools as a means of identifying strategies or practices that might be contributing to the schools' relative success. Key decisions on how to identify BTO schools may affect whether schools make the BTO list and thereby the identification of practices used to beat the odds. The purpose of this study was to examine how a list of BTO schools might change depending on the methodological choices and selection of indicators used in the BTO identification process. This study considered whether choices of methodologies and type of indicators affect the schools that are identified as BTO. The three indicators were (1) type of performance measure used to compare schools, (2) the types of school characteristics used as controls in selecting BTO schools, and (3) the school sample configuration used to pool schools across grade levels. The study applied statistical models involving the different methodologies and indicators and documented how the lists schools identified as BTO changed based on the models. Public school and student data from one midwest state from 2007-08 through 2010-11 academic years were used to generate BTO school lists. By performing pairwise comparisons among BTO school lists and computing agreement rates among models, the project team was able to gauge the variation in BTO identification results. Results indicate that even when similar specifications were applied across statistical methods, different sets of BTO schools were identified. In addition, for each statistical method used, the lists of BTO schools identified varied with the choice of indicators. Fewer than half of the schools were identified as BTO in more than one year. The results demonstrate that different technical decisions can lead to different identification results.
|WWC SSR232||WWC Review of the Report "The Short-Term Effects of the Kalamazoo Promise Scholarship on Student Outcomes"
Researchers examined the impacts of the Kalamazoo Promise Scholarship program on academic and behavioral outcomes of students in grades 9–12 in Kalamazoo Public Schools (KPS). The Kalamazoo Promise Scholarship program offers college scholarships to graduating high school students in the KPS district. The percentage of tuition and fees covered is dependent on how long a student has attended school in the district. Students attending since kindergarten receive the full 100% of tuition and fees. Students attending since ninth grade receive a scholarship covering 65%. Students who enter KPS in tenth grade or later are not eligible to receive the scholarship. To assess the program’s impacts, researchers compared the academic and behavioral outcomes of students in high school, before and after the Kalamazoo Promise Scholarship program was introduced. Student outcomes that were included in the study were: student grade point averages, whether students earned course credits, the number of course credits earned, incidence of and number of days spent in suspension, and incidence of and number of days spent in in-school detention. This study uses a quasi-experimental design in which baseline equivalence of the groups cannot be demonstrated. Therefore, the research does not meet WWC group design standards.
|NCES 2014103||Problem Solving Skills of 15-year-olds: Results from PISA 2012
This Data Point uses data from the 2012 administration of the Program for International Student Assessment (PISA) problem solving assessment. PISA is an international assessment that measures 15-year-old students' reading, mathematics, and science literacy and, in 2012, general problem solving skills and financial literacy. PISA is coordinated by the Organization for Economic Cooperation and Development (OECD), an intergovernmental organization of industrialized countries. The PISA computer-based assessment of problem solving assessed how well prepared students are to confront the kinds of problems that are encountered almost daily in 21st century life.
|NCES 2014102||Financial Literacy of 15-year-olds: Results from PISA 2012
This Data Point uses data from the 2012 administration of the Program for International Student Assessment (PISA) financial literacy assessment. PISA is an international assessment that measures 15-year-old students' reading, mathematics, and science literacy and, in 2012, general problem solving and financial literacy. PISA is coordinated by the Organization for Economic Cooperation and Development (OECD), an intergovernmental organization of industrialized countries. The PISA financial literacy assessment assessed students’ knowledge and understanding of fundamental elements of the financial world, including financial concepts, products, and risks, as well as their ability to apply what they know to real-life situations involving financial issues and decisions.
|NCES 2014028||Program for International Student Assessment (PISA) 2012 U.S. Public-use Data Files
The PISA 2012 U.S. public-use data files and documentation include the following: U.S. national PISA 2012 data in ASCII text format, including variables unique to the United States; SPSS and SAS control files for reading the data and producing SPSS and SAS system files; codebooks; illustrative code for merging student and school-level files; a Read Me file, and a Quick Guide.
Users of this data should also consult the PISA 2012 U.S. Technical Report and User Guide available for viewing and downloading at http://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2014025.
|NCES 2014055||Program for International Student Assessment (PISA) 2012 Massachusetts Restricted-use Data File
This CD-ROM contains PISA 2012 restricted-use data for Massachusetts, including variables unique to U.S. data collection. Massachusetts is one of three states to participate separately from the nation in 2012. The CD-ROM includes the complete MA data file, a codebook, and a cross-walk to assist in merging with other public datasets, such as the Common Core of Data (CCD) and Private School Survey (PSS). A restricted-use license must be obtained before access to the data is granted. Click on the restricted-use license link below for more details.
|NCES 2014056||Program for International Student Assessment (PISA) 2012 Connecticut Restricted-use Data File
This CD-ROM contains PISA 2012 restricted-use data for Connecticut, including variables unique to U.S. data collection. Connecticut is one of three states to participate separately from the nation in 2012. The CD-ROM includes the complete CT data file, a codebook, and a cross-walk to assist in merging with other public datasets, such as the Common Core of Data (CCD) and Private School Survey (PSS). A restricted-use license must be obtained before access to the data is granted. Click on the restricted-use license link below for more details.