Search Results: (1-15 of 56 records)
|NCES 2018305||The Feasibility of Collecting School-Level Finance Data:
An Evaluation of Data from the School-Level Finance Survey (SLFS) School Year 2013–14
This Research and Development (R&D) report presents school-level finance data on expenditures by function from the School-Level Finance Survey (SLFS). The SLFS is an extension of two existing collections being conducted by NCES in collaboration with the Census Bureau: the School District Finance Survey (F-33) and the state-level National Public Education Financial Survey (NPEFS). The SLFS is essentially an expansion of the F-33 to include some school-level variables. The SLFS pilot study was cleared to collect school-level finance data for the School Year (SY) 2013-14 from 12 state education agencies (SEAs). In the second year (SY 2014–15) the SLFS pilot was cleared to collect data from up to 20 SEAs, and NCES has recently obtained clearance to collect school-level finance data on a volunteer basis from all 50 states and the District of Columbia for SY 2015–16.
|NCES 2018120||An Evaluation of Data From the Teacher Compensation Survey: School Year 2007-08 through 2009-
This Research and Development report summarizes the results of the data collected through NCES’ Teacher Compensation Survey (TCS) for the 2007-08, 2008-09 and 2009-10 school years. It provides an overview of the survey methodology; comparisons of TCS data with other data sources; a discussion of the advantages and disadvantages of TCS data; and findings and descriptive statistics from the years of data collected.
|NCES 2017056||Certification Status and Experience of U.S. Public School Teachers: Variations Across Student Subgroups
This report provides a snapshot of the extent to which U.S. public schools students are taught by certified and experienced teachers using two available datasets. The Schools and Staffing Survey (SASS) provides a comprehensive picture, as it includes teachers of K–12 students in all subjects and the National Assessment of Educational Progress (NAEP) provides a picture specific to grades 4 and 8. In addition, NAEP data are directly related to teachers of two key subjects: reading and mathematics. SASS data are available for the 2011–12 school year and NAEP data are available for 2013 and 2015.
|NCES 2013336||Strategies for Longitudinal Analysis of the Career Paths of Beginning Teachers: Results From the First Through Fourth Waves of the 2007–08 Beginning Teacher Longitudinal Study
The purpose of this R&D report is to develop a strategy for the longitudinal analysis of the BTLS data that can be used to better understand teacher attrition, retention, and mobility. NCES may use this strategy to analyze and present data on all five years of the BTLS in their future reports. The R&D report has 3 research objectives: (1) define the concept of a career path for beginning teachers that can be implemented with all waves of the BTLS; (2) operationalize the assignment of a career path using this definition (i.e., examine methods for assigning career paths); and (3) investigate the best approach for analyzing the relationships between beginning teachers’ career paths and selected teacher and school characteristics.
|NCES 2010329||An Evaluation of the Data from the Teacher Compensation Survey: School Year 2006–07
This report provides an overview of the Teacher Compensation Survey (TCS) data collection in 17 states for school year 2006-07. It also includes a comparison of state administrative records with other sources of data, data availability and quality. This report discusses the uses of the data, and the limitations and advantages of the TCS.
|NPEC 2010832||Suggestions for Improving the IPEDS Graduation Rate Survey Data Collection and Reporting
In 1990, Congress enacted the Student Right-to-Know (SRTK) Act which requires colleges and universities to disclose the rate students complete academic programs at postsecondary education institutions. The National Center for Education Statistics (NCES) at the U.S. Department of Education developed the Graduation Rate Survey (GRS) to help institutions comply with the SRTK requirements. The purpose of this paper is to present recommendations for reducing complexity and confusion of completing the GRS survey as well as improve the standardization of data. The paper summarizes findings from two activities: deliberations of the NPEC GRS Working Group (with feedback from the full NPEC membership) and an analysis of graduation rate survey perceptions using entries in the Common Dataset listserve.
|NCES 2009482||Indirect County and State Estimates of the Percentage of Adults at the Lowest Literacy Level for 1992 and 2003
The 2003 National Assessment of Adult Literacy (NAAL) assessed the English literacy skills of a nationally representative sample of 18,500 U.S. adults (age 16 and older) residing in private households. NAAL is the first national assessment of adult literacy since the 1992 National Adult Literacy Survey (NALS). The NAAL and NALS produced direct estimates of Prose, Document, and Quantitative literacy, each reported on a 0 to 500 scale and on four performance levels: Below Basic, Basic, Intermediate, and Proficient based on this scale. This report, describes the statistical methodology used to produce the model-dependent—indirect—estimates of the percentages of adults at the lowest literacy level for individual states and counties for 1992 and 2003. The county and state indirect estimates themselves are provided at the NAAL website http://nces.ed.gov/NAAL (the state indirect estimates are also provided in appendices to this report). The measure chosen for the indirect estimation is the percentage of adults lacking Basic prose literacy skills (BPLS). The literacy of adults who lack BPLS ranges from being unable to read and understand any written information in English to being able to locate easily identifiable information in short, commonplace prose text, but nothing more advanced. It should be noted that adults who were not able to take the assessment because they were not able to communicate in English or Spanish (i.e. language barrier cases) are included in the indirect estimates and classified as lacking BPLS because they can be considered to be at the lowest level of English literacy.
A companion report published by ERIC conveys in nontechnical terms the statistical methodology used to develop the estimates. It also provides a profile of adults lacking Basic prose literacy, and a description of various potential users and usages of the findings.
|NCES 2009453||Measuring the Status and Change of NAEP State Inclusion Rates for Students with Disabilities
This report examines the relationship between various characteristics of students with disabilities (SD) and the probability that they would be included in the National Assessment of Educational Progress (NAEP) assessments. Characteristics examined included the type of disability, the severity level of the disability, and whether the student requires accommodations not permitted by NAEP. For various reasons, inclusion of SDs varies from state-to-state, and sometimes within states from year-to-year. Some students, for example, cannot participate meaningfully in the assessments due to the nature of their disabilities or because their Individualized Educational Programs (IEPs) specify an accommodation that is not permitted in NAEP assessments. To address the concern that such fluctuations may affect the validity of reports on achievement trends, NAEP has:
|NCES 2008601||An Exploratory Analysis of the Content and Availability of State Administrative Data on Teacher Compensation
This report identifies state education agencies (SEAs) that maintain records on pay for public school teachers, the comparability of these records, and whether the data might be available to the research community. The report finds that many states maintain teacher-level records with earnings and other teacher characteristics, and are willing to share these data with researchers. It is feasible to use teacher employment and compensation data collected by SEAs to conduct large multistate comparative studies of teacher pay. These studies would not only permit overall comparisons of pay, but also comparisons of teacher pay at various points along typical career trajectories, with breakdowns by teacher demographics and state or district characteristics.
|NCES 2008440||An Exploratory Evaluation of the Data from the Pilot Teacher Compensation Survey: School Year 2005–06
This brief publication contains contains a discussion of the experiences with this new data collection--the methodologies employed and problems encountered in reviewing the data. Illustrative summary data from the research and development effort to collect individual salary and demographic data on public school teachers are included. Seven states participated in this effort: Arizona, Arkansas, Colorado, Florida, Iowa, Missouri, and Oklahoma. Additional examples of data from full-time public school teachers who teach at only one school were included in the analysis. Median salaries and counts for different groupings by experience, age, race, and gender are presented.
|NCES 2008474||Comparison Between NAEP and State Reading Assessment Results: 2003
In late January through early March of 2003, the National Assessment of Educational Progress (NAEP) grade 4 and 8 reading and mathematics assessments were administered to representative samples of students in approximately 100 public schools in each state. The results of these assessments were announced in November 2003. Each state also carried out its own reading and mathematics assessments in the 2002-2003 school year, most including grades 4 and 8. This report addresses the question of whether the results published by NAEP are comparable to the results published by individual state testing programs. OBJECTIVES: Comparisons to address the following four questions are based purely on results of testing and do not compare the content of NAEP and state assessments. How do states’ achievement standards compare with each other and with NAEP? Are NAEP and state assessment results correlated across schools? Do NAEP and state assessments agree on achievement trends over time? Do NAEP and state assessments agree on achievement gaps between subgroups? How do states’ achievement standards compare with each other and with NAEP? Both NAEP and State Education Agencies have set achievement, or performance, standards for reading and have identified test score criteria for determining the percentages of students who meet the standards. Most states have multiple performance standards, and these can be categorized into a primary standard, which, since the passage of No Child Left Behind, is generally the standard used for reporting adequate yearly progress (AYP), and standards that are above or below the primary standard. Most states refer to their primary standard as proficient or meets the standard. By matching percentages of students reported to be meeting state standards in schools participating in NAEP with the distribution of performance of students in those schools on NAEP, cutpoints on the NAEP scale can be identified that are equivalent to the scores required to meet a state’s standards.
|NCES 2008475||Comparison Between NAEP and State Mathematics Assessment Results: 2003
In late January through early March of 2003, the National Assessment of Educational Progress (NAEP) grade 4 and 8 reading and mathematics assessments were administered to representative samples of students in approximately 100 public schools in each state. The results of these assessments were announced in November 2003. Each state also carried out its own reading and mathematics assessments in the 2002-2003 school year, most including grades 4 and 8. This report addresses the question of whether the results published by NAEP are comparable to the results published by individual state testing programs. OBJECTIVES: Comparisons to address the following four questions are based purely on results of testing and do not compare the content of NAEP and state assessments. How do states’ achievement standards compare with each other and with NAEP? Are NAEP and state assessment results correlated across schools? Do NAEP and state assessments agree on achievement trends over time? Do NAEP and state assessments agree on achievement gaps between subgroups? How do states’ achievement standards compare with each other and with NAEP? Both NAEP and State Education Agencies have set achievement, or performance, standards for mathematics and have identified test score criteria for determining the percentages of students who meet the standards. Most states have multiple performance standards, and these can be categorized into a primary standard, which, since the passage of No Child Left Behind, is generally the standard used for reporting adequate yearly progress (AYP), and standards that are above or below the primary standard. Most states refer to their primary standard as proficient or meets the standard. By matching percentages of students reported to be meeting state standards in schools participating in NAEP with the distribution of performance of students in those schools on NAEP, cutpoints on the NAEP scale can be identified that are equivalent to the scores required to meet a state’s standards.
|NPEC 2008850||Deciding on Postsecondary Education
The report examined the data and the information that potential students use and need in making decisions about postsecondary education. Special emphasis was given to underserved students (non-traditional aged, minority, and students of low- and moderate- socioeconomic status) participating in the college search and decision making process. Qualitative data were gathered and analyzed from 11 focus groups with 90 participants in eight states. Secondary data were collected via a review of over 80 sources in the research literature. The literature review indicated that parents, guidance counselors, mainstream media, college brochures, and institutions are primary sources for information about college. For each group of focus group participants, cost, major/program of study, and convenience/location were major determinants in the college search, application, and matriculation processes. Online web-based resources are quickly gaining prominence among current and recent high school graduates who participated in the focus groups. Findings from this research suggest the need for comprehensible information, additional resources, and improved assistance for prospective college students and their families.
|NCES 2006321||A Comparable Wage Approach to Geographic Cost Adjustment
In this report, NCES extends the analysis of comparable wages to the labor market level using a Comparable Wage Index (CWI). The basic premise of a CWI is that all types of workers—including teachers—demand higher wages in areas with a higher cost of living (e.g., San Diego) or a lack of amenities (e.g., Detroit, which has a particularly high crime rate) (Federal Bureau of Investigation 2003). This report develops a CWI by combining baseline estimates from the 2000 U.S. census with annual data from the Bureau of Labor Statistics (BLS). Combining the Census with the Occupational Employment Statistics (OES) makes it possible to have yearly CWI estimates for states and local labor markets for each year after 1997. OES data are available each May and permit the construction of an up-to-date, annual CWI. The CWI methodology offers many advantages over the previous NCES geographic cost adjustment methodologies, including relative simplicity, timeliness, and intrastate variations in labor costs that are undeniably outside of school district control. However, the CWI is not designed to detect cost variations within labor markets. Thus, all the school districts in the Washington, DC metro area would have the same CWI cost index. Furthermore, as with other geographic cost indices, the CWI methodology does not address possible differences in the level of wages between college graduates outside the education sector and education sector employees. Nor does the report explore the use of these geographic cost adjustments as inflation adjustments (deflators.) These could be areas for fruitful new research on cost adjustments by NCES.
|NCES 2006031||Teacher Qualifications, Instructional Practices, and Reading and Mathematics Gains of Kindergartners
This Research and Development (R&D) report uses data from the Early Childhood Longitudinal Study, Kindergarten Class of 1998-99 (ECLS-K) to explore relationships between kindergarten teachers' reports of their qualifications and instructional practices and direct assessments of children's reading and mathematics achievement during the kindergarten year. Using hierarchical linear modeling (HLM), the study estimated the degree to which specific aspects of teacher training-the teaching credential and coursework in pedagogy-and teaching experience were associated with student achievement. In addition, the study identified teacher-reported instructional practices associated with student achievement gains and examined the qualifications of teachers and aspects of teacher training that were related to the use of these practices. Spending more time on subject and working within a full-day kindergarten structure were found to be associated with relatively large gains in achievement. Also, certain teacher background variables—particularly the self-reported amount of coursework in methods of teaching reading and mathematics—were positively related to the teacher-reported frequency of various instructional practices that in turn were associated with higher achievement.