Search Results: (16-30 of 111 records)
Pub Number | Title | Date |
---|---|---|
NCES 2011607 | National Institute of Statistical Sciences Configuration and Data Integration Technical Panel: Final Report
NCES asked the National Institute of Statistical Sciences (NISS) to convene a technical panel of survey and policy experts to examine potential strategies for configuration and data integration among successive national longitudinal education surveys. In particular the technical panel was asked to address two related issues: how could NCES configure the timing of its longitudinal studies (e.g., Early Childhood Longitudinal Study [ECLS], Education Longitudinal Study [ELS], and High School Longitudinal Study [HSLS]) in a maximally efficient and informative manner. The main, but not sole, focus was at the primary and secondary levels; and what could NCES do to support data integration for statistical and policy analyses that cross breakpoints between longitudinal studies. The NISS technical panel delivered its report to NCES in 2009. The principle recommendations included in the report are: 1. The technical panel recommended that NCES should configure K-12 studies as a series of three studies: (i) a K-5 study, followed immediately by (ii) a 6-8 study, followed immediately by (iii) a 9-12 study. One round of such studies, ignoring postsecondary follow-up to the 9-12 study, requires 13 years to complete. 2. The technical panel also recommended that budget permitting; NCES should initiate a new round of K-12 studies every 10 years. This can be done in a way that minimizes the number of years in which multiple major assessments occur. The panel found that there is no universal strategy by means of which NCES can institutionalize data integration across studies. One strategy was examined in detail: continuation of students from one study to the next. Based on experiments conducted by NISS the technical panel found that: 3. the case for continuation on the basis that it supports cross-study statistical inference is weak. Use of high-quality retrospective data that are either currently available or are likely to be available in the future can accomplish nearly as much at lower cost. 4. Continuation is problematic in at least two other senses: first, principled methods for constructing weights may not exist and, second, no matter how much NCES might advise to the contrary, researchers are likely to attempt what is likely to be invalid or uninformative inference on the basis of continuation cases alone. 5. The technical panel urged that, as an alternative means of addressing specific issues that cross studies, NCES consider the expense and benefit of small, targeted studies that target specific components of students trajectories. |
3/28/2011 |
NCES 2011608 | National Institute of Statistical Sciences Data Confidentiality Technical Panel: Final Report
NCES asked the National Institute of Statistical Sciences (NISS) to convene a technical panel of survey and policy experts to examine the NCES current and planned data dissemination strategies for confidential data with respect to: mandates and directives that NCES make data available; current and prospective technologies for protecting and accessing confidential data, as well as for breaking confidentiality; and the various user communities for NCES data and these communities uses of the data. The principle goals of the technical panel were to review the NCES current and planned data dissemination strategies for confidential data, assessing whether these strategies are appropriate in terms of both disclosure risk and data utility, and then to recommend to NCES any changes that the task force deems desirable or necessary. The NISS technical panel delivered its report to NCES in 2008. The report included four principal recommendations, the first three of which confirmed existing NCES strategies and practices:
|
3/10/2011 |
NCES 2011603 | Statistical Methods for Protecting Personally Identifiable Information in Aggregate Reporting
This Statewide Longitudinal Data Systems (SLDS) Technical Brief examines what protecting student privacy means in a reporting context. To protect a students privacy, the students personally identifiable information must be protected from public release. When schools, districts, or states publish reports on students educational progress, they typically release aggregated datadata for groups of studentsto prevent disclosure of information about an individual. However, even with aggregation, unintended disclosures of personally identifiable information may occur. Current reporting practices are described and each is accompanied by an example table that is used to consider whether the intended protections are successful. The Brief also illustrates that some practices work better than others in protecting against disclosures of personally identifiable information about individual students. Each data protection practice requires some loss of information. The challenge rests in identifying practices that protect information about individual students, while at the same time minimizing the negative impact on the utility of the publicly reported data. Drawing upon the review and analysis of current practices, the Brief concludes with a set of recommended reporting rules that can be applied in reports of percentages and rates that are used to describe student outcomes to the public. These reporting rules are intended to maximize the amount of detail that can be safely reported without allowing disclosures from student outcome measures that are based on small numbers of students. NCES welcomes comments on the recommended reporting rules. |
12/21/2010 |
NCSER 20103006 | Statistical Power Analysis in Education Research
This paper provides a guide to calculating statistical power for the complex multilevel designs that are used in most field studies in education research. For multilevel evaluation studies in the field of education, it is important to account for the impact of clustering on the standard errors of estimates of treatment effects. Using ideas from survey research, the paper explains how sample design induces random variation in the quantities observed in a randomized experiment, and how this random variation relates to statistical power. The manner in which statistical power depends upon the values of intraclass correlations, sample sizes at the various levels, the standardized average treatment effect (effect size), the multiple correlation between covariates and the outcome at different levels, and the heterogeneity of treatment effects across sampling units is illustrated. Both hierarchical and randomized block designs are considered. The paper demonstrates that statistical power in complex designs involving clustered sampling can be computed simply from standard power tables using the idea of operational effect sizes: effect sizes multiplied by a design effect that depends on features of the complex experimental design. These concepts are applied to provide methods for computing power for each of the research designs most frequently used in education research. |
4/27/2010 |
NCES 201002 | 2008/09 Baccalaureate and Beyond Longitudinal Study (B&B:08/09) Field Test Methodology Report - Working Paper Series
This report describes the methodology and findings for the field test of the 2008/2009 Baccalaureate and Beyond Longitudinal Study (B&B:08/09). These students, who completed their bachelors degree requirements during the 2007-08 academic year, were first interviewed as part of the 2008 National Postsecondary Student Aid Study (NPSAS:08) field test. BPS:08/09 is the first follow-up of this cohort. The B&B:08/09 field test was used to plan, implement, and evaluate methodological procedures, instruments, and systems proposed for use in the full-scale study scheduled for the year following graduation from a bachelors degree program. The report provides the sampling design and methodologies used in the field test. It also describes data collection outcomes, including response rates, interview burden, and results of incentive, mailing, and prompting experiments. In addition, the report provides details on the evaluation of data quality for reliability of responses, item nonresponse, and question delivery and data entry error. Recommendations for the full-scale study are provided for the sampling design, locating and tracing procedures, interviewer training, data collection, and instrumentation. |
3/4/2010 |
NCEE 20090049 | What to Do When Data Are Missing in Group Randomized Controlled Trials
This NCEE Technical Methods report examines how to address the problem of missing data in the analysis of data in Randomized Controlled Trials (RCTs) of educational interventions, with a particular focus on the common educational situation in which groups of students such as entire classrooms or schools are randomized. Missing outcome data are a problem for two reasons: (1) the loss of sample members can reduce the power to detect statistically significant differences, and (2) the introduction of non-random differences between the treatment and control groups can lead to bias in the estimate of the interventionfs effect. The report reviews a selection of methods available for addressing missing data, and then examines their relative performance using extensive simulations that varied a typical educational RCT on three dimensions: (1) the amount of missing data; (2) the level at which data are missingat the level of whole schools (the assumed unit of randomization) or for students within schools; and, (3) the underlying missing data mechanism. The performance of the different methods is assessed in terms of bias in both the estimated impact and the associated standard error. |
10/13/2009 |
NCES 2009310 | 2005-06 Private School Universe Survey (PSS) Data File User's Manual and Survey Documentation
This is the data file user's manual and survey documentation for the 2005-06 Private School Universe Survey (PSS). |
3/18/2009 |
NCES 2009001 | Highlights From TIMSS 2007: Mathematics and Science Achievement of U.S. Fourth- and Eighth-Grade Students in an International Context
The Trends in International Mathematics and Science Study (TIMSS) 2007 is the fourth administration of this international comparison since the 1995 initial administration. TIMSS is used to compare over time the mathematics and science knowledge and skills of fourth- and eighth-graders. TIMSS is designed to align broadly with mathematics and science curricula in the participating countries. The results, therefore, suggest the degree to which students have learned mathematics and science concepts and skills likely to have been taught in school. In 2007, there were 58 countries and educational jurisdictions that participated in TIMSS, at the fourth- or eighth-grade level, or both. The focus of the report is on the performance of U.S. students relative to their peers in other countries in 2007, and on changes in mathematics and science achievement since 1995. For a number of participating countries, changes in achievement can be documented over the last 12 years, from 1995 to 2007. This report also describes additional details about the achievement within the United States such as trends in the achievement of students by sex, race/ethnicity, and enrollment in schools with different levels of poverty. In addition to numerical scale results, TIMSS also includes international benchmarks. The TIMSS international benchmarks provide a way to interpret the scale scores by describing the types of knowledge and skills students demonstrate at different levels along the TIMSS scale. |
12/9/2008 |
NCES 2008805 | Handbooks Online - Version 6.0
Handbooks Online - Version 6.0 is a searchable web tool that provides access to the NCES Data Handbooks for elementary, secondary, and early childhood education. These Handbooks offer guidance on consistency in data definitions and in maintaining data so that they can be accurately aggregated and analyzed. The online Handbook database provides the Nonfiscal Handbooks in a searchable web tool. This database includes data elements for students, staff, classrooms, and education institutions. |
10/1/2008 |
NCES 2005372 | Handbooks Online - Version 5.0
Handbooks Online - Version 5.0 is a searchable web tool that provides access to the NCES Data Handbooks for elementary, secondary, and early childhood education. These Handbooks offer guidance on consistency in data definitions and in maintaining data so that they can be accurately aggregated and analyzed. The updated database includes data elements for students, staff, and education institutions; added data elements for food service; and a link to the current NCES Accounting Handbook. |
10/15/2007 |
NCES 2007397 | Data Files: NCES Comparable Wage Index
The Comparable Wage Index (CWI) is a measure of the systematic, regional variations in the salaries of college graduates who are not educators. It can be used by researchers to adjust district-level finance data at different levels in order to make better comparisons across geographic areas. The CWI was developed by Dr. Lori L. Taylor at the Bush School of Government and Public Service, Texas A&M University. This documentation describes four geographic levels of the CWI, which are presented in four separate files. These files are the school district, labor market, state, and a combined regional and national file. The school district file provides a CWI for each local education agency (LEA) in the NCES Common Core of Data (CCD) database. For each LEA there is a series of indexes for the years 1997 - 2005. The file can be merged with school district finance data, and this merged file can be used to produce finance data adjusted for geographic cost differences. This file also includes four agency typology variables. The additional files allow for similar geographic cost adjustments for larger geographic areas. |
9/4/2007 |
IES 20076004 | IES 2007 Biennial Report to Congress
The Institute of Education Sciences has issued the Director's Biennial Report to Congress, covering activities and accomplishments of the Institute in 2005 and 2006. Transmitted by Director Grover J. (Russ) Whitehurst as required by the Education Sciences Reform Act of 2002, the report includes a description of the activities of IES and its four National Education Centers, as well as a summary of all IES grants and contracts during the biennium in excess of $100,000. Since IES's first Biennial Report two years ago, said Whitehurst, "IES has been transformed from an organization under construction to one that is fully formed and operational." |
5/13/2007 |
NCES 96860REV | USER'S MANUAL: Restricted-Use Data Procedures
The Restricted-Use Data Procedures Manual describes the laws, licensing procedures, security procedures and on-site inspections of restricted-use data. Restricted-use Data Applicants: |
3/30/2007 |
NCES 2006865 | Documentation for the NCES Comparable Wage Index Files
The Comparable Wage Index (CWI) is a measure of the systematic, regional variations in the salaries of college graduates who are not educators. It can be used by researchers to adjust district-level finance data at different levels in order to make better comparisons across geographic areas. The CWI was developed by Dr. Lori L. Taylor at the Bush School of Government and Public Service, Texas A&M University and William J. Fowler, Jr. at NCES. Dr. Taylors research was supported by a contract with the National Center for Education Statistics. The complete description of the research is provided in the NCES Research and Development A Comparable Wage Approach to Geographic Cost Adjustment (NCES 2006-321). This documentation describes four geographic levels of the CWI, which are presented in four separate files. These files are the school district, labor market, state, and a combined regional and national file. The school district file provides a CWI for each local education agency (LEA) in the NCES Common Core of Data (CCD) database. For each LEA there is a series of indexes for the years 1997 - 2004. The file can be merged with school district finance data, and this merged file can be used to produce finance data adjusted for geographic cost differences. This file also includes four agency typology variables. The additional files allow for similar geographic cost adjustments for larger geographic areas. NCES has sponsored the development of other geographic adjustment indexes in the past; the latest was for the 1993-94 school year. |
6/15/2006 |
NCES 2006321 | A Comparable Wage Approach to Geographic Cost Adjustment
In this report, NCES extends the analysis of comparable wages to the labor market level using a Comparable Wage Index (CWI). The basic premise of a CWI is that all types of workersincluding teachersdemand higher wages in areas with a higher cost of living (e.g., San Diego) or a lack of amenities (e.g., Detroit, which has a particularly high crime rate) (Federal Bureau of Investigation 2003). This report develops a CWI by combining baseline estimates from the 2000 U.S. census with annual data from the Bureau of Labor Statistics (BLS). Combining the Census with the Occupational Employment Statistics (OES) makes it possible to have yearly CWI estimates for states and local labor markets for each year after 1997. OES data are available each May and permit the construction of an up-to-date, annual CWI. The CWI methodology offers many advantages over the previous NCES geographic cost adjustment methodologies, including relative simplicity, timeliness, and intrastate variations in labor costs that are undeniably outside of school district control. However, the CWI is not designed to detect cost variations within labor markets. Thus, all the school districts in the Washington, DC metro area would have the same CWI cost index. Furthermore, as with other geographic cost indices, the CWI methodology does not address possible differences in the level of wages between college graduates outside the education sector and education sector employees. Nor does the report explore the use of these geographic cost adjustments as inflation adjustments (deflators.) These could be areas for fruitful new research on cost adjustments by NCES. |
5/4/2006 |
Page 2
of 8