Skip Navigation

Concurrent Session XI Presentations

Friday, July 19, 2013
10:15 am – 11:15 am

XI–A: Michigan’s State Longitudinal Data System (SLDS)—Process Implementation—It’s Really Not Micromanagement

Fawn Dunbar and Carol Jones, Michigan Center for Educational Performance and Information

    Michigan’s Center for Educational Performance and Information (CEPI) has implemented a State Longitudinal Data System (SLDS) and will share in this session details about the process employed to ensure the most efficient, effective, and timely production of EDFacts reports. Presenters will discuss the production process from beginning to end. They will also provide key lessons learned and pitfalls to avoid. Process elements to be discussed include agile board use, business rules documents, the use of Team Foundation Server (TFS) tickets for tracking tasks and hand-offs, and configuration management for tracking changes to code or business rules.

Download Zipped PowerPoint Presentation and Handouts:

XI–B: Statewide Longitudinal Data System (SLDS) and Data Quality Auditing

John Brandt and Sarah Wald, Utah State Office of Education
Breanne Humphries, Utah System of Higher Education
Andrew Mingl, Utah College of Applied Technology

    Many statewide longitudinal data system (SLDS) initiatives find that data quality assurance becomes a time-consuming activity that can prove to be a significant roadblock to project completion. In this session, partners of the Utah Data Alliance will share their experiences and lessons learned about data quality assurance over the past year as they have populated the Alliance’s P–20W warehouse. Topics to be addressed include creating awareness of the time and resources necessary for a good data quality audit, formulating and executing of a data quality audit plan, identifying and resolving discrepancies between source and research-ready data, and determining when the data are of sufficient quality.

XI–C: Data Quality in a District-Created Assessment System

Margaret Bailey and Brandan Keaveny, Syracuse City School District (New York)

    As part of the New York State Annual Professional Performance Review, the Syracuse City School District embarked on a collaborative process to develop district-created assessments to be used to measure student growth. These pre- and post-tests are the key components to student learning objectives (SLOs). These SLOs result in a growth score that represents 20 percent of a teacher’s performance rating. Participants in this session will learn about the SLO journey and the lessons learned along the way. Further discussion will include plans to improve assessment administration and data quality in the coming year.

XI–D: Come and Get It? You Need More Than Data to Share It

Melissa Beard, Washington State Office of Financial Management

    Having data in a warehouse is nice, but data requestors need more than the data in one place in order to ask for it and use it. Come learn about the Education Research and Data Center’s (ERDC) request process and see the materials ERDC has produced to help people request and use data. Also be prepared to share ideas you have implemented that help makes education data accessible to others.

Download Zipped PowerPoint Presentation

XI–E: Determining Student Growth Without Regression

Neal Gibson, Arkansas Research Center
Eric Hedberg, NORC at the University of Chicago

    The Arkansas Research Center is developing an alternative method for calculating student growth using an ordinal ranking of students. This approach is easier for educators to understand because it does not require regression. The results of these rankings behave identically to the Student Growth Percentile (SGP) model, with a correlation of .99. To ensure methodological rigor, Arkansas has partnered with NORC at the University of Chicago, which performed an evaluation of the Pilot Growth Model Program for the analysis. This presentation will be an overview of this methodology and its application in determining teacher growth.

Download Zipped PowerPoint Presentation

XI–F: Race to the Top Assessment: Consortium Progress and Interoperability

Jessica McKinney, U.S. Department of Education

    Since receiving U.S. Department of Education grants in 2010, two consortia of states, representing 44 states and the District of Columbia, have been developing next-generation student assessment systems. Their work has the potential to dramatically transform the student achievement data landscape. The consortia must also, as part of the terms of the grant, develop all assessment items and produce student-level data consistent with industry-recognized, open-licensed technology interoperability standards. The aim of this session is to provide a brief introduction to the program and progress to date, including work on technology interoperability standards. This session is a follow-up to the CEDS-AIF session held at the STATS-DC 2012 conference, providing an update on the CEDS-AIF Version 1.0 release in January 2013.

Download Zipped PowerPoint Presentation:

XI–G: National Perspective on Coordinated Early Childhood Statewide Longitudinal Data Systems (SLDS)—Lessons Learned

Missy Cochenour, AEM Corporation
Kathy Hebbler, SRI International
Carlise King, Child Trends
Meredith Miceli, U.S. Department of Education

    This session will provide an overview of where states are in creating early childhood data systems and linking to other sectors through the statewide longitudinal data system (SLDS), leveraging the work of multiple national groups like DaSy, Early Childhood Data Collaborative (ECDC), and SLDS. The federal technical assistance on data systems together show the full picture in the nation and where the field is going in the next year. Participants will be provided an opportunity to engage around their current work and challenges and expand into early childhood and learn more from the national surveys.

Download Zipped PowerPoint Presentation:

XI–H: Districtwide Analytics (CANCELLED)

Paul Velit and Girish Rajput, Arlington Public Schools (Virginia)
Steven Pummill, Worldgate, LLC

    Arlington County Public Schools (APS) in Virginia has implemented DataBlocs K12 Analytics to serve as the district’s dashboard, analytic, and reporting tool. APS uses this tool to enable administration, principals, and teachers to make data-driven decisions with a high level of data confidence. The district’s tool is built on a robust K–12 data model, and APS has made available more than 50+ K–12 dashboards and reports along with industry-standard key performance indicators (KPIs) to support district stakeholders. This session will discuss how the district’s solution is empowering stakeholders at all levels to access information easily and securely via the Web and has alleviated the burden placed on the IT organization to provide data reports.

XI–I: Postsecondary Success and Outcomes of High School Graduates: A Colorado Story

Beth Bean and Brenda Bautsch, Colorado Department of Higher Education

    As the lines between high school and higher education continue to blur, Colorado is emerging as a pacesetter in understanding this educational intersection and the importance of data to merge the community. This session will address how Colorado tracks high school graduates into college and how to feed outcome and performance information back to the districts. The presenters will also discuss the outcomes of linked research around dual enrollment, remedial education, and workforce outcomes.

Download Zipped PowerPoint Presentation

XI–J: From Start to Finish: Using an Early Warning Indicators Approach to Identify Dropouts as Early as First Grade

Thomas West, Montgomery County Public Schools (Maryland)

    This presentation will demonstrate how an early warning indicators approach can be used to identify dropouts across elementary, middle, and high school grades. Using data for two cohorts of first-time ninth-grade students, this study compares attendance, suspension, reading and mathematics ability, and grade-point average cut-points (the ABCs) of eventual dropouts and nondropouts as well as the effect of each cut-point on students’ odds of later dropping out of high school. Results and limitations will be discussed from the standpoint of a school district.