Concurrent Session IX Presentations
Thursday, July 18, 2013
4:15 pm – 5:15 pm
IX–A: Who Moved My EDEN Queries: How to Make the Change From Manual Processes
Joseph Cowan, Pennsylvania Department of Education
Since 2006, the Pennsylvania Department of Education (PDE) has been collecting data into its
Pennsylvania Information Management System (PIMS) for the purpose of accountability reporting.
Over the past two years, PDE has teamed with eScholar to use the existing data being collected to
simplify and automate EDEN/EDFacts reporting. This session will cover the technologies used and
the processes enacted to make the project successful.
Download Zipped PowerPoint Presentation:
IX–B: Fiscal Coordinators’ Round Table Discussion (Part 2)
Glenda Rader, Michigan Department of Education
Susan Barkley, Kentucky Department of Education
Stephen Cornman, National Center for Education Statistics
Once a year we submit school district financial data to the National Public Education Financial
Survey (NPEFS) and Survey of Local Government Finances: School Systems (F-33). As you complete
the survey, you may find those odd little revenues and expenditure items and question, “Am I
reporting this correctly?” Here is your opportunity to discuss various financial reporting dilemmas
with your colleagues in other states. Bring your questions and answers and be prepared to discuss
issues like these: where to code revenue and expenditure categories on the F-33 and NPEFS;
when to record various facility acquisition costs as capital vs. contracted services; where to code
expenditures incurred by one district but paid on behalf of students in another district without
distorting the per-pupil amounts; how to account for Indirect Cost Recovery without distorting
actual expenditures; how to record sub-grantee revenue and expenditures so they do not distort
the individual or statewide reports; how to record Charter School Operations; how to record
Post Employment Benefit cost under new GASB Pronouncements; when to consider an activity a
district rather than an agency/student activity; and how various states check district data quality
before submitting to the NPEFS and F-33.
IX–C: Collect Once and Use Twice
Sharon Gaston, Texas Education Agency
Shawn Bay, eScholar LLC
Alan Hartwig, Deloitte Consulting LLP
In 2013, the Texas Education Agency (TEA) will begin the statewide deployment of the Texas Student
Data System (TSDS) to more than 1,200 independent school districts that serve nearly 5 million
students. The TSDS system will be utilized to meet dual purposes. The TSDS will support TEA’s
mandated state reporting and replace a 25-year-old legacy system. At the same time, the system
will support the implementation of the StudentGPS Dashboards to teachers and administrators
at the local education agencies (LEAs) throughout the state. TSDS will also serve as the primary
data collection mechanism for future TEA data collections. This presentation will look at the TSDS
architecture and how TSDS will support both data requirements and plans for supporting future
data requirements.
IX–D: Illinois State Board of Education (ISBE) and
Bloomington District 87: Vision of Real-Time
Data Collection and Validation
Jim Peterson, Bloomington Public Schools District 87 (Illinois)
Brandon Williams, Illinois State Board of Education
Gay Sherman and Aziz Elia, CPSI, Ltd.
The Illinois State Board of Education (ISBE) and Illinois Interactive Report Card (IIRC) are piloting
the use of real-time data collection and validation toolsets as a way to gather data from 40 school
districts in Illinois. The objective is to allow educators access to data, resources, and tools that will
enhance student performance. The project incorporates real-time Extract, Transform, and Load
(ETL) and validation options to provide data to a central, cloud-based data store available for
Illinois school districts, including data validation and correction, error-reporting services, and a set
of analytical tools to allow interoperability between student data, assessments, and other data
related to student achievement and learning. Bloomington District 87 will present its vision of
the real-time architecture, how it fits in with its current schools interoperability framework (SIF)
deployment, and the potential impact this project has on its students and educators. In addition,
Bloomington District 87 will discuss its involvement with the inBloom initiative and how it passes
data to inBloom through its underlying data center infrastructure IaaS/SaaS pilot called IlliniCloud.
IX–E: Common Education Data Standards (CEDS): 101 Tools and Use
Beth Young, Quality Information Partners, Inc.
Jim Campbell and Nancy Copa, AEM Corporation
This introductory session will familiarize users with the Common Education Data Standards (CEDS).
This session will describe what CEDS are needed, what the parts of CEDS are, and how CEDS can be
used. The session will also include a demonstration of both CEDS Tools: Align and Connect.
Download Zipped PowerPoint Presentation:
IX–F: Kentucky’s Continuous Instructional Improvement System
Maritta Horne, Kentucky Department of Education
Amy King, Pearson
Kentucky’s Continuous Instructional Improvement Technology System (CIITS) connects standards,
electronically stored instructional resources, curriculum, formative assessments, instruction,
professional learning, and evaluation of teachers and principals in one place. This session will
address how CIITS improves instructional outcomes, teacher effectiveness, and leadership.
IX–G: An Early Warning System in the Yonkers Public Schools
David Weinberger and Shanit Halperin, Yonkers Public Schools (New York)
The Yonkers Public Schools is establishing an Early Warning System to identify students at risk of
not graduating from high school. The system is managed at the district level and is based upon
district-specific indicators using local data to provide relevant and replicable information to its
schools. This session will present both the components of the system as well as the challenges
of implementing data-intensive information to school staff and organizing data use at the district
level.
IX–H: Data Lifecycle—Success Strategies From Washington State
Jason Alvarado and Emily Rang, Washington State Office of Superintendent of Public Instruction
With the recent public release of our Statewide Longitudinal Data System (SLDS), data transparency
and data quality have driven better state education agency (SEA) data lifecycle processes.
Washington State has more than four unique student information system vendors supporting 296
local education agencies (LEAs) statewide. In this session, Washington State will explain its data
lifecycle process, from collection and verification to master-data management and beyond. The
presenters will share strategies, technologies, and lessons learned that have proven successful.
Future goals will also be discussed.
Download Zipped PowerPoint Presentation:
IX–I: Data Use for Early Childhood
Missy Cochenour, AEM Corporation
Jaci Holmes, Maine Department of Education
Phil Koshkin, Maryland State Department of Education
Kathryn Tout, Child Trends
This interactive session will highlight how a statewide longitudinal data system (SLDS) that includes
early childhood data uses the data to address key issues in early childhood education, including
kindergarten entry assessments, quality rating and improvements systems, child outcomes data,
and policy questions. Common Education Data Standards (CEDS) will also be highlighted as a tool
that can support the work of addressing these challenges using data.
Download Zipped PowerPoint Presentation:
IX–J: Major Edit and Imputation Methods Employed in the Processing of Common Core of Data (CCD)
Robert Stillwell, National Center for Education Statistics
Beth Goldberg and Jeff Little, U.S. Census Bureau
The NCES Common Core of Data (CCD) has employed an array of cleaning techniques in the
collection and processing of administrative data. These techniques include data validation checks,
editing procedures, and imputation methods. The presenters in this session will explore the
various techniques used to process the current CCD data, provide some metrics regarding the
efficacy and burden of those procedures, and discuss the ongoing process of improving these
techniques to improve data quality and timeliness and to reduce overall program burden from
respondent to end user.
Top