Concurrent Session VIII Presentations
Thursday, July 28, 2011
2:45 pm – 3:45 pm
VIII–A: Written—Taught—Tested: How Kansas Is Completing the Cycle of Instruction
Kathy Gosa and Tom Foster, Kansas State Department of Education
This presentation will provide an overview of a project that demonstrates how to bridge the gap in utilization between
information systems and instructional practice. The Unified Standards Multiple Resource System (USMRS) is a tool that will
dynamically link the Kansas Common Core Standards and other standards with applications that include student achievement data,
student demographics, and student program participation data. Kansas State Department of Education’s (KSDE) libraries of
instructional resources will then be linked to the USMRS to allow real-time programmatic suggestion of resources based upon
student performance data. By creating an opportunity for field-based educators to interact with written, taught, and tested
components of our data systems, we encourage the focus on data-based instructional decisions. We hope to create additional
depth of support to users that will promote continual development and sharing of high-quality instructional resources through
the educator interaction facilitated by the USMRS tool.
Download Zipped PowerPoint Presentation:
VIII–B: District’s Longitudinal Data System (Growth Model): Impact on Instruction and Learning in Classrooms
Mwarumba Mwavita,
Western Heights School District (Oklahoma)/Oklahoma State University
Joe Kitchens, Western Heights School District (Oklahoma)
This session will present the Western Heights School District’s growth model that tracks individual students’ and cohorts
of students’ academic growth by teacher using a district-adopted standardized formative assessment administered three times
within a school year. Findings on the impact of the growth model on district’s stakeholders—administrators, teachers, students,
and parents will be discussed. Lessons learned and future use of the growth model in predicting students’ performance on state,
norm-referenced, and postsecondary assessments will be shared.
Download Zipped PowerPoint Presentation:
VIII–C: Utilizing Statewide Longitudinal Data System (SLDS) Data Beyond Accountability
Shara Bunis and David Ream, Pennsylvania Department of Education
Pennsylvania Department of Education’s PK–20 Statewide Longitudinal Data System (SLDS) is a rich data source that
is being linked with early childhood data in the Pennsylvania Department of Public Welfare’s Early Learning Network,
using common unique student identifiers. In this presentation, the Pennsylvania Department of Education will provide
updates on two projects that utilize early learning, PK–12 and postsecondary data with the result of actionable information
for education improvement.
Download PDF Presentation:
VIII–D: State Visits Under the State Education Information Support Services (SEISS) Contract
Patrick Sherrill, U.S. Department of Education
Ross Lemke, AEM Corporation
The Performance Information Management Service (PIMS) is visiting states to evaluate state systems’ capabilities, to
provide high quality data to EDFacts, and to provide onsite EDFacts technical support to states which request it. The
PIMS/AEM Corporation team will talk about the results of visiting the participating states.
VIII–E: Four-Year Adjusted Cohort Calculation: Considerations and Decisions
Ted Carter and Kelly Holder, Kansas State Department of Education
Currently there are several approved methods for determining graduation rate with regard to the No Child Left Behind
accountability requirements. In an effort to standardize the method being used among the states, the U.S. Department of
Education has mandated a four-year adjusted cohort calculation method for all states, beginning in the 2010–11 school year.
Kansas has implemented this cohort calculation and, in the process, identified numerous requirements for business rules and
greater specification. In this presentation, the Kansas State Department of Education staff will share their experiences
from operationalizing the calculation while highlighting some of the considerations and decisions established when creating
the logic for the software application.
Download Zipped PowerPoint Presentation:
VIII–F: Estimating Causal Effects With Large Scale Longitudinal Data (Session Cancelled)
Carolyn Barber and Sarah Frazelle,
Kansas City Area Education Research Consortium
Mark Ehlert, University of Missouri
K–12 schools are rich with data collected for a variety of accountability and reporting purposes. Increasingly,
districts have been interested in using these data to answer questions about the effectiveness of teachers, initiatives,
or specific elements of curriculum. This panel discussion hosted by Kansas City Area Education Research Consortium (KC-AERC)
will focus on models used to approximate causation in non-experimental designs highlighting the use of value-added and
propensity score matching using two specific cases from their current work with districts. American Education Research
Association’s recent publication entitled “Estimating Causal Effects Using Experimental and Observational Design” will
serve as a foundation for the discussion.
VIII–G: Comparing Apples to Apples: Identifying Heterogeneity in Schools
Jennifer Lambert and Kristin Campbell, Utah State Office of Education
No two schools are identical, and comparing outcomes and progress has always been difficult. Schools are mixtures
of minorities, income levels, etc. Utah has sought to address this problem using Polytopic Vector Analysis (PVA) which
identifies and quantifies school heterogeneity. PVA analyzes school variables and identifies school types (self-training
classifier) and percentages of each type for each school. School likeness can then be determined, which can be used to
identify schools that are doing well or ones that are failing relative to similar schools.
VIII–H: Making School-Level Expenditure Data Meaningful
Peggy O’Guin, California Department of Education
Vaughn Altemus, Vermont Department of Education
Stephen Cornman, National Center for Education Statistics
There is an escalating interest in collecting school-level revenue and expenditure data, but at the current time
there is no standardized protocol for attributing centralized costs to individual schools. This lack of protocol threatens
the viability of any school-level financial data collection by compromising consistency in how school-level costs are defined.
Arriving at such a protocol would require careful consideration of a number of important questions, with involvement and input
from stakeholders at the school, district, county, state, and federal levels. Further, if school-level finance data collections
are instituted in the future, the variable definitions must match or be closely aligned to the standardized protocol.
This session will seek to identify the questions, explore whether meaningful data of value in decisionmaking could result if
they were answered, and consider the costs and benefits of producing that data.
Download Zipped PowerPoint Presentation:
VIII–I: Data.ED.Gov
Ross Santy and Jason Hoekstra, U.S. Department of Education
In 2010, the U.S. Department of Education launched a new internet tool to allow public access to federal grant
application information. In the year since this tool was made available, these data have been enhanced with
significantly more valuable education information. This session will provide an overview to these data and discuss
the possible future hosting of EDFacts information on this site.
Top