Skip Navigation
small NCES header image

Concurrent Session VIII Presentations


Thursday, July 12, 2012
2:45 pm – 3:45 pm


VIII–B: Mapping Performance Within U.S. Department of Education Data Releases

Ross Santy and Jane Clark, U.S. Department of Education

    For the past two years, the U.S. Department of Education has been increasing its capability to flexibly develop and host geographic presentations of its data. This interactive session outlines the technical and policy decisions behind current efforts to visually display K–12 performance data in online maps. Presenters from the EDFacts initiative and from the Office of Elementary and Secondary Education share prototypes of pages being considered for data.ed.gov and ED Data Express. The session encourages discussion and feedback to help shape products slated for deployment online later this year.

VIII–C: Common Education Data Standards (CEDS) and Race to the Top (RTTT) Assessments

Jessica McKinney, U.S. Department of Education
Beth Young, Quality Information Partners
Rob Abel, IMS Global Learning Consortium, Inc.
Larry Fruth, SIF Association

    Learn about work being done as part of the Common Education Data Standards (CEDS) project to support Race to the Top (RTTT) Assessments. This work will support the next-generation assessment systems being built by the Consortia, which will ensure that next-generation assessment data are interoperable with respect to assessment item format, storage, display, transmission, and other areas. These voluntary common standards will enable comparisons across and within states to measure student performance and inform instruction. This work is being done by creating an Assessment Interoperability Framework and developing standards to support the movement of the elements.

Download Zipped PowerPoint Presentation:


VIII–D: Oregon—Creating Funding Opportunities Through Accurate Time Tracking

Josh Klein, Oregon Department of Education

    The Oregon Department of Education requires staff to track time on all technology projects. The resulting data set creates the foundation for a sophisticated funding model that allows technology projects to be billed to a variety of funding streams while providing accurate project costs and complete supporting documentation. This presentation highlights the processes supporting this distributed funding model and explores how time-tracking data can be used for staff development, project management, technology budgeting, and the calculation of agency performance metrics. A demonstration of the “Tracker” and “eTimesheet” applications that enable this funding model are also included.

Download PDF Presentations:


VIII–E: Growing Up With a Growth Model: The Evolution of Virginia’s Student Growth Percentile Reports

Deborah Jonas and Nathan Carter, Virginia Department of Education

    The Virginia Department of Education (VDOE) experienced a unique set of challenges in using student growth percentiles (SGP) for federal and state accountability purposes. Some of the challenges included establishing appropriate business rules for applying the SGP model, understanding how new tests in reading and math could impact measurements of growth, developing different types of SGP reports that communicate results effectively to different stakeholder groups, and offering professional development opportunities so stakeholders would understand how to use the new information responsibly. In this session, the presenters provide more details about these types of challenges and share the technical and capacity-building strategies VDOE employed in response.

Download Zipped PowerPoint Presentation:


VIII–F: “What Makes for a Good Test?”

Carolyn Fidelman, National Center for Education Statistics

    Many of us depend on meaningful test scores for a variety of research goals, but how much do you know about what is behind that magical number? This session provides a brief overview of the basics of good standardized test design, use, and the interpretation of test quality indicators; differences in the ways attitude and opinion measures and measures of academic ability are developed, with particular focus on content and construct validity; interpretation of the information in technical reports such as basic descriptives, alpha reliability, point biserial values, and item response theory (IRT) parameters; and ways to evaluate the comparability of scores from different tests.

VIII–G: The School District Demographics System (SDDS) Goes Mobile!

Tai Phan, National Center for Education Statistics
Michael Lippmann, Blue Raster

    Mobile devices (e.g., smart phones and tablets) are revolutionizing the way our nation consumes information. With the increasing ubiquity of mobile devices, mobile applications have the potential to reach more users and offer location relevant data. The School District Demographics System (SDDS) is now available for use on both Apple iOS and Android mobile devices. This session presents an overview of current efforts to bring the SDDS to mobile devices, including relevant use cases.

VIII–H: Leveraging the Power of Geographic Information System (GIS) Applications to Display Information From the Tennessee Longitudinal Data System

David Wright and Indrani Ojha, Tennessee Higher Education Commission

    The Tennessee Longitudinal Data System (TLDS) is being constructed from federal Race to the Top funds to connect statewide K–12, postsecondary, and labor market participation data. Web-based geographic information system (GIS) web applications provide the potential to bring together large amounts of information from disparate sources for graphic display at state, regional, or county levels. Detailed data tables included in web-tool design provide even more granular drilldown capability. This session demonstrates a map-driven interface developed by the Tennessee Higher Education Commission and the Office of Information Resources to display TLDS data in user-defined formats.

Download Zipped PowerPoint Presentation:


VIII–I: Data Issues Resolution Process

Christina Tydeman, Hawaii State Department of Education

    If at first you don’t succeed, try and try again. In 2011, the Hawaii State Department of Education overhauled its data governance process. As a single state education agency (SEA)/local education agency (LEA), Hawaii was challenged to redefine data ownership roles related to IT and program stakeholders while developing an issues resolution process that addressed both LEA and SEA needs. An internal Data Issues Resolution workspace was created and has been a key tool for monitoring and facilitating progress, as well as provide documentation and ongoing access to the resulting decisions. The presenter demonstrates the workspace and shares resources about the process, structure, and lessons learned.

Download Zipped PowerPoint Presentation:


Top


Would you like to help us improve our products and website by taking a short survey?

YES, I would like to take the survey

or

No Thanks

The survey consists of a few short questions and takes less than one minute to complete.