Skip Navigation
small NCES header image

Winter 2013 Forum Meeting Notes


National Forum on Education Statistics
February 11-12, 2013
Washington, DC



Opening Session

Monday, February 11, 2013

Forum Agenda Review and Introduction MS PowerPoint (215 KB)
Forum Chair Tom Ogle (Missouri Department of Elementary and Secondary Education) welcomed Forum members to the Winter 2013 Forum Meeting in Washington, DC. He introduced the Forum officers, welcomed new and returning members, and encouraged veteran members to introduce themselves to new members during each of the standing committees. Tom spent a moment reviewing the Forum’s mission and recent publications, including theForum Guide to Taking Action with Education Data (http://nces.ed.gov/forum/pub_2013801.asp) and theForum Guide to Supporting Data Access for Researchers: A State Education Agency Perspective (http://nces.ed.gov/forum/pub_2012809.asp). The Forum anticipates the release of a new resource this Spring—a technical implementation guide to teacher-student data links. Tom concluded his remarks by briefly reviewing the agenda for the meeting and welcoming John Easton, Director of the Institute of Education Sciences (IES) to deliver the welcoming address.

Welcome to the Winter 2013 Meeting
John Easton, Director of the Institute of Education Sciences (IES), welcomed Forum members to Washington, DC. He encouraged new members to become involved in Forum activities and he commended the Forum Data Use Working Group on the quality and usefulness of the new Forum Guide to Taking Action with Education Data (http://nces.ed.gov/forum/pub_2013801.asp). IES’s centers share a commitment to using data, and IES is working to support expanded partnerships between researchers and practitioners to maximize uses of education data in research. The IES Regional Educational Laboratories (RELs) currently sponsor approximately seventy research alliances that are conducting research and using findings to impact education. IES is also currently reviewing proposals on researcher-practitioner partnerships. IES FY14 Research Grants will help to promote continuous research in education organizations, and will improve the use of rapid analysis, evaluation studies, and other research that can be used for quick improvements. John thanked members for their participation and wished everyone a successful meeting. 

Emerging Educational Technologies
Richard Culatta, Deputy Director of the U.S. Department of Education’s Office of Educational Technology (OET), presented on the topic of emerging educational technologies. Richard provided an overview of the work of the OET, which develops national educational technology policy for populations ranging from elementary/secondary through postsecondary, adult, and special education.  In addition to advocating for the transition to digital learning, OET explores ways to leverage technology to reimagine and personalize learning. Richard shared points from a recent TED talk on reimagining learning, available at http://www.youtube.com/watch?v=Z0uAuonMXrg. Emerging technologies are constantly changing, and Richard focused his presentation on the following four areas where new tools and capabilities are fueling education innovation:

  • Research—Richard recommended reports available through OET at http://www.ed.gov/edblogs/technology/research/, that discuss topics such as how technology can allow new approaches to research and how technology can help to measure non-cognitive factors, such as the effects of motivation and stress on learning.  
  • Data and Analytics—Richard explained that education is only beginning to see the uses of what has been termed, “Big Data.” New developments include online games that can teach science concepts while contributing to scientific progress (http://fold.it/portal/), interactive and engaging textbooks, and tests that identify patterns in right and wrong answers. “Small data” are also evolving, and Richard provided the example of the MyData Initiative (http://www.ed.gov/edblogs/technology/mydata/), which uses a model developed in the healthcare field to allow individuals to download their own data. OET is working to improve the U.S. Department of Education’s data sites, and Richard encouraged Forum members to explore http://alpha.data.gov/.
  • Digital Learning Content—The expansion of digital learning content, such as Massive Online Open Courses (MOOCs), has highlighted problems with finding digital learning content—search engines are not good at identifying learning resources. OET is developing the Learning Registry, which combines crowd-sourced content and an open infrastructure to assemble educational resources. 
  • Assessing Learning—Online courses are creating new opportunities for assessments and for verifying student identities through assessments. Richard discussed programs for virtual exam proctoring, simulation-based assessments, learning positioning systems that function similar to global positioning systems, and open badges (http://openbadges.org/), which offer competency-based assessments and allow users to create a digital skill portfolio. 

Forum members engaged Richard in discussions on the following issues:

  • The pace of new and emerging technological developments makes it difficult for users to know what it useful and worthwhile. Change for the sake of change isn’t useful, and it is important to look at approaches that are sustainable. Richard explained that OET doesn’t focus on devices, which change quickly, but instead focuses on approaches and infrastructures that use open and interoperable standards.  
  • Data are powerful, and it is important to protect student privacy and to teach responsible data use. Richard advocates actively teaching good data privacy skills, and provided examples of how instructions for proper data use and behavior can be built into data projects.
  • Schools are a good starting point for improving access to technology. Richard suggested that broadband should be included in school design in the same manner as other utilities. Schools can use http://www.educationsuperhighway.org/ to test and improve their building’s connectivity.

Top

Joint Session: Assessment Consortia Updates

Tuesday, February 12, 2013

Assessment Consortia
In 2010, the Partnership for the Assessment of Readiness for College and Careers (PARCC) and the Smarter Balanced Assessment Consortium (SBAC) won competitive grants to develop assessment systems that will meet the dual needs of accountability and instructional improvement. The Forum regularly welcomes representatives of the consortia to provide updates on the work of each group and to answer questions about assessment preparations and technology readiness.

Wes Bruce (Indiana Department of Education) provided a presentation on PARCC MS PowerPoint (425 KB). The PARCC assessment is designed to be device-agnostic so that schools can leverage devices they currently have to implement the assessment. PARCC has developed lists of both minimum and recommended specifications, and Wes noted that PARCC will support tablets.  PARCC is interested in studying student use of tablets to inform tablet specifications; for example, in order for students to have adequate screen size for assessment questions it may be necessary to require keyboards. PARCC recommends bandwidth of 100 kbps per student, but has not yet set a minimum requirement. Further information on PARCC is available at http://www.parcconline.org/.

SBAC and PARCC are working to synchronize their technology announcements, and they activated a technology readiness tool in January. The tool allows schools, districts, and states to see technology gaps, and it has been updated to tell the user the number of devices in the state that meet the minimum technology standards. The tool remains open and can be used at any time. Wes provided an example from New Mexico, where the tool provided information on technology gaps that were then used to inform education budget meetings. 

Michael Muenks (Missouri Department of Elementary and Secondary Education), provided a presentation MS PowerPoint (47 KB) on SBAC. Work groups of SEA and LEA staff are developing the SBAC assessment system, and each member state has active engagement with institutions of higher education. Michael encouraged Forum members to check with their state technology coordinators to learn more about each state’s use of the readiness tool. The technology tool can help education agencies identify gaps, and SBAC can also use the tool to identify challenges to test implementation. For example, SBAC found that many schools were including computers that are not available to students in their counts of available devices, and SEAs and LEAs identified the challenge of keeping computers continually connected to a server. Current SBAC tasks include determining how data will be entered into the SBAC system, exploring the idea of vendors working with LEAs, and determining how the end of high school assessment works with the SAT and ACT. SBAC is also concerned with addressing the issue of sustainability—how will SBAC be structured when the initial assessment design is finished? The current SBAC timeline includes a pilot test in spring 2013, a field test in spring 2014, and operational administration in 2014-2015. Further information is available at http://www.smarterbalanced.org/.

Forum members were interested in learning more about topics such as

  • data submission to the assessment consortia by SEAs and LEAs;
  • whether states will need to shift their data collections to accommodate the needs of the consortia; and
  • consortia allowances for bring-your-own-device policies.

Top

Joint Session: Teacher Evaluation

Tuesday, February 12, 2013

The Forum welcomed a panel of SEA and LEA representatives to discuss teacher evaluation
MS PowerPoint (5.17 MB). Jan Petro (Colorado Department of Education) and Glenn McClain (Platte Valley School District, Weld Re-7 [CO]) discussed Colorado’s State Model Evaluation System. Colorado’s system is guided by five principles:

  • Data should inform decisions, but human judgment will always be an essential component of evaluations.
  • The implementation and evaluation of the system must embody continuous improvement.
  • The purpose of the system is to provide meaningful and credible feedback that improves performance.
  • The development and implementation of educator evaluation systems must continue to involve all stakeholders in a collaborative process.
  • Educator evaluations must take place within a larger system that is aligned and supportive.

Glenn reviewed the elements that comprise the quality standards for principals and teachers, the framework of the system, and evaluation rubrics. Teacher and principal evaluations use the same system and structure, and are based equally on student academic growth and measures of professional practice. Districts can use locally-developed standards, but they must meet the state’s requirements. Jan discussed the timeline for system rollout, which began with development and beta testing in in 2011-2012, then progressed to pilot and rollout in years two and three (2012-2013 and 2013-2014) and concludes with full statewide implementation in 2014-2015. The 2013-14 school year will be a “hold harmless” year, and the system provides two probationary years for teachers deemed not effective. 27 Colorado districts are participating in the evaluation system pilot. Jan and Glenn discussed challenges that are specific to SEAs and LEAs, and encouraged Forum members to learn more about the system by visiting www.cde.state.co.us/EducatorEffectiveness.

Patricia Hardy (Pennsylvania Department of Education) presented an SEA perspective on measuring educator effectiveness. Patricia began by noting that the challenges experienced in Colorado are similar to those experienced in Pennsylvania. Development of Pennsylvania’s teacher evaluation system began in 2010, and multiple pilots are now occurring simultaneously. Patricia emphasized that the system is focused on improving educator effectiveness, not on reducing the teaching force. Pennsylvania has approximately 500 districts, and local control is very important. Evaluations are therefore implemented at the district level, and while the state recommends the Danielson framework, districts are allowed to submit alternative approaches. Teacher specific data are averaged over three years to reduce “noise,” and in the first year, teachers are provided with their value-added data for informational purposes. Effectiveness measures incorporate both building-level and teacher-specific data, and teachers in both tested and non-tested subjects, as well as non-teaching professional employees, are held responsible for building-level data. Part of the evaluation system involves allowing individuals to identify targets and provide evidence for how they are meeting those targets. The value assigned to this aspect of the evaluation is higher for non-teaching professionals and teachers in non-tested subjects than for teachers in tested subjects. Pennsylvania has found that the use of data in educator evaluation has highlighted the importance of data quality, and one of the challenges facing the system is the collection of accurate data from LEAs. Patricia also discussed the challenge of creating a teacher-student data link that is sufficiently robust to support the new evaluation system. More information on Pennsylvania’s system is available at www.education.state.pa.us.

Linda Rocks (Bossier Parish Schools [LA]) provided an LEA Data Manager perspective on educator effectiveness. Linda noted that the development for Louisiana’s Compass system is similar to the multi-year projects underway in Colorado and Pennsylvania. Compass was developed in 2010 and piloted in 2011. The 2012-2013 school year marks the statewide inauguration of the system, and all districts will receive value-added data for eligible teachers. The system allows teachers to review their rosters and to select targets for review. End-of-year ratings are compiled by the state based on multiple sources of information, divided equally between measures of student growth and professional practices. Linda reviewed the tools available on the Compass website, and noted that the sections for evaluations and reports are still in development. Linda recommends that data managers and others with a background in data should be involved in policy discussions to better address data issues when designing systems.

Top

National Education Statistics Agenda Committee (NESAC) Meeting Summary

Monday, February 11, 2013

Morning Session

Emerging Educational Technologies Follow-up Discussion
Richard Culatta, Deputy Director of the U.S. Department of Education’s Office of Educational Technology (OET), facilitated a discussion on emerging technologies to follow up on his general session presentation. Forum members asked questions about the Learning Registry and setting up and utilizing online badges. Richard suggested the websites www.openbadges.org and free.ed.gov. Forum members discussed the complexity of increasing funding and access to technology in schools.

Welcome, Introductions, and Agenda Review
NESAC Chair Ray Martin (Connecticut Department of Education) led the group in introductions, welcomed everyone to the Forum and the NESAC committee meeting, and reviewed the NESAC agenda.

Working Group Updates
School Codes for the Exchange of Data (SCED) Working Group: Kathy Gosa (Kansas State Department of Education) updated NESAC on the School Codes for the Exchange of Data (SCED) Working Group. The Working Group has met five times since the July Forum meeting. They have three main goals for their work: 1) update the codes, 2) create a best practices document, and 3) create a change management process for the future. The group is also working with data collections at the National Center for Education Statistics (NCES) to align course code systems with SCED.

Alternative Socio-Economic Status (SES) Measure Working Group: Matt Cohen (Ohio Department of Education) updated NESAC on the progress of the Alternative SES Measure working group. This group met in September and is meeting again later in the week. This group will not solve the issues SEAs and LEAs are facing following changes to the measurement of free and reduced price meal (FRPM) eligibility, but will instead establish a list of existing and emerging issues. Matt emphasized that this is an opportunity to focus on students in poverty. By addressing a wide range of data and data use issues surrounding this topic, the Working Group hopes to frame efforts across the country in a consistent and meaningful way.

Afternoon Session

Family Policy Compliance Office and National School Lunch Program Joint Guidance
Kathleen Styles, Chief Privacy Officer at the U.S. Department of Education (ED), and Julie Brewer, U.S. Department of Agriculture, shared an update on the Family Policy Compliance Office. NESAC members asked questions and discussed topics including

  • expanding Free and Reduced Price Meal (FRPM) eligibility to additional resources and services;
  • determining who has access to FRPM data and maintaining very limited access for very specific instances; and
  • identifying the implications for teacher accountability for schools that do not give teachers access to these data.

Working Group Presentation: Teacher-Student Data Link
Lee Rabbitt (Rhode Island Department of Elementary and Secondary Education) provided a presentation on an upcoming publication by the Teacher-Student Data Link Working Group. The document is now going through final editing and Forum members will soon receive the publication for review. The first few chapters focus on developing an understanding of the teacher-student data link and data quality. Further chapters focus on data use, governance, policies, and best practices. Use cases located throughout the document provide insight into specific applications of the teacher-student data link and their intended impact. Use case topics include the roster verification process, educator preparation program feedback and evaluation, compliance reporting, educator evaluation, and teacher compensation. Lee highlighted the appendices which contain a Summary Matrix of the use cases and a section on the use of the teacher-student link for emerging technologies.

Assessment Consortia Updates Follow-up Discussion
Wes Bruce (Indiana Department of Education) and Michael Muenks (Missouri Department of Elementary and Secondary Education) answered follow up questions from the joint presentation on the Assessment Consortia. Questions from the committee included:

  • Who are the other players in Common Core State Standards (CCSS) implementation?
  • What will happen if schools are not ready in time?
  • How would issues be addressed with schools that have limited bandwidth?

Common Core Granularity
Jim Goodell (Quality Information Partners) provided a presentation on Common Core Granularity which helps educators unpack the Common Core State Standards. Looking at standards more granularly should be considered when students need to understand and use concepts that require both knowledge and skills. Since standards are designed to work together, a balanced approach needs to be maintained. The State Educational Technology Directors Association (SETDA) is the lead on the project and it is a collaborative effort of the Partnership for Assessment of Readiness for College and Careers (PARCC), the Smarter Balanced Assessment Consortium (SBAC), and the Council of Chief State School Officers (CCSSO). The technical scope and design information draft is available for public review in February and the complete content and solution are expected in June 2013. For more information, visit the GIM-CCSS Public Updates website at http://assess4ed.net/group/gim-ccss-public-updates. *PPT

Group Discussion:
NESAC Vice-chair Allen Miedema engaged NESAC members in a discussion on open education resources. Topics included:

  • “Flipping the script” on textbook curriculum licensing versus curriculum ownership (and cost savings from crowdsourcing).
  • How to create infrastructure that supports open education resources.
  • What are strategies for balancing and introducing new technologies to make sure students are able to comfortably use them in ways that are educationally beneficial?
  • The increasing need to discuss security firewalls and liability issues with all stakeholders.
  • How to leverage devices.
  • Teaching digital citizenship or granting digital driver’s licenses.

Tuesday, February 12, 2013

Morning Session

Data Use and Access
Federal Data Collection Use (EDFacts): Ross Santy and Polly Hayes (U.S. Department of Education [ED]), provided insight on how federal data collections in ED make use of the data collected. Ross reported that EDFacts uses are continuing to grow. EDFacts aggregate data on reading and math assessments are now released as restricted data files. They also use the data in the administration’s goals of transparency and supporting research. Internal uses include supporting the U.S. Secretary of Education in his talking points when he goes on site visits; state level summaries of information for programmatic purposes; and informing policy developments such as Elementary and Secondary Education Act (ESEA) flexibility, School Improvement Grants (SIG), Race to the Top (RTT), and compliance reporting.

Polly Hayes focused her presentation MS PowerPoint (897 KB) on how the Civil Rights Data Collection (CRDC) is used. Polly also discussed internal and external uses of the CRDC data. Internally, the Office of Civil Rights uses the data for knowledge gathering, publications, enforcement, technical assistance, and policy guidance. Externally, state legislatures and boards of education, research institutes, advocacy groups, professors, and news organizations use CDRC data. At a higher level the Office of Civil Rights looks at the data to see the successes and challenges of civil rights. CDRC data have also been used to start investigations on districts that are allegedly violating civil rights laws. For more information, please visit the Civil Rights Data Collection website at http://ocrdata.ed.gov.

Researchers
SLDS Grant Program PDF File (327 KB): Dorthyjean Cratty (National Center for Education Statistics [NCES]), provided a presentation on NCES support for increasingly demand-driven research and data use. When an educational issue comes up between stakeholders such as policy makers, students, parents, educational agencies, educators, researchers, or institutions, they need better opportunities to communicate and, as a result, create better educational research. There are a few initiatives in NCES that help to bridge these worlds and increase access to resources.

The Regional Educational Laboratories (RELS) MS PowerPoint (669 KB): Ruth Curran Neild, (National Center for Education Evaluation and Regional Assistance [NCEE]), provided a presentation on the Regional Educational Laboratories (RELs) and research data use. The RELs play an important role in the supply and demand of data and research among LEAs and SEAs. In knowing their audience, RELs are equipped to conduct and release research in a timely way, to analyze and present data with easy-to-use tools, and to provide research summaries in a variety of formats and media. Ruth highlighted the work of research alliances. Working together, these alliances are comprised of practitioners and researchers collaborating toward a predetermined common and specific goal. Ruth encouraged NESAC members to contact their local REL and determine if there are any opportunities for research collaboration. For more information about the RELs please visit the website at http://ies.ed.gov/ncee/edlabs/regions/

Forum Guide to Data Access for Researchers – SEA Perspective MS PowerPoint (4.58 MB)
Kathy Gosa (Kansas State Department of Education) provided a presentation on the recently released SEA guide for data access. The guide was written under the context that data use should drive data collection, the “research community” is an important user of education data, and developing mutually beneficial relationships between education agencies and the research community makes sense. The main section of the guide covers data partnerships, foundations for data sharing, and challenges to data sharing. The guide also uses many core practices to illustrate the points. Discussion and questions included:

  • When the researchers provide data, are findings reported back to the SEA?
    • The competing process of public record requests.
    • Dealing with repeat requestors (i.e. “data stalkers”).
    • What types of requests fall under the Freedom of Information Act (FOIA)?

The publication can be found at: http://nces.ed.gov/forum/pub_2012809.asp

Forum Guide to Data Access for Researchers – LEA Perspective
Christina Tydeman (Hawaii Public Schools) provided an update on the upcoming data use guide from the LEA perspective, which is based on the Forum Guide to Data Access for Researchers: A State Education Agency Perspective (http://nces.ed.gov/forum/pub_2012809.asp). This group has met twice and is meeting later in the week. This guide will be available by the STATS-DC conference and includes the following topics:

  • LEAs not only have data requests, they also have people that want to come into schools to conduct primary research.
  • Looking at best practices for districts that have different capacities.
  • Providing templates for LEAs.
  • LEA Core practices:
    • Using the data findings at the local level.
    • Creating request forms that differentiate between new and existing data collections.
    • Managing the data request process efficiently, including engaging the schools of education themselves. Releasing data appropriately.
    • Monitoring data use.
    • Monitoring the publication or public release of data.

Small Group Discussions
NESAC members broke up into smaller LEA and SEA groups to discuss data use. Members discussed their own policies and procedures in the area of data sharing and in using the information that comes back from researchers.
Afternoon Session

Teacher Evaluation Follow-up Discussion
Patricia Hardy (Pennsylvania Department of Education), Glenn McClain (Platte Valley School District [CO]), Jan Petro (Colorado Department of Education) and Linda Rocks (Bossier Parish School System [LA]) came to NESAC for a follow-up discussion on teacher evaluations. Discussion and questions included:

  • Some class sizes are small. Would you set a minimum number of students for measuring growth? (In Louisiana the minimum is set to 10. The growth isn’t just from one year to the next. It’s cumulative over many years.)
  • Kentucky adopted the CO growth model for the first time this past fall. Each student receives one point for typical or better growth (measured as being at the 40th percentile or better).
  • What do teachers think of the growth model?
  • How do you deliver the growth information back to the teachers?
  • You can’t just look at growth levels alone; you must look at proficiency levels.
  • Is there a way to get student voices into the assessment system?

Topics from the Floor
NESAC Vice-chair Allen Miedema engaged NESAC members in an open discussion about the assessment consortia. Issues and questions included the following:

  • There is still anxiety around this work and the current messaging.
  • Comparative analysis from our current assessment.
  • We are the consortia. Each state is involved but it’s like the headless horseman. There has to be a person or group we can talk to.
  • Concerned about how much time this test is going to take.
  • What do teachers need to know?
  • Time is running out and budgeting cycles are being missed.

EdFacts and Common Education Data Standards (CEDS) MS PowerPoint (909 KB)
Ross Santy (ED) and Beth Young (Quality Information Partners) provided an update on EDFacts and the Common Education Data Standards (CEDS). For the EDFacts update, Ross focused on Office of Management and Budget (OMB) clearance, Elementary and Secondary Education Act (ESEA) flexibility, and publicly available assessment data. The latest OMB packet remains largely unchanged. There are no whole new areas of collection. There is general interest in the educational experience of military connected students but this is not a new data item. For the first time this January EDFacts released data and information on mathematics and reading language arts proficiency. There is a concerted effort to also roll out 2011 – 2012 data while retaining value and consistency in one file. Ross and the EDFacts team are looking forward to improving transparency.

Beth provided the current status of CEDS. Version 3 was released at the end of January and provided some use and CEDS hierarchy statistics. The number of elements has almost doubled between Version 2 and Version 3. Many of the new elements are from early learning and support for RTT assessments. The increased interest from vendors using the Connect tool indicates forward movement. CEDS currently has over 30 public Align maps and public connections.

The Connect tool has been updated with a “MyConnect” button which will allow users to run an instant comparison of their own public maps against any connections to see if they have the elements needed to make this connection. Ross then discussed the work with seven states to add in Align maps necessary for some EDFacts reporting (Common Core of Data [CCD], assessment, and cohort graduation rate). EDFacts will also release their own connections for these data files and two of the seven states will do so as well.

Meeting Review/Summer Planning
A variety of topics were discussed:

  • Positive feedback for emerging technology presentation.
  • Members brainstormed the possibility of increasing the involvement of individual Forum members with social media.
  • Adjusting the listserv to be easier to use and to read comments.
  • Would like more time to engage in good discussions—especially from the joint sessions.
  • Virtual courses and open course enrollment.
  • Connections to postsecondary.
  • Professional development evaluation (Thomas Guskey, University of Kentucky).
  • NESAC members would like to have a more complete report from the Assessment Consortia.

Top

Policies, Programs and Implementation (PPI) Standing Committee Meeting Summary

Monday, February 11, 2013

Morning Session

Welcome and Introductions
PPI Chair Sonya Edwards (California Department of Education) welcomed everyone to the meeting and led the group in introductions. Vendors were reminded that by participating in Forum meetings, they may have access to information that could potentially disqualify them from a competitive bidding process at a national, state, or local level. When vendors identify an item of potential conflict on a committee’s agenda, it is appropriate for them to excuse themselves while that topic is being discussed.

Agenda Review and Summer 2012 PPI Meeting Review
Sonya Edwards outlined the PPI agenda for the winter meeting and invited members to suggest additional topics for discussion. She then reviewed the work PPI accomplished at the Summer 2012 Forum.

Supporting Data Access for Researchers (SEAs) MS PowerPoint (271 KB)
PPI Members Tom Howell (Michigan Center for Educational Performance and Information) and Levette Williams (Georgia Department of Education) were part of the Forum Working Group that produced the Forum Guide to Supporting Data Access for Researchers: A State Education Agency Perspective, available at http://nces.ed.gov/forum/pub_2012809.asp. Tom and Levette provided PPI members with examples of how their states support data access. Michigan has pursued several successful strategies, including tailoring agreements to specific requests, creating clearly-defined review and approval processes, and presenting research opportunities at different venues to encourage research partnerships. Michigan continues to refine the research request process to make it more systematic and efficient, and is developing a cohesive research agenda. Tom reviewed Michigan’s data request form and discussed planned enhancements to the data request system, such as posting white papers and publications to direct researchers to existing publications. Tom noted that partnerships with researchers benefit the state education agency (SEA) but they also can create challenges. PPI members were interested in learning how the process in Michigan has evolved over time. Tom explained that Michigan is promoting ongoing engagement between researchers and education agencies to ensure proper data use. Michigan is also working to systemize interactions with researchers to ensure that the data are understood and that all data use is governed by a consistent set of rules.

Levette explained that Georgia has also created a system for handling data requests that begins with meetings with researchers followed by a review of each research request by the SEA’s legal and policy offices. Upon approval, the SEA and researcher sign a Memorandum of Understanding (MOU). Levette provided information on two research projects that used SEA data; one involved matching student data to foster care databases, and another involved providing juvenile court judges with data. The Georgia Office of Student Achievement audits student data and is working to build a P20 data warehouse. The warehouse will include a research database for seven agencies, and will create new identification numbers to protect privacy. Researchers from the agencies whose data are included in the database will be granted first access to pilot the database.

Emerging Educational Technologies Follow-up Discussion
Richard Culatta, Deputy Director of the U.S. Department of Education’s Office of Educational Technology, joined PPI for a discussion of topics raised during his Opening Session presentation. PPI members were interested in the concept of “learning positioning”, especially with regard to learning maps. Richard demonstrated how learning maps might look using vendor examples and explained that he encourages states to employ flexibility when designing and using systems and tools. PPI members were interested in knowing how information on new technologies can be made available at both state and local levels, and Richard noted that his office encourages tool developers to work with the research community and to engage in outreach. The group discussed best practices for implementing new technologies in an era of budget and resource constraints, including

  • determining the cost and expected returns of each investment;
  • moving to digital media instead of print-based media;
  • providing metadata tools; and
  • allowing for new tools to be purchased in affordable parts.

Richard explained that his office is focused on interoperability and is looking into future-proofing new technologies by aligning to standards.

Afternoon Session

Assessment Consortia Updates Follow-up Discussion
Wes Bruce (Indiana Department of Education) and Michael Muenks (Missouri Department of Elementary and Secondary Education) met with PPI to answer questions following their joint session presentation. PPI members asked for clarification on the concept of compliance with the Smarter Balanced Assessment Consortium (SBAC) after the completion of SBAC’s work. Michael explained that a vendor could use the SBAC item bank as well as the SBAC platform, or the vendor could use a different platform. SBAC would review the use and determine if the vendor is in compliance. PPI members were also interested in how the consortia are working to minimize burden. Wes and Michael explained that the consortia are leveraging existing systems as much as possible and are also developing reports that will benefit SEAs and LEAs. Wes added that schools are invited to participate in Partnership for Assessment of Readiness for College and Careers (PARCC) field tests.

Alternative Socioeconomic Status (SES) Measure Working Group Update
Matt Cohen (Ohio Department of Education) met with PPI to provide an update on the work of the Forum’s Alternative Socioeconomic Status (SES) Measure Working Group. The Working Group is comprised of SEA and LEA representatives, as well as representatives from the U.S. Department of Education. Matt explained that reporting changes brought about by changes to the National School Lunch Program provide an opportunity to address the question of how to measure SES. The Working Group is looking for a way to replace Free and Reduced Price Meal (FRPM) status as a proxy measure for SES. The new measure should provide meaningful data while also meeting the reporting needs of states for both accountability and funding.

Family Policy Compliance Office and National School Lunch Program Joint Guidance
Kathleen Styles, Chief Privacy Officer, U.S. Department of Education, and Julie Brewer, Department of Agriculture, met with PPI to discuss plans for joint guidance on the use of National School Lunch Program data. Kathleen and Julie presented the group with a list of scenarios and asked for feedback to inform their efforts. Topics suggested for the guidance include

  • the use of FRPM data in aggregate measures of poverty;
  • recent FERPA amendments that address foster care; and
  • “soft” data, such as issuing backpacks to FRPM students, that can undermine the privacy of data.

Innovative Data Use
PPI members expressed an interest in learning more about data use after the Summer 2012 Forum, and members Jared Knowles (Wisconsin Department of Public Instruction), Ellen Mandinach (Regional Educational Laboratory—West), and David Weinberger (Yonkers Public Schools, NY) each agreed to speak about their knowledge and experience with innovative uses of education data.

Jared Knowles gave a presentation MS PowerPoint (1.14 MB) on the development of a statewide early warning system (EWS) in Wisconsin. The Wisconsin Department of Public Instruction will pilot the EWS beginning in March, 2013, with a planned statewide rollout in September, 2013. The EWS aims to identify students at risk of not completing high school soon enough to allow time for effective interventions, to provide a sense of possible (not absolute) negative outcomes to the student, and to engage in a cycle of identification, intervention, evaluation, and learning. The system uses a free and open source platform and it was designed to be flexible to accommodate more data. Work to improve the EWS includes identifying interventions and developing a district protocol to use local knowledge to improve the system and better understand outcomes. Marilyn Seastrom, Deputy Director of NCES, encouraged Jared and others to optimize models that identify variables with actionable interventions. Some states are doing this by focusing systems on behavioral factors rather than demographic factors.

Ellen Mandinach gave a presentation MS PowerPoint (4.47 MB) on examples of innovative data use in five SEAs and nine LEAs. SEAs included the Virginia, Arkansas, Hawaii, Maryland, and Oregon Departments of Education. LEAs included Jefferson and Kenton County Public Schools in Kentucky; Charlottesville City Schools and Loudon County Public Schools in Virginia; Tucson Unified Public Schools in Arizona; Washoe County School District in Nevada; Mamaroneck Union Free school District in New York; Long Beach Unified School District in California; and Metropolitan Nashville Public Schools in Tennessee. Ellen identified the following common themes in SEAs and LEAs that effectively use data:

  • Leadership and vision are essential.
  • Finding resources requires creativity.
  • Enculturation takes time and commitment.
  • Sustainability across administrations is possible with sufficient embedding of the data culture.
  • The status quo often must be shaken up.
  • Training for all educators and stakeholders is important.

She explained that not every exemplar has all components. The transition to innovative and effective data use is a systemic process that takes time, money, energy, and patience.

David Weinberger gave a presentation MS PowerPoint (1.0 MB) on the development of an early warning system (EWS) in Yonkers Public Schools, New York. A staff member taking a graduate course in data mining led the initiative, which replicated a 2008 study from the Consortium on Chicago School Research. The study examined the impact of variables on high school graduation, and found four variables that are significant for predicting variation in high school graduation in Yonkers. These variables were used to identify five risk levels among students. David discussed challenges around the implementation of the system, including the need for funding and staffing, coordinating with the district’s summer institute to allow adequate preparation time, and updating analysis with recent data. PPI members were interested in applying the ideas discussed in Ellen Mandinach’s presentation to Yonkers. Suggestions included moving data into the hands of available staff, enlisting institutes of higher education to assist with staff training, and using the EWS to target the highest-risk students.

Tuesday, February 12, 2013

Morning Session

Data Use Discussions

PPI members divided into small groups to discuss data use by teachers and principals and the role of LEAs, SEAs, and the Forum in promoting data use. Small groups reported their results back to the committee.

  1. What knowledge should teachers and principals have to make them competent users of data?
    Teachers should be knowledgeable about
    • what data are actionable and can be used to implement instructional and behavioral changes in the classroom;
    • what conclusions can (and cannot) be drawn from data and how to distinguish data that are merely measurable from data that are meaningful;
    • sources of data, data collection procedures, available elements, and element definitions;
    • data limitations;
    • the intended uses of data;
    • what data exist in each system and how data flow between the SEA and LEA;
    • data use tools;
    • the intrinsic value of the data and the significance of data elements;
    • privacy laws such as FERPA and HIPPA;
    • the relevancy of the data and the framework for what the data are supposed to do/ take action; and
    • processes and theories of action.

    In addition to the knowledge possessed by teachers, principals should be knowledgeable about the appropriate use of data to support student learning and organizational management.

  2. What abilities should teachers and principals have to make them competent users of data?
    Teachers should have the ability to
    • read graphs and use spreadsheets and basic software;
    • understand concepts such as confidence intervals, the difference between scale and percentile scores, and norm-referenced and criterion-referenced assessments;
    • understand that a student’s score on a given day is not a perfect measure of his/her knowledge;
    • accept the role of data in teaching and understand, apply, synthesize, and generalize data;
    • understand the accountability system;
    • access the data from their particular system;
    • explain data to parents;
    • use formative assessment data; and
    • provide needed data to the principal.

    In addition to the abilities possessed by teachers, principals should have the ability to

    • aggregate and synthesize data and determine what they can do within their span of control in their building;
      • add context and meaning to data for student learning and organizational management;
      • translate data into action;
      • communicate with stakeholders; and
      • match instruction to data.
  3. What data are most useful to principals and teachers?
    Data that are most useful to teachers include
    • student-level information, such as assessment, behavioral, and attendance data;
    • benchmark and formative data;
    • projections and planning data;
    • information on linkages across systems, such as information on a child’s specific needs;
    • longitudinal student information; and
    • components of the evaluation system.

    Data that are most useful to teachers are also useful to principals. Additional data that are useful to principals include

    • trend data and aggregated data;
    • classroom and school level information;
    • information on which teachers are best suited to dealing with students with specific needs;
    • teaching assignments;
    • information on budget flexibility;
    • services needed for students;
    • operations management data; and
    • instruction management data.

  4. What are the roles of SEAs and LEAs in advancing knowledge and abilities?
    Both SEAs and LEAs can advance the knowledge and abilities of staff by
    • making it easy for teachers/principals to access and consume data;
    • providing data visualizations;
    • transforming data into actionable information;
    • summarizing information to make it as simple as possible;
    • offering suggestions on actions to take based on data; and
    • directing data users to resources that explain the data.

    The role of the SEA in advancing the knowledge and abilities of staff will vary depending on the data available at the state level, but may include

    • making sure data are provided in a useful way;
    • making the ability to use data a component of licensure expectations;
    • encouraging institutes of higher education to prepare teachers and administrators to use data;
    • determining when data use should be taught and allocating resources;
    • working with teachers/principals to understand uses of data and developing solutions and packaging information, creating tools and training to assist in data use;
    • taking burden from teachers and principals who cannot be expected to become masters of the data systems;
    • reporting district data to the U.S. Department of Education;
    • understanding how different vendor solutions create different rules of data entry;
    • providing dashboards and planning data;
    • generating reports and dashboards;
    • keeping data current, reliable, and relevant;
    • developing training materials;
    • providing customer service and relationship management; and
    • providing teacher training.

    The role of LEAs in advancing the knowledge and abilities of staff may include

    • maintaining attention on supporting the existing staff in data use skills;
    • making better connections to the mandated events and the needed data use to achieve those goals, such as common core and teacher evaluation;
    • holding principals accountable for raising the bar for evidence of data use in school improvement plans;
    • providing quick turnaround of data to teachers/principals so they can be data-informed rather than data-driven (this may be an SEA role in some states);
    • providing quality data to the SEA;
    • creating a culture of data use;
    • helping the SEA understand LEA data needs and sources of data;
    • training schools in data use;
    • collecting feedback from schools and providing it to the SEA;
    • allowing flexibility for new functionality;
    • performing formative assessment work; and
    • taking ownership of data validation and accuracy.

  5. Is there a role for the Forum in advancing data use?
    The role of the Forum could include
    • serving as a repository for existing best practices on data use and showcasing existing tools;
    • increasing access to tools, and sharing information on how to use tools;
    • providing examples on the misuse of data;
    • disseminating Forum publications, especially the recently-released Forum Guide to Supporting Data Access for Researchers: A State Education Agency Perspective (available at http://nces.ed.gov/forum/pub_2012809.asp), and soliciting feedback on Forum resources;
    • partnering with RELs to identify the key questions about data use that SEAs and LEAs want to learn from researchers;
    • explaining best practices for SEA/LEA roles, creating models of relationships for coordinating roles and responsibilities, and explaining strategies for working with different models;
    • providing a sounding board for leveraging educator training;
    • making video materials available to members;
    • promoting American Statistical Association efforts for classroom teachers;
    • identifying ten professional organization conferences and scheduling presentations to discuss Forum work;
    • discussing Forum work at state professional organizations;
    • pursuing strategic partnerships;
    • distributing Forum publications to the media; and
    • publicizing SLDS grant successes.

Teacher Evaluation Follow-up Discussion
Jan Petro (Colorado Department of Education), Patricia Hardy (Pennsylvania Department of Education), Glenn McClain (Platte Valley School District, Weld Re-7, CO), and Linda Rocks (Bossier Parish School System, LA) joined PPI to discuss teacher evaluation systems. PPI members raised the following discussion topics with panelists:

  • How do evaluation systems combine measures of professional practice, such as observations, with student growth scores? The panelists discussed methods for merging data that involve assigning points to each type of measure, which may then be weighted according to the system. Some states have created systems in which data are merged entirely at the state level.
  • What approaches are SEAs and LEAs taking to the issue of teacher rating history and portability? Panelists noted that there is considerable variation in SEA and LEA approaches to addressing the comparability of evaluations between districts, how teachers are rated if they transfer districts, and how evaluators in different districts are trained and calibrated.
  • How are school employees who are not teachers or administrators evaluated? Panelists from Colorado discussed how their state brings together communities of professionals to identify standards for nurses and other professionals in schools.
  • How can states allow districts to use different evaluation systems? Pennsylvania recommends one model and if a district would like to use another model they must show that the alternative model is able to meet state evaluation requirements.

Data Retention Discussion
PPI Chair Sonya Edwards (California Department of Education) invited members to share their organization’s data retention policies and to discuss data retention challenges. PPI members discussed how data retention needs have changed and noted that policies developed for paper records do not work well with electronic data. Some states have statewide archival guidelines, while others have policies based on use; for example, student records may be kept for thirteen years based on an average timeline for progression from kindergarten to graduation. Marilyn Seastrom, Deputy Director of NCES, suggested discussing this topic with the Statewide Longitudinal Data Systems (SLDS) Grant Program to get information on how each state retains data.

EdFacts and Common Education Data Standards (CEDS) MS PowerPoint (909 KB)
Ross Santy (U.S. Department of Education), Beth Young (Quality Information Partners), and Jim Campbell (AEM Corporation) joined PPI to provide brief updates on EdFacts and CEDS and to discuss a project that is assisting states with mapping their EdFacts systems to CEDS. Ross focused on Office of Management and Budget (OMB) clearance, Elementary and Secondary Education Act (ESEA) flexibility, and publicly available assessment data. The latest OMB packet remains largely unchanged. There is general interest in the educational experience of military connected students but this is not a new data item. For the first time this January EdFacts released data and information on mathematics and reading language arts proficiency. There is a concerted effort to roll out 2011 – 2012 data while also retaining value and consistency in one file. Ross and the EdFacts team are looking forward to improving transparency.

Beth provided the current status of CEDS. Version 3 was released at the end of January and provided use and CEDS hierarchy statistics. The number of elements has almost doubled between Version 2 and Version 3. Many of the new elements are from early learning and support for RTT assessments. The increased interest from vendors using the Connect tool indicates forward movement. CEDS currently has over 30 public Align maps and public connections. The Connect tool has been updated with a “MyConnect” button which will allow users to run an instant comparison of their own public maps against any connections to see if they have the elements needed to make this connection. Ross then discussed the work with seven states to add Align maps necessary for some EdFacts reporting (Common Core of Data [CCD], assessment, and cohort graduation rate). EdFacts will release their own connections for these data files and two of the seven states will do so as well.

Steering Committee Business/Report
PPI Chair Sonya Edwards provided members with a briefing on the work of the Forum Steering Committee.

  • The Steering Committee met on Monday, February 11th to discuss the Forum. Members agreed that SEAs and LEAs need more information on the technology requirements and other aspects of the SBAC and PARCC assessments. The Steering Committee will draft a letter to the U.S. Department of Education outlining the information SEAs and LEAs need to be able to implement the assessments within the current timeline.
  • NCES has hired the National Opinion Resource Center (NORC) to improve the functionality of the CRDC data collection. Sonya encouraged PPI members to attend an informational session offered Thursday, February 14th, 2013 as part of the Management Information Systems (MIS) conference.
  • Sonya will report the outcomes of the PPI meeting to the Steering Committee, and will note the PPI suggestions for Forum outreach.

Afternoon Session

College and Career Readiness MS PowerPoint (274 KB)
Dean Gerdeman and Jennifer Stephan of the Regional Educational Laboratory—Midwest met with PPI to provide a presentation on the REL—Midwest’s work on college and career readiness (CCR). REL—Midwest is one of ten Regional Educational Laboratories tasked with improving academic outcomes for students by promoting evidence-based decision-making; conducting and supporting high-quality research and evaluation; and helping states, school districts, and schools systematically use data. Dean is the Acting Director of REL—Midwest and Jennifer is the Alliance Lead for College and Career Success. The College and Career Success Research Alliance aims to “identify and understand factors that support college readiness, college completion, and workforce success.” Alliance members come from state-level education agencies, higher education agencies, and not-for-profit organizations involved in CCR initiatives. These initiatives have faced challenges ranging from limited financial resources to difficulties in promoting effective collaboration among multiple stakeholders. The alliance serves these initiatives by identifying research and technical assistance needs and interests and developing projects to address those needs. PPI members engaged the presenters in a discussion of what it means to be college and career ready, and Jennifer explained that part of the work of the alliance is to address questions of how to determine readiness. A project currently underway in Indiana aims to identify predictors of college readiness using state data, and has raised questions of how to measure non-academic factors such as college-going culture. The REL has multiple alliances, and some of the topics of interest to PPI members may be addressed by other alliances. For example, a study by a rural research alliance will address the question of whether proximity to a college affects readiness. PPI members suggested that the REL should add more K-12 representatives to the alliance, add a focus for career readiness, look at the effects of scholarships, and address the question of financial readiness for college.

LEA Data Access Working Group Update
Christina Tydeman (Hawaii State Department of Education) chairs the Forum’s LEA Data Access for Researchers Working Group and she joined PPI to provide an update on the work of the group toward the development of a new Forum resource. This new Working Group is modifying the Forum Guide to Supporting Data Access for Researchers: A State Education Agency Perspective (/forum/pub_2012809.asp) to include a companion guide that is more relevant to LEAs. Christina noted that while the practices detailed in the SEA guide are appropriate for LEAs, there are additional LEA concerns that will be addressed in the new guide. LEAs deal with researchers who want to do research in districts, which can create substantial burdens for schools. The guide will explain practices that can reduce the burden on LEAs such as

  • working with institutes of higher education to teach college advisors the LEA’s research processes so that they can better prepare their students who wish to work with LEA data;
  • differentiating requests for existing and new data;
  • determining what research is needed and useful for the LEA;
  • suggesting considerations and timelines for higher education students interested in undertaking projects in schools;
  • developing appropriate consent forms for parents;
  • engaging individual schools in the research approval process; and
  • monitoring data use and prohibiting the release of protected information.

The group anticipates releasing the guide as an online Forum document in Summer 2013. The guide will include templates and forms aimed at LEA staff and researchers.

Graduation Rate Discussion
Chris Chapman (U.S. Department of Education) joined PPI to discuss the publication of two graduation rates—the Average Freshman Graduation Rate (AFGR) and the new Adjusted Cohort Graduation Rate. Sonya Edwards asked why the AFGR is still published, and Chris explained the history of the two rates and the reasons for publishing both. The No Child Left Behind Act required data on on-time graduation rates, but there were no standard measures for states to use at that time. NCES undertook a study to find the best rate from aggregated data, and determined that the AFGR was the best stopgap measure pending the creation of a standard. The new Adjusted Cohort Graduation Rate improves upon the AFGR because it follows individual experiences over time. The AFGR is no longer the standard, but it is produced for several reasons, including

  • allowing time to ensure that the new rate is comparable across states;
  • allowing trend analysis in research; and
  • serving as a national rate since a subset of states are not yet reporting the new Adjusted Cohort Graduation Rate.

PPI members engaged Chris in a discussion of how graduation rates may vary among states, including differences in how transfers, dropouts, and homeschooled students are verified. Chris noted that some states are affected by international transfers, such as students moving to Mexico, and there are also questions about whether states count students who do not pass exit exams in their rates. To obtain a standard cohort graduation rate, students should be kept in their original cohort when possible. For example, schools should not be penalized for early graduates. The use of the old rate will serve as a statistical bridge until any issues are resolved.

Meeting Review and Summer 2013 Planning
Sonya Edwards (California Department of Education) thanked PPI members for their participation and invited members to comment on the Forum sessions and the PPI meeting. Members reported that they liked the small group discussions and appreciated time spent in in-depth discussions. They also enjoyed hearing about the work of their PPI colleagues and encouraged others to consider presenting at future meetings. The group revisited the idea of Forum engagement with other professional organizations and suggested that the Forum should draft an introductory letter to several national professional associations. Marilyn Seastrom, Deputy Director of NCES, suggested partnering with the American Statistical Association. Sonya reminded members of their responsibility to act as ambassadors of the Forum in their home organizations. PPI members asked if the Forum could help transmit information to members in the event of sequestration. Marilyn suggested that the Steering Committee could speak with Jack Buckley about using the Forum as a conduit for information.

Recommended topics for the Summer 2013 meeting include

  • a session that combines presentations from PPI members with follow-up small group discussions;
  • reports from PPI members on how they promoted Forum resources;
  • information on how Common Core Assessments are affecting accountability;
  • a discussion of partnerships with non-governmental agencies;
  • exploring ways for sharing information across states;
  • data policies around Schools Interoperability Framework (SIF) and State Educational Technology Directors Association (SETDA);
  • data use culture change; and
  • data governance.

Top

Technology (TECH) Committee Meeting Summary

Monday, February 11, 2013

Morning Session

Welcome, Introductions, and Winter 2013 TECH Meeting Review
TECH Chair Laurel Krsek (San Ramon Valley Unified School District [CA]) welcomed everyone to the Winter 2013 TECH meeting and led the group in introductions. TECH members introduced themselves and described some of their current professional interests and challenges. Laurel also reviewed the TECH mission statement and members agreed that it was starting to look outdated. TECH agreed to consider updating the statement via email this spring.

Any vendors in the room were reminded that by participating in Forum meetings, they may have access to information that could potentially disqualify them from a competitive bidding process at a national, state, or local level. When vendors identify an item of potential conflict on a committee’s agenda, it is appropriate for them to excuse themselves while that topic is being discussed.

Summer 2012 Agenda Review
Chair Laurel Krsek reviewed the agenda and opened the floor for suggestions to add items to the agenda. Laurel pointed TECH members to the notes from the last meeting in July 2012 and reminded everyone of the discussions that helped lead the agenda at this meeting:

  • Teacher-Student Data Link Working Group Update - Lee Rabbitt, Newport Public Schools (RI)
  • Western Interstate Commission for Higher Education (WICHE) Multi-State Longitudinal Data Exchange Update - Josh Klein, Oregon Department of Education and Hans L’Orange, State Higher Education Executive Officers
  • Common Education Data Standards (CEDS) Update: Version 3, Race to the Top, and Schools Interoperability Framework (SIF) - Beth Young and Jim Goodell, Quality Information Partners, Inc. and Larry Fruth, SIF Association
  • Data Use and the Regional Educational Laboratory (REL) Program - Ruth Neild, National Center for Education Evaluation and Regional Assistance, U.S. Department of Education
  • P-20 Metrics / College and Career Readiness - John Kraman, Oklahoma State Department of Education
  • Family Policy Compliance Office (FPCO) Update - Kathleen Styles, Chief Privacy Officer and Dale King, Family Policy Compliance Office, U.S. Department of Education
  • National School Lunch Program - Julie Brewer, Food and Nutrition Service, U.S. Department of Agriculture and Ross Santy and Lily Clark, U.S. Department of Education
  • Enhancing Teaching and Learning Through Data Mining and Learning Analytics - Karen Cator, U.S. Department of Education, Office of Educational Technology
  • Early Warning Systems - Brian Snow, Maine Department of Education; Tom Olson, South Carolina Department of Education; and Laurel Krsek, Vice Chair, Napa Valley Unified School District (CA)
  • Bring Your Own Devise (BYOD) - Laurel Krsek, Vice Chair, Napa Valley Unified School District (CA)
  • TECH Election – Laurel and Jay elected chair and vice chair of the Technology Committee.
  • EDFacts Update - Ross Santy, U.S. Department of Education
  • Washington State’s Total Cost of Ownership - Peter Tamayo, WA State Office of Superintendent of Public Instruction
  • Assessment Consortia - Wes Bruce, Indiana Department of Education and Susan Van Gundy, Achieve, Inc.

Early Learning Data
TECH was pleased to welcome Missy Cochenour from the Applied Engineering Management Corporation (AEM) to help get the group thinking about how education data systems need to be incorporating, or at least planning to incorporate, early childhood (EC) data—including addressing why EC data and data systems have become a priority (e.g., Race to the Top [RTT]), reviewing the governance for EC, and discussing the current state of data use with EC data.

Missy specializes in using data to inform decisions, particularly within EC. Before joining AEM, she was the Program Control Consultant and Program Manager of Research and Evaluation for the Los Angeles County Office of Education- Head Start State Preschool. Through her years as a teacher, administrator, and researcher, Missy gained knowledge and experience in a variety of areas within education. Within each role, she worked with various stakeholders—ranging from parents to school boards and governors—to use data to inform decisions. Missy currently provides technical assistance to states on integrating EC data into their state longitudinal data system and leads the early learning team on the Common Education Data Standards. Recently, she has taken on an additional role working with the new Office of Early Learning at the U.S. Department of Education and providing technical assistance to the RTT-Early Learning Challenge grantees, particularly around planning, design, and use of EC data.

Discussion noted that EC data is collected to evaluate EC program effectiveness and assess how students with/out EC learning experience perform relative to other students. One challenge is that EC data actually refer to a multifaceted set of data about care and education, health, mental health, nutrition, special needs, and family support. These data are often stored in multiple locations (even within the same agency), are uncoordinated and often require probabilistic matching, usually have significant gaps, and are “owned” by multiple local, state, and federal partners (e.g., state programs, independent programs, birth-3 programs, Head Start, Early Head Start, Individuals with Disabilities Education Act [IDEA] Part B—meaning the “P” in “P-20” varies across states and districts).

Research is showing that success for students starts at birth and is critical from infant care through grade three, thus many federal programs and states want to answer policy and programmatic questions about their young learners—including discussions of school readiness (EC Advisory Councils & EC Data Collaborative). Moreover, due to changing economic environments, federal programs and states are looking for ways to allocate resources to make the largest impact...meaning that EC data will most likely to continue to grow in importance.

Teacher-Student Data Link Working Group Update
Lee Rabbitt (Rhode Island Department of Elementary and Secondary Education) chairs the Forum’s Teacher-Student Data Link Working Group. She provided an update and in-depth introduction to this new Forum resource, which is expected to be released later this spring. Lee reviewed the Table of Contents in the review draft and emphasized the nine use cases that make up a large part of the document.

Monday, February 11, 2013

Afternoon Session

Emerging Educational Technologies Follow-up Discussion
Richard Culatta (Deputy Director, Office of Educational Technology [OET], U.S. Department of Education) joined TECH in a follow up to his presentation at the Opening Session. The discussion was rich, enlightening, and thought-provoking:

  • Too many people get caught up on devices which, in reality, change quite frequently. It is better to focus on how technology can improve teaching and learning with devices viewed simply as tools for accomplishing the goal of improving education. As such, there need to be learning goals that do not focus on devices. Interoperability and standards (both technology and data standards) permit one to become more device independent.
  • A learning registry is just a database for registering metadata (content). It is not intended to complete with more proprietary resources, but is supposed to help others access information that will help with teaching and learning.
  • It would be great to have an ongoing collaboration between TECH and OET—we are both about technology and education, so Richard was invited to view this meeting at TECH as the beginning of an ongoing relationship to exchange ideas and will serve as a sounding board/access point to SEAs and LEAs for OET.
  • Data mining—where are the books about mining education data? They are not really out there, but education will eventually have much more data than other industries. We need data scientists in education. Carnegie Mellon offers a track in learning for data scientists.
  • The Amazon retail website is an example of making data access easy and seamless—they built algorithms so that their users don’t need to be data analysts. We need to do this for teachers as well. We build the tools and they can just leverage the data (e.g., identify which students are in need of which support).

Partnership for Assessment of College and Careers (PARCC) Follow-up Discussion
Wes Bruce (Indiana Department of Education and representing PARCC) and Michael Muenks (Missouri Department of Elementary and Secondary Education and representing Smarter Balanced Assessment Consortia [SBAC]) joined TECH in a follow up to their presentation at the General Session. Topics of discussion included:

  • RFPs for assessment vendors will be released soon. The American Institutes for Research (AIR) and Pearson have infrastructures that have been demonstrated to work well for SBAC and PARCC respectively.
  • Test prep will be much harder for these assessments because they are not like “old-fashioned tests” where you can study a new tool/content/technique for 20 minutes a day. Instead, students might be asked to read two passages and write an essay and THEN read a third passage and tie all three together into a conclusion in a third essay.
  • PARCC achievement levels: 5 = high caliber universities; 4 = college/technical career/military ready; 3 = no need for remediation at community college level; etc.
  • SBAC achievement levels: 4 = as a junior in HS you are ready for college level material; 3 = on track for college but need to continue with prep senior year (non-remedial in state universities unless not complete senior year expectations)
  • These scores are for placement rather than admission at this point.

Alternative Socioeconomic Status (SES) Measures Working Group
Matt Cohen (Ohio Department of Education) chairs the Forum’s Alternative SES Measures Working Group. He provided some background information on the project and an update on the effort to data:

  • The U.S. Department of Agriculture modified procedures for determining student eligibility for free- and reduced-price meals (FRPM) and, subsequently, changed the collection of these data in order to reduce burden on LEAs and serve more students. The net result is that the education community will lose the commonly-used student level FRPM eligibility flag, so education will soon need a new way of knowing which individual students are low SES in order to target educational services. Continuing to collect data about who gets free lunch is still an option but there is no incentive/demand from Agriculture so this is an opportunity to improve the approach to these data.
  • It is understood that FRPM eligibility is not the best indicator of low SES—e.g., Sandy destroyed million dollar homes and these kids become FRPM eligible because they are now homeless (even though their families may actually be quite wealthy).
  • The big questions are: How can SEAs and LEAs develop meaningful poverty data? How do we know poverty when we see it? How do we get data on it (both a policy and technical issue)? How can we get these data into shape for accountability subgroups and federal program funds compliance? And how can it be useful for instructional services?
  • This could likely require data from different agencies. A Forum document will be the product.
  • The working group is looking for a national solution with consistent data across states.
  • The working group met last September to review this problem and will meet on Wednesday following the Forum to finalize its vision/task.
  • This could produce substantial shifts in where title money goes in states.
  • It would be great if there were studies from the research community to validate alternative measures of SES and how they could/should be used.

Education Technology Open Discussion
Chair Laurel Krsek and Vice Chair Jay Pennington (Iowa Department of Education) led a breakout session for state and local education agencies (SEAs and LEAs) to discuss technology issues facing the education data community.

SEA discussion:

  • Assessment consortia uncertainties/concerns include:
    • Devices
    • Bandwidth/network inadequacy
    • Interoperability of student data
    • Lack of coordination between school administrators and curriculum staff
    • Far too many unknowns to expect high quality administration and data
    • Accountability impact of waivers
    • Preparation for the assessment, including training students (“practice tests” including not only content, but also test taking strategies for adaptive assessment)
    • Feedback mechanisms for data quality reviews
  • Virtual education—theForum Guide to Virtual Education should be updated (http://nces.ed.gov/pubs2006/2006803.pdf).
    • Missing data
    • School and program evaluation (of rigor and quality of education services)
    • Attendance – time versus competency
  • SLDS sustainability
    • Return on Investment (What are SEAs going to ultimately achieve through their SLDS systems?)
  • LEA data system capacity appears to be reaching its limits
    • New collections become increasingly burdensome
    • New/better systems will be needed at the local level

LEA discussion:

  • Assessments
  • Guidance from states to LEAs—what is the role of the SEA in LEA technology decisionmaking?
  • Perhaps there is a role for national bodies like the Forum as an advisory voice for LEA decisionmaking?
  • Technology implementation will always be critical.
  • Return on Investment is always important but especially so in these difficult economic times.
  • Technology planning should be strategic, but it is often reactionary.
  • Budgets for sustainable technology versus always choosing the least expensive option even if it doesn’t meet long-term needs.
  • Data storage, access, mining, and use—all warrant best practices (such as theForum Technology Suitehttp://nces.ed.gov/forum/pub_tech_suite.asp—which should be updated).
  • Virtual education is only becoming more important—theForum Guide to Virtual Education should be updated (http://nces.ed.gov/pubs2006/2006803.pdf).

Tuesday, July 10, 2012

Morning Session

Family Policy Compliance Office and National School Lunch Program
Kathleen Styles (Chief Privacy Officer, U.S. Department of Education) and Julie Brewer (U.S. Department of Agriculture [AG]) joined TECH in a follow up to their presentation at the General Session. Topics of discussion included:

  • Joint guidance is nearly ready to release.
  • Discussion questions were passed out in an effort to ask TECH to confirm that the types of scenarios listed made sense to SEA and LEA reps. E.g., what are the school official uses in the real world? Teachers, librarians, etc.?
  • If you rename FRPM eligibility data as SES data without any additional information, then it is still FRPM data and the same protections apply to the newly named element because it is still really just FRPM eligibility data. IF FRPM eligibility data are only one part of another measure (with other data), then the new SES measure does not need to be protected according to AG’s FRPM regulations and standards.
  • Foster care children now automatically FRPM eligible regardless of economic issues.
  • A literal read of AG’s law means teachers can’t see which kids are FRPM eligible to help individual students get services/benefits. This is another reason for education to develop its own measure because then it can be used for these types of educational purposes. Similarly, teachers can’t know who is FRPM eligible even though they are accountable as a performance subgroup.

Teacher Evaluation Follow-up Discussion
Glenn McClain (Platte Valley School District, Weld Re-7 [CO]), Patricia Hardy (Pennsylvania Department of Education), Jan Petro (Colorado Department of Education), and Linda Rocks (Bossier Parish School System [LA]) joined TECH to follow up on their morning general session. Topics of discussion included:

  • This is highly sensitive politically and important to students and staff from an instructional perspective.
  • It is understood that some LEAs might wish to take a “wait and see” approach on teacher evaluation because they don’t want to invest time, energy, and resources until the system gets finalized at the SEA level.
  • In PA, alternative systems need to “meet or exceed” the SEA system. Doing so will be challenging because the complexity of a multiple measure approach means that the bar to achieve is quite high and complicated.
  • Joining an SEA system especially makes sense when the SEA provides training and software, whereas an LEA that chooses to go on its own (i.e., “meet or exceed” the SEA system) will not receive such assistance in PA.
  • These systems really need to be thought out thoroughly and comprehensively. “Fuzziness” in policies and application are not acceptable when someone might lose their job over it.
  • Getting the teachers’ union on board was critical. AFT and NEA were on board with the new law in PA even though they didn’t like everything in it (e.g., some issues can no longer be negotiated via collective bargaining).
  • Other policies must get aligned as well. For example, in CO, probationary teachers need to be notified by June 1 if their contracts will not be renewed, but the final evaluation ratings will not be ready until late June. This simply isn’t going to work.
  • There is other “noise” in the system as well (e.g., inter-evaluator reliability).
  • Budgeting – How do you plan for teacher raises if some unknown number of teachers will not get raises because of their evaluation results?
  • Remember, these should be viewed as development tools—the goal is to help teachers perform adequately rather than as a tool for weeding out (firing) bad teachers.
  • There is great variability in the complexity of teaching assignments. For example, some teachers teach the same subject all day, others change from period to period (more prep time). Is it fair to hold them to the same standards?
  • Will teachers get any formative feedback from the evaluation process or will it all be summative—here is what you got?
  • Some teachers are concerned about individual teacher data being housed in a state system (as opposed to in the LEA).
  • Data quality: unlike some other initiatives, PA is assigning more staff to this project, including someone focused on data quality.

Midwest Education Information Consortium
Jay Pennington (Iowa Department of Education) led a presentation on the Midwest Education Information Consortium (MEIC).

  • Participating states include Missouri, Iowa, Minnesota, Kansas, Nebraska, South Dakota, and North Dakota.
  • MEIC has been around for more than 20 years and focuses on regional information sharing.
  • MEIC started discussion for dropout tracking project in 2010.
  • Kansas, Nebraska, Iowa, and Missouri collaborated with eScholar to design an interstate solution. The design focused on providing the capability to identify and locate individuals who are mobile across states by locating drop-outs who have relocated and enrolled in another state. The foundation to the interstate capability is the unique identification of the student and linking the unique IDs across states.
  • To date, the project has generated non-duplicated IDs across states (eScholar Uniq-IDĀ® products), provided an ID crosswalk/alias ID structure, designed and developed eScholar Interstate ID eXchange, and released eScholar Interstateā„¢ ID eXchange to Iowa, Missouri, Nebraska and Kansas. It is now finalizing MOUs for pilot participants.
  • Required data elements include: student last name, student first name, student middle name, date of birth, and gender.
  • The U.S. Department of Education Privacy Technical Assistance Center (PTAC) provided a great deal of technical assistance during the development of the MOU.
  • With all the benefits of sharing data like this across states, is there any interest in a voluntary national unique student ID? The Migrant Student Records Exchange Initiative (MSIX) might be a model for this.

Southeast Data Exchange (SEDE)
Tom Olson (South Carolina Department of Education) provided a summary of the SEDE project.

  • Participating states include Alabama, Florida, Georgia, Kentucky, North Carolina, and South Carolina and, believe it or not, Colorado and Oklahoma.
  • GA is building the system as part of its SLDS grant. It manages the hub and the other states run an app provided by GA. It is used to search for drop outs, etc., similar to MEIC.
  • SEDE uses Common Education Data Standards (CEDS) as the data element standard for the matching elements. First name, last name, date of birth (DOB), gender, race, ethnicity, city of birth, parent last name are all elements for trying to match (and are optional, not required).
  • Exact matching will be difficult (e.g., if using first name, last name, and DOB only). One outcome of testing will be an upper boundary on the number of matches that get returned (i.e., sending thousands of potential matches isn’t really helpful).
  • Like MEIC, SEDE recommends that SEAs maintain a matching database with only the elements that are options for matching in order to protect the actual student information system (SIS) data and minimize the user volume on the operational SIS.
  • It is hoped that the ongoing costs of maintenance will be limited to help ensure sustainability.
  • The Council of Chief State School Officers (CCSSO) plans to help support and advise these types of collaborative activities.
  • Having some central support is important—tracking milestones, reminders of commitments, etc.—because the SEAs themselves are busy with the day-to-day operation and use.
  • In some ways, this work is setting the foundation for other data sharing—across agencies as well, potentially, as with students and parents (e.g., send me all the data via a MyData button).

Topics From the Floor
Tom Szuba (Quality Information Partners) provided a short review of the Forum Unified Education Technology Suite (http://nces.ed.gov/forum/pub_tech_suite.asp) and the Forum Guide to Virtual Education (http://nces.ed.gov/pubs2006/2006803.pdf). Both resources are still largely relevant and could be updated in the near future with minimal effort. Laurel will present the idea to the Steering Committee. Any revision work is envisioned as an online task.

Afternoon Session

EDFacts and Common Education Data Standards (CEDS) MS PowerPoint (909 KB)
Ross Santy (U.S. Department of Education), Beth Young (Quality Information Partners), and Jim Campbell (AEM) joined TECH to update members on EdFacts and CEDS.

  • The EDFacts package is now at the Office of Management and Budget (OMB). Some charter school data will be removed.
  • School Improvement Grant (SIG) – EDFacts inquired about collecting these data for all schools but, based on input during the public comment period, will not proceed with this idea.
  • What about children whose parents are overseas in the military? OMB has interest in collecting data about the educational progress of these kids thus EDFacts asked for feedback on the idea of getting some data about this population. This is NOT currently being proposed as a part of EDFacts, but a lot of people are interested in this issue.
  • The EDFacts Coordinators meeting will now coincide with the Summer Data Conference.
  • TECH members were pleased that feedback about additional burden was heard from the public comment period.

www.apps4va.org
Bethann Canada (Virginia Department of Education) reported on a project that the Virginia Department of Education (VDOE) and the Center for Innovative Technology (CIT) jointly launched a software application development program known as "Apps4VA." This program challenges the public to review K-12 data and use it to design innovative new tools to aid in better data-driven decisions and intervention strategies.

  • Apps4VA stems from a federal grant funded through the American Recovery and Reinvestment Act of 2009 (ARRA) “to support the development and implementation of a data system that tracks students from EC to postsecondary and beyond.” The Virginia Longitudinal Data System (VLDS) integrates student and workforce data from across the Commonwealth.
  • Program goals included raising awareness of the availability of the VLDS, developing new and innovative ways to use VLDS Data, engaging the public, and increasing transparency.
  • 200+ Participants generated more than 87 ideas for data use related to developing new measures for student performance; revealing factors that affect student performance; providing an enriched educational environment for Virginia students; revealing ways to better prepare Virginia’s students to enter the workplace and to compete in the global marketplace; equipping Virginia teachers with tools to help tailor their programs to student needs; and providing employers in Virginia with a workforce suited to their needs.
  • A “start up” event hosted by Microsoft included 30 mentors for more than 100 participants. 13 teams launched 37 “pitches.”
  • Finalist ideas included: Edu Data Mashup & Simulator in which an online service simulates and mashes up data for a "visual representation of the facts”; a Goal Tracker service that provides “personalized analytics” for K-12 students using their individualized data and comparing progress against current educational standards; and a My School Budget service provides local school districts and schools a dashboard to help manage budgets and addresses the need for transparency. Members of the community will have access to “infographic-like” charts that track budgetary management. 
  • The winning idea was Pigeon: A service that promotes student achievement by providing grammatically accurate templates for teachers to communicate student needs or progress to their non-English speaking parents. 
  • An overnight “hackathon” resulted in more than $6,000 being awarded to winning teams from across the Commonwealth. The challenge was to create apps based on education data provided by the Virginia Department of Education that will assist in better data-driven decision making for policymakers, educators, and students; OR develop apps that will facilitate communication among Virginia school divisions to compare best or promising practices, etc. 27 individuals competed on 12 teams. The Best in Commonwealth award went to Predictive Outcomes, a tool to develop dropout risk profiles that will allow educational administrators to develop an intervention strategy for current Virginia students.
  • An Open Competition will occur later this winter.

Technology Security
Baron Rodriguez and Mike Tassey (Privacy Technical Assistance Center [PTAC]) led TECH in a review of current technology threats and mitigation facing education systems. Take-aways included numerous requests for guidance related to high-level security review and/or audit from PTAC (e.g., checking networks for vulnerabilities); best practices in data security, including basic information, such as definitions of various IT terms, as most staff are unfamiliar with them; and encouragement for PTAC to reach out even more to LEAs—not all LEAs and SEAs are aware of all the different services and guidance PTAC offers. Many third-party vendors are unfamiliar with the laws and some have been known to intentionally violate privacy laws. Moreover, some vendors tend to outsource work to smaller contractors overseas. It was recommended that education agencies clarify in detail all terms of agreement, including ramifications for violating terms. Educational entities should have written contracts with vendors, especially cloud providers.

Closing Thoughts
Chair Laurel Krsek thanked TECH members for their contributions to the great meeting. Because the agenda was so full and the conversation so comprehensive, requests for the Summer 2013 meeting will be solicited via an online inquiry to the TECH listserv.

Top

Closing Session MS PowerPoint (1.43 MB)

Tuesday, February 12, 2013

NCES Update MS PowerPoint (589 KB)
Jack Buckley, Commissioner of the National Center for Education Statistics (NCES), joined the Forum to offer his thoughts on delivering on the promise of data in education. He began by discussing the effects that increasing accountability pressure can have on data quality, and presented the interaction of accountability pressure and data quality as a curve. If there is no accountability, there is very little incentive for accurate reporting. The area below the peak of the curve is where increased accountability can lead to more accurate reporting. The area after the peak of the curve is where too much pressure can distort the data. Recent education cheating scandals show how increasing pressure will cause a percentage of individuals to subvert the system—in the case of cheating scandals, by intentionally misreporting data. Methods for reducing accountability pressure include sampling and aggregate reporting. Often, accountability pressure is determined by policy makers, so there is little that practitioners can do to reduce pressure. Instead, practitioners can focus on shifting the curve upwards by improving data quality. Methods for improving data quality include practices exemplified by the Forum and related initiatives, such as cooperation, training, technical assistance, and best practices, as well as improvements in collection technology. NCES is pursuing these strategies through efforts such as developing Common Education Data Standards (CEDS) and the CEDS Align and Connect Tools; assisting with the redesign of the Civil Rights Data Collection (CRDC) collection tool; working with EDFactsand state partners to use CEDS; maintaining and improving the Forum, data conferences, and SLDS conferences; and working with partners to improve collection systems. Jack concluded by cautioning against over-reliance on data and technology and over-promising the benefits of data.

Standing Committee Progress Reports

Recognition of Completed Projects
On behalf of the Forum, Jack Buckley presented plaques to members of the working groups responsible for developing the Forum Guide to Supporting Data Access for Researchers: A State Education Agency Perspective (http://nces.ed.gov/forum/pub_2012809.asp) and the Forum Guide to Taking Action with Education Data (http://nces.ed.gov/forum/pub_2013801.asp).

Closing Remarks

Tom Ogle commended the Forum on accomplishing the goal of becoming a group that others turn to for expertise and best practices. He noted that the Forum has grown and expanded, and with the use of technology, members are collaborating and developing new projects throughout the year. He thanked members for their efforts and encouraged everyone to complete meeting evaluation forms.

Top

Steering Committee

Monday, February 11, 2013

Welcome and Agenda Review
Forum Chair Tom Ogle (Missouri Department of Elementary and Secondary Education) welcomed members of the committee and reviewed the agenda.

Sunday Review
The Forum School Codes for the Exchange of Data (SCED) Working Group met prior to the Forum meeting on Sunday, February 10, 2013. Kathy Gosa and Ghedam Bairu reported that the National Center for Education Statistics (NCES) and the SCED Working Group will align other NCES course codes with SCED. The alignment will affect the High School Transcript Studies and the National Assessment of Education Progress (NAEP). This new charge will expand the scope of the SCED revision, which included updating Visual and Performing Arts Codes, updating College and Technical Education (CTE) codes, and developing a best practice guide. The Working Group aims to begin releasing new codes in September, 2013.

Monday Review
New Member Welcome
The New Member Welcome provided new members with an overview of the Forum and information on the work of each standing committee. Steering Committee members noted that thirty minutes was sufficient time for the welcome.

Emerging Educational Technologies
Committee members reported that Richard Culatta was a dynamic and engaging speaker, and his presentation on Emerging Educational Technologies was an excellent addition to the Opening Session. Members appreciated his effort to link his presentation to data issues, and expressed interest in inviting him to future meetings.

Assessment Consortia Updates
Representatives from the Assessment Consortia provided a good discussion of topics; however, members were concerned that there is insufficient information available to state and local education agencies (SEAs and LEAs) to adequately prepare for the assessments within the current timeframe. The Steering Committee will draft a letter to the U.S. Department of Education offices that coordinate the work of the consortia. The letter will highlight the information SEAs and LEAs need to prepare for the assessments and concerns about assessment implementation, including

  • the need for a data file map (file layouts for elements);
  • concern that the consortia are not adequately prepared for future technological changes, such as the increasing prevalence of bring-your-own-device (BYOD) policies;
  • the need for the consortia to address the fact that many schools operate wirelessly and wired configurations will be difficult to implement; and
  • the need for adequate information for state procurement processes.

Standing Committee Time
Chairs reported that time spent in standing committees was informative and productive. TECH discussed emerging educational technologies and then moved to discussions about the assessment consortia and the Forum’s upcoming guide to data access for researchers from an LEA perspective. PPI leadership asked members for topics of interest related to data use prior to the meeting, and then based a session on innovative data use on the responses. The session included presentations from Regional Education Laboratory (REL), SEA, and LEA members. NESAC reported that members were especially interested in discussions around data privacy and educational technology.

Outside Agency Requests for Forum Advice
Several agencies requested Forum assistance with data-related projects.

  • The National Endowment for the Arts (NEA) met with Forum leadership in October to discuss the collection of arts-related data in SEAs and LEAs. The NEA and the Forum scheduled a follow-up meeting for February 14th, 2013.
  • The American Clearinghouse on Educational Facilities (ACEF) expressed interest in establishing a committee of nationwide experts to update the 2003 Forum Planning Guide for Maintaining School Facilities. Steering committee members agreed to collaborate with ACEF and will review the update.
  • Andy Zuckerberg of NCES planned an open discussion to obtain input to improve the Civil Rights Data Collection (CRDC) collection system and submission process. The U.S. Department of Education has engaged the research firm NORC to assist with modifications to the CRDC data collection tool to improve its usability and to streamline the submission process. Forum members were encouraged to attend the session.
  • Ghedam reported that the White House Initiative on Asian Americans and Pacific Islanders (WHIAAPI) has requested Forum assistance with disaggregating data in states with rapidly-growing AAPI populations. The Steering Committee agreed to meet with Akil Vohra of the Initiative at the end of Tuesday’s meeting.
Tuesday, February 12, 2013

Review of Tuesday’s Events
Teacher Evaluation Panel
Panelists presented different approaches to teacher evaluation, different stages of implementation, and different perspectives from SEAs and LEAs. Members noted that this was an effective approach to the discussion because states are at different stages of evaluation system development, especially with regard to roster verification.

Standing Committee Time
Standing Committee Chairs reported on recommendations and ideas from each committee:

  • TECH members are interested in updating the 2005 Forum Unified Education Technology Suite and the 2006 Forum Guide to Elementary/Secondary Virtual Education.
  • PPI members are interested in exploring ways of promoting Forum publications, beginning with members’ responsibility to act as ambassadors of the Forum in their home agencies. PPI members suggested that the Forum should approach national associations to share publications, and Sonya Edwards suggested that information should be shared using the tagline, “When you need to know about _______, go to the Forum.” Ghedam noted that this topic can be addressed by the Forum Communications Committee, which has prepared outreach materials in the past for promoting the Forum and its publications.
  • PPI members were also interested in exploring whether there is a role for the Forum or for RELs to bring together professional development resources and other information on data use in a central repository. Marilyn Seastrom, Chief Statistician and Acting Deputy Commissioner with the National Center for Education Statistics (NCES), also suggested that the Forum could collaborate with the American Statistical Association (ASA)’s statistical literacy project.
  • REL participation in Forum meetings provides an additional resource, and Forum members are interested in learning more about the information and assistance RELs can provide.

Upcoming Working Group Meetings
The LEA Data Access for Researchers Working Group and the Alternative Socio-economic Measure (SES) Working Group have Wednesday meetings scheduled.

Summer 2013 Planning
The Summer 2013 Forum Meeting will be combined with the EDFactsmeeting. Members discussed the possibility of focusing the professional development session on data governance, and will submit additional ideas for topics to Ghedam.

White House Initiative on Asian Americans and Pacific Islanders (WHIAPPI)
Commissioner Jack Buckley provided Steering Committee members with a brief overview of WHIAAPI’s efforts to disaggregate data on Asian American and Pacific Islander (AAPI) populations to identify differences in educational achievement and attainment, as well as AAPI subpopulations in need of educational interventions. Deputy Commissioner Marilyn Seastrom suggested that WHIAPPI use U.S. Census data to identify states with growing AAPI populations.

Akil Vohra, an attorney with WHIAPPI, asked the group for suggestions on how to manage data and work with states with growing AAPI populations. WHIAPPI has scheduled a June 7th, 2013, meeting to convene representatives of states with emerging AAPI populations to discuss data disaggregation, and Akil asked for suggestions of who to include in the meeting. WHIAPPI has also been tasked with exploring public-private partnerships, providing technical assistance, and scaling existing models for use in other states. The initiative is interested in using data to inform policies, develop programs, and measure outcomes. Steering Committee members noted that this effort may require changes to data collection processes. If immediate data are needed, information could be disaggregated based on home language. Commissioner Buckley suggested that this is part of the larger issue around collecting socio-economic data on students. The Steering Committee will develop suggestions and communicate with Akil through NCES.

Steering Committee monthly conference calls will resume on Thursday, March 7, 2013, at 12:00 p.m. EST. The agenda for the call will include reviewing meeting evaluations.

 Previous Page

Top

Publications of the National Forum on Education Statistics do not undergo the formal review required for products of the National Center for Education Statistics. The information and opinions published here are the product of the National Forum on Education Statistics and do not necessarily represent the policy or views of the U.S. Department of Education or the National Center for Education Statistics.


Would you like to help us improve our products and website by taking a short survey?

YES, I would like to take the survey

or

No Thanks

The survey consists of a few short questions and takes less than one minute to complete.