Skip Navigation
small NCES header image

Winter 2012 Forum Meeting Notes


National Forum on Education Statistics
February 13-14, 2012
San Diego, California



Opening Session


Monday, February 13, 2012

Forum Agenda Review and Introduction MS PowerPoint (928 KB)

Forum Chair David Weinberger (Yonkers Public Schools, NY) welcomed Forum members to the 2012 Summer Forum Meeting in San Diego, California. After expressing his hope that the meeting would be both productive and enjoyable, he introduced the Forum officers and welcomed the following new SEA and LEA members to the Forum meeting:

  • Laura Boudreaux, Louisiana Department of Education
  • Paul Butler-Nalin, South Carolina Department of Education
  • Sarah Cox, Arkansas Department of Education
  • Drew Dilly, Wyoming Department of Education
  • Sally Gordon, Minnesota Department of Education
  • Reylam Guerra-Goderich, Puerto Rico Department of Education
  • Kent Hatcher, Indiana Department of Education
  • Chandra Haislet, Maryland State Department of Education
  • Jared Knowles, Wisconsin Department of Public Instruction
  • Edward Moreno-Alonso, Puerto Rico Department of Education
  • Gary Niehaus, McLean County Unit School District 5 (IL)
  • Sameano Porchea, Nebraska Department of Education
  • Annette Severson, Colorado Department of Education
  • Peg Votta, Rhode Island Department of Elementary and Secondary Education
  • John Walker, Henryetta Public Schools (OK)
  • Susan Williams, Wyoming Department of Education

In addition to new SEA and LEA members, the Forum is working to increase cooperation with IES’ Regional Educational Laboratories (RELs), and David welcomed the following nine new REL representatives to the meeting:

  • Chris Brandt, Regional Educational Laboratory—Midwest
  • Chris Cobitz, Regional Educational Laboratory—Mid-Atlantic
  • Edith Gummer, Regional Educational Laboratory—Northwest
  • John Hughes, Regional Educational Laboratory—Southeast
  • Wendy Kekahio, Regional Educational Laboratory—Pacific
  • Julie Kochanek, Regional Educational Laboratory—Northeast and Islands
  • Ellen Mandinach, Regional Educational Laboratory—West
  • Steve Meyer, Regional Educational Laboratory—Central
  • Robert Muller, Regional Educational Laboratory—Appalachia

David encouraged new members to visit each of the three standing committees to learn more about the work of the Forum and to join working groups when opportunities become available. Following introductions, he reviewed the mission of the Forum and announced the pending release of the new Forum Guide to Facilities Information Management: A Resource for State and Local Education Agencies. He also noted the success of the Forum Guide to Data Ethics Online Course. David provided a brief overview of the agenda for the meeting, and introduced Sonya Edwards, a longtime Forum member and the SEA representative for the California Department of Education.

Welcome to California
Sonya Edwards, California Department of Education, welcomed Forum members to San Diego and provided information on the city and planned events.

Elementary and Secondary Education Act (ESEA) Flexibility and National School Lunch ProgramMS PowerPoint (2.3 MB)
Ross Santy and Lily Clark, (Office of Planning, Evaluation and Policy Development, U.S. Department of Education), discussed ESEA flexibility. Core policies of ESEA flexibility include setting a high bar for students and schools, protecting all students, and providing flexibility to move forward with reform. Waiver requests must address four core principles:

  1. College- and career-ready (CCR) expectations for all students. States seeking waivers are expected to adopt CCR standards in reading and math. The timeline for waivers allows states time to engage individual schools so that standards and assessments can be used statewide.
  2. State-developed differentiated recognition, accountability, and support. States are expected to set ambitious but achievable goals for schools.
  3. Supporting effective instruction and leadership. Teacher and principal evaluation and support systems should provide meaningful feedback to teachers.
  4. Reducing duplication and unnecessary burden. Burdensome administrative requirements should be removed, and reporting should be streamlined when possible.

Flexibility and waivers are designed to transition states to local systems that function with the ultimate goal of improving results for all students. The first waivers were granted on February 9th, 2012 to Colorado, Florida, Georgia, Indiana, Kentucky, Massachusetts, Minnesota, New Jersey, Oklahoma, and Tennessee, and multiple rounds of waivers will follow. Waivers are not competitive. The process for waiver submission and approval includes peer review as well as U.S. Department of Education (ED) feedback and technical assistance. Lily discussed some of the key characteristics of the waivers that have been approved thus far, along with state-specific examples of implementation. Approved waivers promote continuous improvement for all kids, transition all students to higher standards, demonstrate a renewed focus on closing achievement gaps, provide accountability based on student growth and progress, increase State and district capacity for school improvement, provide a holistic view of success, and provide teachers and principals with support and effective professional development. Importantly, waivers include consideration of factors beyond student performance in reading and math, leading to a renewed focus on a well-rounded education.

As a condition of the waiver, states must report data to ED documenting progress toward plan implementation. Whenever possible, existing data collections will be leveraged to demonstrate progress within the new systems. Consolidated State Performance Reports (CSPR) and EdFactswill continue to be used to monitor state implementation of plans. Changes are planned for EdFactsto accommodate changed reporting requirements. Reporting requirements will vary according to the specific details outlined in state-specific plans developed for each state that is granted a waiver. More information on ESEA flexibility is available at http://www.ed.gov/esea/flexibility, and questions and comments may be sent to ESEAflexibility@ed.gov.

Forum interest in the National School Lunch Program motivated Ross to provide a brief update on the Healthy, Hunger-Free Kids Act of 2010 (PL 111-296). In-depth information on the program is available through the U.S. Department of Agriculture (USDA), and Ross offered to bring questions generated by the Forum to the attention of the USDA. Three major changes to the act affect data: the use of direct certification is encouraged; a fourth provision on community eligibility was introduced; and the Secretary of Agriculture is directed to identify alternatives to annual eligibility applications. Student eligibility for the program is determined by a household application or by “categorical eligibility,” which refers to a student's or household’s qualification through another program. Direct certification, which is now encouraged, is a type of categorical eligibility whereby public records are used to determine whether a student is eligible for the program. Ross noted that use of direct certification of student eligibility may impact the availability of annual individual eligibility data, and it may impact connections between statewide longitudinal data systems and other state systems.

The newly-added Provision 4 of the Act, Community Eligibility, will be gradually introduced to states. Community eligibility allows communities in which at least 40 percent of students are eligible for the program via direct certification to forego collecting individual household applications and to provide free lunch and breakfast to all students. In July 2011, community eligibility was introduced in Illinois, Kentucky, and Michigan. Additional states will be added in 2012 and 2013, and it will be available nationwide in 2014. The USDA is investigating individual application options for the program, and plans to release a formal report in 2013. Ross listed key questions for state and local education agencies to consider based on the 2010 Act, including:

  • Is your state using Free and Reduced Price Lunch (FRPL) eligibility for accountability reporting?
  • What risks need to be addressed if districts or schools choose community eligibility?
  • Can your agency continue to collect individual information used to directly certify students if NSLP no longer requires individual data?
  • How do you currently deal with the variety in certification methods? What if variations increase in coming years?

ED will also have to consider how changes affect ED programs and data collections. An EdFacts white paper will be posted to ED’s website addressing the National School Lunch Program. In addition, information on the Healthy, Hunger-Free Kids Act of 2010 is available at http://www.fns.usda.gov/cnd/governance/Legislation/CNR_resources.htm, and information on the National School Lunch Program is available at http://www.fns.usda.gov/cnd/lunch/.

The Learning Journey: Solana Beach School District MS PowerPoint (5.4 MB)
Leslie Fausset, Superintendent of the Solana Beach School District in Southern California, and her colleagues Lisa Denham, Principal of Skyline School and Christie Kay, teacher at Solana Pacific Elementary School, joined the Forum to discuss data use. Leslie summarized her presentation as the “story of a district”, which can be seen as a microcosm of California—encompassing wealthy and impoverished areas; urban, suburban, and rural areas; and areas with significant numbers of English language learners. The district is generally considered to be high-performing and over time has increased the number of students achieving advanced proficiency, but there are nevertheless students who are in need of assistance and support. Through the use of data and technology, the growth of each student is measurable. As part of the learning journey, the district has promoted professional development and has implemented a district-wide language arts curriculum.

Lisa noted that when she arrived in the district, Skyline School staff had little experience using data. Now, staff members review trend data, identify improvement areas, use multiple sources of student data to inform decisions, and develop site-strategic plans. Student data are reviewed in school-wide meetings as well as at individual grade-level team meetings both prior to and throughout the school year. Data are used to inform school improvement plans, and each grade develops “smart goals” for reading and math attainment. Reports are designed to provide clear information to teachers, including information on individual student performance organized around each identified goal. The school also invites parents to review data. Lisa discussed the ways in which data improve education for English language learners, including monitoring benchmarks to ensure that students continue to improve after they reach proficiency. English language learners also benefit when data are shared with parents, and when teachers and parents create plans to foster ongoing student learning and address ways to prevent the loss of knowledge over the summer months.

School-wide data are also useful in determining areas in which staff development is needed. Teachers employing effective strategies can be identified and their strategies shared with colleagues. Moreover, data can point to areas of the curriculum where specific interventions are needed.

Christie, a 5th grade teacher, discussed the ways in which data can be used in the classroom. She reviewed a number of reports that she uses with students and parents to set goals and monitor student and classroom progress. She noted that computer-based assessments have improved opportunities for data use, because data are now readily available to teachers in the form of useful reports. Christie uses reports to differentiate instruction in her classroom and to guide instruction for small groups of students. Reports can also quickly provide information on where students are in the process of mastering standards, and can identify the specific standards that individual students need to learn. Students are taught to read the reports, and they become involved in the analysis of data through goal setting, whereby the class as a whole sets goals based on aggregate class scores and individual students set goals for themselves based on test scores. Such goals act as a motivational tool for students throughout the school year. Reports that are tailored to students and to parents show district average scores, individual student growth over time, typical student growth, and national percentiles. Reports that identify students’ Lexile skills are useful for encouraging students to select books that are appropriate to their reading ability.

Forum members were interested in knowing more about how Solana Beach School District budgeted time and money for professional development, and how teachers know which specific interventions to implement when the data identify areas for improvement. The speakers discussed how they budget for professional development, noting that they draw on strategies used by successful teachers in the district, and how data use and professional development have been incorporated into regularly-scheduled meetings. They also discussed the district’s success in using research-based strategies as well as a various strategies identified in the standard curriculum when targeted interventions are needed.

Joint Session: FERPA Updates

Monday, February 13, 2012

FERPA Updates MS PowerPoint (246 KB)
Kathleen Styles, Chief Privacy Officer with the U.S. Department of Education, joined the Forum to provide an update on the Famly Educational Rights and Privacy Act (FERPA). The gravity of data privacy protections was evident in a brief overview Kathleen provided on data breaches in educational institutions. Kathleen reflected on the information she was able to share with the Forum at the Summer 2011 Meeting, and noted the many changes that have been accomplished since that time. Major accomplishments include finalized FERPA regulations released in December 2011, numerous new guidance and best practice documents, resumed FERPA trainings, increased coordination between the Privacy Data Assistance Center (PTAC) and the Family Policy Compliance Office (FPCO), and the promotion of a 2-way line of communication on privacy issues between ED and the educators in SEAs and LEAs. Webinars have been very popular, and basic FERPA webinars will be repeated periodically.

Kathleen provided Forum members with a list of FERPA resources and then explained the FERPA regulatory changes which went into effect on January 3, 2012. To illustrate the changes, she presented three case studies intended to help education agencies better understand the new FERPA regulations. The case studies are available at the PTAC website (http://www2.ed.gov/policy/gen/guid/ptac/index.html). In 2012, Kathleen’s list of priorities includes continuing to expand PTAC’s assistance to LEAs, offering more guidance and best practices, promoting inter-agency cooperation, publishing data while protecting personally identifiable information, and balancing privacy and transparency. Because of the need for information on developing written data sharing agreements, a top priority will be the creation of a template or checklist for written agreements. In addition, she plans to issue best practices on the electronic transmission of PII, determining what types of video are “education records,” transparency, and distinctions between de-identified and aggregate data. A breach response checklist is planned, and ED will collaborate with the U.S. Department of Agriculture on the topic of Free and Reduced Price Lunch (FRPL) data. Forum members were encouraged to submit their suggestions and ideas for long-term projects.

SEAs and LEAs were encouraged to review how their agencies report on the use of student data. Kathleen emphasized that transparency is key, and PTAC can help agencies to provide basic information on their websites about what is being done with student data and how it is protected. She stated that SEAs and LEAs should be proud of the insights they gain through the use of student data, and should publish research results. In summary, she reiterated her interest in hearing feedback from Forum members, and encouraged members to attend conference sessions on privacy protection.

National Education Statistics Agenda Committee (NESAC) Meeting Summary

Monday, February 13, 2012

Afternoon Session

Welcome, Introductions, and Agenda Review
NESAC Chair Cheryl McMurtrey (Mountain Home SD 193,ID) welcomed the committee members to San Diego and to the NESAC Subcommittee. Members introduced themselves and Cheryl took a moment to remind associate members, business partners, and vendors that while their attendance in these sessions is appreciated, they may risk disqualification in certain state, local, or federal competitive bidding processes by being part of sensitive conversations during the subcommittee proceedings. Cheryl then reviewed the meeting agenda and the proceedings of the summer meeting in Bethesda, Maryland.

National School Lunch Program & ESEA Reauthorization and Waivers
Ross Santy and Lily Clark (ED) asked Forum members for feedback to help inform the National School Lunch Program conversation. Members brought forward the following questions and issues:

  • What organizations can provide a direct certification?
  • Other programs use free lunch application totals and if direct certification is used then no applications can go out and these totals will not be available.
  • What other federal programs will be impacted by changes to free and reduced price lunch certification procedures?
  • Provision 2 districts report 100% economically disadvantaged which affects accountability data because non-economically disadvantaged students are included in the low SES sub-category.
  • Is it possible that some LEAs using new eligibility measures will “escape” accountability measures for SES sub-groups?
  • Is there a difference in the reimbursement processes or amounts in the different provisions? A grid outlining the differences among the provisions would be helpful.
  • How is “community” defined?
  • Direct certification still requires human interaction.

Teacher-Student Data Link Working Group Lee Rabbitt (Newport Public Schools, RI), Working Group Chair, provided an update on the working group status. Work on a new resource started at the Summer Forum. The first meeting occurred in December 2011 with multiple web meetings following and a meeting at the Winter Forum. The new resource should be ready for publication by the Summer Forum (or shortly thereafter). It will include recommendations for LEAs and SEAs on implementing the teacher-student data link. The product is not intended to replace past work/research in this area. It will reference prior work, components and considerations of linking teacher and student data, implementing the link, and use case scenarios (ranging from federal reporting to local use). The document is designed to promote best practices.

NESAC members asked questions and provided feedback in several areas:

  • Job sharing should be addressed.
  • The use of “dosages” (percentages) or other means of assigning responsibility (contributing educators) should be addressed.
  • It would be useful to show types of analysis that can be done once a link is established.
  • Is there any information on the effectiveness on the link?

Teacher-Student Data Link Discussion
Sheri Ballman (Princeton City School District, OH), Susan Williams (Virginia Department of Education), Chris Woolard (Ohio Department of Education), and Patricia Sullivan (Texas Education Agency) lead a discussion on the teacher-student data link and progress underway in each of their states on this issue. The group discussed the use of a dosage-type measure and how such information is gathered, use of the link for multiple purposes, how the link can accommodate special education out-placements, and roster verification.

Tuesday, February 14, 2012

Morning Session

Data Release Policy
Kathleen Styles, Chief Privacy Officer, U.S. Department of Education, and Jack Buckley, Commissioner, National Center for Education Statistics facilitated a discussion on data release policies. ED data committees are working on data release issues (not including NCES statistical data) and Forum feedback is welcomed on preliminary thoughts about the policies. ED is working to establish processes for releasing program data such as cell suppression or blurring, and NESAC members are invited to share their thoughts on what is working. Issues noted by NESAC members include:

  • Rules that require the suppression of zero students (e.g. zero students that are hearing impaired) seem overly restrictive.
  • There are different thresholds for different types of sharing.
  • Teacher dashboards being used in the classroom bring up data release issues because of the danger of unauthorized users walking in when a teacher is using the dashboard.

Data Sharing Discussion
NESAC Vice Chair Ray Martin (Connecticut State Department of Education) lead a discussion on data sharing. Members are still hesitant to say what types of sharing are permitted or prohibited under the new regulations. Scenarios detailing what types of data sharing are and are not allowed would be helpful.

Statewide Longitudinal Data Systems (SLDS) Update
Nancy Sharkey and Emily Anthony (NCES) provided an update on the SLDS Program. The FY12 Competition is in full swing. A Review Panel will meet at the end of February, and the panel’s recommendations will be brought back to ED. The awards should be announced in late spring or early summer with start dates of late summer. They also discussed the Public Domain Clearinghouse and other resources NCES is developing regarding data use and working with administrative records data.

Civil Rights Data Collection (CRDC) Update
Rebecca Fitch and Ross Santy (ED) provided an update on the CRDC collection. Two-thirds of school districts have logged on already to provide contact information. ED will be following up with the school districts that have not yet logged on. NESAC members had the following comments:

  • Can we get the file structures now? Are the software vendors engaged? If so, the burden of the collection will be reduced.
  • Some of the items are very challenging, for example, students subjected to physical restraint.
  • Can you work with the SAT folks to have test takers to put in their high schools?
  • What will be pre-populated?
  • Better definitions and instructions are needed for some of the items.

Assessment Consortia Discussion
NESAC members broke into LEA and SEA discussion groups to discuss what is being done now in member organizations to prepare for the new assessments; whether or not the Forum should get involved and if so, in what capacity; and what the Forum can provide the larger community.

LEA Issues:

  • What computer capacity do the LEAs have for assessments?
  • Mixed capacity now and for resources to buy new materials.
  • How will reports be organized?
  • What data will be returned to the LEA?
  • What rules will be used when LEAs get the data back?
  • How valuable will the data be to teachers?
  • Will LEAs have to upload a database of our students, and if so, how will that work and how will mobile students be included?
  • Virginia SEA has a guide for LEAs for on-line testing.
  • Questions on online testing software and appropriate platforms.

SEA Issues:

  • It seems that the Assessment heads are working on this but not bringing in other state folks (like SIS folks). Conversations should be broadened to include more stakeholders.
  • Idea – data structures and how they work together for success in this area.
  • Questions on technological capacity.
  • Idea – Assist states that haven’t made a decision yet (or in two Consortia)
  • Eventually how will this be reported to EDEN?

Joint Issues:

  • What are the Consortia’s feedback loops, and do they provide any LEA feedback?
  • Assessment Consortia should be in attendance at the Summer Meeting.
  • Technology readiness.
  • Forum role – data coming through in a way that sustains and improves what we are already doing.
  • Communicate on behalf of the members that this isn’t just an assessment exercise; we should all be integrally involved in this now. Better coming from this larger voice.
  • Half-page best practices on Common Core implementation (need a team now that includes… data sharing, confidentiality)
  • Prototype of the assessment
  • Have the Forum go speak (an SEA and LEA) to both Consortia about important issues to Forum members (learn from our work).
  • Be careful about too closely associating with the Consortia as there may be some consequences to Forum participation for those states that are not part of either consortium.

Afternoon Session

Common Education Data Standards (CEDS) Update MS PowerPoint (6.16 MB)

Jim Campbell (AEM Corporation) joined NESAC to discuss the Common Education Data Standards (CEDS) and to demonstrate tools associated with the project. CEDS is a NCES initiative to develop voluntary, common data standards for a key set of education data elements. CEDS Version 2 was released in January 2012, and includes elements from early learning and postsecondary as well as an expansion of K12 elements. Jim reviewed why CEDS is needed, identified the parts of the standards, and the uses of CEDS. CEDS is not solely an ED undertaking; stakeholders from various education sectors contribute to the development of each new version. In addition to the ongoing work of stakeholders, each round of CEDS development includes consideration of existing standards, alignment with the field, and a public review period prior to release. During the development of Version 2 a few support tools were created focused on improving user interaction with CEDS, and as a result, the CEDS website (http://ceds.ed.gov/) allows users to filter CEDS elements by domain and it includes a guide for using the data model. He also provided information on the CEDS Alignment Tool and the future Use Case Generator. The web-based alignment tool allows users to import or input data dictionaries, align data to CEDS, and compare to other data dictionaries. Jim noted that several SEAs, early learning, and postsecondary maps are being added to the tool. NESAC members were interested in the uses of the alignment tool, especially with regard to generating comparison reports between data dictionaries.

Data Governance and career and Technical Education (CTE) Data in Washington MS PowerPoint (335 KB)
Bill Huennekens, Washington State Office of Superintendent of Public Instruction, provided an overview of how CTE data stewards are integrated into data governance discussions in Washington.

Topics from the Floor: NESAC Chair Cheryl McMurtrey (Mountain Home School District 193, ID) asked members for topics that are of interest to them and topics they would like to hear about at the July meeting. Suggestions included:

  • Should there be a Forum EDEN group put together so LEAs can be involved? (LEAs cannot participate in the EIMAC task force.)
  • College and career readiness working group
    • Panel of states with ESEA waivers
    • A white paper would be helpful
  • Social media and privacy (ask Kathleen Styles to discuss)
  • CRDC – room for continuous improvement
  • Data security breaches and appropriate responses
  • LEA privacy policies like data release, etc.
  • Dropouts – discussion of the change from an event rate to a cohort or longitudinal rates because of the access to longitudinal data
  • Get NCES groups together (SASS, CCD, finance, etc.) to reduce burden
  • What is NCES’s data governance process?
  • Assessment Consortia
  • Best practices on postsecondary remediation rates
  • Knowledge transfer of data or tech personnel

Policies, Programs and Implementation (PPI) Committee Meeting Summary


Monday, February 13, 2012


Afternoon Session

Welcome and Introductions
PPI Chair Tom Howell (Michigan Center for Educational Performance and Evaluation) welcomed everyone to the meeting and led the group in introductions. He reminded participants that the purpose of PPI is to address the policy implications of the Forum’s work on national education data issues.

Any vendors in the room were reminded that by participating in Forum meetings, they may have access to information that could potentially disqualify them from a competitive bidding process at a national, state, or local level. When vendors identify an item of potential conflict on a committee’s agenda, it is appropriate for them to excuse themselves while that topic is being discussed.

Agenda Review
Tom Howell outlined the PPI agenda for the winter meeting and invited members to suggest additional topics for discussion.

Summer 2011 PPI Meeting Review
Tom Howell reviewed the work PPI accomplished at the Summer 2011 Meeting, noting that discussions that occurred at the Summer Meeting informed the development of the Winter agenda.

Western Interstate Commission for Higher Education (WICHE) Pilot Update MS PowerPoint (131 KB)
Hans L’Orange (State Higher Education Executive Officers, SHEEO) provided an update on the work of the WICHE Multi-State Data Exchange. This four state pilot is funded by the Bill and Melinda Gates Foundation and managed by WICHE. Hans briefly reviewed the background and goals of the project and then discussed the types of policy questions that the pilot is designed to answer. He shared some of the challenges faced when exchanging data among different states and education sectors, as well as some of the successes. Aspects of the project include the development of a Memorandum of Understanding (MOU) to guide the data sharing process, and processes for data matching that preserve the privacy and confidentiality of data. PPI members were curious to learn more about the outcomes of the project and were interested in the design of the MOU.

Data Use Working Group Update
SEA Data Use Working Group Update
Kathy Gosa (Kansas State Department of Education) provided an update on the Data Use Working Group and the SEA Data Use Working Group. Products developed by the Data Use Working Group will promote taking action with data. An introduction and two briefs are nearing completion, and work on a third brief is currently underway. The group plans to hold a series of meetings via web conference, and will convene again in April with the goal of having documents ready for Forum review in May.

The objective of the SEA Data Use Working Group is to document best practices and challenges involved in sharing data with researchers. The group’s forthcoming guide, directed at SEAs, will include resources and templates to facilitate data sharing. PPI members were interested in whether the guide defines “research” and specifies who is considered a “researcher.” Kathy responded that the goal of the group is to outline a process for data requests that states can use to address different types of research requests from different sources. PPI members were also curious about research partnerships, which will be addressed in an appendix to the document. In response to questions from PPI members, she noted that the guide will refer readers to other Forum resources as appropriate, and it will include important information on tracking requests, confirming data destruction, and identifying the resources necessary to implement a data sharing infrastructure.

Data Quality Campaign 10 State Actions
PPI members broke into two groups to discuss the results of the Data Quality Campaign’s (DQC) report, Data for Action 2011. Discussion focused on the DQC’s Ten State Actions for Effective Data Use. Some members noted that it would be helpful if states had the opportunity to not only indicate whether they had completed the actions, but also provide supporting information (e.g. links to information on the SEA’s website or to relevant legislation) that could be useful to other states working towards achieving each goal. Moreover, PPI members noted that the questions about data governance could be improved by giving states the opportunity to give more information on which stakeholders are involved in the data governance process. PPI members suggested that it might be worthwhile for the DQC to collect states’ definitions of college and career-readiness.

National School Lunch Program and Elementary and Secondary Education Act (ESEA) Flexibility
Ross Santy and Lily Clark (ED) joined PPI to answer questions and discuss issues relating to changes to the National School Lunch Program as well as ESEA flexibility. PPI members were curious about the regulations governing the release of Free and Reduced Price Lunch (FRPL) data, the use of FRPL eligibility as a proxy indicator of socio-economic status (SES), and the effects of Provision 4 of the Healthy, Hunger-Free Kids Act of 2010 on the collection of FRPL data. Ross informed the group that representatives of ED are currently working with the USDA to clarify issues relating to FRPL data use, and SEA and LEA representatives can send questions to the Privacy and Technical Assistance Center (PTAC). In addition, PTAC is interested in FRPL data use cases, which can inform discussions with the USDA. SEA and LEA representatives were interested in learning how their colleagues were measuring and defining SES, whether FRPL data contributed to states’ measurements, and what other data elements are considered when defining SES. Ross recommended a forthcoming ED white paper, which will include information gathered from different states.

Following the discussion of changes to the National School Lunch Program, Lily engaged the group in a brief discussion of issues relating to ESEA flexibility. PPI members were especially concerned about the development of indicators for college and career-readiness (CCR).

Tuesday, February 14, 2012


Morning Session

Assessment Consortia and LEAs
At the Summer 2011 Meeting, the Forum held a panel discussion on the work of the two assessment consortia: Smarter Balanced Assessment Consortium (SBAC) and the Partnership for Assessment of Readiness for College and Careers (PARCC). PPI members expressed the need to further discuss the work of these two consortia, and on Tuesday morning the group spent time developing a list of questions and concerns to share with the consortia regarding the effects of new assessments on policies and procedures.

Issues identified by PPI members include:

  • How will Common Core State Standards be incorporated into the assessments?
  • There is the potential for a gap year between the old and new assessments. How will this gap year, and the implementation of new assessments, affect growth models?
  • How will instructional practices and curricula be altered to align with new standards and assessments?
  • SEA and LEA representatives reported concerns that there has been little information available from PARCC regarding implementation issues. LEAs need information to address the number of computers that must be available simultaneously, space issues and lack of computer labs, and how implementation will proceed with little or no financial support.
  • Will testing occur simultaneously or will there be a window of time for tests to occur?
  • What costs are associated with upgrades and implementation? Information is needed on technical costs and the mechanics of implementation.
  • The validity of the data will be affected by logistics, for example, the timing of the test and whether it is offered simultaneously or if there is a testing window.
  • SEAs and LEAs would benefit from knowing if there is a timeline or window for implementation. It is difficult to plan without knowing what is expected and when, especially with regard to budget cycles.
  • What are the staff needs?
  • What accommodations are needed for computer-adaptive tests? Can accommodations be administered to a group of students? (Note: Considerable LEA time is spent planning for EL and IEP student accommodations, and more information is needed to gauge whether we have enough staff to provide accommodations.)
  • How will scores on SBAC and PARCC tests compare?
  • How will the fact that cut scores for College and Career Readiness (CCR) vary by state and institution be handled?
  • How will the work of the consortia be evaluated?
  • How much bandwidth will LEAs need?

Teacher-Student Data Link Working Group Update
Lee Rabbitt (Newport Public Schools, RI), joined PPI to discuss the work of the new Teacher-Student Data Link Working Group. Although the group is sponsored by NESAC, Lee visited each of the standing committees to gather information and suggestions to inform the Working Group’s next meeting. This new group is developing a guide to best practices for SEAs and LEAs implementing the data link, and intends to have a document ready by the summer. The guide will include use cases, components of the link, and information on implementing the link. Representatives of the Gates Foundation as well as several states with experience implementing the link are participating in the development of the new guide. PPI members provided feedback on topics that they are most interested in learning about, including how links can accommodate multiple teachers, flexible student regrouping, and virtual education. Some members were interested in how this work will be affected by the work of the testing consortia. Lee thanked PPI members for their input and encouraged them to submit further recommendations to her prior to the working group’s Wednesday meeting.

PPI Discussion: Teacher to Student Data Link
PPI Vice Chair Sonya Edwards (California Department of Education) opened the floor to further discussion about teacher-student data links as well as any comments on the previous day’s presentation, The Learning Journey: Solana Beach School District. PPI members enjoyed the presentations by representatives from the Solana Beach School District, and were curious as to how the district is able to implement what appear to be extensive professional development opportunities. Members were also interested in the district’s testing results following the implementation of a new district-wide language arts curriculum.

PPI Round Robin Demonstration: Transformation of Data to Information
Bruce Dacey (Delaware Department of Education) and Josh Klein (Oregon Department of Education), joined PPI to take part in a discussion on the transformation of data to information. Chandra Haislet (Maryland State Department of Education) and Carmen Jordan (Arkansas Department of Education) rounded out the panel of presenters. Each presenter discussed ways in which their state is transforming data into useful information that can be accessed and used by stakeholders.

Delaware is a Race to the Top (RTTT) state and is also participating in the Harvard Strategic Data Project. The strategic data project is a two year program that assigns a Harvard Data Fellow to the SEA while also training two SEA employees as Agency Fellows. Harvard also provides data analysis for the SEA. To promote data use, Delaware has also pursued the strategy of providing schools with “coaches”—data coaches work with professional learning communities within schools, development coaches focus on teacher evaluation, and leadership coaches work with struggling principals. High school teachers participate in 90 minute professional learning communities that alternate between focusing on data and instructional improvement. Over 1500 applications were received for data coaches. Bruce provided PPI members with a list of the data systems used in Delaware, and demonstrated their use. When discussing data systems, Bruce emphasized that states must be sure to discuss the impact new data systems will have on individual districts. The state is in Phase 2 of the Gates Foundation-funded Shared Learning Initiative (SLI), which includes the development of a data warehouse and dashboard. The Council of Chief State School Officers (CCSSO) is a proponent of SLI. The state mapped local course codes to the Secondary School Course Classification System: School Codes for the Exchange of Data (SCED).

The Oregon DATA Project is “building educators’ capacity for using data to improve achievement.” The data project is funded through a Statewide Longitudinal Data Systems (SLDS) grant. The project’s Data Literacy Professional Development component is led by certified data trainers, and has been very successful. Josh noted that it was developed in part using the Forum Guide to Building a Culture of Quality Data: A School and District Resource. SEAs and LEAs interested in the data project can access the DATA Project Evaluation Report and the Data Project Year 5 Report at www.oregondataproject.org. In addition to the DATA project, seven regional data warehouse providers serve Oregon. Funded through competitive grants, these providers offer dashboards for teachers and administrators, among other services. Oregon also encourages stakeholders to access assessment information, school information, and other reports the Education Data Explorer (www.educationdataexplorer.com). Work is currently underway to expand this tool.

Maryland was a recipient of SLDS and Race to the Top (RTTT) grants, and has undertaken a considerable expansion of information systems, including establishing P20W collaboration. Chandra discussed the work that Maryland has undertaken to build consensus and ensure the involvement of LEAs in the development of new systems, including developing a help desk for RTTT questions, multi-media training, user acceptance testing, data coaching, and a data portal to offer a single point of entry for information. She demonstrated the Maryland Data Projects Portal, which includes a longitudinal data systems library, and she demonstrated the Child Care Maryland portal. PPI members were interested in the ways in which LEAs have been involved in the process, and noted the usefulness of using surveys to inform training models.

Arkansas has several projects that facilitate the transformation of data to information. At the Summer meeting, Forum members learned about the Hive, Arkansas’ portal for information on assessments. Carmen introduced PPI members to the Arkansas Department of Education Data Center (http://adedata.arkansas.gov/), which is a public site that offers a “one stop shop” for data tools and reports of interest to stakeholders. Tools are organized by stakeholder groups, and approved users can log into dashboards through the site.

FERPA Update and Data Release Policy
Kathleen Styles, Chief Privacy Officer, U.S. Department of Education, and Jack Buckley, Commissioner, National Center for Education Statistics, joined PPI to discuss the development of a data release policy. A Data Strategy Team co-chaired by Jack and Kathleen is currently working to create a clear and concise policy for protecting students from unlawful disclosure and determining an appropriate disclosure review process. Kathleen discussed with PPI members some of the approaches that have been considered by the Data Strategy Team, and she sought the input of the group on best methods for protecting data, such as blurring and self-suppression. In many cases, research applications do not require personally identifiable data. ED is considering developing downloadable school-level files for researchers and is pursuing the development of an open source tool LEAs and SEAs can use to suppress data. The group discussed problems that arise when statistical disclosure rules prevent the release of data that show, for example 100% proficiency. PPI members also suggested that the Data Release Policy should be applicable to LEAs, and Jack and Kathleen agreed and emphasized that it should be applicable at the school level. Kathleen noted that PTAC has expanded services to assist LEAs, and LEAs have also benefitted from the “FERPA 101” training offered through PTAC.

Jack recommended as a resource the NCES technical brief, Statistical Methods for Protecting Personally Identifiable Information in the Disclosure of Graduation Rates of First-Time, Full-Time Degree- or Certificate-Seeking Undergraduate Students by 2-Year Degree-Granting Institutions of Higher Education. Although designed for two year colleges, this brief nevertheless has information that could benefit SEAs and LEAs.

PPI members asked a number of follow-up questions to Kathleen’s presentation on FERPA updates. She addressed questions regarding the difference between disclosure and re-disclosure, as well as recordation requirements. The group then engaged in a discussion regarding the lifespan of data, addressing issues such determining the advantages and disadvantages of different timeframes for data destruction, ensuring that any data use complies with the original data collection agreements, and managing multiple ad-hoc data releases to prevent disclosure.

Forum Education Privacy Document Review
Levette Williams (Georgia Department of Education) and Ghedam Bairu (NCES) reviewed the history of Forum work on data privacy and the future of such work. In 2008, when new FERPA regulations were established, PPI undertook a revision of the Forum’s data privacy guides. A draft of the document was put on hold pending the new FERPA regulations, which were released in 2011. This document is now being updated by PTAC. PPI members noted that it would be useful for the Forum to review the document before it is released, and the Forum could be especially useful in reviewing the document to ensure it is relevant to LEAs. Ghedam agreed to extend an offer of Forum assistance to Kathleen Styles.

PPI members discussed the idea that Forum documents that are subject to changing regulations or that may become outdated should include a note directing the reader to visit the Forum website for up-to-date publications as well as links to relevant regulations.

Topics from the Floor
Sonya Edwards (California Department of Education) asked the REL representatives present for their perceptions on how their work will integrate into the work discussed at the Forum. In addition to large-scale research design, RELs are working to provide technical support and assistance to help states improve data use. One of the goals of the RELs is to determine states’ current capacity for data use, and then identify areas where RELs can assist. REL representatives were therefore very interested in the morning’s discussion of data transformation, and they are using the Forum as a means of determining the current state of the field and identifying what information they can bring back to their regions. RELs serve both SEAs and LEAs, and they can facilitate the dissemination of regional information.

While RELs can offer technical assistance, they are only one of several programs (including EDTAP and ERDCs) that provide technical assistance. It was suggested that these competing assistance streams should be coordinated. Because REL contracts were recently awarded and work is just beginning, this is an opportune time for conversations to begin between the RELs and the regions they serve, as well as with the Forum. Some potential areas where RELs may be useful are in looking the issue of free and reduced price meal data as a proxy for poverty as well as other proxy indicators of poverty. SEAs can be the conduit for the RELs, and it would be useful to develop an inventory of the projects the RELS are working on, the timeframe, and who the point of contact is for the SEA.

Afternoon Session

Governance and Career and Technical Education (CTE) Data in Washington MS PowerPoint (335 KB)

Bill Huennekens (Washington State Office of Superintendent of Public Instruction) discussed the inclusion of Career and Technical Education (CTE) in the data governance process and Education Research and Data Center (ERDC) data governance in Washington state related to data sharing, data definitions and master data management. A working group works on definitions and brings them to the data management committee. There are 4 LEA representatives on the data management committee. Bill explained data governance, outlined organizational processes for establishing and implementing data governance, and discussed the benefits of including Career and Technical Education data stewards in the data governance process. CTE data stewards can inform the discussion on the direction of data systems, especially statewide longitudinal data systems; they are often informed on up-to-date data practices, they are in a position to build connections with other agencies, such as those collecting workforce data; and their inclusion in data governance helps to ensure that all data systems are aligned. He shared his experiences and recommendations and then spent time answering questions regarding matching data, resolving differences between different groups involved in the data governance process, and establishing timelines for data governance.

Statewide Longitudinal Data Systems Grant Program and Public Domain Clearinghouse Update
Emily Anthony and Nancy Sharkey (NCES) provided an update on the Statewide Longitudinal Data Systems (SLDS) Grant Program and discussed the products and assistance available through the Public Domain Clearinghouse (PDC). The SLDS Request for Applications (RFA) was released in February, and information is available at http://nces.ed.gov/programs/slds/index.asp. The PDC is intended to allow states to build off of SLDS products developed and owned by states such as Memoranda of Understanding (MOU) and project management information, as well as to request that states share existing documents and products. Currently there is information about 19 such projects in the PDC including Nebraska’s Reporting Portal, Arkansas’ Oyster system for identity authentication, and Colorado’s Data Dictionary for its Growth Program. The PDC’s state profile pages can help SEAs and LEAs to identify states that are doing similar work.

Nancy engaged PPI members in a discussion on research and data use, and encouraged members to suggest ways in which ED and RELs can facilitate data use. PPI members were interested in learning more about how SEAs can broker productive relationships with RELs so that both parties benefit, and in understanding how individual states can build on work done by RELs. NCES is developing information to foster partnerships between RELs and states including guides, checklists for RFPs, and information on what a data sharing memorandum of understanding should include. She asked members to share with her information and suggestions on these and other documents that would be useful to SEAs and LEAs, and Ghedam Bairu suggested that the Forum website could be used to solicit feedback when such documents are developed. The RELs’ work emphasis is on data use, and several PPI members felt that it is important that the RELs make sure that researchers understand the data (e.g., a bridge document explaining changes to race/ethnicity data). RELs should develop data that leads to policy decisions, and they should help states to improve data quality (e.g., directory data over the years). One PPI member encouraged Forum members to be strategic about the effort – SEAs or NCES should get copies of any code that is created by the researcher so it can be shared. LEAs will be able to access the PDC. The REL contracts are for five years.

Common Education Data Standards MS PowerPoint (6.16 MB)

Jim Campbell (AEM) joined PPI to discuss the Common Education Data Standards (CEDS) and to demonstrate tools associated with the project. CEDS is a NCES initiative to develop voluntary, common data standards for a key set of education data elements. CEDS Version 2 was released in January 2012, and it includes elements from early learning and postsecondary as well as an expansion of K12 elements. Jim reviewed why CEDS is needed, identified the parts of the standards, and the uses of CEDS. CEDS is not solely an ED undertaking; stakeholders from various education sectors contribute to the development of each new version. In addition to the ongoing work of stakeholders, each round of CEDS development includes consideration of existing standards, alignment with the field, and a public review period prior to release. During the development of Version 2 a few support tools were created focused on improving user interaction with CEDS, and as a result, the CEDS website (http://ceds.ed.gov/) allows users to filter CEDS elements by domain and it includes a guide for using the data model. He also provided information on the CEDS Logical Data Model, Alignment Tool and the future Use Case Generator. The CEDS logical data model provides a high-level framework for translating standards into a physical model and includes two views, the domain entity schema and a normalized data schema. The alignment tool is web-based and allows users to import or input data dictionaries, align data to CEDS, and compare to other data dictionaries. Jim noted that several SEAs, early learning, and postsecondary maps are being added to the tool. PPI members were interested in the uses of the alignment tool, especially with regard to generating comparison reports between data dictionaries. The future tool, the Use Case Generator can be used to query policy questions and the list of elements needed in order to answer the questions.

Steering Committee Business/Report
Meeting Review/Summer 2012 Planning
Tom Howell (Michigan Center for Educational Performance and Information) thanked PPI members for their participation, and noted the contributions of the new REL representatives. Tom informed members that the issues discussed and action items identified in PPI would be reported to the full Forum during the Closing Session and discussed with the Steering Committee. Members reported that they appreciated the update from Hans L’Orange, State Higher Education Executive Officers (SHEEO) on the WICHE data exchange, and they enjoyed the presentations from states on transforming data to information. Recommended topics for the Summer 2012 Meeting include:

  • Follow-up on the WICHE project
  • Free and Reduced Price Meals
  • PTAC and FERPA
  • Civil Rights Data Collection
  • College and Career Readiness
  • Common Core State Standards
  • Data Use Webexs
  • State presentations
  • REL initiatives

Technology (TECH) Committee Meeting Summary

Monday, February 13, 2012

Afternoon Session

Welcome, Introductions, and Summer 2011 TECH Meeting Review
TECH Chair Peter Tamayo (Washington State Office of Superintendent of Public Instruction) welcomed everyone to the TECH meeting, led the group in introductions, and reviewed proceedings from TECH's meeting at the Summer 2011 Meeting.

Any vendors in the room were reminded that by participating in Forum meetings, they may have access to information that could potentially disqualify them from a competitive bidding process at a national, state, or local level. When vendors identify an item of potential conflict on a committee’s agenda, it is appropriate for them to excuse themselves while that topic is being discussed.

Summer 2011 Agenda Review
Chair Peter Tamayo reviewed the agenda for the meeting.

John Kraman, Oklahoma Department of Education, was added to the list of presenters for P-20W Feedback Reports.

Teacher-Student Data Link Working Group Update
Lee Rabbitt (Newport Public Schools, RI) chairs the Forum’s new Teacher-Student Data Link Working Group. She reported that a considerable amount of work is currently underway related to the teacher-student data link, most notably through the Teacher-Student Data Link (TSDL) Project and the Data Quality Campaign (DQC).

This Forum group met for the first time in December, followed by a series of meetings via web conference. The group will develop a new resource that focuses on overcoming implementation challenges, particularly at the local level. The group hopes to complete its work by the end of 2012. The document will build on existing work and extend to a practical guide for SEAs and LEAS, including use cases (e.g., professional development, teacher access to data, teacher evaluation, merit pay, and individualized learning systems).

TECH members offered advice regarding implementing the link: There is an easy way to do it and a fair way to do it. The classroom setting and teacher assignments are tricky, and a good system needs to accommodate this complexity. Planners should consider working from the student level and going up, rather than the teacher and going down. Lee replied that this document will not assume there is one right answer but will, instead, focus on best practices. For example, dosage versus non-dosage methods are both valid so the document will highlight both approaches and leave it to the reader to decide about applicability. The document will focus heavily on LEA implementation and will try to define some important terms, although the definitions will be necessarily broad.

TECH hopes this document will describe the complexity of the issue because very powerful decisionmakers may read it and they need to understand that using these data is not a simple as it may initially appear.

Common Education Data Standards MS PowerPoint (6.16 MB)

Jim Campbell (AEM) joined TECH to discuss the Common Education Data Standards (CEDS) and to demonstrate tools associated with the project. CEDS is a NCES initiative to develop voluntary, common data standards for a key set of education data elements. CEDS Version 2 was released in January 2012, and it includes elements from early learning and postsecondary as well as an expansion of K12 elements. Jim reviewed why CEDS is needed, what components make up the standards, what CEDS is not intended to be, and how CEDS can be used. CEDS is not solely an ED undertaking; stakeholders from various education sectors contribute to the development of each new version. In addition to the ongoing work of stakeholders, each round of CEDS development includes consideration of existing standards, alignment with the field, and a public review period prior to release. During the development of Version 2 a few support tools were created focused on improving user interaction with CEDS, and as a result, the CEDS website (http://ceds.ed.gov/) allows users to filter CEDS elements by domain and it includes a guide for using the data model. He also provided information on the CEDS Alignment Tool and the future Use Case Generator. The alignment tool is web-based and allows users to import or input data dictionaries, align data to CEDS, and compare to other data dictionaries. Jim noted that several SEAs, early learning, and postsecondary maps are being added to the tool. TECH members were interested in the uses of the alignment tool, especially with regard to generating comparison reports between data dictionaries.

Statewide Longitudinal Data Systems Grant Program and Public Domain Clearinghouse Update
Emily Anthony and Nancy Sharkey (NCES) from the IES Statewide Longitudinal Data System (SLDS) grant project joined TECH to provide updates on the program. The FY 12 competition is underway with an expectation that results will be announced by the late spring. With respect to the Public Domain Clearinghouse (PDC), many SEAs are now using it to share resources and save or reduce their own development costs. The PDC is hoping to begin to include resources for LEAs in the near future as well.

On a related front, the SLDS team is working hard to develop several new best practice resources, including a guide about forming research partnerships, a technical brief about working with administrative data, a checklist for writing RFPs, and a model memorandum of understanding (MOU).

SLDS Discussion: Total Cost of Ownership and Return on Investment
Peter Tamayo (Washington State Office of Superintendent of Public Instruction), Josh Klein (Oregon Department of Education), and Laurel Krsek (Napa Valley Unified School District, CA) shared their perspectives on measuring the cost and value of their agency’s investments in education data systems.

Josh described Project ALDER, funded by Oregon’s FY09 ARRA SLDS Grant. The Oregon State Senate expected a Return on Investment (ROI) component to this work for various levels in education:

  • Early Learning – Measure student growth between early learning program entry and exit and also at each learning stage using valid assessments.
  • K-12 – Adopt the Center for American Progress methodology that evaluates the level of student attainment of state standards given the challenges of the student population served and available resources.
  • Postsecondary – Measure performance of community colleges and universities by using tools that link degree and certificate completion to the resources used by program area.

Oregon used measures from the Center for American Progress, which attempted to examine “the efficiency of the nation’s public education system” and includes an effort to evaluate the productivity of almost every major school district in the country. These indicators were used to measure academic achievement relative to a district’s educational spending, while controlling for factors outside their control, such as cost of living and degree of student poverty.

In the state of Washington, agencies were charged with instituting a method for accounting for IT related expenditures, including common definitions of what constitutes an IT investment in applications and infrastructure. Washington State used a Total Cost of Ownership (TCO) metric for evaluating relevant cost structure and productivity levels. It is now planning to rely on the Gartner framework to document TCO for the state education data systems (including IT and program area costs) with a goal of documenting the value and benefits of education data systems to the state. These data will inform the 2012 Washington State Legislature as it develops its 2013-15 budget decision package.

Laurel explained how things were different at the school district level. Napa Valley relies on an indicator referred to as the “Value of Investment” (VOI), which looks at anticipated costs and benefits of technology projects. In particular, LEAs differ from their SEA counterparts by needing to take into account qualitative benefits that relate directly to school mission, goals, and mandates—i.e., instructional value.

TECH requested that this important issue be revisited at the Summer 2012 Meeting.

Tuesday, February 14, 2012

Morning Session

National School Lunch Program & ESEA Waivers
Ross Santy and Lily Clark (ED) joined TECH to receive questions following their Monday Opening Session presentation about ESEA flexibility and the National School Lunch Program. In addition to responding to questions from the TECH Committee, they communicated several additional points and clarifications:

  • Waivers are a rolling process (round one has concluded, a second round will occur in late February, and a final round in fall 2012).
  • The CRDC is currently an all or nothing model – full files but no partial submissions (e.g., a state cannot complete 80 percent of the CRDC and log out so that it can be finished by districts). Since moving to a universe survey, ED is trying to improve communication with SEAs and LEAs so they can collaborate on a single flat file for submission (i.e., the 80/20 percent could be completed prior to submission). Ross acknowledged that it can be difficult to reconcile CCD “schools” against CRDC “educational settings” which has some discrepancies.
  • ED and PTAC invite Forum members to send examples of how FRPM data access may conflict with privacy concerns and the sound use of data in general. For example, food services staff might say that school principals can’t see the data and, therefore, instructional staff don’t know which kids are in the socio-economic disadvantaged cells that needs improvement – how can educators target kids for instructional support if they can’t know who they are?
    • The community eligibility option will be available to everyone as an option, but not a requirement. Also, the community eligibility option may very well lead to the loss of data about kids at the local and state levels (because there will be fewer household applications). ED invites the Forum to help them determine what the implications might be on other data usage.
    • Food services can serve meals to kids who aren’t in an SEA’s SMS, meaning numbers reported by food services will not match data sent by schools. E.g., meals served in AG “educational settings” in addition to ED “schools.”
    • Some states are considering collecting their own socio-economic data. For example, they hope to develop an SES index for each student; the composite index may be used without anyone seeing the actual data that were used to calculate the index.
    • Ross would like to continue this discussion at future Forum meetings to make sure ED is aware of how the FRPM data are being used for alternative purposes. Does the Forum want to consider developing an alternative index indicator model? Should TECH consider a task force?

21st Century Skills Assessment
Vice Chair Laurel Krsek (Napa Valley Unified School District, CA) and Kathy Gosa (Kansas State Department of Education) shared a presentation on what is happening in classrooms across the nation—as data people in SEAs and LEAs, TECH members need to stay abreast of these issues.

Authentic 21st century assessments are an essential component (and even the foundation) of a 21st century education. In an increasingly complex, demanding, and competitive 21st century, students need to learn more than the 3 Rs. In Napa Valley (CA), instruction now focuses on the 4 Cs: Communication, Collaboration, Critical thinking, and Creativity. Napa Valley also believes that they need to “measure what we value,” meaning that assessment should include complex rubrics and student portfolios. They are monitoring initiatives emerging from Smarter Balance to identify how the new standardized assessments will address these elements. They are also adopting a Bring Your Own Device approach to technology. Laurel showed examples of new assessment questions. They are complex and multi-dimensional—not your old bubble sheet responses.

In Kansas, the state board of education believes that authentic 21st century assessments are the essential foundation of a 21st century education. Kansas State Department of Education (KSDE) is in the process of defining and developing a new state accreditation system which reflects a focus on the five Rs: Relationships, Relevance, Rigor, Results, and Responsive Culture.

Kansas has been using online assessments for a number of years – paper and pencil is an accommodation. The Kansas Writing Instruction & Evaluation Tool (KWIET) tool is an online environment where students compose pieces of writing in response to writing tasks, and where teachers evaluate, score, and provide feedback. It is used at all grade levels and can track student writing progress over time (a student’s writing portfolio).

KSDE is partnering with Kansas’ state assessment vendor (University of Kansas’s Center for Educational Testing and Evaluation) and other states to develop a Career Pathways Collaborative (CPC) with “end of Pathways” assessments. The goal is to assess high school students’ readiness for postsecondary technical education or entry into the workforce in areas related to general agriculture, plant systems, maintenance, animal systems, production, general business, finance, marketing, and education.

Data Governance and Career and Technical Education (CTE) Data in Washington MS PowerPoint (335 KB)
Bill Huennekens (Washington State Office of Superintendent of Public Instruction) has been talking to data stewards from across the nation at SLDS meetings, EDFacts meetings, and related settings. He joined TECH to share his thoughts on CTE data and governance.

Bill asserted that the essential notion behind establishing a K-12 data governance system is that decisions are only as good as the data on which they are based, and as we transform data into information to facilitate wise decisionmaking, users and managers of K-12 data need to establish data definitions, data and process ownership/authority, accountability, security, and reporting needs and requirements.

Because of the importance of sound data governance, he advocated the need for a CTE data steward in any P-20-workforce SLDS. How is this accomplished? By CTE staff building relationships with IT, Student Information and EDFacts Coordinators. He recommended that CTE staff “invite yourself to meetings...it’s usually not against the rules.” Moreover, clear documentation of federal and state reporting requirements will demonstrate the alignment of CTE with most state and federal reporting requirements. He recommended that TECH members expand their scope of work and relationships with postsecondary and workforce partners.

P-20 Feedback Reports
Peter Tamayo (Washington State Office of Superintendent of Public Instruction), Kathy Gosa (Kansas State Department of Education), and Tom Purwin (Jersey City Public Schools, NJ) explained how feedback reporting was being envisioned and/or implemented in their agencies.

In the Jersey City Public Schools (NJ), Tom views feedback reports as potentially including work force through early childhood education (and younger). In addition to colleges and universities telling K-12 how well prepared and successful their students were in higher education, feedback can go the other direction as well. For example, as an employer of higher education graduates (teachers and administrators), K-12 can provide a Higher Education Feedback report to let them know information such as the number of graduates hired, the number of graduates who received tenure, the number of graduates who left the district (by choice and/or denied re-employment), evaluation summaries of graduates, (future) teacher- student accountability data, etc. Similarly, feedback reports apply across other levels of education, even within K-12 (e.g., high school to middle school, middle school to elementary school, etc.). Elements might possibly include “repeated 1 or more grades, overage status (2 or more years), number of in/out-of-school suspension incidents, attendance, rate of freshman on-track for graduation, as well as a host of assessment data.” According to Tom, a critical indicator would be some measure of if/when/how a student gets off track for graduation.

In Kansas, feedback reporting is a byproduct of SFSF and is managed at the SEA level. Existing sources of data include KSDE and the Kansas Board of Regents, as well as the National Student Clearinghouse and the ACT. KSDE provides detailed high school feedback dashboards through secure authentication by school and district administrators. Enhancements currently envisioned include the addition of SAT data, HS course data, Department of Labor data, postsecondary completion metrics, and Department of Defense career readiness data.

Peter displayed statewide P-20 feedback reports from the Washington Education Research & Data Center (ERDC). Reports included “student enrollment by type of institution enrolled in postsecondary education,” “demographic characteristics of graduates enrolled in postsecondary education,” “high school academic performance by postsecondary enrollment,” and a range of participation/engagement data, such as number of credit hours enrolled, etc.

Challenges to accomplishing this in Washington included agreeing on a method for data suppression of small numbers when reporting on student characteristics and finding the time and resources to work with multiple data contributors to clarify methods, data definitions, and appropriate usage. Lessons learned included: having an effective communication strategy for when the reports are released (with a goal of getting the information used); listening to report users who can document concerns and suggestions for future reports; and assuming that the reports will change over time, which requires ongoing flexibility in report content, display, and delivery.

John Kraman (Oklahoma State Department of Education) asked TECH to think about P20 performance indicators for the purpose of measuring outcomes over a student’s advancement throughout the P20 experience. These data could be used for accountability as well as early warning indicator systems. For example, readiness for kindergarten might be measured by factors related to language development, physical and motor skills, and social/emotional behaviors; whereas readiness for 4th grade might focus on literacy and reading scores, numeracy and math scores, core subject grades, and school engagement. John described his vision for a college, career, and citizenship indicator system that he is now envisioning and invited TECH to contribute to the development of such a system. TECH will revisit this topic in July.

Teacher Evaluation
Rhode Island is implementing new policies for evaluating teachers. Lee Rabbitt (Newport Public Schools, RI) has talked to TECH about this in the past and updated us on the ongoing experience in her district and state.

There are five goals of the Rhode Island Educator Performance and Support System (EPSS): (1) provide a forum for support and actionable feedback for educators, (2) guide professional development and management, (3) inform student and teacher assignments, (4) inform personnel actions and certification decisions, and (5) provide information on educator preparation programs. The EPSS is designed to provide an easy-to-use interface to collect and manage data on all three components of RIDE’s Evaluation System – Student Learning, Professional Practice, and Professional Responsibility. Components of the evaluation system include Professional Practice (PP), Professional Responsibilities (PR), Student Learning Objectives (SLO), and Student Growth (SG).

As a component of this work, Rhode Island is establishing and implementing a teacher course student link in which teachers of record are the primary educators of a student; contributing educators are any teacher that contributes to the core subject(s); the entire student’s growth in the core subject is attributed to a teacher of record and the contributing educators; and multiple teachers can be linked to the same student’s growth. Four files are necessary to determine the teacher, course, student link: Course File, Section File, Teacher File, and Student File. Roster verification occurs in October, February and June, and teachers are expected to verify their rosters, although principals have the final approval.

The Rhode Island Growth Model (RIGM) is a statistical model that provides a new way of looking at student achievement. It enables decisionmakers to look at growth in addition to proficiency to get a fuller picture of student achievement. RIGM is based on the Colorado Growth Model and state tests (NECAP) will be used to measure growth. PARCC will replace NECAP in ELA and math. For more information, visit http://www.ride.ri.gov/EducatorQuality/EducatorEvaluation/.

Assessment Consortia and Technology: Discussion
Last July, Tom Foster (Kansas State Department of Education), and Wes Bruce (Indiana Department of Education), representing the SMARTER Balanced Assessment Consortium (SBAC) and the Partnership for the Assessment of Readiness for College and Careers (PARCC), respectively, told us that they are developing assessments that are agnostic with respect to technology. While there is still some uncertainty about what technology will be required for planned online assessments, there is an expectation that a "typical" computer that can handle internet, sound, etc. will suffice as an assessment station. Thus, we are told that “any system” will work as long as it meets a set of fairly basic minimum capacity standards.

TECH members need to understand the details that their agencies will need to have in place to permit assessments throughout states and districts. For example, how will assessments be administered securely (both onsite and technologically speaking)? Will they occur in computer labs or classrooms? What bandwidth will be needed? How will make up assessments be handled? How will data flow to and from the consortia?

These are just some questions that have arisen informally during recent conversations with Forum members. TECH members used this session to think proactively and systematically about the information needed to effectively participate in assessment consortia activities. TECH would like to submit the following questions to the consortia with an expectation that the assessment consortia will attend the July meeting in person to provide answers:

  • When will technical requirements for the assessment consortia be released?
  • What metrics for technology demand are available? What about for broadband, wireless, and mobile accessibility?
  • Will stress testing or other technology readiness tools be available?
  • When will the roll out plan be available?
  • Will the size or specifications of a student’s monitor possibly influence performance?
  • How will mobile testers be authenticated?
  • Has the testing period or window been defined? In other words, will all students be assessed at the same time or can computers/bandwidth be used to assess students over time (e.g., all students in one week or one third of students each of three consecutive weeks)?
  • To what extent are assessments going to include 21st Century skills?
  • What content will be assessment, if any, beyond the two subject areas?
  • How are the consortia addressing the sustainability of the work after the contract/funding expires?
  • Will the time needed by teachers to administer the tests go up or down?
  • What is the plan for mobile device support for the formative or summative assessments?
  • What does professional development look like?
  • In the collection of the assessment data, who owns the data?
  • Are there resources for states/districts to interpret results?
  • How are the PARCC and SBAC models going to be integrated?
  • Will there be interoperability specifications?
  • What does the tech readiness tool actually do?

Afternoon Session

FERPA Update and Data Release Policy
Kathleen Styles, Chief Privacy Officer, U.S. Department of Education, and Jack Buckley, Commissioner, National Center for Education Statistics, joined each of the Forum's standing committees to make themselves available for questions and discussion. Many interesting issues arose during the conversation, including:

  • There needs to be more consistency across ED with respect to data release practices. ED’s data strategy team is working on a data release policy. SEAs also release data. How can ED help? For example, would a model for governing data releases be helpful? TECH members agreed that most SEAs don’t have the resources to tackle this issue, so a federal model that can be adapted/adopted could be very helpful.
  • There are numerous national level data releases that might relieve some SEA burden.
  • ED chooses not to engage in perturbation given the importance of program data, although cell suppression and blurring may still be useful.
  • As with many issues, there may not be a one-size-fits all answer. Solutions depend on the intended use of the data… research, public accountability, program evaluation, etc.
  • SEAs need guidance that they can reference to explain why they mask data. A lot of stakeholders don’t understand why “transparency” doesn’t mandate the release of unmodified data sets.
  • Could ED create a data masking tool (accompanied by disclaimers) for SEA and LEA use?
  • Common sense must apply: If Congress passed a law for 100 percent proficiency (which it did), why is it a violation of FERPA to report that 100 percent of the student population is in a cell? That may be a disclosure, but doesn’t pass the “common sense” test.

Emerging Technologies and Education Data Security
At recent Forum meetings, the TECH Committee has talked in broad terms about how emerging technologies such as cloud computing and Google apps are affecting the field of education data. But each time these topics have arisen, we realize that we do not have the right expertise in the room to have an informed discussion. As such, Julia Stiglitz, Manager,Google Apps for Education in North America, was invited to help inform our discussion.

  • Rocketship Schools – high performing charter schools in low SES communities; they are using technology in a blended learning model, which allows them to hire up to 25 percent fewer teachers.
  • Kahn Academy – the flipped classroom model is another example of how technology can be used to improve achievement.
  • Virtual Learning: Stanford’s Artificial Intelligence (AI) class was offered free and online to students worldwide from October 10th to December 18th 2011.
  • Other cutting edge topics in the discussion included a push toward the Cloud and digital textbooks. The major concern about these developments include:
    • The digital divide is growing, so children without access to computing and internet resources could be left further behind.
    • Content, storage, and retention are not free, although discounts could be negotiated at the state level.
    • The security of Cloud is not yet clear. Admittedly, the security of servers in education agencies isn’t always sound either (e.g., old buildings, inappropriate infrastructure and access, etc.).
  • Bethann Canada will share info about VA’s experience using safer iphone/ipads—but it came at a cost of functionality.
  • Some access concerns are policy driven – teachers and administrators may not permit the use of digital devices, etc., in school.

Meeting Review and Summer 2012 TECH Planning
Suggested topics for the Summer 2012 meeting included:

  • Assessment consortia data demands – how will they report data, generate indicators, etc.? – can CEDS, SIF, etc. help establish reporting standards for the consortia?
  • ESEA Reauthorization
  • P-20 Feedback Reports (using IPEDS data)
  • Teacher evaluation systems
  • FERPA/PTAC news
  • Sustainability of SLDS systems (more about TCO, VOI, etc.)
  • Moving systems to cloud computing
  • VA’s experience using iPhones and iPads to improve security.
  • Early Warning Systems
  • Early childhood collections
  • Examples of organizations with policies, examples, lessons, of Bring Your Own Device (BYOD) or Bring Your Own Technology (BYOT)
  • Education Technology initiatives
  • Western Interstate Commission for Higher Education (WICHE) Pilot (see PPI Feb 13 afternoon session with SHEEO representative)
  • Transforming data to information
  • Update on the ROI/TCO/VOI work in WA, OR and Napa Valley SD
  • Continued discussion of the College and Career Readiness topic
  • Continue discussion of the P-20 metrics (John Kraman, Oklahoma)

TECH Closing Thoughts
TECH Chair Peter Tamayo thanked the TECH members for an especially interesting and productive meeting. Peter asked TECH members to look for email over the next few months as we begin our preparation for the Summer 2012 Forum in Washington, DC. Note that we will be meeting on July 9-11, which is a departure from our normal meeting calendar.

Closing Session

Closing Session Forum W2012 MS PowerPoint (869 KB)

Regional Educational Laboratory Program
Forum Chair David Weinberger reviewed the success of the Forum in fostering local-state-federal collaboration—noting contributions the Forum has made to projects such as the Common Core of Data (CCD) and Office of Civil Rights (OCR) surveys; EdFacts development; the NCES Handbooks Online and the National Education Data Model (NEDM); the Statewide Longitudinal Data System (SLDS) Grant Program; and FERPA and privacy improvements. In support of the newest partnership between the Forum and the Regional Educational Laboratory (REL) Program, David again welcomed the new REL representatives and spoke about the Forum’s goals for collaboration between Forum members and RELs.

John Easton, Director of the Institute of Education Sciences at the U.S. Department of Education, provided Forum members with more information on the Regional Educational Laboratory Program. John visited the Standing Committees during the course of the Forum, and expressed his appreciation for the quality of conversation and the depth of knowledge shared in the committees. He then spoke of a new generation of regional laboratories that will pursue the goal of fostering more involvement with SEAs and LEAs and increasing work related to the use of data. John’s vision for RELs includes more actionable research that will have clear implications for school and district improvement. RELs have been charged with creating research alliances and developing research agendas that directly address the need for actionable research. RELs will work to produce multiple types of research, and each type is designed to build upon previous research as well as the REL’s research agenda. Research projects are expected to include descriptive research that is both qualitative and quantitative in design, literature reviews, correlational research, and impact studies that involve REL, SEA, and LEA engagement in experiments. RELs will also be expected to produce papers on applied research methods.

NCES Update MS PowerPoint (2.81 MB)
Jack Buckley, Commissioner of NCES, joined the Forum to provide an update on NCES activities relating to the release of the Common Education Standards (CEDS) Version 2 in January 2012, and plans for Version 3. Jack provided an overview of the CEDS project, which is a national, collaborative effort to develop voluntary, common data standards for a key set of education data elements. He reviewed the need for common data standards and corrected common misconceptions about the project. CEDS expanded from 161 elements in the Version 1 release to 628 elements in the Version 2 release. In addition to an expanded list of data elements that span early learning, K12, and postsecondary, CEDS Version 2 includes a data model, a data alignment tool, and a searchable database of elements. Soon after the release of Version 2 work began on Version 3. Jack explained the development process for new CEDS releases and discussed the composition and work of the CEDS stakeholder group. Interest in CEDS is evident based on the number and variety of comments received about Version 2. Jack concluded the discussion by discussing plans for CEDS version 3.

Standing Committee Progress Reports

Working Group Updates
Kathy Gosa, Kansas State Department of Education, gave a brief presentation on the work of the Forum Data Use Working Group, which is working on a series of Data Use briefs for education stakeholders. The group is nearing completion of an introduction to the series as well as two briefs. A third brief is currently in development, and more briefs may follow. Kathy also discussed the work of the SEA Data Use Working Group, which is nearing completion of a document that will provide best practices, resources, and templates to assist SEAs working with researchers.

Lee Rabbitt, Newport Public Schools (RI), provided an update on the new Teacher-Student Data Link Working Group, which is developing a resource for SEA and LEA staff. This new technical guide will be informed by previous work and will identify best practices for implementing the data link. Lee thanked the three standing committees for taking time to discuss the group’s work and provide comments and suggestions.

Meeting Evaluations
David Weinberger closed the meeting by thanking members for their efforts and encouraging them to share opinions and suggestions for the Forum via the evaluation forms.

Steering Committee

Monday, February 13, 2012

Welcome and Agenda Review
Forum Chair David Weinberger, Yonkers Public Schools (NY), welcomed members of the committee and reviewed the agenda.

Sunday Review The Forum Data Use and SEA Data Use working groups met prior to the Forum meeting on Sunday, February 12, 2012. Kathy Gosa reported on the progress both groups have made toward developing new Forum resources. The Data Use Working Group is nearing completion of the introduction and first two planned data use briefs, and work on a third brief is underway. The SEA Data Use Working Group is finalizing changes to a draft of the planned document before submitting it for Forum and NCES review. As the documents near completion, she noted that it will be important for the groups to be aware and informed of the work of other organizations that are undertaking similar work. She emphasized that the work done by these groups needs to be consistent with ED guidelines and current practices.

Monday Review Steering committee members were pleased that a large group of REL new members were present at the Forum New Member Orientation. Forum Chair David Weinberger expressed interest in seeing what new perspectives REL representatives will bring to the Forum over time. Many of the REL representatives attended the PPI Standing Committee meeting on Monday, while others chose to attend the NESAC meeting. Forum Vice Chair Tom Ogle noted that, in addition to the surge in REL membership, there were many new SEA representatives.

Committee members reported that the responses they received to the Opening Session were positive, and it was noted that David’s tenure as Forum Chair advances the interests of LEAs at the Forum. The information on ESEA flexibility and the National School Lunch Program provided by Ross Santy and Lily Clark was useful, and the Solana Beach information was well-presented and informative. Members agreed that it would be good to encourage applause for speakers at the Opening Session.

Chairs then commented on Monday’s Standing Committee time. NESAC members had a lively discussion with Ross and Lily, and members were pleased that Ross offered to communicate questions and concerns about the National School Lunch Program to the U.S. Department of Agriculture. Lee Rabbitt visited NESAC to discuss the Teacher-Student Data Link working group, and she provided an overview of the vision for the publication now under development.

PPI members received an update from Hans L’Orange (SHEEO) on the Western Interstate Commission for Higher Education (WICHE) data sharing project. Members were interested in the data sharing framework used by the project as well as master agreements developed for the project. Kathy Gosa visited PPI to provide an update on the Data Use and SEA Data Use Working Groups. PPI also engaged in a short discussion of the Data Quality Campaign’s Ten State Actions, and noted items that members felt were missing from the survey. PPI Chair Tom Howell noted that the PPI meeting concluded with a useful session with Ross Santy and Lily Clark about the impact changes to the National School Lunch Program will have on the use of proxy poverty indicators. Tom put forth the idea of convening a Forum Task Force on the topic of poverty indicators. While discussing ESEA with Lily, PPI members expressed concern about metrics for measuring college and career readiness. PPI members were concerned that CCR accountability is preceding the development of reliable measures. CCR was noted as a topic that should be revisited, potentially by the full Forum.

TECH also reported a full afternoon agenda. Members engaged in a discussion about the teacher-student data link and found great value in the update on the Statewide Longitudinal Data Systems (SLDS) Grant Program and the Public Domain Clearinghouse (PDC). Jim Campbell presented on CEDS and TECH was able to continue a discussion from the Summer 2011 Forum meeting on the topic of sustainability, including issues such as return on investment (ROI) and total cost of ownership.

David reported that the Joint Session presentation on FERPA updates was well-received, and follow-up discussions in Standing Committees promised to be useful and informative.

Other Issues
Ghedam Bairu (NCES) noted that there has been some confusion over the participation of vendors and associate members in Forum meetings, and the Steering Committee should carefully consider how roles are assigned and how membership is granted. Ghedam will meet with Ruth Neild of the National Center for Education Evaluation (NCEE) to promote collaboration between RELs and the Forum, and to further clarify the role of REL representatives at the Forum.

Tuesday, February 14, 2011

Review of Tuesday’s Events
During Tuesday’s Standing Committee meetings, members discussed a number of issues that were reported back to the Steering Committee. Data reporting was a topic of concern for many Forum members. NESAC members discussed EDFacts reporting, and were pleased that Ross Santy was interested in supporting an advisory group of LEAs. LEA members of NESAC were particularly interested in providing input before changes to the collection are made. An advisory group currently exists for the Education Information Management Advisory Consortium (EIMAC). Sonya Edwards, PPI Vice Chair, suggested that the Forum should consider developing a reasonable “cycle for changing data collections,” including a timeline for how long LEAs and SEAs need to implement changes to data collections/reporting emanating from ED. Steering Committee members recalled similar resources developed by other groups. Sonya noted that if the Forum develops a data cycle resource, it can be a means for demonstrating how implementing an SLDS can streamline and improve the data cycle. NESAC members were also concerned that the schedule for reporting to the Civil Rights Data Collection (CRDC) is affected by the fact that many SEA and LEA data staff work on nine month contracts and are not available to work during the summer months.

Cheryl McMurtrey, NESAC Chair, expressed the need for best practices on the topic of succession planning for retiring data stewards. Steering Committee members discussed whether it would be useful for the Forum to develop a general framework with a list of considerations and resources for succession planning. This topic was partly addressed in the Forum Guide to Building a Culture of Quality Data: A School and District Resource, but more detail is needed. Additional suggestions included creating a web portal with resources and creating a certification program. It was also noted that many critical data roles are structured differently among states.

Other Issues
Continuing the previous day’s discussion regarding Forum membership, Ghedam noted that the Postsecondary Electronic Standards Council (PESC) requested associate membership in the Forum. Steering Committee members agreed to accept PESC as the newest associate member, and suggested that it might be useful to further extend Forum membership to representatives from the U.S. Departments of Agriculture and Labor.

Summer 2012 Forum Planning
Succession planning was recommended as potential professional development topic at the upcoming Summer 2012 Forum Meeting. The topic of emerging technologies, especially social networking, was also recommended for discussion. Steering Committee members were interested in consulting with Kathleen Styles, Chief Privacy Officer at ED, to learn more about appropriate use of these technologies.

NESAC interest in the topic of College and Career Readiness led to the suggestion that the Forum consider convening a working group to address issues related to what it means to be “ready” for college or a career and how waiver states are approaching CCR.

Steering Committee monthly conference calls will resume on April 20, 2012, at 12:00 p.m. EST. Agenda items for the call include reviewing the meeting notes and the evaluation results.

 Previous Page

Top

Publications of the National Forum on Education Statistics do not undergo the formal review required for products of the National Center for Education Statistics. The information and opinions published here are the product of the National Forum on Education Statistics and do not necessarily represent the policy or views of the U.S. Department of Education or the National Center for Education Statistics.


Would you like to help us improve our products and website by taking a short survey?

YES, I would like to take the survey

or

No Thanks

The survey consists of a few short questions and takes less than one minute to complete.