Skip Navigation

Summer 2015 Forum Meeting Notes

National Forum on Education Statistics
July 6-8, 2015
Washington, DC

Forum Opening Session
Welcome
Joint Session: Office of Educational Technology Initiatives
Joint Session: Education Data Privacy
Joint Session: Educator Equity Plans
National Education Statistics Agenda Committee (NESAC) Meeting Summary
Policies, Programs, and Implementation Committee (PPI) Meeting Summary
Technology Committee (TECH) Meeting Summary
Forum Closing Session
Steering Committee



Top

Forum Opening Session

Monday, July 6, 2015

Forum Agenda Review pdf file (285 KB)
Forum Chair Thomas Purwin (Jersey City Public Schools [NJ]) welcomed Forum members to the Summer 2015 Forum Meeting in Washington, DC. He introduced the Forum officers, reviewed the agenda, and encouraged everyone to help welcome the following new members into standing committees and working groups:

  • Kara Arzamendia, Minnesota Department of Education
  • Rebecca Bolnick, Arizona Department of Education
  • Heather Boughton, Ohio Department of Education
  • Brett Carter, Montana Office of Public Instruction
  • Megan Clifford, Oklahoma State Department of Education
  • Stephanie Stoll Danton, Regional Educational Laboratory - Pacific
  • Kristen DeSalvatore, New York State Education Department
  • Deborah Donovan, Mississippi Department of Education
  • Jamey Ereth, Montana Office of Public Instruction
  • Michael Ferry, Rhode Island of Elementary and Secondary Education
  • Beverly Flanagan, Indiana Department of Education
  • Leigh Ann Grant-Engle, Missouri Department of Elementary and Secondary Education
  • Cynthia Hearn, South Carolina Department of Education
  • Angela Hemingway, Idaho State Department of Education
  • Georgia Hughes-Webb, West Virginia Department of Education
  • Randall Kirk, West Virginia Department of Education
  • James Lane, Goochland County Public Schools (VA)
  • Brian Laurent, Alaska Department of Education and Early Development
  • Kathleen Moorhead, New York State Education Department
  • Dave Moyer, Hawaii State Department of Education
  • Lan Neugent, State Educational Technology Directors Association
  • Jim Robb, West Memphis School District (AR)
  • John Q. Porter, Mississippi Department of Education
  • Ben Tate, Oregon Department of Education
  • Lane Wiley, Kansas State Department of Education

Top

Welcome

Marilyn Seastrom, Chief Statistician and Program Director of Statistical Standards (National Center for Education Statistics [NCES]), welcomed Forum members and commended them on a job well done in developing the following resources:

She highlighted Forum activities that assist federal initiatives:

  • Providing feedback to the U.S. Department of Education Office for Civil Rights–Civil Rights Data Collection (CRDC).
  • Working closely with the U.S. Department of Education Family Policy Compliance Office (FPCO) and the Privacy Technical Assistance Center (PTAC).
  • Providing suggestions for the development of NCES’s new web-based school climate survey tool.
  • Reaching out to the Office of State Support at the Department of Education and representatives of the Smarter Balanced Assessment Consortium and the Partnership for Assessment of Readiness for College and Careers so that they can provide updates on various aspects of their work, including implementation, technical issues, and future plans.
  • Inviting the Regional Educational Laboratories (RELs) to become associate members and providing opportunities such as roundtable discussions, presentations, and joint webinars for REL representatives to share their work and help to ensure that REL work is practical and useful to SEAs and LEAs.
  • Developing the Forum Guide to Data Access for Researchers: A State Education Agency Perspective and a companion resource targeting the needs of LEA staff in response to suggestions from Forum members, National Center for Education Evaluation and Regional Assistance (NCEE) Commissioner, Ruth Neild, and former Institute of Education Sciences (IES) Director, John Easton.

Marilyn described how the Forum has supported other groups, including the White House Initiative on Asian Americans and Pacific Islanders, the National Endowment for the Arts, and several National Science Foundation (NSF)-funded projects.

The Forum continues to collaborate around developing new resources. Marilyn highlighted a few of the ways Forum members have facilitated collaborative efforts:

  • Upon completion of the Forum Guide to Alternative Measures of Socioeconomic Status in Education Data Systems the Forum reached out to the U.S. Department of Agriculture (USDA) to confirm that the information in the guide aligned with new USDA initiatives. The Forum also worked with the USDA to plan back-to-back presentations for the Summer 2015 NCES STATS-DC Data Conference.
  • The Forum’s School Courses for the Exchange of Data (SCED) Working Group continues to work collaboratively with the NCES High School Transcript Study team. In an effort to coordinate comparable course classification systems, Forum members worked with NCES to support mapping SCED to the Classification of Secondary School Courses (CSSC) and implemented SCED as the course coding system for NCES transcript studies. Most importantly, Forum members realized that for SCED to be relevant and useful it has to reflect the needs of practitioners, and they have worked closely with numerous expert organizations.

Marilyn noted that these examples are only the highlights of the Forum’s recent work and that she could spend much more time discussing Forum member accomplishments despite having only one annual meeting. She acknowledged that Forum members are currently strategizing and planning new resources and congratulated everyone on all of their accomplishments.

Top

Joint Session: Office of Educational Technology Initiatives

Monday, July 6, 2015

Zac Chase, ConnecEd Fellow at the U.S. Department of Education (USED), provided an overview of the ConnectEd and Future Ready Initiatives at the Office of Educational Technology (OET). The ConnectED Initiative sets goals to transition to digital learning including: preparing teachers to use technology effectively to improve student learning; upgrading Internet connectivity; and providing access to educational devices and digital content. Future Ready builds on the momentum of the President’s ConnectED Initiative with the launch of the Future Ready Pledge. The pledge recognizes the importance of building human capacity within schools and districts for effectively using increased connectivity and new devices to transform teaching and learning.

Zac stated that the 2015 National Education Technology Plan (NETP) is in the development process. The current draft of the plan is focused on five key areas: leadership, learning, teaching, assessment, and infrastructure. Forum members suggested that infrastructure be moved to the top of the list of key areas. As E-rate funding increases, so does access to technology across the nation. Zac shared with the Forum a video about acquiring everyday life skills at Rupley Elementary School, Community Consolidated School District 59 in Illinois. Individuals can communicate stories about equity in a wide variety of education contexts. Zac encouraged Forum members to share other examples to highlight.

OET is working to move the 2015 Future Ready Regional Summits forward: supporting the network, doubling the number of signers of the Future Ready District Pledge, and aligning and maximizing the work of the Future Ready Coalition. Zac explained that there is much to learn from Future Ready leaders who inspire others through research-based practices, peer site visits, and sharing stories (http://tech.ed.gov/stories/). Zac encouraged Forum members to send other inspiring videos to include on the website so that OET can do its part to amplify great stories, share good practices, and connect practitioners.

Forum members discussed the following topics with Zac in the Q&A portion of the session:
  • Forum members were interested in learning more about how the OET is working with the Federal Communications Commission (FCC) to address inequity in internet access for students across the nation. In Illinois, superintendents are working with local internet service providers to provide home internet access to students who quality for Free and Reduced Price Lunch (FRPL). There are also multiple federal agencies working together to increase broadband opportunities in schools and libraries. Efforts at the school level to hold technology-focused events are also useful for parents to learn about low-cost internet options.
  • Superintendents who signed the pledge have a direct line to other superintendents who can serve as stronger channels for expanding their network. The OET supports the network of pledge signers and hopes to build a community to talk about the seven activities mentioned in the Future Ready Pledge:
    • Fostering and leading a culture of digital learning within our schools
    • Helping schools and families transition to high-speed connectivity
    • Empowering educators through professional learning opportunities
    • Accelerating progress toward universal access to quality devices for all students
    • Providing access to quality digital content
    • Offering digital tools to help students and families
    • Mentoring other districts and helping them transition to digital learning
  • There should be more dialogue between teaching education programs and schools. Professors could make a larger effort in equipping graduate students with training on the importance of digital literacy in schools. Zac encouraged Forum members to submit stories they find about increasing dialogue between teacher preparatory programs and schools to the OET.

Top

Joint Session: Education Data Privacypdf file(771 KB)

Tuesday, July 7, 2015

Peter Tamayo (Washington State Office of Superintendent of Public Instruction) introduced Michael Hawes, Statistical Privacy Advisor at the U.S. Department of Education (USED), who gave a presentation on the national landscape of education data privacy. Michael addressed pending bills regarding student privacy, many of which focus on the issue of how well 3rd party vendors are protecting students’ privacy. Some bills are also looking at what the Family Educational Rights and Privacy Act (FERPA) is doing to protect student privacy and if it needs to be updated. The digitalization of student information at all levels created a different world than when FERPA was first created, and keeping pace with changes has been a challenge. There is a tremendous amount of misinformation about how student data are being used and what student data are being used for. Moreover, there is misinformation about what data are being collected about students. Questions about student privacy often turn out to be about the Common Core State Standards, or state and federal involvement in local education issues. These are not privacy questions. Yet, such concerns are still included in the discussion of how to deal with changes in education data privacy laws. More efforts are needed to explain to parents the value of collecting and using student data and how students and their families are benefiting from the data collected. Aggregate student information is necessary for accountability, and there is tremendous value in education data.

The pending bills are the result of action at the state and federal levels, and could radically change current practices. Even if the bills do not pass, there is still a chance that the provisions could affect Elementary and Secondary Education Act (ESEA) reauthorization or be moved into a new bill. Either way, the education field needs to be prepared to change the way business is done at the local, state, and federal levels; to be more forthcoming with information on data use, data collection, and data sharing; and to be more responsive to concerns.

Public concern could be the result of lack of information. The education industry should be focusing on concrete issues policy makers and researchers face as they use data; publishing Memoranda of Understanding (MOUs), research results, and data dictionaries; explaining why different data elements are being collected; showing parents and community members where to find information; and working with parents to find out what their concerns are so LEAs and SEAs can mitigate those concerns. Transparency will help shape the debate and help make it a productive one. Michael encouraged Forum members to use the surge in proposed education privacy legislation as a call to action.

Peter thanked Michael for his presentation and introduced Raymond Martin (Connecticut Department of Education), Susan M. Williams (Virginia Department of Education), and Janice Petro (Colorado Department of Education) who gave presentations on the impact of privacy legislation changes in their states.

Ray outlined new data sharing agreement changes in the state of Connecticut. Connecticut legislators were active in their efforts to ensure education data privacy and the state has been proactive in making efforts to incorporate FERPA and privacy issues in legislation. Susan continued the presentation and discussed the Virginia Privacy Act which drove action throughout the state. She encouraged Forum members to view the video on the Virginia Longitudinal Data System which serves as a communication tool to increase transparency around student data. Janice provided a brief summary of how the Colorado Department of Education (CDE) is working to protect the privacy of data collected, used, shared, and stored. The CDE brought privacy to the forefront of action by passing the Student Data Privacy Act, making time for employee awareness training around data sharing, appointing a privacy team, and strengthening their website. The Forum Guide to the Privacy of Student Information: A Resource for Schools and Forum Guide to Protecting the Privacy of Student Information: State and Local Education Agencies are posted on their FERPA Resources website, along with guidance for districts and schools; information on federal laws and policies; information about state law and policies; and data privacy and security procedures. Peter thanked Raymond, Susan, and Janice for their presentations and opened the floor for Q&A. Forum members were interested in the following issues:

  • LEAs were interested in learning how to increase transparency around vendor agreements while also protecting vendors from becoming targets for hackers.
  • A large vendor allows identity automation with other 3rd party vendors. A Forum member wanted to learn more about how FERPA addresses what vendors can do with identity automation data. Michael responded that FERPA addresses this issue and that accounts should be protected. Agreements made with vendors should include restrictions on how vendors can use the data. There is a consumer aspect to privacy. Education is a service and there are certain data that are essential to the proper functioning of an educational institution. This implies there is some mandatory element of data collection, data use, and data sharing that are necessary for education to function effectively. Laws are established to ensure that data sharing is limited. Students and their families have the choice of whether or not to enroll in public schools, although data are also collected in other schooling systems. Any time there is a public service, there will be data collected to ensure public money is being properly spent.

Peter Tamayo thanked the panel for their presentations. He welcomed Michael to come back and give updates on bills and new legislation regarding student privacy.

Top

Joint Session: Educator Equity Planspdf file(771 KB)

Tuesday, July 7, 2015

Patrick Rooney, Deputy Director of the Office of State Support (OSS) at the U.S. Department of Education (USED), led a presentation on the development and implementation of educator equity plans. In a letter to Chief State School Officers, U.S. Secretary of Education Arne Duncan described how state education agencies (SEAs) can ensure that every student in every public school has equal access to great educators. The letter stated “The Department will ask that, in April 2015, each State educational agency (SEA) submit to the Department a new State Educator Equity Plan in accordance with the requirements of Title I of the Elementary and Secondary Education Act of 1965 (ESEA). As required by ESEA, in its plan, each SEA must, among other things, describe the steps it will take to ensure that ‘poor and minority children are not taught at higher rates than other children by inexperienced, unqualified, or out-of-field teachers.’ To prepare a strong plan, each SEA will analyze what its stakeholders and data have to say about the root causes of inequities and will craft its own solutions. The Department will issue guidance this fall to support SEAs in plan development and implementation.”

Patrick explained that the core principles of the work are that all students deserve equal educational opportunity and teachers and principals deserve support. The purpose of Educator Equity Plans are to: 1) show states where there might be gaps in teacher equity; 2) compare certain teacher characteristics; and 3) identify districts with many of the state’s high poverty or high priority schools. A profile with descriptive statistics summaries was created for each state based on the data available. Examples of items include educator and classroom characteristics such as teacher qualifications and salary. Profiles also showed equity gaps in the highest poverty schools by district and location. The large State Data Files OSS creates are available to states if they do not have better data to create their own equity plans. The collected data are available to states to do their own analyses or to combine with other data sets. Convenience and transparency drive the approach. The data sources and elements for the files produced by OSS include the Civil Rights Data Collection (CRDC), EdFacts, Common Core of Data (CCD), and Comparable Wage Index (CWI).

A Forum member asked if there is a national average or benchmark for schools to meet. Patrick responded that there is not a national benchmark due to the wide variability across state and LEA needs. Another Forum member brought up the concern that the equity plans would increase burden and the data may not be accurate.

Patrick introduced Danielle Smith (USED) who gave a presentation on the use of the data OSS has received. All states were given the opportunity to receive technical assistance and feedback on their state plans through panel reviews. Each state plan had six major requirements:

  • consult with stakeholders;
  • identify equity gaps;
  • explain equity gaps;
  • identify strategies;
  • measure & report progress; and
  • identify equity gaps in three categories: inexperience, unqualified, or out of field.

Some states implemented the High Quality Teacher (HQT) Toolkit to define the three categories above. To report data on students in poverty, some states used Free and Reduced Price Lunch (FRPL), Temporary Assistance for Needy Families (TANF) or Supplemental Nutrition Assistance Program (SNAP). States were asked to calculate the equity gaps that existed to identify the root causes of gaps. Common themes included

  • inadequate preparation;
  • insufficient professional development;
  • adverse working conditions;
  • ineffective leadership;
  • ineffective human capital; and
  • insufficient funding.

Strategies from state plans are focused on providing supports to schools and districts and improving the supply of teachers to high needs schools, the management of human capital, and compensation for excellent educators in high needs schools. States work with the OSS to measure progress, address equity gaps, and report on progress. Technical assistance increases access to resources and a support network. A Forum member asked Danielle about exploring teacher effectiveness through the equity plans. Danielle responded that more than 10 states included teacher effectiveness data. Other states specifically left it out to because they wanted to wait in order to ensure that the data would be reliable, but indicated that that is an area they would like to explore.

Top

Forum Closing Sessionpdf file (521 KB)

Wednesday, July 8, 2015

NCES Update pdf file (521 KB)

Peggy Carr (Acting Commissioner, NCES) provided an update on NCES initiatives to the Forum. She began by reviewing recent Forum resources and explaining how the Forum and NCES help to ensure quality education statistics. She discussed many of the findings from The Condition of Education 2015 and encouraged Forum members to view new data videos available at https://www.youtube.com/user/EdNCES/videos. She then reviewed the landscape of national and global education assessments and shared findings from studies that compared NAEP with the Common Core Math standards and the Next Generation Science Standards.

New NCES initiatives include

  • enhanced and expanded outreach through social media;
  • improved core NCES annual reports; and
  • improved visibility for the Civil Rights Data Collection and EDFacts.

Peggy invited questions and comments from the audience.

Standing Committee Progress Reports

Recognition of Completed Projects
The Forum presented certificates to those members who helped work on the Forum Guide to Alternative Measures of Socioeconomic Status in Education Data Systems, and then to those who worked on the Forum Guide to College and Career Readiness.

Recognition of Forum Officers
The Forum also presented plaques to recognize the contributions of the Forum Officers.

Forum Election
Chair Thomas Purwin presented the slate of proposed 2015–2016 officers for a vote. The slate was seconded and then the Forum voted to approve the following members as 2015–2016 officers:

Chair: Peter Tamayo, Washington State Office of Superintendent of Public Instruction
Vice Chair: Laurel Krsek, San Ramon Valley Unified School District (CA)
Past Chair: Thomas Purwin, Jersey City Public Schools (NJ)
NESAC Chair: Kristina Martin, Macomb Intermediate School District (MI)
NESAC Vice Chair: Susan M. Williams, Virginia Department of Education
PPI Chair: David Weinberger, Yonkers Public Schools (NY)
PPI Vice Chair: Levette Williams, Georgia Department of Education
TECH Chair: Dean Folkers, Nebraska Department of Education
TECH Vice Chair: James Hawbaker, Appleton Area School District (WI)

Closing Remarks:
The 2015-2016 Forum Chair Peter Tamayo (Washington State Office of Superintendent of Public Instruction) thanked Thomas Purwin for his leadership of the Forum. He highlighted the usefulness of Forum products and encouraged the use of publications and the Forum360 website. He also asked Forum members to complete the evaluation forms.

Top

Steering Committee

Monday, July 6, 2015

Welcome and Agenda Review
Forum Chair Thomas Purwin (Jersey City Public Schools [NJ]) welcomed members of the committee and invited members to share their thoughts and comments on the day’s events.

Review of Monday’s Events
A large number of new members attended the New Member Orientation Session. Steering Committee members were pleased that many new members were familiar with Forum resources and several reported having used Forum resources in their state and local education agencies (SEAs and LEAs). Conversations about member uses of Forum resources continued in the Standing Committee meetings. Several new members were appointed to their positions immediately prior to the Forum and they will need to have mentors assigned to them.

Representatives of the Partnership for Assessment of Readiness for College and Career (PARCC) and the Smarter Balanced Assessment Consortium (SBAC) visited each Standing Committee throughout the day. Forum members were interested in reports from the consortia, and noted that since testing is underway the consortia have new topics to discuss.

  • NESAC Chair Janice Petro (Colorado Department of Education) reported that the NESAC agenda included Forum Working Group updates and a presentation from the Office of Special Education. New members were active in NESAC discussions and it was clear that mentorship helps new members to understand Forum work.
  • PPI Vice Chair David Weinberger (Yonkers Public Schools [NY]) reported that PPI spent considerable time discussing issues related to education data privacy. Members suggested that the Forum should consider developing a best practices guide that could include references to Privacy Technical Assistance Center (PTAC) guides and other resources. PPI members also began using a new table structure that replaced the hollow square with separate round tables. David noted that the new structure was working well.
  • TECH Chair Michael Hopkins (Rochester School Department [NH]) reported that Thomas Purwin’s presentation on the Consortium on School Networking (CoSN) Meeting was well-received and several members requested a copy of Thomas’s PowerPoint presentation. TECH divided into two groups (SEAs and LEAs) for an open discussion on member-generated topics. Members discussed data visualization and sustainability at length.

Forum Vice Chair Peter Tamayo (Washington State Office of Superintendent of Public Instruction) noted that he had not been expecting a forty five minute break following the opening session. In the future, such a long break could be used for roundtable discussions or other member-led discussions. NESAC Vice Chair Kristina Martin (Macomb Intermediate School District [MI]) noted that the long break gave members an opportunity to network.

Zac Chase (Office of Educational Technology) gave an engaging and interactive presentation. Members were interested in learning more about the research behind the initiatives Zac discussed. Ghedam Bairu (NCES) suggested that the Forum can follow-up with Zac to learn more.

Steering Committee members explored the idea of a new working group on the topic of privacy. Ideas included:

  • Partnering with PTAC. PTAC has excellent resources, but it can be difficult for practitioners to determine what PTAC resources are best suited to their work. Guides and/or briefs that target specific audiences are needed to help practitioners identify appropriate resources.
  • Identifying resources and best practices for teachers. Teachers need to know what types of software they can use, and how privacy regulations apply to their work.
  • Compiling related resources. In addition to PTAC, groups such as the Consortium on School Networking, the National Association of State Boards of Education, the Data Quality Campaign, and others have privacy resources.
  • Focusing on examples. Real-life examples of potential privacy concerns and alternate approaches that protect data would be useful to include in the document. The document should include both negative and positive examples.

Tuesday, July 7, 2015

Review of Tuesday’s Events
Steering Committee members reported that Michael Hawes gave an excellent presentation on education data privacy, and members would benefit from more opportunities to discuss privacy with him.

Tuesday’s Standing Committee agendas included updates on the National Assessment of Educational Progress (NAEP) transition to digitally-based assessments, follow-up discussions on data privacy, presentations on National Science Foundation (NSF)-funded research into science, technology, and math indicators (STEM), updates on the Civil Rights Data Collection (CRDC), and information on School Climate Surveys.

During the NESAC data privacy discussion, members noted that LEAs would appreciate additional information and best practices related to how teachers use software. NESAC members provided suggestions and information to Martin Orland and Ellen Mandinach on STEM-related data available in student information systems and engaged the CRDC presenters in a discussion about the upcoming data collection. The school climate survey tool under development by the National Center for Education Statistics (NCES) promises to be useful to many SEAs and LEAs. NESAC members were interested in learning more about

  • data visualization;
  • LEAs and privacy;
  • reauthorization of the Elementary and Secondary Education Act (ESEA); and
  • data governance best practices for LEAs.

PPI members continued to discuss privacy issues on Tuesday and welcomed Baron Rodriguez from the Privacy and Technical Assistance Center (PTAC), who discussed new resources. The NCES School Climate Survey tool interested PPI members, but there was some concern that the surveys could be coopted by other groups. PPI members were interested in learning more about data burden and the CRDC.

TECH members appreciated the opportunity to provide feedback on the CRDC and to discuss whether EDFacts could be useful as a pre-population source. Members were interested in the opportunity for future information on the NAEP transition to digitally-based assessments and were curious about why the assessment is not web-based. TECH members were interested in ways the Forum can gather information related to education data privacy.

Steering Committee members reported that the Educator Equity Plans Joint Session was a useful and informative presentation, but members shared concerns about how the plans will affect their SEAs and LEAs, including

  • the plans may increase burden;
  • reauthorization of ESEA may change the requirements for the plans;
  • some collective bargaining agreements may affect the ability of agencies to implement the solutions.

New Working Groups
The Steering Committee voted to approve the creation of two new Forum Working Groups:

  • Education Data Privacy
  • Data Visualization

Forum Elections
Standing Committee Chairs reported the results of their elections. Proposed Chairs and Vice Chairs for the 2014-15 year were

  • NESAC Chair: Kristina Martin, Macomb Intermediate School District (MI)
  • NESAC Vice Chair: Susan Williams, Virginia Department of Education
  • PPI Chair: David Weinberger, Yonkers Public Schools (NY)
  • PPI Vice Chair: Levette Williams, Georgia Department of Education
  • TECH Chair: Dean Folkers, Nebraska Department of Education
  • TECH Vice Chair: Jim Hawbaker, Appleton Area School District (WI)

The Steering Committee proposed Peter Tamayo (Washington State Office of Superintendent of Public Instruction) as the Forum Chair and Laurel Krsek (San Ramon Valley Unified School District [CA]) as Vice Chair for 2015-16.

Wednesday, July 8, 2015

Welcome to New Steering Committee Chairs
Newly-elected Chair Peter Tamayo (Washington State Office of Superintendent of Public Instruction) welcomed new Steering Committee Members to the meeting and reviewed the responsibilities of Vice Chairs, which include

  • mentoring new members;
  • promoting Forum products;
  • assisting the Chair; and
  • serving on the Forum Voice Editorial Board.

Review of Wednesday’s Events
Wednesday’s Standing Committee agendas included updates on EDFacts and the Regional Educational Laboratory (REL) Program. Chairs reported on the day’s events and shared recommendations from the committees.

NESAC members were interested in

  • convening a webinar in August to allow Forum members to discuss EDFacts comments;
  • data visualization;
  • ESEA reauthorization;
  • LEA data governance; and
  • data privacy.

PPI members were interested in

  • early childhood data;
  • marketing to parents; and
  • increasing transparency.

TECH members were interested in

  • identifying a tangible project/piece of work for TECH;
  • identity management and single sign-ons;
  • data visualization;
  • communications; and
  • facilitating SEA/LEA discussions around common concerns (e.g., what are members’ recent accomplishments, what keeps you up at night);
  • establishing a calendar of regular Forum or TECH webinar events; and
  • experimenting with a new room design (in the style of NESAC) at the next in-person meeting.

Other Issues
Steering Committee members agreed that core dates and times for webinars should be set well in advance. Members were also interested in engaging new members through webinars. Peter Tamayo noted that the Forum’s New Member Handbook and Forum360 site (http://forum.grads360.org/) are useful for both new and established members. Laurel Krsek suggested that new members would understand more about the standing committees if the Forum offered a webinar that approaches a single topic from the vantage point of each committee.

Top

National Education Statistics Agenda Committee (NESAC) Meeting Summary

Monday, July 6, 2015

Morning Session
Welcome & Introductions
NESAC Chair Janice Petro (Colorado Department of Education) and Vice Chair Kristina Martin (Macomb Intermediate School District [MI]) began the meeting with introductions and reminded members that Forum meetings are working meetings designed for Forum members and invited guests. Others interested in the work of the Forum were invited to attend the STATS-DC Data Conference immediately following the Forum.

Agenda Review and Review of Recent Activities
NESAC Chair Janice Petro (Colorado Department of Education) provided a summary of recent Forum activities such as webinar presentations, Steering Committee calls, working group meetings, and School Courses for the Exchange of Data (SCED) tool development. She also gave a brief review of agenda items for the NESAC meeting.

Review of NESAC Mission
NESAC Chair Janice Petro (Colorado Department of Education) recited the NESAC mission statement to remind NESAC members and new Forum members of the ways NESAC fulfills the Cooperative System’s legislative mandate to improve the quality of data at federal, state, and local levels

Suggestions for New Forum Members
NESAC Chair Janice Petro (Colorado Department of Education) opened the floor for NESAC members to offer tips and advice to new Forum members attending the NESAC standing committee. NESAC members suggested ways that new Forum members can become active in the Forum, including

  • utilize listservs;
  • attend the three standing committees (NESAC, TECH, and PPI) to get a better idea of which they would like to join;
  • pursue additional opportunities to engage with current Forum members, such as participating in quarterly meetings or getting involved with working groups; and
  • participate in feedback for other state education agency (SEA) challenges, such as being willing to share their own experiences around managing subgroup data.

College and Career Ready Working Group Update
Dean Folkers (Nebraska Department of Education) provided an overview of the Forum’s College and Career Ready (CCR) Working Group. Dean emphasized the opportunity for the working group to bring together many different perspectives around CCR from local education agencies (LEAs) and SEAs. The Forum Guide offers five use cases of using data to support college and career ready goals:

  • Tools to Support Individualized Learning
  • Educator Support Systems to Address Student-Specific Needs
  • Postsecondary Feedback Loops to Guide CCR Programmatic Decisions
  • Metrics, Accountability, and Continuous Improvement
  • Maximizing Career Opportunities for All Students

By approaching CCR from the philosophical standpoint that students have the opportunity to choose their pathway, whether college or career, the working group was able to bring together a wide variety of conversations regarding CCR. The Forum Guide to College and Career Ready Data is intended for anyone with an interest in preparing K12 students to be college and career ready.

School Courses for the Exchange of Data (SCED) Working Group Update
NESAC Chair Janice Petro (Colorado Department of Education) provided an overview and update on SCED Working Group activities. The Forum established a SCED Working Group to regularly review and update SCED with the assistance of subject matter experts at the local, state, and national levels. Since the last Forum meeting, the Working Group released SCED 3.0, which focused on new and updated standardized courses, including International Baccalaureate and Project Lead the Way; new and updated Family and Consumer Sciences courses; and new courses from the National Center for Education Statistics (NCES) Transcript Studies. NCES also supported mapping SCED to the Classification of Secondary School Courses (CSSC) and implemented SCED as the course coding system for NCES Transcript Studies. NCES recently supported the development of the SCED Finder (http://nces.ed.gov/SCEDfinder/home). This new tool helps users select predefined SCED codes and then assign elements and attributes to courses. The SCED Finder makes it easier for LEAs to map locally adopted codes and to identify available SCED codes. The Working Group is currently reviewing suggestions for Version 4.0, which will include a comprehensive update. NESAC committee members discussed ways that SEAs can work with LEAs to increase buy-in and implementation of SCED, such as through webinars or in-person training sessions.

Virtual Education Working Group Update
Laurel Krsek (San Ramon Valley Unified School District [CA]) updated NESAC on the Virtual Education Working Group and the development of a new Forum Guide to Virtual Education Data. The Forum convened the Virtual Education Working Group to review and revise the Forum Guide to Elementary/Secondary Virtual Education (2006). The virtual education environment has grown and changed since 2006, and includes vastly different technologies and approaches to teaching and learning. At the same time, new developments in the field of data standards such as the Common Education Data Standards (CEDS) and SCED have made it easier for state and local education agencies to collect, manage, compare, and use education data to inform and improve education. After reviewing the 2006 document, the Working Group began development of a new resource that will assist state and local education agencies as they

  • consider the impact of virtual education on established data elements and methods of data collection; and
  • address the scope of changes, the rapid pace of new technology development, and the proliferation of resources in virtual education.

Much of the structure and content of the old guide will remain in the update. However, the Working Group has added information to address the challenges currently facing SEAs and LEAs, such as establishing data governance and determining LEA/SEA responsibilities in situations where the virtual provider is out-of-district or out-of-state. One topic that has sparked quite a bit of discussion is the challenge of collecting data when information that is relevant to virtual education does not comply with established reporting–for example, many systems track course credit based on “seat time.” This is problematic when collecting data on virtual courses that award credit based on competency measures. There are very few data elements that are solely used for virtual education. However, there are a number of data elements that often exist in traditional data systems that are particularly useful for collecting virtual data. For example, CEDS includes a number of elements that can be used to identify organizational responsibilities, such as the “Responsible Organization Identifier.” CEDS also offers elements that can be used to identify achievement criteria for competency based education. The Working Group has highlighted these elements in the document along with other elements that are well-suited to collecting virtual education data. The Working Group has also found that virtual education regulatory structures vary, and while some states have state-endorsements for online programs, others do not. The Forum Guide includes an appendix of state links to information on virtual education. Forum members are encouraged to send input for the appendix to Ghedam Bairu (NCES). The Working Group expects to release this new resource later this year.

Afternoon Session
State and Local Education Agency Uses of Forum Resources
Robert Rodosky (Jefferson County Public Schools [KY]) shared his experience with using Forum resources in his school district. In addition to disseminating Forum Guides, he has utilized Forum resources to develop trainings around building a culture of data quality. Furthermore, he included the Forum Guide to Supporting Data Access for Researchers: A Local Education Agency Perspective in an article published by the Phi Delta Kappan titled “School Districts as Partners in Research Efforts.” NESAC members shared examples of how they use Forum products:

  • The Forum representative from Guam shared Forum resources with local university faculty who work with graduate student researchers at the Graduate School of Education. As a result, the Guam Department of Education received significantly more research proposals that contribute to the LEA’s strategic plan.
  • Another Forum member explained that learning about best practices from SEAs can be useful for other Forum members to see if their LEAs or SEAs are aligned with their own education agency’s goals.
  • Forum members use Forum resources to start conversations around increasing data quality throughout their education agencies.
  • The Forum Guide to Facilities Information Management: A Resource for State and Local Education Agencies is useful for graduate students because textbooks are not always relevant to the day to day work of SEAs and LEAs.
  • Forum guides are a useful way to engage leadership around decisions about how to build research and data sharing contracts, or how to develop indicators.
  • Forum Guides are sponsored by NCES but they reflect the voices and perspectives of Forum members. The guides include best practices that help Forum members promote better practices in their state. Guides are also utilized to provide clarity on education data governance.
  • An SEA reported that the Forum Guide to School Courses for the Exchange of Data (SCED) Classification System helped to open up discussion around the use of new SCED codes in ways that are compatible with the existing state system.
  • The Forum Guide to Building a Culture of Quality Data: A School and District Resource is useful for staff who are involved with data and who need to know about the Family Educational Rights and Privacy Act (FERPA).

Update on OSEP-Funded TA and Data Centerspdf file (2.9 MB)
David Guardino and Meredith Miceli (Office of Special Education and Rehabilitative Services [OSEP], U.S. Department of Education [USED]), joined NESAC to provide information on the OSEP Technical Assistance (TA) Centers. Their presentation titled, “Update on OSEP-Funded TA and Data Centers,” gave NESAC members a broad overview of initiatives, programs, and technical assistance and dissemination efforts funded through OSEP. OSEP supports improved outcomes for all students by leading results driven accountability components. These components are aligned in a manner that best support states in improving results for infants, toddlers, children and youth with disabilities, and their families. The purpose of the Individuals with Disabilities Education Act (IDEA), Part D is to support the development and implementation of comprehensive strategies that improve educational results for children with disabilities. The OSEP discretionary portion covers five programs:

  • Education Technology, Media, and Materials
  • Parent Information Centers Program
  • Personnel Preparation
  • State Personnel Development Grants (SPDG)
  • Technical Assistance and Dissemination (TA&D)

Another related but separate source of support from OSEP is the Technical Assistance on State Data Collection Grant Program and the four national TA centers within the program: Center for IDEA Early Childhood Data Systems (DaSy), Center for IDEA Fiscal Reporting (CIFR), Center for the Integration of IDEA Data (CIID), and IDEA Data Center (IDC). The purpose of the grant program is to provide technical assistance to states to improve the capacity of states to meet the IDEA data collection expectations:

  • collect, report, analyze, and utilize high quality IDEA data;
  • meet IDEA fiscal requirements; and
  • meet IDEA data reporting requirements.

From a data quality perspective, it is important for OSEP to clarify reporting requirements through resources on their center websites. The collaboration among OSEP Data TA Centers bolsters a continuum of approaches that provide information sharing, conferences, webinars, TA resources, and TA to states.

David and Meredith provided a number of handouts, including:

Assessment Consortia
Jessica McKinney (Office of State Support, USED), Jeffrey Cuff (Partnership for the Assessment of Readiness for College and Careers [PARCC]), and Brandt Redd (Smarter Balanced Assessment Consortium [SBAC]), joined NESAC to provide an update and discussion on the work of the assessment consortia. Jessica began the presentation with a review of the Race to the Top (RTT) pdf file (724 KB) context and RTT summative assessment goals. Jeffrey provided an overview pdf file (718 KB) of PARCC, summative administration, non-summative administration, PARCC Platform Solution Diagram, student response capability features, interoperability standards, data reporting to states/districts, and technical challenges. Brandt continued with a presentation pdf file (900 KB) that highlighted Smarter Balanced achievements in the past year, the three pillars of SBAC assessments, accessibility features, the variety of compatible devices, test results data that are CEDS and Schools Interoperability Framework Association (SIF) compliant, interoperability standards, and technical challenges.

Tuesday, July 7, 2015

Morning Session
E-NAEPpdf file (697 KB)
William Ward (Assessments Division: National Assessment Branch, NCES) provided an update on lessons learned from the pilot technology-based National Assessment of Educational Progress (NAEP). The NAEP program is in the midst of transitioning all assessments to digitally based content and delivery. Beginning in 2017, the NAEP mathematics, reading, and writing assessments will be administered to students throughout the nation on NAEP-provided tablets. The first stage began with a pilot test in 2015 for mathematics, reading, and science.

Bill led the Q&A portion of the presentation and discussed the following topics with NESAC members:

  • increasing communication with principals and district staff about the online tools;
  • maintaining smooth administration of the assessment despite obstacles in the electronic delivery of information;
  • enabling superintendents to populate the web application to ensure states have the correct student list;
  • considering the potential differences between paper and digital delivery of the pilot assessment;
  • learning through the use of tools available to students throughout the assessment;
  • including support for sub-populations and students with disabilities;
  • surveying student access to computers across the country;
  • collecting data on Free and Reduced Price Lunch (FRPL); and
  • developing plans for a more seamless student experience through adaptive assessment.

Committee Discussion: Education Data Privacy
NESAC Chair Janice Petro (Colorado Department of Education) and Kristina Martin (Macomb Intermediate School District [MI]) led a discussion on education data privacy. Linda Rocks (Bossier Parish School System [LA]) began the discussion by describing some of the obstacles to being transparent with data while maintaining 3rd party vendor agreements. When teachers decide to utilize cloud applications, they may click “I agree” without knowing exactly how the data they enter into that application are going to be used by the vendor. It is a significant challenge for LEAs and SEAs to monitor every agreement made with the wide range of digital resources available on the internet. NESAC suggested that the Forum could develop a resource that provides best practices around data governance and education data privacy with 3rd party agreements.

NESAC members shared obstacles they have experienced while maintaining education data privacy at the LEA and SEA levels. Members also discussed the importance of properly de-identifying PII and scholastic records. When there are requests for data elements over multiple years, the amount of accumulated information can later be used to identify students. NESAC members were interested in better methods for protecting PII for students and their families that might be targeted for unwarranted or inappropriate services.

The Virginia Department of Education data governance team put together a video on the Virginia Longitudinal Data System titled “What is VLDS?” The video is posted on the Virginia Longitudinal Data System website for all visitors to learn about how Virginia secures the privacy of their student data. The video provides clarification to parents on what happens to the data once they are collected.

46 states have proposed legislation that can potentially impact education data privacy at the local and state level. The evolving political landscape of education data privacy requires states to face challenges head-on and to create the capacity to improve their data governance policies on their own. A NESAC member bought up the concern that new legislation at the state level may add further burden and financial cost to improvement efforts. In Colorado, they have had movement with legislation because of activism on the part of parent groups. Managing requests for data takes considerable time and resources. Forum members noted that they are not always able to limit data that are requested because laws around privacy are inconsistent. Navigating a high volume of requests presents challenges, especially big requests for data that require a significant amount of work before handing them off to the requestor. Some agencies have staff who are dedicated to dealing with merged data sets. The combined calculations from a single request can become a significant burden for SEAs as they work to maintain high standards of education data privacy. Moreover, there should be boundaries on the purpose and use of the data that are made available to requestors.

Not everyone knows what FERPA is and many staff in education agencies may not understand the importance of following FERPA guidelines and the impact not following guidelines has on the privacy of student information. State superintendents have the ability to redefine the list of elements that are part of the directory information. However, it is up to districts to decide which directory information may be shared for specific purposes. When a charter school takes advantage of open records, they can go door-to-door in neighborhoods and directly market services to certain students and families. Some states may consider this a violation of FERPA, while others may not. Lawyers may not fully understand FERPA guidelines to adequately protect the education data privacy rights of students. Finding effective legal representation about student information and relevant laws has become a concern in some states. Lastly, implementing data agreements with vendors is essential to prevent lists of students, numbers, and addresses from being published.

Science, Technology, Engineering, and Math (STEM) Indicatorspdf file (707 KB)
Karen King (NSF), Ellen Mandinach (WestEd), and Martin Orland (WestEd), led a presentation on NSF-funded research on STEM indicators. The research is looking at the alignment between indicators that the STEM communities feel are critical and elements that are available in administrative systems that can inform the indicators. The panel shared what has been found so far and opened a discussion for feedback and recommendations in this area.

Of the 14 STEM Indicators, the following STEM Indicators are most connected to Statewide Longitudinal Data Systems (SLDSs):

1. Indicator 1 is the number of and enrollment in different types of STEM schools and programs in each district.
2. Indicator 2 measures teachers’ estimates of the amount of time (instructional minutes per week) that they devote to teaching science.
3. Indicator 3 is concerned with the range of in-school but non-classroom science-related learning opportunities that elementary schools may offer, arrange, or help broker.
6. Indicator 6 addresses teachers’ science and mathematics content knowledge for teaching.
7. Indicator 7 is concerned with teachers’ participation in professional development activities targeted for one of the STEM fields or STEM in general.
8. Indicator 8 concerns instructional leaders’ participation in professional development on creating conditions that support STEM learning. Some of these conditions may be specific to STEM, while many are aspects of a positive school climate that supports all kinds of academic learning.

Members were interested in learning more about STEM indicator research. The following topics were discussed in the Question & Answer portion of the NESAC presentation:

  • The STEM Indicator research is a feasibility study. The researchers are not aiming to create an additional data collection. The study is probing how NSF can inform the STEM indicators without adding burden.
  • Researchers are looking into use of the CEDS.
  • In state level data for K-5, it is unknown how much time is spent on science since, for example, the 3rd grade teacher is not necessarily a science teacher. There is great variation in how each district provides that information.
  • Defining STEM presents challenges when considering certain programs to include in the study. Not only do program definitions differ, but the specificity of what is being measured also brings in unique frameworks and classifications of the STEM course or STEM teacher, which may complicate the research analyses.
  • Out-of-field STEM teaching is common in states that do not have any STEM teacher certification programs.
  • In other states, STEM is defined by funding stream. For example, in some states, at the district level, STEM definitions are based on which grants they received from specific sources (Bill & Melinda Gates Foundation, seed companies, etc.). In many of these cases, STEM opportunities are created by interested teachers who want to do more to support their students in these fields.
  • The researchers are interested in building on top of work that has already been done. Using language educators already use, such as School Courses for the Exchange of Data (SCED) codes, CEDS, and data standards, may help to engage more SEA and LEA staff in the STEM Indicators research.
  • Forum members suggested adding a place/variable for schools that consider themselves STEM, but do not fall into the definition set through the research.
  • Even though STEM is a popular topic at the SEA level, in some LEAs STEM is still low on their radar. They have career and technology (CTE) directors, CTE courses, and CTE definitions, but not STEM.

Karen, Ellen, and Martin discussed whether SLDSs may be a resource and repository for the needed data elements to address the STEM indicators of success at the state and national levels. They have also done an analysis of CEDS elements using the Align tool to see what elements states are currently using in this area. Ellen invited Forum members to participate in focus groups to contribute more feedback.

Afternoon Session
Civil Rights Data Collection (CRDC)pdf file (527 KB)
Abby Potts (NCES), Ross Santy (NCES), Janis Brown (Office of Civil Rights [OCR]), and Rebecca Fitch (OCR), joined NESAC to provide an update on the CRDC. Ross began the update by explaining how CRDC has evolved over the years to incorporate user feedback. Abby gave a PowerPoint presentation outlining updates to CRDC and lessons learned from the previous years’ collections.

The website has been significantly reworked based on feedback from 2013-14 so that it can handle more widespread use and data entry for 2015-16. The website needs to be compatible with the demands of complex student information systems (large districts). The collection was scheduled to open in early April but had to shut down because it was not performing acceptably from a technical perspective–so it officially opened on April 30. The pre-population and file upload feature on the website has been successful with eight states. OCR is working to keep the FAQs and tip-sheets on the website updated as more questions come in. There is a need to post specific resources on school expenditures so that people from their respective finance offices in each state can learn how to use the site.

NESAC members discussed the following topics with the presenters:

  • OCR changed the organization of the tool to reflect feedback gathered from LEAs during site visits.
  • Summertime is a difficult time of the year to collect data for smaller districts that are unavailable during summer months. Smaller districts may run into more challenges and lag behind other larger districts.
  • The capability to prepopulate file uploads that match the CRDC submission system streamlines the process for LEAs to review the data for accuracy.
  • Cognitive interviews were useful to inform changes on the website design. The site has improved usability with updated FAQs that address common questions. In addition, document download numbers have increased to the 1,000s for short tip sheets and smaller worksheets about how to navigate the website.
  • OCR is also working to make the site more efficient for 2015-16 by refining the edit process so that the error messages are not so voluminous.
  • It will be useful to include additional resources for data administrators who are in charge of school level expenditures.
  • Depending on who is asked, there is no optimal time to have a submission window for the CRDC. NESAC Committee members agree that having a generous window would be preferable any time of the year–and that summertime is generally not a good time for the window.
  • Often, it seems like a new experience for those submitting data on a regular basis. It would be useful if the CRDC could provide more support to LEAs so that they understand where the data collection comes from and they know what to do and how to do it.
  • Sending information to a single main contact has a limited impact. Messages should also be sent to SEA contacts.
  • The more alignment CRDC has with data collections LEAs are already doing, the easier it is to enter the data. CRDC is working to align more with EdFacts so that LEAs can prepopulate data on the site and to reduce the burden on LEAs.
  • LEAs must be the locus of control over the data that gets reported to the CRDC. The certification statement is proof that the CRDC is being done the right way. CRDC is unable to collect data from EdFacts.
  • OCR carefully considers each extension request, especially if a school needs to be added or deleted, and needs more time to fill out data.
  • Many LEAs are burdened with putting together the CRDC data for students who took the SAT and ACT. There are many challenges with data validation and in merging with the SIS system. NCES has a workgroup dedicated to tackling the issue of submitting student lists of SAT and ACT test takers. Forum members suggested that when working with the College Board, it would be easier for LEAs to use their own student ID.
  • OCR is looking forward to the 2015-16 CRDC and as of July 2015, the Office of Management and Budget (OMB) package is out for public comment. Minor clarifications and changes have been made to the package.
  • IDEA count and enrollment counts might not be the same depending on when the window ends. Furthermore, if the windows move, SEAs may no longer be able to prepopulate those data. CRDC could potentially reintroduce some burden that had been removed.
  • The public comment period is an important time for LEAs and SEAs to submit this exact kind of feedback.
School Climate Surveys (SCLS)pdf file (527 KB)
Isaiah O’Rear (NCES), provided an update on the development of a web-based data collection platform for the administration of the School Climate Surveys (SCLS), a suite of surveys designed to collect data for the measurement of school climate. The SCLS web-based platform will enable schools, school districts, and states throughout the country to administer the survey suite to middle and high school students, teachers, and school staff. Isaiah provided an overview of the SCLS background and highlighted the SCLS respondent demographics, administration of the survey, online platform, and reporting of SCLS results. Isaiah led the Q&A discussion to address questions from the Forum about the surveys. Through this discussion, Isaiah offered more information on the following aspects of the survey:

  • Subgroups are self-reported.
  • SCLS researchers considered several different models for school climate. Many of those models included teaching and learning. This SCLS is not based on a specific model, and does not directly address academic issues.
  • LEAs have the ability to assign student IDs randomly or to create a crosswalk that links to their own data sources as a *.csv document.
  • Every respondent receives a pin as they log on to the survey. The pin allows respondents to complete the survey in multiple sittings.
  • The SCLS will be available in October.
  • The SCLS will provide benchmarks based on 250 middle schools and 250 high schools that reflect a nationally representative sample collected by NCES.
  • No statistical tests have been planned. Aggregate school level scores will be available.
  • The SCLS does not have a comment section for respondents. Administrators do, however, have the functionality to create their own custom questions.
  • Each respondent group will have about 70 questions which take an average time of 20 to 25 minutes to complete. NCES is working to reduce cognitive bloat.
  • The database lives on the local server of the administrator. The data is not transmitted to NCES.
  • There are no plans to have a post-secondary version of SCLS.
  • Regarding discipline and disciplinary infraction data, the SCLS does not ask students about their individual behavior or own experiences. The survey specializes in environment, not incident counts. If any LEAs are interested in adding their own items, there is the flexibility to add custom questions and response items for things that were not on the survey.

Isaiah encouraged NESAC members to visit the SCLS website at http://safesupportivelearning.ed.gov/scls to learn more about the survey and to watch the SCLS Administration Guide videos.

Topics from the Floor
Susan Williams (Virginia Department of Education) opened the floor to NESAC members to engage in discussion on data visualization. Virginia contracted with the Center for Innovative Technology to do a nationwide research project on best practices for displaying report card-like data for the public using graphics and charts. Susan provided a brief overview of this project and asked NESAC members about any best practices around displaying data and communicating information with their audience. NESAC members shared the following feedback:

  • For Tennessee LEAs, data visualizations are often utilized for internal consumption by teachers and school staff. Making complex data consumable greatly depends on data use and what is appropriate for different audiences.
  • LEAs are interested in determining rules for flagging information on a dashboard (as red, yellow, or green), and determining how to effectively use impact data.
  • More discussion on helping LEAs navigate personally identifying information (PII) issues associated with digital learning programs would be useful.
  • A NESAC member reminded colleagues of the importance of paying attention to accessibility requirements.
  • Online interactive tools may be a more inviting method to encourage users to consume data. Translating the use of geographic information systems in the field of education is a challenge. It is not easy to find experts in this area.
  • Funding sources for certain products are not always available. Other state departments can take LEA or SEA data and publish it with better technology. The problem is that they do not have an education background so the data may be incorrect.
  • The goal is to put out sensible data visualizations that are accessible to all.
  • Another goal would be to help users understand what the data visualization means. There is a mismatch of interest between an audience of colleagues and the public audience.
  • Graphic designers are skilled at knowing their audiences and experts could assist with data visualizations.
  • Audiences should also be aware of the balance between simplicity and error. For example, some software has complex functions happening in the background. If a graphic designer does not understand it, they could end up presenting error-filled data.

Other topics from the floor included:

  • Volunteers are encouraged to provide feedback to NSF researchers on STEM Indicators.
  • Forum members are interested in future updates about ESEA reauthorization and the Every Child Achieves Act of 2015.
  • Forum members would like to hear more about data governance at the LEA level, and how to maximize support from executive leadership on their data system frameworks.

NESAC Election
Kristina Martin (Macomb Intermediate School District [MI]) was elected NESAC Chair and Susan M. Williams (Virginia Department of Education) was elected NESAC Vice Chair for 2015-16.

Wednesday, July 8, 2015

Morning Session
Committee Discussion: Education Data Standards
NESAC Chair Janice Petro (Colorado Department of Education) opened the floor for an informal committee discussion on the topic of education data standards. Forum members discussed the most recent version of CEDS:

  • SEAs seem to be doing more with CEDS than LEAs.
  • CEDS is particularly useful when blending different data systems into one consolidated system. Moreover, several vendors are now using CEDS, which reduces burdens and enables SEAs to map and manage data over one common framework.
  • More professionals are realizing the value in data management as a discipline.
  • The alignment tool is useful for specific outputs when looking at the same field in different data systems. This ensures that the right information is being pulled out for the right purpose.
  • ¬†When data are mapped to CEDS, SEAs can use CEDS to help clarify data element content.
  • Some SEAs use CEDS for outreach. CEDS can be used to start conversations around the importance of having a common language to improve communication between education agencies when students move. CEDS outreach materials help to support SEA efforts to maintain data quality across their state.
  • LEAs can benefit from using a common language that clarifies understanding. The more people that use CEDS, the easier it becomes to ask for specific element fields, locate the data, and understand how it is collected. CEDS can make data governance easier for LEAs.
  • If all data collections were defined in CEDS, it could work as a self-monitoring data governance tool.
  • Some SEAs use vendors that are unable to tell the SEA what the vendor needs to be pulling from the SEA’s data system.
  • If LEAs and SEAs demand that vendors must be CEDS-compliant, LEAs and SEAs can make CEDS an industry standard.
  • EdFacts is mapped to CEDS.

Steering Committee Business/Report
NESAC Chair Janice Petro (Colorado Department of Education) reviewed topics discussed by the Steering Committee, as well as meeting events, action items, and issues that will be reported at the Closing Session. The Steering Committee approved two new working groups on the topics of data visualization and data privacy.

EdFacts
Challis Breithaupt, Barbara Timm, and Kelly Worthington (NCES) provided an update on EDFacts and the School Year 2015-16, 2016-17 and 2017-18 OMB Information Package. The OMB package is the official registration of the EDFacts data collection with the Office of Management and Budget (OMB). Every three years EDFacts has to go through a clearance process which includes two public review cycles; during which time anyone can comment on the items being collected through EDFacts. Changes to EDFacts are ones proposed by the ED program offices that steward the data. The few new or changed items being proposed include: directory definitional changes, direct certification, chronic absenteeism, homeless category in the adjusted cohort graduation rate, Kindergarten entry assessments (where applicable), and scale score mean and standard deviations.

The package has since been posted for public comment: http://www.regulations.gov/#!documentDetail;D=ED-2015-ICCD-0090-0001. Comments are due on September 8th. This is the first public comment period, 60 days, after which ED will revise the package, respond to the comments and repost for a 30 day comment period. After that period a final package is presented to OMB for approval. Forum members should review and respond to Appendix D, directed questions.

NESAC members were interested in discussing the following comments and topics with the presenters from EdFacts:

  • Chronic absenteeism is often collected in hours per year, not in days. EdFacts wants to collect the exact information states have. EdFacts is interested in hearing about whether or not their requests are reasonable and feasible for states to implement.
  • The Office of Civil Rights (OCR) and School Improvement Grant Program (SIG) both have definitions proposed for absenteeism. It would be useful for an FAQ or Q&A to go out to states that help clarify the two different definitions.
  • The way that schools take attendance varies greatly across states.
  • EdFacts is interested in comments about the burden and associated costs of the proposed changes. The EdFacts team will work to individually address each comment.
  • EdFacts is interested in finding the best way to get the most complete data sets from SEAs.
  • Forum Members are highly encouraged to participate in the 60-day comment period for the proposed EdFacts package to receive official feedback on specific questions.
  • Data collection for homeless students remains a challenge for SEAs. The associated costs with tracking homeless student information continue to be a burden for SEAs. EdFacts is aware of the difficulty in collecting the right, correct data.
  • EdFacts is interested learning more about state definitions, economic disadvantage, and direct certification data. If states are concerned with following certain guidelines, EdFacts wants to better understand how to improve communication.
  • EdFacts is aware of the Forum Guide to Alternative Measures of Socioeconomic Status in Education Data Systems that provides a significant amount of comments on broader ways of defining socioeconomic status in regards to direct certification.
Regional Educational Laboratory (REL) Programpdf file (689 KB)
Joy Lesnick and Lisa Sanders (National Center for Education Evaluation and Regional Assistance, USED) led a presentation on the REL Program. They outlined REL product types, Institute of Education Sciences (IES) publications, infographics, and logic model maker software. Joy also provided an overview of the REL Program Research Alliances: groups of practitioners and/or policymakers who work with researchers to use data and research to understand and address a specific education problem. The 79 REL Research Alliances are diverse in topical focus, structure, membership, and geography. Joy encouraged NESAC members to learn more about REL publications by signing up for the IES Newsflash at www.ies.ed.gov/newsflash.

Joy led the Q&A portion of the presentation and provided the following information:

  • REL Publications can be found at http://ies.ed.gov/ncee/edlabs/projects/index.asp.
  • The Ask A REL tool provides annotated bibliographies to users who enter an education research question. The turnaround time for the answer may vary because RELs search databases, literature, and libraries to provide accurate information–especially for specific information. Ask A REL can serve as technical support and service to each region. Some RELs publish all of their Ask A REL questions.
  • Currently, infographics are located in multiple locations. Forum members would benefit more from the REL website if all of the infographics were in the same place or easier to find.
  • Ghedam encouraged NESAC members to reach out to the associate Forum member from their respective REL. She also welcomed discussion around better ways to communicate with RELs.

Meeting Review/Planning Next Steps
Janice Petro (Colorado Department of Education) opened the floor to comments and suggestions from NESAC members on the 2015 Summer Forum Meeting. NESAC members provided the following feedback:

  • The format in NESAC was appreciated because there was ample discussion time and conversations did not feel rushed.
  • The NESAC sessions were well-sized and well-timed.
  • Members are interested in using Outlook Calendar invitations with the Forum Virtual Meeting schedule so that Forum members can better remember Forum events.
  • A virtual meeting on EdFacts would be useful to hold at the midpoint of the 60-day comment period. Forum members can meet virtually and work to ensure their comments at the Forum Meeting are included.

Top

Policies, Programs and Implementation (PPI) Committee Meeting Summary

Monday, July 6, 2015

Morning Session
PPI Vice Chair David Weinberger (Yonkers Public Schools [NY]) opened the meeting by reporting that PPI Chair John Kraman was no longer a Forum member. As Vice Chair, David accepted the responsibility of moderating this PPI meeting in the Chair’s absence. He welcomed everyone to the Summer 2015 PPI meeting and led the group in introductions. In addition to stating their name and organization, each participant was invited to share information on how Forum work has benefitted them and their organization. It was clear both new and experienced members are effectively using Forum publications in state education agencies (SEAs) and local education agencies (LEAs) throughout the nation.

David reminded participants that Forum meetings are intended for Forum members and invited guests. Others interested in these topics are encouraged to attend the STATS-DC data conference. It is important for vendors to recognize that by participating in Forum meetings, they may have access to information that could potentially disqualify them from a competitive bidding process at a national, state, or local level.

David then provided a brief overview of the PPI Agenda.

Privacy Kickoff Discussion
PPI Vice Chair David Weinberger (Yonkers Public Schools [NY]) facilitated an open discussion about privacy issues. PPI focused on the following topics:

  • Sonya Edwards (California Department of Education) opened the discussion with a short update on privacy issues in her agency, including social media, 3rd party contracts, and personal information protection. New or proposed legislation states that:
    • LEAs must inform parents if social media is monitored. If so, it must be for the purpose of student safety.
    • LEAs must include specific provisions for protecting student privacy in all contracts with 3rd parties.
    • Companies are prohibited from using or selling personally identifiable information (PII) for marketing.
Professional development on these topics is provided by both the California Educational Technology Professionals Association and the California County Superintendents Association.
  • Privacy is becoming increasingly more important and more complicated in states and districts.
  • Some state laws do not differ from Family Educational Rights and Privacy Act (FERPA), but they are still needed to formally acknowledge that it is important in the state as well as at the federal level.
  • New laws being considered in some states could substantially derail work that is currently being undertaken (e.g., Statewide Longitudinal Data Systems [SLDS]).
  • The software industry suggested that there would be “unintended consequences” to various proposed laws that would limit the use of student data for marketing purposes, but no one in PPI could think of a negative effect.
  • A new privacy law in Georgia goes into effect in 2016. The Student Data Privacy, Accessibility, and Transparency Act is very comprehensive and requires a process for updating security practices over time.
    • A lot of precautions that are already common practice will be formalized in policy, especially direction to LEAs about how they deal with vendors.
    • Georgia will appoint a Chief Privacy Officer as explicitly called for in the law.
    • GA plans on gathering exemplary contracts to leverage what is already available in LEAs.
    • GA will also create an incident response system as well as a mechanism for parents to protest when they do not receive access to their student’s data in a timely manner.
  • Transparency is important, but there are concerns that SEAs and LEAs cannot post everything to a website in the name of transparency or it will be viewed as a “hacker’s handbook.”
  • West Virginia had a new law last year as well. It is very similar to new/prospective laws elsewhere. It appears as though states are modeling legislation based on a common template.
  • In some states, advocacy groups are very interested in how data are shared and want parental consent for disclosure of any PII, including to researchers. Other groups feel strongly that no information should be shared with the U.S. Department of Education (USED). SEAs wanted it to be clear that they do not “sell” data to USED in return for funding. PPI thought there was a lot of misinformation that is negatively affecting how SEAs and LEAs conduct business. Could the Forum help to correct the misinformation?
  • In some states, there are very vocal advocacy groups “... Against the Common Core,” even in states that have not adopted the Common Core. When asked what they object to, they often respond with vague or misinformed ideas about schools collecting biometric data, maternal weight gain during pregnancy, etc. Some people incorrectly assume that SEAs collect this type of information and store it in an SLDS.
  • Can the Forum help to correct misinformation and fix the preponderance of “street mythology” by explaining to stakeholders what data are collected, what is done with the data, which parts are reported to USED (all of which is in aggregate form), etc. Because some audiences do not want to read this information (which is already available in some cases), a YouTube video would be helpful to debunk some of the misinformation. For example, a narrator could say, “Data collected on Joey are used to improve his school experience and are masked or aggregated before being shared... then this happens to it, then that... see...” Some people still may not trust a government video, but at least correct information will be readily available. Additional video suggestions included showing a blank page and saying, “This is the PII we share with the federal government,” and linking directly to federal sites so viewers can see the types and level of detail shared with USED.
  • Ensuring data privacy and providing accurate information on privacy practices are components of data governance.
  • A recent Georgia bill provides a list of reasonable bill components.

Afternoon Session
Topics from the Floor
PPI Vice Chair David Weinberger (Yonkers Public Schools [NY]) opened the floor for topic suggestions. Member suggestions included discussion around the following topics:

  • There is a need to train teachers to properly use data. The process could be similar to how a researcher is trained to follow institutional review board (IRB) rules. Too often training is not offered until after an incident has occurred.
    • Some PPI members liked the idea of a preparatory program for data privacy and security, but it would need to be maintained over time to reflect ever-changing rules, laws, best practices, etc.
    • Note that some LEAs do not want SEAs to tell them what to do. To build trust between SEAs and LEAs, SEAs can offer assistance rather than issuing requirements or mandates. Similarly, people often appreciate checklists, list of questions to consider, etc.
    • There are many obstacles to implementation of new privacy trainings. Some districts have more than 40+ schools. Unfortunately many teaching staff do not have enough time to become educated on privacy.
    • Classroom teachers may not be the only appropriate audience for this type of training. Unfortunately, staff in central offices may draft google forms to ask which kids did “x” (did not pass a test, got disciplined, etc.). Perhaps establishing basic rules about what can and cannot be done would be more useful than new training.
    • Not everyone needs to become an expert on FERPA; they just need to know to not email PII (or another specific behavior).
  • Marilyn suggested the production of a Forum product in coordination with Privacy Technical Assistance Center (PTAC). PTAC materials are useful, but LEA stakeholders might benefit even more from a handbook. PTAC resources are different than Forum guides; they are not developed by practitioners in SEAs and LEAs for use by other practitioners. Perhaps the Forum and PTAC could collaborate and develop three online courses– one each focused on the local, state, and federal level. The Forum resource should be simple and easily consumable for practitioners with practical recommendations about what to do or not to do (and the rationale should be kept separate).
    • There are a lot of resources to that help individuals better understand FERPA, but not as many resources about how to implement it. In some states, PTAC only covers part of the information that is needed for different audiences, which leaves some needs unmet. The Forum can canvass resources to see what is available and then work to fill in the gaps. The goal is not to turn teachers into FERPA experts, but to effectively communicate general principles and specific practices.

Assessment Consortia
Jessica McKinney (USED), Jeffrey Cuff (Partnership for the Assessment of Readiness for College and Careers [PARCC]), and Brandt Redd (Smarter Balanced Assessment Consortium [SBAC]), joined PPI to provide an update and discussion on the work of the assessment consortia. Jessica began the presentation with a review of the Race to the Top (RTT) pdf file (724 KB) context and RTT summative assessment goals. Jeffrey provided an overview pdf file (718 KB) of PARCC, summative administration, non-summative administration, PARCC Platform Solution Diagram, student response capability features, interoperability standards, data reporting to states/districts, and technical challenges. Brandt continued with a presentation pdf file (900 KB) that highlighted Smarter Balanced achievements in the past year, the three pillars of SBAC assessments, accessibility features, the variety of compatible devices, test results data that are CEDS and Schools Interoperability Framework Association (SIF) compliant, interoperability standards, and technical challenges.

Privacy Discussion
PPI Vice Chair David Weinberger (Yonkers Public Schools [NY]) opened the floor to resume PPI discussion of pressing privacy issues. Topics included:

  • Forum members want more data points tied to accountability to focus attention on them.
  • Far too many data reports are organized so that it is easy to get all of the data about one school, or one piece of data about all schools, but they do not offer other approaches to data access and analysis.
  • Accreditation is driving reporting now because Schools of Education at colleges and universities need to know how their graduates are doing in the field of teaching.
  • One SEA reported that the State Department of Environmental Quality asked for student attendance day-by-day to see if air quality correlated with students missing school.
  • A very strict interpretation of some privacy laws has resulted in “over-redacting” data reports. This needs to be resolved because some SEAs are unnecessarily making reports less useful.
  • The effective use of data requires a certain level of skill. To monitor use, however, it helps to have to have the user’s perspective. As we work to produce a new resource, it may be difficult for Forum members to know about use. The Forum may want to consider including users on working groups or as reviewers.
  • One successful approach for SEAs is to think of districts as clients. For example, we know that high absenteeism before 3rd grade is important, so an SEA can push data out to LEAs that identifies early elementary school students who have missed more than “x” days during the past year. Or, an SEA could say, “We noticed that you suspend minority students at x times the state rate and x times your suspension rate for white students. The Office of Civil Rights (OCR) tells us this is common and we believe you should review this very intentionally.”
  • What are parents (or teachers) doing with individual reports coming out of SBAC or PARCC? Are they using them appropriately, drawing appropriate conclusions, or avoiding inappropriate conclusions? Some teachers are not prepared to assimilate reports so that they can engage in data-driven instruction (which is a high profile concept these days).
  • We can’t assume that parents or teachers are data literate. Data literacy should start as soon as possible in an educator’s career – developing skills across the education school curriculum. Praxis will be integrating data literacy skills, which will increase emphasis in courses. The more we can do to help teacher training in universities, the better. Perhaps the Forum can put together a Forum Guide to Data for Student Teachers.
  • Data are second nature to Forum members, but that skill had to be learned and there is not an easy way to convey that concept. PPI members could not teach a classroom full of kids. Data requires different training. Rather than having a teacher do the data specialist’s job, the data people need to do a better job conveying the message in the data – perhaps this could be a theme for the Forum Data Visualization Working Group.
  • There are school-based data teams that are doing a lot with data. Some of that work is very good, but sometimes outputs reflect limited data use skills, which can result in poor reporting and misinformation.

Tuesday, July 7, 2015

Morning Session
Privacy Technical Assistance Center
Michael Hawes (USED), and Baron Rodriguez (PTAC), joined PPI to follow up on the Privacy general session. PPI members were interested in discussing the following topics with Michael and Baron:

  • Does the Freedom of Information Act (FOIA) apply to SEAs? It is federal in focus, but federal grants and program funds go throughout SEAs, which could put related data in scope.
  • There has been an investment in SLDSs and other systems, which could be used as an argument to retain them or else it becomes a wasted investment. The data in those systems also have value. For example, in some states, there is a law that every LEA needs to have an early warning system. The SLDS can serve this purpose, which provides cost avoidance for LEAs.
  • PTAC will release several new resources by September. These include videos and tri-folds that are easily customizable by SEAs and LEAs (e.g., with space to insert relevant state laws alongside the FERPA summary).
  • Online computer-based training courses would be helpful as a tool to learn more about FERPA.
  • Providing accurate, reliable information is a great way to counter concerns about privacy.

Science, Technology, Engineering, and Math (STEM) Indicatorspdf file (707 KB)

Karen King (NSF), Ellen Mandinach (WestEd), and Martin Orland (WestEd), led a presentation on NSF-funded research on STEM indicators. The research is looking at the alignment between indicators that the STEM communities feel are critical and elements that are available in administrative systems that can inform the indicators. The panel shared what has been found so far and opened a discussion for feedback and recommendations in this area.

Of the 14 STEM Indicators, the following STEM Indicators are most connected to SLDSs:

  • Indicator 1 is the number of and enrollment in different types of STEM schools and programs in each district.
  • Indicator 2 measures teachers’ estimates of the amount of time (instructional minutes per week) that they devote to teaching science.
  • Indicator 3 is concerned with the range of in-school but non-classroom science-related learning opportunities that elementary schools may offer, arrange, or help broker.
  • Indicator 6 addresses teachers’ science and mathematics content knowledge for teaching.
  • Indicator 7 is concerned with teachers’ participation in professional development activities targeted for one of the STEM fields or STEM in general.
  • Indicator 8 concerns instructional leaders’ participation in professional development on creating conditions that support STEM learning. Some of these conditions may be specific to STEM, while many are aspects of a positive school climate that supports all kinds of academic learning.

PPI members focused the discussion on the following topics:

  • National data are helpful, but more granular level (LEA) data are especially important to this research given that the goal is to have information that can be used to inform decisionmaking and improve STEM education in schools and districts.
  • STEM is not just coursework. In broad terms, it can include teaching methods, school climate, etc. Karen King noted that related studies are being undertaken as well. For example, there are a range of “STEM schools” – some require applications for admission; some require course prerequisites; and others permit students to just show up. PPI members asked whether or not all of these schools would be considered “STEM schools.”

Karen, Ellen, and Martin discussed SLDSs may be a resource and repository for the needed data elements to address the STEM indicators of success at the state and national levels. They have also done an analysis of CEDS elements using the Align tool to see what elements they states current are using in this area. Ellen invited Forum members to participate in focus groups to contribute more feedback.

E-NAEPpdf file (697 KB)
William Ward (NCES) provided an update on lessons learned from the pilot technology-based National Assessment of Educational Progress (NAEP). The NAEP program is in the midst of transitioning all of its assessments to digitally based content and delivery. Beginning in 2017, the NAEP mathematics, reading, and writing assessments will be administered to students throughout the nation on NAEP-provided tablets. The first stage began with a pilot test in 2015 for mathematics, reading, and science. A PPI member found the discussion to be very informative with respect to the issues the National Assessment of Educational Progress (NAEP) is considering and the direction it appears to be taking for the future. PPI noted that in ten years, we may likely be spending more time talking about how kids “interact” with exams and less about the “administration” of exams.

Afternoon Session
School Climate Surveys (SCLS)pdf file (527 KB)
Isaiah O’Rear (NCES) provided an update on the development of a web-based data collection platform for the administration of the School Climate Surveys (SCLS), a suite of surveys designed to collect data for the measurement of school climate. The SCLS web-based platform will enable schools, school districts, and states throughout the country to administer the survey suite to middle and high school students, teachers, and school staff. Isaiah provided an overview of the SCLS background and highlighted the SCLS respondent demographics, administration of the survey, online platform, and reporting of SCLS results. Isaiah led the Q&A discussion to address questions from the Forum about the surveys. Through this discussion, Isaiah offered more information on the following aspects of the survey:

  • Subgroups are self-reported.
  • SCLS researchers considered several different models for school climate. Many of those models included teaching and learning. This SCLS is not based on a specific model, and does not directly address academic issues.
  • LEAs have the ability to assign student IDs randomly or to create a crosswalk that links to their own data sources as a *.csv document.
  • Every respondent receives a pin as they log on to the survey. The pin allows respondents to complete the survey in multiple sittings.
  • The SCLS will be available in October.
  • The SCLS will provide benchmarks based on 250 middle schools and 250 high schools that reflect a nationally representative sample collected by NCES.
  • No statistical tests have been planned. Aggregate school level scores will be available.
  • The SCLS does not have a comment section for respondents. Administrators do, however, have the functionality to create their own custom questions.
  • Each respondent group will have about 70 questions which take an average time of 20 to 25 minutes to complete. NCES is working to reduce cognitive bloat.
  • The database lives on the local server of the administrator. The data is not transmitted to NCES.
  • There are no plans to have a post-secondary version of SCLS.
  • Regarding discipline and disciplinary infraction data, the SCLS does not ask students about their individual behavior or own experiences. The survey specializes in environment, not incident counts. If any LEAs are interested in adding their own items, there is the flexibility to add custom questions and response items for things that were not on the survey.

PPI members discussed the following topics with Isaiah:

  • There are not any limitations on who can use this survey. It is a tool that NCES is making available to anyone who accesses it, but with the expectation that the majority of users will be from schools, LEAs, or SEAs.
  • Schools, LEAs and SEAs will be able to customize the SCLS, adding their own custom questions to the surveys.
  • If a user needs additional subgroups, NCES may be able to modify the tool. Isaiah O’Rear is the point of contact.
  • SEAs could administer the survey on behalf of LEAs and then feed the data back to the districts. All data are anonymous by design. There are also suppression rules in the reporting module to minimize the likelihood of accidental release of personally identifiable information (PII). For example, only percentages (not counts) are provided in the reporting module.

Isaiah encouraged PPI members to visit the SCLS website at http://safesupportivelearning.ed.gov/scls to learn more about the survey and to watch the SCLS Administration Guide videos.

Topics from the Floor
PPI Vice Chair David Weinberger (Yonkers Public Schools [NY]) held an open discussion focused the topic of data burden.

  • USED collects data, but it is limited by Office of Management and Budget (OMB) review. SEAs do not always have comparable oversight and burden in some states can fall heavily on LEAs. In Rhode Island, however, a data governance board has been established–and any collection has to go through it. This has limited the number of collections and the number of repetitive requests originating in the SEA.
  • The discussion topic of burden brings up the question: What are the benefits of collecting the data? Data can be useful, which is why they are collected. For example, we know there is a cost of reporting absenteeism, especially in small schools with few problems related to chronic absenteeism. But, it might be more beneficial and valuable to identify chronic absentees in big high schools with serious problems. Being clear about the purpose of a collection may help get respondents out of compliance mode and into data quality improvement mode. Use case analyses would be helpful for making this point.
  • Some states are collecting fewer elements–through both consolidation and elimination. For example, Idaho is eliminating 151 elements.
  • In some states, LEAs are refusing to provide information (e.g., school violence, which they think is a local issue) unless they are mandated or there is program money attached.
  • North Carolina consolidated Civil Rights Data Collection (CRDC) reporting for all districts except for about a dozen elements, which has been a huge relief for LEAs. GA did it as well for CRDC. The new tool allows SEAs to upload what they have rather than requiring LEAs to enter 100 percent of the data even though they had already reported it to their SEA.
  • In many ways, EDFacts plays a similar role for USED–a single collection rather than each federal program having its own collection. Georgia applied this model to decrease LEA burden.

PPI Election
PPI Vice Chair David Weinberger (Yonkers Public Schools [NY]) was elected PPI Chair and Levette Williams (Georgia Department of Education) was elected PPI Vice Chair for 2015-16.

Civil Rights Data Collection (CRDC)pdf file (527 KB)
Abby Potts (NCES), Ross Santy (NCES), Janis Brown, Office of Civil Rights (OCR), and Rebecca Fitch (OCR), joined PPI to provide an update on the CRDC. Ross began the update by explaining how CRDC has evolved over the years to incorporate user feedback. Abby gave a PowerPoint presentation outlining updates to CRDC and lessons learned from the previous years’ collections. The website has been significantly reworked based on feedback from 2013-14 so that it can handle more widespread use and data entry for 2015-16. The website needs to be compatible with the demands of complex student information systems (large districts). The collection was scheduled to open in early April but had to shut down because it was not performing acceptably from a technical perspective–so it officially opened on April 30. The pre-population and file upload feature on the website has been successful with eight states. OCR is working to keep the FAQs and tip-sheets on the website updated as more questions come in. There is a need to post specific resources on school expenditures so that people from finance offices in each state can learn how to use the site.

PPI members discussed the following topics with the presenters:

  • PPI members asked if it would be possible for the system to generate documentation that shows which EDFacts files/elements are the source of an “error” message that LEAs get when they submit data that does not match SEA EDFacts data. Moreover, perhaps the system could provide a summary of the current error reports which, presumably, can be generated more quickly than the full error report which takes a lot of time and sometimes provides unnecessary information. PPI also suggested that a screen shot be included so it is clear where the incongruence originates (so they can resolve the problem).
  • The proposed November-February submission window is very much preferable to this year’s window during testing season.
  • Presenters noted that CRDC mapped its questions to CEDS elements and this information can be shared with vendors.
  • CRDC has received a lot of feedback and support from the Forum and they are grateful–acknowledging how valuable Forum input was for the improvements people are now seeing in the collection.
  • The 2015-16 package should look very familiar to respondents because OCR tried to minimize changes from the 2013-14 collection. Closing date for comments is August 3.

Wednesday, July 8, 2015

Morning Session
EDFacts
Challis Breithaupt, Barbara Timm, and Kelly Worthington (NCES) provided an update on EDFacts and the School Year 2015-16, 2016-17 and 2017-18 OMB Information Package. The OMB package is the official registration of the EDFacts data collection with the Office of Management and Budget (OMB). Every three years EDFacts has to go through a clearance process which includes two public review cycles; during which time anyone can comment on the items being collected through EDFacts. Changes to EDFacts are ones proposed by the ED program offices that steward the data. There are few new or changed items being proposed, they are: directory definitional changes, direct certification, chronic absenteeism, homeless category in the adjusted cohort graduation rate, kindergarten entry assessments (where applicable), and scale score mean and standard deviations.

The package has since been posted for public comment: http://www.regulations.gov/#!documentDetail;D=ED-2015-ICCD-0090-0001. Comments are due on September 8th. This is the first public comment period, 60 days, after which ED will revise the package, respond to the comments and repost for a 30 day comment period. After that period a final package is presented to OMB for approval. Forum members should review and respond to Appendix D, directed questions.

Regional Educational Laboratory (REL) Programpdf file (689 KB)

Joy Lesnick and Lisa Sanders (National Center for Education Evaluation and Regional Assistance, USED) led a presentation on the REL Program. They outlined REL product types, Institute of Education Sciences (IES) publications, infographics, and logic model maker software. Joy also provided an overview of the REL Program Research Alliances: groups of practitioners and/or policymakers who work with researchers to use data and research to understand and address a specific education problem. The 79 REL Research Alliances are diverse in topical focus, structure, membership, and geography.

Joy led the Q&A portion of the presentation and provided the following information:

  • REL Publications can be found at http://ies.ed.gov/ncee/edlabs/projects/index.asp.
  • The Ask A REL tool provides annotated bibliographies to users who enter an education research question. The turnaround time for the answer may vary because RELs search databases, literature, and libraries to provide accurate information–especially for specific information. Ask A REL can serve as technical support and service to each region. Some RELs publish all of their Ask A REL questions.
  • Currently, infographics are located in multiple locations. Forum members would benefit more from the REL website if all of the infographics were in the same place or easier to find.

PPI members asked if there is any way to decrease the amount of time it takes to conduct a study. Joy agreed that there is a tension between quality and time for approval and peer review. The RELs believe that a relatively new approach to “just in time” projects will streamline efforts.

Privacy Discussion Wrap-up
PPI Vice Chair David Weinberger (Yonkers Public Schools [NY]) opened the floor to briefly summarize and review the general session on education data privacy. PPI noted the following:

  • What can an SEA do to help an LEA with data governance? It is helpful if LEAs ask for support so that SEA staff can then respond to needs rather than appear to direct efforts. It may just be a perception issue, but it matters.
  • The Forum should coordinate with the Education Information Management Advisory Consortium (EIMAC) privacy working group.
  • What is the timeline of the various federal privacy bills referenced earlier in the meeting? If a federal privacy bill is added as an amendment to the Elementary and Secondary Education Act (ESEA) it could happen very soon.

Steering Committee Business/Report
PPI Vice Chair David Weinberger (Yonkers Public Schools [NY]) reported that the Forum Steering Committee approved two new working groups: data visualization and privacy. Members who are interested in participating should contact Ghedam Bairu (NCES). The Steering Committee is also looking at how the Forum (and its committees and working groups) can continue to communicate effectively in an ongoing way when we aren’t together physically.

Meeting Review and Next Steps
At the next in-person or virtual meeting, PPI would like to consider the following issues:

  • As a new member, it wasn’t clear what the differences were between the three separate committees. Experienced members noted that on the surface, the topics are the same, but the focus changes and there are differences in what is presented and discussed. For example, PPI focused on privacy at this meeting and the other committees did not. We have some flexibility to further distinguish PPI if that is what the group wishes to do as we plan each meeting.
  • Early Learning and Early Childhood Development: There are many challenges in this sector. For example, how does early learning connect to SLDS conversations? Additionally, SEA data teams are not always clear about definitions and measures in the field of early childhood. Connecting to Head Start is a beginning, but the independent schools, parochial schools, etc. are difficult to communicate with given they are managed by different organizations. Yonkers Public Schools (NY) has PreK participation-graduation rate data suggesting that PreK experience is an important factor for graduation. Perhaps the Forum could make links to the U.S. Department of Health and Human Services and Head Start so that Forum members can further explore these data issues.
  • Vermont took a new approach to education legislation by focusing on skills instead of grade levels. PPI would like to hear more about the shift to skills-based learning because workforce (Workforce Innovation Act) shifted to skill development as well. What kids are learning (versus their grades) is increasingly important to stakeholders. Members would be interested in other innovations that are going on at the state level.
  • SLDS implementation: Simply having an SLDS doesn’t make it valuable. How will they be used? How can we support/encourage SLDS use by parents?
  • A wealth of knowledge is available in the PPI meeting. By sharing more (SEA-SEA and SEA-LEA), we will help members work smarter rather than harder.
  • Future Ready is rolling out stories (videos) about how various educational issues are unfolding. Maybe the Forum could be a similar venue for SEAs, LEAs, and schools to articulate how their stakeholders are using data as well (e.g., via videos, which are understandable by more stakeholders).

Top

Technology (TECH) Committee Meeting Summary

Monday, July 6, 2015

Morning Session
Welcome and Introductions
TECH Chair Mike Hopkins (Rochester School Department [NH]) welcomed everyone to the Summer 2015 TECH meeting and led the group in introductions. Mike reminded participants that Forum meetings are working meetings designed for Forum members only. Others interested in the work of the Forum are encouraged to attend the STATS-DC Data Conference immediately following the Forum Meeting.

Virtual Education Working Group Update
Laurel Krsek (San Ramon Valley Unified School District [CA]) updated TECH on the Virtual Education Working Group and the development of a new Forum Guide to Virtual Education Data. The Forum convened the Virtual Education Working Group to review and revise the Forum Guide to Elementary/Secondary Virtual Education (2006). The virtual education environment has grown and changed since 2006, and includes vastly different technologies and approaches to teaching and learning. At the same time, new developments in the field of data standards such as the Common Education Data Standards (CEDS) and School Courses for the Exchange of Data (SCED) have made it easier for state and local education agencies to collect, manage, compare, and use education data to inform and improve education. After reviewing the 2006 document, the Working Group began development of a new resource that will assist state and local education agencies as they

  • consider the impact of virtual education on established data elements and methods of data collection; and
  • address the scope of changes, the rapid pace of new technology development, and the proliferation of resources in virtual education.

Much of the structure and content of the old guide will remain in the update. However, the Working Group has added information to address the challenges currently facing SEAs and LEAs, such as establishing data governance and determining LEA/SEA responsibilities in situations where the virtual provider is out-of-district or out-of-state. One topic that has sparked quite a bit of discussion is the challenge of collecting data when information that is relevant to virtual education does not comply with established reporting–for example, many systems track course credit based on “seat time.” This is problematic when collecting data on virtual courses that award credit based on competency measures. There are very few data elements that are solely used for virtual education. However, there are a number of data elements that often exist in traditional data systems that are particularly useful for collecting virtual data. For example, CEDS includes a number of elements that can be used to identify organizational responsibilities, such as the “Responsible Organization Identifier.” CEDS also offers elements that can be used to identify achievement criteria for competency based education. The Working Group has highlighted these elements in the document along with other elements that are well-suited to collecting virtual education data. The Working Group has also found that virtual education regulatory structures vary, and while some states have state-endorsements for online programs, others do not. The Forum Guide includes an appendix of state links to information on virtual education. Forum members are encouraged to send input for the appendix to Ghedam Bairu (NCES). The Working Group expects to release this new resource later this year.

Consortium on School Networking (CoSN) Annual Meeting Reportpdf file (669 KB)

Tom Purwin (Jersey City Public Schools [NJ]) provided an update from his attendance at the CoSN Annual Meeting. It was Tom’s first time at this conference and he found the sessions very impressive and noted that there were many teachers in attendance. Tom attended sessions on e-rate, management systems, global trends, social media, badges, privacy, and data quality.

School Courses for the Exchange of Data (SCED) Working Group
Susan Williams (Virginia Department of Education) provided an overview and update on SCED Working Group activities. The Forum established a SCED Working Group to regularly review and update SCED with the assistance of subject matter experts at the local, state, and national levels. Since the last Forum meeting, the Working Group released SCED 3.0, which focused on new and updated standardized courses, including International Baccalaureate and Project Lead the Way; new and updated Family and Consumer Sciences courses; and new courses from the National Center for Education Statistics (NCES) Transcript Studies. NCES also supported mapping SCED to the Classification of Secondary School Courses (CSSC) and implemented SCED as the course coding system for NCES Transcript Studies. NCES recently supported the development of the SCED Finder (http://nces.ed.gov/SCEDfinder/home). This new tool helps users select predefined SCED codes and then assign elements and attributes to courses. The SCED Finder makes it easier for LEAs to map locally adopted codes and to identify available SCED codes. SCED Version 4.0 will include a comprehensive update. The Working Group needs assistance and review from Forum members, especially with identifying new codes that are needed for SEA and LEA SCED uses.

Afternoon Session
Assessment Consortia
Jessica McKinney (USED), Jeffrey Cuff (Partnership for the Assessment of Readiness for College and Careers [PARCC]), and Brandt Redd (Smarter Balanced Assessment Consortium [SBAC]), joined PPI to provide an update and discussion on the work of the assessment consortia. Jessica began the presentation with a review of the Race to the Top (RTT) pdf file (724 KB) context and RTT summative assessment goals. Jeffrey provided an overview pdf file (718 KB) of PARCC, summative administration, non-summative administration, PARCC Platform Solution Diagram, student response capability features, interoperability standards, data reporting to states/districts, and technical challenges. Brandt continued with a presentation pdf file (900 KB) that highlighted Smarter Balanced achievements in the past year, the three pillars of SBAC assessments, accessibility features, the variety of compatible devices, test results data that are CEDS and Schools Interoperability Framework Association (SIF) compliant, interoperability standards, and technical challenges.

Topics from the Floor: Committee Discussions
TECH committee members broke into LEA and SEA groups to discuss data visualization including what problems it solves, who are the stakeholders, what are some solutions, and what can a Forum resource provide. Both groups reported back and compiled a list of suggestions for the Forum Data Visualization Working Group.

Earlier in the day, TECH members reviewed topics of interest in addition to data visualization, including

  • sustainability of SLDSs;
  • privacy and security;
  • how SEAs can reduce the burden of LEAs;
  • data quality;
  • personalized learning;
  • managing central data systems; and
  • data systems movement to customer systems.

Tuesday, July 7, 2015

Morning Session
Science, Technology, Engineering, and Math (STEM) Indicatorspdf file (707 KB)

Karen King (NSF), Ellen Mandinach (WestEd), and Martin Orland (WestEd), led a presentation on NSF-funded research on STEM indicators. The research is looking at the alignment between indicators that the STEM communities feel are critical and elements that are available in administrative systems that can inform the indicators. The panel shared what has been found so far and opened a discussion for feedback and recommendations in this area.

Of the 14 STEM Indicators, the following STEM Indicators are most connected to SLDS systems:

  1. Indicator 1 is the number of and enrollment in different types of STEM schools and programs in each district.
  2. Indicator 2 measures teachers’ estimates of the amount of time (instructional minutes per week) that they devote to teaching science.
  3. Indicator 3 is concerned with the range of in-school but non-classroom science-related learning opportunities that elementary schools may offer, arrange, or help broker.
  4. Indicator 6 addresses teachers’ science and mathematics content knowledge for teaching.
  5. Indicator 7 is concerned with teachers’ participation in professional development activities targeted for one of the STEM fields or STEM in general.
  6. Indicator 8 concerns instructional leaders’ participation in professional development on creating conditions that support STEM learning. Some of these conditions may be specific to STEM, while many are aspects of a positive school climate that supports all kinds of academic learning.

Karen, Ellen, and Martin discussed whether SLDSs may be a resource and repository for the needed data elements to address the STEM indicators of success at the state and national levels. They have also done an analysis of CEDS elements using the Align tool to see what elements states are currently using in this area.

Civil Rights Data Collection (CRDC)pdf file (527 KB)
Abby Potts (NCES), Ross Santy (NCES), Janis Brown, Office of Civil Rights (OCR), and Rebecca Fitch (OCR), joined PPI to provide an update on the CRDC. Ross began the update by explaining how CRDC has evolved over the years to incorporate user feedback. Abby gave a PowerPoint presentation outlining updates to CRDC and lessons learned from the previous years’ collections. The website has been significantly reworked based on feedback from 2013-14 so that it can handle more widespread use and data entry for 2015-16. The website needs to be compatible with the demands of complex student information systems (large districts). The collection was scheduled to open in early April but had to shut down because it was not performing acceptably from a technical perspective–so it officially opened on April 30. The pre-population and file upload feature on the website has been successful with eight states. OCR is working to keep the FAQs and tip-sheets on the website updated as more questions come in. There is a need to post specific resources on school expenditures so that people from finance offices in each state can learn how to use the site.

Afternoon Session

E-NAEPpdf file (697 KB)
William Ward (Assessments Division: National Assessment Branch, NCES) provided an update on lessons learned from the pilot technology-based National Assessment of Educational Progress (NAEP). The NAEP program is in the midst of transitioning all assessments to digitally based content and delivery. Beginning in 2017, the NAEP mathematics, reading, and writing assessments will be administered to students throughout the nation on NAEP-provided tablets. The first stage began with a pilot test in 2015 for mathematics, reading, and science.

School Climate Surveys (SCLS)pdf file (527 KB)
Isaiah O’Rear (NCES) joined TECH to provide an update on the development of a web-based data collection platform for the administration of the School Climate Surveys (SCLS), a suite of surveys designed to collect data for the measurement of school climate. The SCLS web-based platform will enable schools, school districts, and states throughout the country to administer the survey suite to middle and high school students, teachers, and school staff. Isaiah provided an overview of the SCLS background and highlighted the SCLS respondent demographics, administration of the survey, online platform, and reporting of SCLS results. Isaiah led the Q&A discussion to address questions from the Forum about the surveys. Through this discussion, Isaiah offered more information on the following aspects of the survey:

  • Subgroups are self-reported.
  • SCLS researchers considered several different models for school climate. Many of those models included teaching and learning. This SCLS is not based on a specific model, and does not directly address academic issues.
  • LEAs have the ability to assign student IDs randomly or to create a crosswalk that links to their own data sources as a *.csv document.
  • Every respondent receives a pin as they log on to the survey. The pin allows respondents to complete the survey in multiple sittings.
  • The SCLS will be available in October.
  • The SCLS will provide benchmarks based on 250 middle schools and 250 high schools that reflect a nationally representative sample collected by NCES.
  • No statistical tests have been planned. Aggregate school level scores will be available.
  • The SCLS does not have a comment section for respondents. Administrators do, however, have the functionality to create their own custom questions.
  • Each respondent group will have about 70 questions which take an average time of 20 to 25 minutes to complete. NCES is working to reduce cognitive bloat.
  • The database lives on the local server of the administrator. The data is not transmitted to NCES.
  • There are no plans to have a post-secondary version of SCLS.
  • Regarding discipline and disciplinary infraction data, the SCLS does not ask students about their individual behavior or own experiences. The survey specializes in environment, not incident counts. If LEAs are interested in adding their own items, there is the flexibility to add custom questions and response items for things that were not on the survey.

TECH committee members asked about how this is being communicated to states and LEAs, how members can see the survey, and how NCES is dealing with any misconception of the survey items. Isaiah encouraged Forum members to visit the SCLS website at http://safesupportivelearning.ed.gov/scls to learn more about the survey and to watch the SCLS Administration Guide videos.

Education Data Privacy Follow-up Discussion
TECH Chair, Michael Hopkins (Rochester School District [NH]) led a discussion on data privacy and ideas for a Forum working group on privacy. Discussion topics included SEAs hiring privacy officers, submission of small cells, state privacy legislation, and parental notification. Suggestions for the Forum resource included checklists for schools and LEAs, information for teachers using Apps, and providing examples.

TECH Election
Dean Folkers (Nebraska Department of Education) was elected TECH Chair for 2015-16 and James Hawbaker (Appleton Area School District [WI]) was elected TECH Vice Chair for 2015-16.

Wednesday, July 8, 2015

Morning Session
Regional Educational Laboratory (REL) Programpdf file (689 KB)

Joy Lesnick and Lisa Sanders (National Center for Education Evaluation and Regional Assistance, USED) led a presentation on the REL Program. They outlined REL product types, Institute of Education Sciences (IES) publications, infographics, and logic model maker software. Joy also provided an overview of the REL Program Research Alliances: groups of practitioners and/or policymakers who work with researchers to use data and research to understand and address a specific education problem. The 79 REL Research Alliances are diverse in topical focus, structure, membership, and geography.

Joy led the Q&A portion of the presentation and provided the following information:

  • REL Publications can be found at http://ies.ed.gov/ncee/edlabs/projects/index.asp.
  • The Ask A REL tool provides annotated bibliographies to users who enter an education research question. The turnaround time for the answer may vary because RELs search databases, literature, and libraries to provide accurate information–especially for specific information. Ask A REL can serve as technical support and service to each region. Some RELs publish all of their Ask A REL questions.
  • Currently, infographics are located in multiple locations. Forum members would benefit more from the REL website if all of the infographics were in the same place or easier to find.

Virtual Meetings: Reflections and Suggestions for Moving Forward
TECH Chair, Michael Hopkins (Rochester School Department [NH]) led a discussion of TECH members regarding the virtual meetings over the past year and moving forward with these types of meetings. Many TECH members attended virtual meetings over the past year and found them helpful and high quality. Suggestions for the future included scheduling meetings out over the entire year now so they can be put on calendars and recording meetings and materials and posting on Forum360. Suggested topics for virtual meetings over the next year included: FERPA, providing action on working group topics (i.e. have working groups come to the committees virtually and ask for assistance on trouble spots), communications, ESEA, identity management and operations, and eNAEP.

EdFacts Update
Challis Breithaupt, Barbara Timm, and Kelly Worthington (NCES) provided an update on EDFacts and the School Year 2015-16, 2016-17 and 2017-18 OMB Information Package. The OMB package is the official registration of the EDFacts data collection with the Office of Management and Budget (OMB). Every three years EDFacts has to go through a clearance process which includes two public review cycles; during which time anyone can comment on the items being collected through EDFacts. Changes to EDFacts are ones proposed by the ED program offices that steward the data. The few new or changed items being proposed include: directory definitional changes, direct certification, chronic absenteeism, homeless category in the adjusted cohort graduation rate, kindergarten entry assessments (where applicable), and scale score mean and standard deviations.

The package has since been posted for public comment: http://www.regulations.gov/#!documentDetail;D=ED-2015-ICCD-0090-0001. Comments are due on September 8th. This is the first public comment period, 60 days, after which ED will revise the package, respond to the comments and repost for a 30 day comment period. After that period a final package is presented to OMB for approval. Forum members should review and respond to Appendix D, directed questions. TECH members had questions on LEA involvement in the review process, chronic absenteeism, and the definition of school.

Meeting Review/Planning Next Steps
TECH Chair, Michael Hopkins (Rochester School Department [NH]) led a discussion to review the meeting and plan the next in-person meeting. Suggestions for the next in-person meeting include having actionable activities for TECH members, more panels from states, more small group discussions with directed questions, and deeper working group updates. Members also wanted to reorganize the room to be like the PPI and NESAC rooms.



 Previous Page

Top

 

Publications of the National Forum on Education Statistics do not undergo the formal review required for products of the National Center for Education Statistics. The information and opinions published here are the product of the National Forum on Education Statistics and do not necessarily represent the policy or views of the U.S. Department of Education or the National Center for Education Statistics.