Skip Navigation

Summer 2019 Forum Meeting Notes

National Forum on Education Statistics
July 22–July 24, 2019
Washington, DC

Forum Opening Session
Forum Welcome
Joint Session: Report Card Panel Discussion
Joint Session: Update on Student Data Privacy
Joint Session: Update on Cybersecurity
Forum Closing Session
Steering Committee
National Education Statistics Agenda Committee (NESAC)
Policies, Programs, and Implementation (PPI) Committee
Technology (TECH) Committee Meeting Summary




Top

Forum Opening Session

Monday, July 22, 2019

 

Forum Agenda Review

Forum Chair Allen Miedema (Northshore School District [WA]) welcomed Forum members to the Summer 2019 Forum Meeting in Washington, DC, and thanked them for their time, work, and commitment over the past year.

Allen highlighted four recent Forum accomplishments: the publication of the Forum Guide to Early Warning Systems and the completion of the Forum Guide to Personalized Learning Data; the Forum Guide to Technology Management in Education; and the Forum Guide to Planning for, Collecting, and Managing Data About Students Displaced by a Crisis. He also recognized the Forum's new working groups on the topics of data governance and exit codes, the release of the School Courses for the Exchange of Data (SCED) Version 6, and the creation of a new online course on data visualization.

Allen introduced Dr. James “Lynn” Woodworth, Commissioner of the National Center for Education Statistics (NCES).

Top

Welcome and NCES Remarks

Dr. Woodworth, the commissioner of NCES, thanked Forum members for their attendance at the meeting. He also commended Forum members for all their work to improve education data. He summarized the major activities of NCES and provided an overview of NCES plans and activities, with a focus on information that state and local education agencies (SEAs and LEAs) may find most useful:

  • Strong response rates are critically important to ensuring that NCES data are representative and of high quality. The commissioner emphasized the importance of SEA and LEA participation in NCES data collections and encouraged Forum members to respond to data collections and surveys.
  • Recent NCES releases include Student Reports of Bullying: Results From the 2017 School Crime Supplement to the National Crime Victimization Survey; Baccalaureate and Beyond (B&B:16/17): A First Look at the Employment and Educational Experiences of College Graduates, 1 Year Later; Mapping State Proficiency Standards Onto National Assessment of Educational Progress (NAEP) Scales; and Characteristics of Private Schools in the United States: Results From the 2017‑18 Private School Universe Survey.
  • Forthcoming releases include reports on the School Survey on Crime and Safety (SSOCS) and the High School Longitudinal Study of 2009 (HSLS:09). Forthcoming data collections include the NAEP Grades 4 and 8 Mathematics and Reading assessments, the International Computer and Information Literacy Study (ICILS), the Programme for International Student Assessment (PISA), the SSOCS, and the Private School Universe Survey (PSS).
  • The commissioner highlighted the NCES Education Demographic and Geographic Estimates (EDGE) Program, which uses data from the U.S. Census Bureau's American Community Survey (ACS) to create custom indicators of social, economic, and housing conditions for school‑age children and their parents. It also uses spatial data collected by NCES and the Census Bureau to create geographic locale indicators, school point locations, school district boundaries, and other types of educational geography to support spatial analysis.
  • NCES continues its work with the Census Bureau on improving measures of socioeconomic status (SES) using ACS data, with the goal of developing location‑based SES estimates. NCES also continues its work on analyzing process data generated by NAEP digital assessments, including measuring time on task, changes in responses, matching processes, use of accommodations, and demographic analyses.

Forum members were interested in learning more about NCES's work on improved measures of SES and engaged Commissioner Woodworth in a discussion that focused on potential changes to how the Census Bureau collects race and ethnicity data, the continued use of National School Lunch Program (NSLP) data, the cost and accuracy of implementing an improved SES measure, and the timeliness of Census Bureau data.

Recognition of Completed Projects

Ghedam Bairu, Forum Project Director, and Dr. Woodworth presented certificates to recognize the contributions of the members of the working groups that developed the Forum Guide to Personalized Learning Data, the Forum Guide to Technology Management in Education, the Forum Guide to Planning for, Collecting, and Managing Data About Students Displaced by a Crisis, and the Forum's Data Visualization Online Course.

Top

Joint Session: Report Card Panel Discussion

S2019_Report_Cards pdf file (44 KB)

Monday, July 22, 2019

Patrick Rooney (U.S. Department of Education [ED]) introduced a panel presentation on state, district, and school report cards. Report cards are an important and integral tool that show how a state and its districts and schools are doing with regard to student achievement and success. They provide comparable, transparent information that enables insight into school performance and helps people make informed decisions. Patrick highlighted two resources developed by ED to help education agencies and parents better understand the purpose of state and local report cards: A Parent Guide to State and Local Report Cards and Opportunities and Responsibilities for State and Local Report Cards (DRAFT, March 2019).

Following Patrick's introduction, DeDe Conner (Kentucky Department of Education), Lee Rabbitt (Pawtucket School Department [RI]), and Linda Roska (Texas Education Agency [TEA]) discussed their states' approaches to report cards.

DeDe summarized how Kentucky engaged stakeholders and collected their feedback throughout the state's report card development process, with the key takeaway being the need to make data easily accessible for stakeholders through a parent‑friendly design. Kentucky local education agency (LEA) participation was vital throughout the process, and state‑local collaboration informed changes to the report card design, helped set priorities, and ensured data quality. The Kentucky School Report Card (http://kyschoolreportcard.com) provides information on each school and district in the state using a variety of metrics and indicators, including academic performance (aggregate, disaggregate, and comparative), educational opportunity, transition readiness, accountability, safety, and financial transparency. Almost all data included in the report card are already collected and available at the state level through the statewide student information system (SIS) and Financial/Human Resources (HR) system, which helps ease the reporting burden for Kentucky LEAs. The state education agency (SEA) implemented a new data collection to include per‑pupil expenditure (PPE) data, school fund allocation, total spending, and funding sources in the Kentucky School Report Card. The SEA and LEAs share responsibility for disseminating report card data to stakeholders, with over 300,000 users accessing the School Report Card Suite student/parent portal monthly. Kentucky faced several challenges in making its report card website accessible and including Civil Rights Data Collection (CRDC) data. Successes include making its report card simpler and visually digestible and reducing the research request burden.

Lee reviewed Rhode Island's new report cards, available on the Rhode Island Department of Education website (https://reportcard.ride.ri.gov/). Stakeholder feedback was used as input on the report card design, with the goal of making the report cards easier for teachers, parents, and community members to understand. Rhode Island report cards use a simple School Star Rating Performance Level system that awards one‑ to five‑star ratings based on metrics and indicators that score achievement and growth in English language arts and math, English language proficiency, absenteeism and suspension, graduation rates, number of low‑performing subgroups, and new metrics, including teacher absenteeism. Most of the data included in Rhode Island's district and school report cards are already collected from LEAs by the SEA. PPEs are calculated using an established measure of district financial data, with some expenses excluded. Lee emphasized that communication is essential‑in addition to media announcements, report cards are disseminated by the SEA to LEAs, who then share the report cards with their stakeholders using a variety of methods. Moving forward, Rhode Island will be making refinements to its report cards in response to stakeholder feedback, including clarifying growth calculations and better publicizing the data that determine star ratings.

Linda discussed how Texas' report cards have developed in recent years. TEA staff worked with stakeholders to internally redesign the state's report cards and determine how the new federal report card would complement the TEA's state and local report cards. Through an analysis of requirements and consultation with stakeholders and peer agencies, it was determined that Texas report cards met most requirements. Data stewards throughout the agency helped by preparing needed data sets for the report cards. The TEA did not add any new indicators to its federal report card, but will include new indicators on kindergarten readiness and post‑secondary persistence and completion in its state and local report cards. Starting in 2018‑19, Texas report cards will include PPE data drawn from an existing collection system. The TEA publishes the federal report card on its website (https://tea.texas.gov/frc.aspx), which includes the ability to search for and access report cards for schools and LEAs. LEA Title I, Part A, funding recipients are responsible for disseminating the state, LEA, and school‑level report cards. Looking ahead, the TEA will work on producing Spanish language translations of its report cards and address questions regarding masking CRDC data.

Forum members engaged the panelists in a discussion on report cards that focused on

  • aligning school‑year data and fiscal year data in a single report card;
  • helping stakeholders, particularly parents and community members, understand and interpret PPE data;
  • strategies for providing contextual and interpretive information to explain disparities, such as how cost of living expenses relate to PPE;
  • processes and timelines for correcting and republishing incorrect data;
  • the advantages and disadvantages of using report cards as a centralized data source; and
  • data validation and verification processes.

Top

Joint Session: Update on Student Data Privacy

S2019_Privacy pdf file (485 KB)

Tuesday, July 23, 2019

Frank E. Miller, Jr., and Tracy Koumare (U.S. Department of Education [ED]) provided Forum members with an update on federal student privacy. Frank shared that ED recently established the Student Privacy Policy Office (SPPO) to fulfill the responsibilities previously provided by the Student Privacy Policy and Assistance Division and the Family Policy Compliance Office. One of the goals of this new office is to reduce the backlog of Family Educational Rights and Privacy Act (FERPA) complaints; thus far, the SPPO has halved the number of complaints and begun providing regular updates to all complainants. In addition to formal complaint investigation, SPPO now provides resolution assistance and intermediation. Other strategic SPPO activities include a review of local education agency (LEA) websites to determine the transparency of education data practices and the release of reports on improving school safety.

Frank then reviewed the purpose and requirements of FERPA, which is a federal privacy law that regulates the access, amendment, and disclosure of student education records. Next, Tracy discussed ensuring student data privacy when using education technology in the classroom. As technology advances, online education technology has become more prevalent in the classroom. They noted that sharing data with a vendor (including technology vendors) requires specific consent under FERPA, with two exceptions. The Directory Information exception allows for the sharing of information in a student record that would not generally be considered harmful or invasive if disclosed, such as a student's name. The School Official exception allows for the disclosure of personally identifiable information (PII) to school officials (e.g., teachers, principals, and board members) for legitimate educational purposes. The School Official exception may also be extended to third parties (e.g., volunteers and vendors) if school official duties are outsourced through district agreements that conform with FERPA. Tracy emphasized the importance of establishing policies for reviewing terms of service agreements and click‑wrap agreements for FERPA compliance and shared the Privacy Technical Assistance Center (PTAC) resource Protecting Student Privacy While Using Online Educational Services: Model Terms of Service.

Frank and Tracy also discussed data destruction and encouraged members to review the recent PTAC publication Best Practices for Data Destruction. Data destruction is the process of removing information in a way that renders it unreadable or irretrievable. Secure, FERPA‑compliant destruction may be accomplished through clearing, purging, or destroying data. Furthermore, both the Studies and Audit or Evaluation exception to FERPA require written agreements that set the terms for data destruction. Frank concluded the presentation by encouraging members to contact the SPPO's PTAC for technical assistance and support.

Forum members were interested in learning more about how to balance the need for data deletion and data backups, how to monitor third‑party use and the destruction of disaggregated or de‑identified data, working with researchers to fulfill research requests (including assessing the legitimate educational interest of research requests), and how state laws can impact the education technology practices of vendors.

Top

Joint Session: Update on Cybersecurity

S2019_Cybersecurity.pdf pdf file (130 MB)

Tuesday, July 23, 2019

Steven Hernandez, the chief information security officer of U.S. Department of Education [ED], discussed cybersecurity in state and local education agencies (SEAs and LEAs). He began with an overview of the value of data‑as determined by who has the data and what they can be used for‑by providing illustrative examples of the potential value of personal computers, email, and organizational assets. Email remains a top target, and Steven recommended using multifactor authentication to protect all business and personal accounts and devices. Education information and education technologies can provide digital access to student personally identifiable information (PII), which is considered very valuable to attackers for future use, as evidenced by recent cyber attacks against school districts and other educational institutions.

Steven provided practical countermeasures that SEAs and LEAs can take to protect against multiple types of cybersecurity threats. Ransomware is a type of malware that can compromise a computer, but it can be counteracted through preventive measures that ensure business continuity, including regularly backing up data, securing data backups, and conducting annual penetration tests and vulnerability assessments. Specific actions can also help minimize the spread of ransomware through an infected network. Caution should be taken when considering whether to pay a ransom following an attack. Business email compromise is a targeted scam that seeks to secure financial payment or the transfer of valuable information, such as tax information. To help protect against business email compromise, Steven recommended being mindful of phone and email communications regarding financial payments and valuable information and reaching out to authorities as soon as a fraudulent transaction has been discovered.

Steven also shared best practice information that SEAs and LEAs can use to secure their data and information against cyber attacks. It is important to be proactive and take appropriate action before an intrusion occurs, such as by creating an action plan, having appropriate technology and services in place, and engaging with law enforcement. By building security in from the start, SEAs and LEAs can leverage technologies and systems to help prevent cybersecurity threats. Steven concluded by highlighting federal resources available through ED, the Federal Bureau of Investigation, the U.S. Department of Justice, the National Security Agency, the U.S. Department of Homeland Security, and the General Services Administration, including the FedRAMP training program.

Forum members engaged Steven in a discussion that addressed counteracting phishing threats, how the age of systems can impact security, minimizing security vulnerabilities in single sign‑on systems, and the FedRAMP marketspace.

Top

Forum Closing Session

Wednesday, July 25, 2018

Forum Resources Panel Discussion

S2019_Forum_Resources pdf file (201 MB)

Six Forum working group chairs delivered a panel presentation on the Forum's new and forthcoming resources. Georgia Hughes‑Webb (West Virginia Department of Education) introduced the panel and reported that the Forum has had a very productive year. This year the Forum developed several new resources that address timely topics in the education data community:

  • Georgia began the presentation by highlighting the Forum Data Visualization Online Course. The Data Visualization Online Course Project Group developed this new resource to introduce the concept of data visualization. The online course describes how to apply key data visualization principles and practices to education data and explains how the data visualization process can be implemented to support effective data analysis and communication throughout an education agency. It builds upon the Forum Guide to Data Visualization: A Resource for Education Agencies, published in 2016.
  • Georgia also shared an update on the Data Governance Working Group, which is currently developing a new resource that will identify best practices for developing, implementing, and maintaining an effective data governance plan. This resource will highlight examples from education agencies that have implemented effective data governance programs, discuss common challenges and solutions in data governance, and provide links to existing data governance resources.
  • Marilyn King (Bozeman School District #7 [MT]) shared an update on the Exit Codes Working Group, which is developing a new resource that will document best practices for tracking students who transfer, complete high school, drop out of school, or otherwise exit from an education agency. This resource will update the information included in the 2006 publication Accounting for Every Student: A Taxonomy for Standard Student Exit Codes and include a revised taxonomy, case studies from education agencies, and best practices for collecting and maintaining exit code data.
  • Jan Petro (Colorado Department of Education) discussed the forthcoming Forum Guide to Planning for, Collecting, and Managing Data About Students Displaced by a Crisis. The Crisis Data Management Working Group developed this new resource to provide best practice information for collecting and managing data about students who have temporarily or permanently enrolled in another educational setting because of a crisis. This resource updates the information included in the 2010 publication Crisis Data Management: A Forum Guide to Collecting and Managing Data about Displaced Students; highlights best practices that education agencies can adopt before, during, and after a crisis; and features case studies and real‑world examples from agencies that have either experienced a crisis that displaced students or received students who were displaced by a crisis.
  • Dean Folkers (Nebraska Department of Education) reviewed the Forum Guide to Personalized Learning Data. The Personalized Learning Data Working Group developed this new resource to help education agencies as they consider whether and how to use personalized learning. The resource includes an overview of the topic and best practices on collecting and using data for personalized learning. It also includes case studies from districts and states that have implemented personalized learning to illustrate how data are used in different locations depending on their approach to personalized learning.
  • Steve Smith (Cambridge Public Schools [MA]) provided an overview of the Forum Guide to Technology Management in Education. The Education Technology Working Group developed this new resource to assist education agency staff with understanding and applying best practices for selecting and implementing technology to support teaching and learning in the classroom. It addresses the widespread use and integration of technology in modern education systems and focuses on technology governance and planning, including needs assessments, as well as technology implementation, integration, maintenance, support, training, privacy, security, and evaluation.
  • Susan Williams (Virginia Department of Education) concluded the presentation with an update on the School Courses for the Exchange of Data (SCED). SCED provides voluntary, common course codes for prior‑to‑secondary and secondary school courses that can be used to compare course information, maintain longitudinal data about student coursework, and efficiently exchange coursetaking records. The SCED Working Group has published SCED Version 6, updated the SCED master list and SCED Finder, and published two case studies on Iowa’s and Virginia’s use of SCED course codes.

The speakers concluded their presentation by encouraging Forum members to visit the Forum website, share Forum resources with their colleagues, and attend the Forum's presentation at the 2019 NCES STATS‑DC Conference.

Standing Committee Progress Reports

Recognition of Forum Officers

The Forum presented certificates to recognize the contributions of the Forum officers. Marilyn Seastrom, chief statistician of NCES, commended the Forum on its work.

Forum Election

Forum Chair Allen Miedema (Northshore School District [WA]) presented the slate of proposed 2019‑20 officers for a vote. The slate was seconded, and then the Forum voted to approve the following members as 2019‑20 officers:

Chair: Dean Folkers, Nebraska Department of Education
Vice Chair: Marilyn King, Bozeman School District #7 (MT)
NESAC Chair: Cheryl L. VanNoy, Saint Louis Public Schools (MO)
NESAC Vice Chair: Gunes Kaplan, Nevada Department of Education
PPI Chair: Bradley McMillen, Wake County Public School System (NC)
PPI Vice Chair: Linda Jenkins, Arkansas Department of Education
TECH Chair: DeDe Conner, Kentucky Department of Education
TECH Vice Chair: Dawn Gessel, Putnam County Schools (WV)

Forum Chair Allen Miedema (Northshore School District [WA]) thanked NESAC Chair Laura Boudreaux (Louisiana Department of Education), PPI Chair Charlotte Ellis (Maine Department of Education), and TECH Chair Ken Hutchins (Brandywine School District [DE]) for their service.

Closing Remarks

2019‑20 Forum Chair Dean Folkers (Nebraska Department of Education) thanked Allen Miedema (Northshore School District [WA]) for his leadership of the Forum. Dean encouraged members to attend the six Forum presentations occurring at the NCES STATS–DC Data Conference. Dean also reminded Forum members to complete the evaluation forms.

Top

Steering Committee

Monday, July 22, 2019

Welcome and Review of Monday's Events

Forum Chair Allen Miedema (Northshore School District [WA]) welcomed the Steering Committee to the meeting and invited members to share their thoughts and comments on the day's events.

  • The reformatted new member orientation was well received, and the small group discussions went well. Members liked the changes made to the format and appreciated having time for discussions between new members and standing committee leaders.
  • Members appreciated the Welcome Address and National Center for Education Statistics (NCES) Update by Commissioner James "Lynn" Woodworth.
  • The Report Card Panel generated good conversations during the joint session and followup discussions in the Forum standing committees.
  • Approximately 50 Forum members were in attendance at the National Education Statistics Agenda Committee (NESAC), with many new members staying for both the morning and afternoon sessions. Approximately 25 Forum members, including some local education agency (LEA) representatives who moved from other committees, were in attendance at the Policies, Programs, and Implementation (PPI) Committee. Approximately 30 Forum members were in attendance at the Technology (TECH) Committee.
  • Members noted that there was unexpected downtime between the joint sessions and standing committee meetings. While this downtime can slow momentum, members noted that it also provided additional time for informal conversations between Forum members who wanted to network/meet with each other.

Other Issues
During the Report Card Panel Discussion and follow-up discussions in standing committees, some members noted concerns about Civil Rights Data Collection (CRDC) data, including a concern about using CRDC data in report cards because they are not the most current data available in the state. Members discussed CRDC data quality and began compiling ideas and suggestions for NCES and the Office for Civil Rights (OCR) on ways to improve the collection.

Tuesday, July 23, 2019
Review of Tuesday's Events
Steering Committee members discussed the joint sessions on student data privacy and cybersecurity, time spent in standing committees, and the new member meeting.

  • The joint session on student data privacy provided helpful information, including some recent updates that may be new to Forum members on improvements to the effectiveness and efficiency of Family Educational Rights and Privacy Act (FERPA) enforcement.
  • Steven Hernandez's (U.S. Department of Education [ED]) joint session presentation on cybersecurity was well received and provided practical information. Members noted that Steven has also presented at regional meetings of state education agencies (SEAs) and LEAs.
  • NESAC held several in‑depth group discussions on topics of relevance to members, which provided many opportunities for new member participation.
  • PPI invited several speakers and scheduled time for discussions, but noted that there were too many topics scheduled in too short a time without breaks, which resulted in the meeting going over schedule. The chairs suggested that next year's agenda include fewer topics and more breaks so members have adequate time for discussion. PPI members suggested that a working group on the topic of data destruction (including end‑of‑lifecycle issues and third‑party agreements with service providers) would be beneficial.
  • TECH members devoted time to reviewing and discussing the TECH white paper on cybersecurity. The committee plans to continue its discussions and work on updating the paper this fall. The white paper can be either posted to the TECH Forum360 website (a password‑protected virtual space that is only accessible to registered TECH members) or published on the Forum website (which requires review and vote by the full Forum and review by NCES prior to publication). If members determine that the paper is a valuable resource that will help education agencies, they would be interested in pursuing formal publication to the Forum website.
  • The new member meeting was a new addition to the Forum meeting agenda and provided new Forum members an opportunity to reconvene with their mentors, the Steering Committee, and the Communications Subcommittee. The Steering Committee noted that the meeting went well and briefly discussed strategies to further improve the new member experience:
    • Encourage new members to form a "network" and continue to connect with each other between meetings.
    • Provide new members with a list of all Forum members and their job titles.
    • Match Forum mentors and mentees based on their job titles and responsibilities.

Forum Elections
Standing committee chairs reported the results of their elections. Nominated chairs and vice chairs for the 2019‑20 year were as follows:

  • NESAC Chair: Cheryl L. VanNoy, Saint Louis Public Schools (MO)
  • NESAC Vice Chair: Gunes Kaplan, Nevada Department of Education
  • PPI Chair: Bradley McMillen, Wake County Public School System (NC)
  • PPI Vice Chair: Linda Jenkins, Arkansas Department of Education
  • TECH Chair: DeDe Conner, Kentucky Department of Education
  • TECH Vice Chair: Dawn Gessel, Putnam County Schools (WV)

The Steering Committee nominated Dean Folkers (Nebraska Department of Education) as the Forum Chair and Marilyn King (Bozeman School District #7 [MT]) as the Forum Vice Chair for 2019‑20.

Forum Chair Allen Miedema (Northshore School District [WA]) thanked NESAC Chair Laura Boudreaux (Louisiana Department of Education), PPI Chair Charlotte Ellis (Maine Department Summer 2019 Steering Committee Meeting Notes 3 of Education), and TECH Chair Ken Hutchins (Brandywine School District [DE]) for their service.

Other Issues
The Steering Committee noted the following:

  • At times, it was difficult to hear some of the joint session presenters due to the sound quality.
  • The Forum sends letters to chief state school officers (CSSOs) to recognize member participation in the Forum and to alert CSSOs when Forum representative seats are vacant. The Forum does not report on member absenteeism at Forum meetings. Members suggested that vacancy letters be sent when CSSO turnover occurs, given that interim appointments may be for a short period of time.
  • This year, the Navajo Nation Department of Dine´ Education (NN DODE) joined the Forum as an Associate member.

Ross Santy (NCES) was invited to the Steering Committee meeting for members to continue the previous day's CRDC discussion. Ross noted that OCR and NCES would be open to discussing concerns about the CRDC with the Forum. These conversations could include offices within ED, such as OCR, NCES, and the Office of State Support.

Wednesday, July 24, 2019

Welcome to New Steering Committee Chairs
Newly elected Chair Dean Folkers (Nebraska Department of Education) welcomed new Steering Committee members to the meeting.

Review of Wednesday's Events
Steering Committee members reviewed the time spent in standing committees.

  • The standing committees identified several topics of interest for future consideration, discussion, webinars, or Forum work:
    • Program evaluation and monitoring (NESAC)
    • Cohort graduation rates (NESAC)
    • Teacher absenteeism (PPI)
    • The Protection of Pupil Rights Amendment (PPRA) (PPI)
    • Data standards and comparability (PPI)
    • Cybersecurity white paper (TECH)
    • One‑page privacy checklist (TECH)
  • NESAC held a robust discussion on the topic of program evaluation and monitoring. Ghedam offered to reach out to the National Center for Education Evaluation and Regional Assistance (NCEE) for more information on resources available to SEAs and LEAs, which the Forum will use to determine whether a working group on this topic is warranted.
  • PPI discussed the Common Education Data Standards (CEDS) and Generate, then held roundtable discussions on chronic absenteeism.
  • TECH spent much of the morning continuing its discussions on the TECH white paper. The committee also heard reports on the Forum's participation in the 2019 Consortium for School Networking (CoSN) meeting and the Postsecondary Electronic Standards Council (PESC) Spring 2019 Data Summit.

Other Issues
Steering Committee members discussed strategies for improving Forum operations and engagement:

  • Rather than focusing on formal speaker presentations, PPI members are interested in ways to incorporate more networking opportunities and breakout discussions into their standing committee time. The NESAC and TECH chairs shared that their committees have incorporated more time for breakout sessions and roundtable discussions based on member suggestions, and suggested that PPI could try a similar approach. While smallgroup discussions are useful, members noted that conversations can be hindered if a handful of participants dominate conversation. Steering Committee members reiterated the important role of chairs in facilitating discussion and encouraging member participation.
  • The standing committees could use member job titles and responsibilities to form discussion groups.
  • PPI had more LEA participants this year than in previous years.
  • If possible, it would be helpful to track simple metrics on Forum resources through Google Analytics, such as how often documents are downloaded from the Forum website.
  • While it is not possible for the Forum to host or partner with a sponsor on after‑hours activities, members can share their after‑hour plans with one other and are welcome to organize pay‑your‑own‑way activities and meals.
  • The Steering Committee suggested that a new subcommittee, comprised of Steering Committee members, be convened to primarily focus on evaluating the Forum meeting structure. This subcommittee could potentially address
    • improving standing committee discussions;
    • reformatting the Forum meeting;
    • analyzing and using Forum member information;
    • identifying Forum resource metrics; and
    • revisiting the Forum Strategic Plan (https://nces.ed.gov/forum/nfestrat.asp).
      Ghedam Bairu (NCES) noted that the Communications Subcommittee was charged with reformatting the new member orientation last year, and could reconvene to discuss methods for improving standing committee discussions and reformatting the Forum meeting.

Steering Committee conference calls will resume in September 2019. They will be scheduled on the third Friday of each month at 12:00 pm (Eastern). The committee will discuss the potential formation of a new subcommittee on its September call.

Top

National Education Statistics Agenda Committee (NESAC) Meeting Summary

Monday, July 22, 2019

Morning Session

NESAC Committee Kickoff
NESAC Chair Laura Boudreaux (Louisiana Department of Education) and NESAC Vice Chair Cheryl VanNoy (St. Louis Public Schools [MO]) introduced themselves and welcomed members to the meeting. Laura informed members about the National Center for Education Statistics (NCES) Distance Learning Dataset Training (DLDT) system. This resource is an online, interactive tool that allows users to learn about and access NCES data across the education spectrum and evaluate it for suitability for specific research purposes. It is designed to introduce users to many NCES datasets, their design, and special considerations for analysis to facilitate effective use. Laura and Cheryl then led an introductory activity during which members shared information about themselves, their work, and their experiences with the Forum. Laura briefly reviewed major activities and discussions from the 2018 NESAC meeting and reviewed the 2019 meeting agenda.

Afternoon Session

Data Use in State and Local Education Agencies (SEAs and LEAs)

NESAC Group Discussion: Joint Session Follow‑up on Report Cards
NESAC Chair Laura Boudreaux (Louisiana Department of Education) led a follow‑up discussion on topics discussed during the joint session presentation on report cards. Members discussed the following topics:

  • It can be difficult to come to a consensus on the amount of data that stakeholders, such as parents, need in a report card and other reports. Several SEAs noted that they have not only designed their report cards to communicate the data that are required for compliance measures, but also tailored reports to the needs of different stakeholders. Other SEAs are in the process of making their report card websites more user friendly.
  • In addition to report cards, SEAs are developing different online tools and reports to help stakeholders understand agency data.
  • SEAs are taking different approaches to reporting civil rights data and per pupil expenditures.
  • Some states have more stringent suppression rules than those required for the Civil Rights Data Collection (CRDC) and must run their state suppression rules on all data shared in report cards, including CRDC data.
  • Efforts to streamline reporting by linking state report card data and accountability data are hindered by different data definitions. SEA staff must often explain to stakeholders why they have different data that refer to the same concept, such as graduation rates.

SEA/LEA Breakouts: Enterprise Data Systems
NESAC members divided into two groups‑one for SEAs and one for LEAs‑to compare their agencies' uses of enterprise data systems. SEA topics of discussion included the following:

  • Many SEAs rely on contractors to implement data systems because SEAs often cannot afford the salaries of full‑time developers. Outsourcing these jobs also helps to reduce the effects of staff turnover.
  • SEAs that build solutions in‑house are responsible for documentation and staffing, but find that they have more control over the design of the system.
  • Data governance is a key component when implementing, managing, and using data systems. In some cases, poor data governance has allowed old systems to be used for purposes beyond their original intent, which can impact the quality of data.
  • It is critical for information technology staff to partner with data users.
  • It can be useful to consider solutions that are outside the SEA's standard portfolio. If a tool outside the portfolio will benefit the agency, staff should build a business case for its use.

LEA discussion topics included the following:

  • With the use of cloud‑based technologies, LEAs no longer need extensive technical staff to manage servers. Instead, LEAs need technical staff with skills related to promoting interoperability and the management of application programming interfaces (APIs).
  • While some agencies have privacy concerns about cloud‑based computing, data stored in the cloud are often more secure than data sitting in a room at an LEA office.
  • It is often less expensive to purchase vendor support for systems than to hire additional LEA technical staff. In addition, vendors can offer around‑the‑clock support, which many LEAs cannot. However, LEAs that use vendors find that it is hard to customize solutions.

NESAC Group Discussion: Using Available Data
NESAC Chair Laura Boudreaux (Louisiana Department of Education) and NESAC Vice Chair Cheryl VanNoy (St. Louis Public Schools [MO]) led a discussion on how agencies are using available data, ways to improve data use, and examples of effective data use. The discussion focused on graduation data, alternative assessment data, diploma types, and data sharing. NESAC members shared the following thoughts:

  • Reporting requirements that only take into account four‑year cohort graduation rates obscure the meaningful work in LEAs to graduate students who may need five or more years. Some SEAs report the federal four‑year cohort graduation rate but also track other rates, such as five‑, six‑, and seven‑year graduation rates, for SEA and LEA use.
  • Louisiana introduced a transitional ninth grade to provide additional supports to students. The policy guidance states that "Students placed in transitional 9th grade are considered 8th graders for accountability purposes and are not included in the high school graduation cohort during their first year on the high school campus." (https://www.louisianabelieves.com/docs/default‑source/policy/2018‑2019‑transitional9th‑policy‑guidance‑(002).pdf?sfvrsn=10).
  • Members were interested in knowing whether the U.S. Department of Education (ED) has done any research comparing outcomes for students who graduate in four years with those who graduate in five years.
  • Graduation rates that focus on years, such as the four‑year graduation rate, do not take into consideration how many credits that includes. The number of credits required for graduation would be a more accurate measure for comparison among states.
  • LEAs monitor the number of students taking alternative assessments to ensure that the number does not exceed 1 percent of LEA enrollment. Some SEAs are able to assist LEAs by reporting data on students who might be eligible to take alternate assessments back to the LEAs prior to the exam testing window.
  • Some states offer alternative diplomas, while others do not. Among states that do, there are different policies regarding whether recipients of alternate diplomas are counted as graduates.
  • In some states, waivers are available that exempt students from some graduation requirements, while still allowing them the opportunity to earn a regular diploma.
  • NESAC members were interested in future discussions around data governance for multiagency data. For example, it would be useful to discuss data governance when a student is concurrently enrolled in two K‑12 schools, or when a student is dually enrolled in a high school and a postsecondary institution.
  • Schools operated by tribal governments are subject to different regulations that impact the ability to share data with SEAs and LEAs.

Tuesday, July 23, 2019

Morning Session

Joint Session Follow‑up: Student Data Privacy
NESAC Group Discussion
NESAC Chair Laura Boudreaux (Louisiana Department of Education) welcomed members back to NESAC and invited members to discuss topics raised during the student data privacy presentation from Frank E. Miller, Jr., and Tracy Koumare (U.S. Department of Education, Student Privacy Policy Office). NESAC members discussed the following topics:

  • Some SEAs and LEAs are working together to create standardized SEA privacy agreements for vendors. Developing these agreements encourages SEA and LEA staff to think critically about what is important to their agencies in terms of privacy.
  • In addition to strong privacy policies, it is crucial to instruct staff and students on principles of good digital citizenship.
  • One downside of the expansion of bring‑you‑own‑device policies is that it is no longer possible for agencies to know the extent of what programs and software students have loaded on their devices and what data these programs and software collect.
  • States have enacted different privacy laws. Examples include laws specifying that student data cannot leave the boundaries of the state (although there may be exemptions for cloud solutions), and laws preventing the SEA from having any personally identifiable information (PII). In cases where the SEA cannot have access to PII, vendors are contracted to strip PII from data that are shared with the state, and vendors or other outside agencies such as university researchers must be contracted to perform research on behalf of the SEA.
  • Parents may opt out of providing directory information if they believe that information may be misused. Some LEAs have established different levels of directory information opt‑outs. For example, one level may give parents the option to restrict the LEA from sharing directory data with outside organizations, but allow it to be shared with the parent‑teacher association (PTA).

Social‑Emotional Data
NESAC Panel Discussion
Isaiah O'Rear (NCES), provided an update on ED's School Climate Surveys (EDSCLS) (https://safesupportivelearning.ed.gov/edscls). The surveys are designed to allow SEAs, LEAs, and schools to collect and act on reliable, nationally validated school climate data in real time. Isaiah discussed the history of the project, including the development of the instruments, cognitive testing, the pilot survey, and benchmarking efforts. Recent work has been focused on survey administration, including tracking differences between refusals and nonresponses and improving exportability. The project added a technical assistance guide, and staff perform continual security updates. ED's Office of Safe and Healthy Students is responsible for updates to the survey platform.

Following Isaiah's presentation, Ellis Ott (Fairbanks North Star Borough School District [AK]) and Alvin Larson (Meriden Board of Education [CT]) discussed their agencies' use of school climate surveys. For more information on Fairbanks North Star Borough School District's school climate surveys see >https://www.k12northstar.org/Page/7759, and for more information on Meriden Board of Education's surveys see https://www.meridenk12.org/departments/researchand‑evaluation/.

Afternoon Session

Joint Session Follow‑up: Cybersecurity
NESAC Group Discussion
NESAC Chair Laura Boudreaux (Louisiana Department of Education) led a follow‑up discussion about cybersecurity following the presentation from Steven Hernandez (ED). NESAC members shared their experiences in responding to cybersecurity threats, and offered the following thoughts:

  • While multifactor authentication is a good security practice, it can be difficult to implement in agencies that do not offer their staff agency cell phones.
  • Agencies have identified several practices that have been helpful in addressing cybersecurity concerns, including
    • requiring vendors to have cybersecurity insurance;
    • conducting security audits within the SEA or LEA;
    • adding requirements for security audits in vendor contracts; and
    • working with an attorney to establish standard, agency‑wide security language for vendor contracts.
  • NESAC members discussed the benefits of cybersecurity training for staff. Training is especially effective when it is tied to the results of the security audits. Some agencies create their own social engineering (spearfishing) attacks within the agency to test their staff and identify areas for additional training. However, this should be done carefully to ensure that it doesn't make staff wary of emails, or result in shaming staff.

Communication and Reporting between ED, SEAs, and LEAs

NESAC Group Discussion
NESAC members used this time to address questions members submitted prior to the Forum.

What strategies have agencies found helpful for promoting SEA and LEA communication on issues such as data quality checks and revisions, and what methods are effective for improving feedback loops and communication routes to resolve data issues?

  • Members noted that the Forum Guide to Building a Culture of Quality Data: A School and District Resource can help agencies answer these questions.
  • Members shared the following best practices from LEAs:
    • Publish daily error reports on dashboards to help staff improve data quality.
    • Follow up with principals periodically (for example, monthly) to ensure that data errors are corrected.
    • Engage networks of superintendents in discussions of data quality issues.
  • Members shared the following best practices from SEAs:
    • Run internal checks on data, compare new data with old data, and do manual spot checks to help resolve errors.
    • Offer test periods prior to submission periods that allow LEAs to submit their data for review.
    • Provide data stewards to assist LEAs with data and ensure that all data staff have access to ongoing training, such as in‑person state data conferences and monthly webinars.
    • Offer incentives, such as certificates, to recognize LEAs that show a commitment to data quality.
    • Consider compiling all SEA reporting information for LEAs into one weekly newsletter to ensure that the SEA is not overburdening LEAs.

What systems and processes do SEAs and LEAs use to inform their staff about data reporting and accountability?

  • SEAs and LEAs use the following approaches to keep their staff informed:
    • Provide data literacy training within the agency, including demonstrations on how to answer questions using information from the report card and examples of information contained within the accountability system.
    • Ensure that the SEA publishes an accountability calendar that is shared with LEAs and that is tied in with other calendars. Note that if the SEA cannot meet the deadlines in the accountability calendar, it can cause ill will with LEAs.
    • Offer SEA-sponsored data training "roadshows," which are particularly useful to rural LEAs.

What are effective practices for communicating information to different stakeholders, such as agency staff or parents?

  • Whit Johnstone (Irving Independent School District [TX]) demonstrated Texas's school report card website (www.texasschools.org). This user‑friendly website helps to make accountability data accessible to a wide range of stakeholders. Whit noted that many stakeholders do not want to look at tables with numbers, and it is often more effective to break accountability data down in different ways. The Texas Schools website allows users to view data by school, district, or campus, and it further divides data by type, such as student achievement data, school progress data, and data that focus on closing achievement gaps between different subpopulations of students.

Discussion of Federal Data in States and Districts with EDFacts Staff (S2019_EDFacts)
Barbara Timm (ED) and Beth Young (EDFacts Support Team) came to the committee to begin discussions on enrollment policies and pre‑kindergarten enrollment. Both issues came out as areas of interest from the last EDFactsOffice of Management and Budget (OMB) collection package. These items were proposed and then dropped from the package based on the public response, and OMB asked for more information from states and districts on these topics for the future. The committee shared their thoughts on both topics, and some of the main points were as follows:

Enrollment Policies:

  • Some states have one policy that applies to all districts/schools in the state.
  • Many states have policies in legislation.
  • Some states have different policies for different districts or schools (charter or magnet), meaning that the districts set their own enrollment policies and that these can change.
  • States with different policies aren't likely to keep that information in their data system and would need to start collecting that information if it was required.
  • Policies aren't always cut and dried, meaning that there could be an open enrollment policy set, but for certain populations only, or until a threshold is met and then the policy closes, etc.

PreK Enrollment (in publicly funded PreK programs):

  • States have the number of PreK students enrolled in public schools and report those students as part of their EDFacts files.
  • Some states have PreK student data systems that include students in nonpublic schools (i.e., those not reported as part of EDFacts).
  • The PreK programs not in public schools are mostly managed by other state agencies.
  • Most states have surveys for incoming kindergarten students that ask about their preschool experience. However, these surveys include only those students who attend public elementary schools and are prone to parent response error.
  • Terms needed to be clarified as part of this conversation, such as public (i.e., does public funding include any public dollars from any federal agency?) and enrollment (i.e., what is the frequency and length of the PreK program: full time, part time, split time, or something else?).

NESAC Committee Business
Topics From the Floor
NESAC Chair Laura Boudreaux (Louisiana Department of Education) opened the floor for NESAC members to discuss topics of interest that were not addressed during planned conversations. Members discussed the following topics:

  • Identifying schools in need of comprehensive support and improvement (CSI) and targeted support and improvement (TSI). The requirement to rank schools means that even schools that make substantial improvements may consistently get a low ranking. Rankings do not show whether a state is improving. Some states are moving away from thinking of percentile rankings in a punitive manner and instead are thinking about collaborative ways to do targeted technical assistance.
  • Effective methods to demonstrate the need for additional fiscal and personnel resources. One strategy is to partner with financial staff to forecast different scenarios. Another is to quantify the amount of money LEAs save by investing in data quality. Agencies sometimes find that it's effective to make clear precisely what they are, and are not, able to accomplish with existing resources. If it's not possible to decline requests or projects due to a lack of support, then it's important to communicate accurate timelines.

NESAC Election
Cheryl VanNoy (St. Louis Public Schools [MO]) was nominated as the 2019‑20 NESAC chair, and Gunes Kaplan (Nevada Department of Education) was nominated as the 2019‑20 NESAC vice chair.

Wednesday, July 24, 2019

Morning Session

Program Monitoring and Evaluation
SEA/LEA Breakouts
NESAC Chair Laura Boudreaux (Louisiana Department of Education) and Vice Chair Cheryl VanNoy (St. Louis Public Schools [MO]) welcomed members back to NESAC and asked members to once again divide into two groups‑one for SEAs and one for LEAs‑to discuss the topic of program evaluation. SEAs discussed the benefits of providing data back to LEAs and program teams, but noted that it can be difficult for research teams who are experienced in data use to partner with program area staff who are not experienced. It can be helpful to offer data visualizations and other ways of explaining the data, as well as offering data skills training for program staff. The burden on SEAs of performing program evaluations can be reduced by partnering with outside agencies, such as universities. Regional Educational Laboratories (RELs) may also be able to assist with this topic.

LEAs discussed implementation plans and identified the following crucial questions:

  • Does your LEA have a program evaluation implementation plan?
  • Is it a formal plan?
  • Did you engage in goal setting as part of planning?
  • Was the plan implemented with fidelity? What measures did you use?
  • Did you provide enough time and resources to determine whether the program is effective?
  • After you collect evaluation data, what do you do with it?

LEAs noted that it is crucial to talk about what data will be collected prior to implementing a program, and then to track and monitor data throughout the program. Early warning system data can be very useful for program evaluations. For example, it can be useful to look at at‑risk students who are succeeding, see what programs they were involved in, and then interview them. LEAs also discussed how data are only one part of determining program longevity; stakeholder support for programs is also an important consideration. An overall goal of program evaluation at the LEA level is to ensure that agencies are getting actionable information that can be used to promote student achievement.

NESAC members suggested that program evaluation is a potential Forum working group topic. Ghedam offered to reach out to the National Center for Education Evaluation and Regional Assistance (NCEE) for more information on resources available to SEAs and LEAs.

NESAC Committee Business
Steering Committee Business/Report
NESAC Chair Laura Boudreaux (Louisiana Department of Education) reported that the Steering Committee discussed topics of interest across the Forum's three standing committees.

Meeting Review/Future Planning
NESAC Chair Laura Boudreaux (Louisiana Department of Education) led the committee in discussing plans and ideas for future NESAC meetings. NESAC members shared the following suggestions and topics for future meeting plans:

  • Interim and summative assessments. Are these two separate assessments or should they be linked?
  • Methods for collecting data on economically disadvantaged status and foster care status.
  • Members are also interested in knowing more about how foster care data are used by ED.
  • Methods for collecting data on military‑connected status. State definitions are at times different than Department of Defense (DOD) definitions. A member suggested that it would be useful to have a Forum session with representatives from ED and DoD to discuss how these data are being used.
  • Reporting per pupil expenditures and other financial reporting. What agencies are going above and beyond reporting requirements for these data?
  • Interagency data sharing. Some SEAs are being asked and are concerned about sharing data with other state agencies, such as state departments of children and families.
  • Data standards, including a session focused on School Courses for the Exchange of Data (SCED).
  • Additional small roundtable discussions.
  • American Indian education issues. How can SEAs and LEAs work with tribal schools? What data can be shared?
  • Health data in schools. What data are tracked? Do these data reside in the nurse's office? What privacy and security issues are tied specifically to health data? Members were also interested in the possibility of a session from the Centers for Disease Control (CDC) on the Youth Risk Behavior Survey (YRBS). Are YRBS data useful to SEAs and LEAs? If so, how can agencies access these data?
  • How to comment on the Federal Register, and why commenting is important.

Top

Policies, Programs and Implementation (PPI) Committee Meeting Summary

Monday, July 22, 2019

Morning Session

PPI Committee Kickoff
Welcome and Introductions
PPI Chair Charlotte Ellis (Maine Department of Education) and Vice Chair Bradley McMillen (Wake County Public School System [NC]) introduced themselves and welcomed members to the meeting. Charlotte kicked off the meeting by sharing information about the National Center for Education Statistics (NCES) Distance Learning Dataset Training (DLDT) system. This resource is an online, interactive tool that allows users to learn about and access NCES data across the education spectrum and evaluate it for suitability for specific research purposes. It is designed to introduce users to many NCES datasets, their design, and special considerations for analysis to facilitate effective use. PPI members then engaged in an interactive session to get to know each other and initiate conversations between members that would support PPI deliberations throughout the meeting.

Summer 2018 Meeting Review and Agenda Review
PPI Chair Charlotte Ellis (Maine Department of Education) briefly reviewed major activities and discussions from the Summer 2018 PPI Meeting, including a focus on state‑local education agency (SEA‑LEA) collaboration, cybersecurity and data privacy, and Forum projects and resources, as well as speakers from EDFacts, NCES, and the U.S. Department of Education (ED).

Charlotte then reviewed the agenda for this Summer 2019 PPI meeting, which was designed to reflect member suggestions throughout the meeting planning process. On Monday afternoon, PPI planned to discuss the Forum Opening Session, report cards, indicators of military connectedness, and data governance. On Tuesday, topics would include student data privacy, data retention and destruction, federal data in states and districts with EDFactsstaff, socialemotional learning with a panel of federal and LEA staff, and SEA and LEA report cards. On Wednesday morning, PPI expected to hear from members of the Common Education Data Standards (CEDS) team to discuss CEDS and G3‑Generate, followed by a discussion about chronic absenteeism data and topics from the floor, before concluding with committee business.

NCES Opening Session Follow–Up
During the Forum Opening Session, Dr. James "Lynn" Woodworth, commissioner of NCES, shared an overview of activities planned and underway at NCES. PPI members found the presentation engaging and informative and made the following comments:

  • Commissioner Woodworth's description of NCES's work with the Census Bureau to develop location‑based socioeconomic status (SES) data was of particular interest to PPI members.
  • Location‑based SES is important because the use of National School Lunch Program (NSLP) data continues to be problematic. For example, there are data quality/undercounting concerns because kids don't self‑identify as they get older. There are also strict limits on analytical use because it is not an education collection. Several agencies said that they were evaluating whether they needed to devise their own measure, but would wait to see what NCES develops, noting that aggregate data are helpful, but student‑level analysis is also needed (but not currently allowable).
  • One SEA has at least 10 years of SES data on disadvantaged kids from its LEAs. Unfortunately, LEAs are permitted to define SES any way they want for the collection (e.g., Temporary Assistance for Needy Families [TANF] eligibility) as long as they do not use NSLP, but this means the data can't be aggregated and aren't comparable from district to district.
  • Some districts use Community Eligibility Provision (CEP), which requires income surveys. The LEA has social workers go door to door to collect eligibility data, which has been effective for getting information.
  • A lot of vendors are trying to capture homelessness, foster care, etc., as a proxy for SES disadvantaged status. One PPI member accessed NCES's Education Demographic and Geographic Estimates (EDGE) (https://nces.ed.gov/programs/edge/) to evaluate whether it matched that member's LEA's records. The match was quite strong, but, admittedly, the person represented a high‑poverty district, so it probably was not representative nationally. Nonetheless, the member thought that the methodology was sound, and it was an ambitious plan to generate the data at the individual student level.
  • Some SEAs are concerned about collecting and using address data. For charter schools, one SEA collects addresses to validate incoming data but does not store them.
  • LEAs are interested in knowing how many of their students are going to other districts. They could also use that type of data for mapping school district boundaries and studying the effects of boundary changes.
  • One SEA is looking at intergenerational poverty. These data are maintained very securely and are not shared with anyone because of the sensitive nature of such analysis.
  • Another SEA said that it is heavily local control and, therefore, only collects data that are mandated by the federal or state government‑thus it does not collect student addresses.
  • Another concern is cyber charter schools with students from any location in the state. For example, one large city does not require students to go to the nearest school, so geospatial assumptions about students are not necessarily valid.
  • One SEA collects student addresses for transportation planning at the state level (but not for NSLP purposes).
  • Economic disadvantage can affect state funding. One SEA plans to apply for an NCES Statewide Longitudinal Data Systems (SLDS) grant to pilot collection of address data in selected LEAs.
  • Some states collect some of these data through the Federal Perkins Loan Program.
  • A final concern about location data was raised: some students have a mailing address that is not the same as the residence address, which needs to be considered in order to minimize the effect on data quality.

Other aspects of the discussion included the following:

  • Commissioner Woodworth reported that a panel was convened to consider whether the Census Bureau should create a Middle Eastern Or North African race category, but there is no reason to think that this will change any time soon because of concern and complexity.
  • Potential changes to race and ethnicity categories can cause concern amongst data system managers, even while many agree that segmenting race and ethnicity is problematic.

Afternoon Session

Joint Session Follow‑Up: Report Cards
PPI Chair Charlotte Ellis (Maine Department of Education) introduced Lee Rabbit (Pawtucket School Department [RI]) and Tyler Backus (Maine Department of Education) to lead a PPI discussion about report cards.

  • Lee opened the conversation by noting that her LEA's financial data was a year behind the accountability data and was in the form of a uniform chart of accounts, which is good for accounting, but not reader friendly for nonexperts. Reorganizing the data for report card presentation really helped nonexpert readers.
  • Her report card uses a five‑star system, which helped readers focus on specifics. For example, one of her LEA's one‑star schools seemed to be a fairly affluent, successful school, but it earned the one star because of the growth measure. Explaining growth was enlightening to the community and helped the LEA better understand the issue as well.
  • Having a narrative section of a report card is key for explaining the data. For example, stating that an LEA funded a 1:1 computing initiative last year is necessary to identify why there is an unusually high technology expense that year.
  • Many SEAs close data reporting at LEA signoff‑there is not an option to change things at a later date. SEAs argued that LEAs needed to pay attention to their data before signing off on it. Otherwise, someone looks at them seriously, perhaps for the first time, once they are published and asks for the release to be put on hold so data can be revised. SEAs might change data if their audit catches something, but not because someone who already signed off reviews it again.
  • As data become "more public" via report cards, agencies are putting more energy into understanding data definitions, formulas, reporting, etc.
  • Members were curious about what states are doing with Civil Rights Data Collection (CRDC) data. Some members expressed concerns about the quality of CRDC data and, therefore, whether to use it rather than augment it with local data on report cards. Using an SEA's own data can be attractive from a data quality perspective, but isn't compliant with report card requirements. Agencies can't simply point to an external data source and tell users to search for the data themselves.
  • One state already collects 60+ percent of CRDC data from districts for its own use. They prepopulate the report card with these data, and the LEA has the choice of replacing it with actual CRDC data if it wishes.

Lee and Tyler then addressed per pupil expenditure (PPE) as a report card item:

  • Maine and Rhode Island have published LEA‑level PPE data for a long time. From Maine and Rhode Island's experience, the public seems to be most interested in LEA‑to‑LEA comparisons of PPE, but this could change as people become more familiar with the school‑level data.
  • PPE data are very complex and nuanced. For example, because salaries are the largest part of school budgets, PPE is heavily influenced by staffing costs, but in LEAs with collective bargaining, school principals don't have much input on staff salaries, so there isn't much that can be done to reduce PPE in such a circumstance.
  • In Rhode Island, there was a process decision to designate any teacher who has an English language (EL) student as an EL teacher, even if there are only one or a few EL students in a large class. Thus, the report card data "overcharacterizes" the number of EL teachers because of a policy/process choice.
  • Periodicity matters. Including financial and accountability data on a single report card, but for different time periods, leads to confusion.

Military Connectedness Indicator
PPI Chair Charlotte Ellis (Maine Department of Education) asked Tyler Backus (Maine Department of Education) to discuss Maine's approach to designing and implementing a military connectedness indicator. Major themes of the discussion included the following:

  • There are two purposes in Maine for collecting a military connectedness indicator: (1) it is a federal requirement to collect and report, and (2) Maine has committed to ensuring that the movement of military students is smooth. Maine used CEDS as the foundation of its effort, but had to further divide the CEDS element to limit the data to active duty only.
  • One LEA asks parents four questions through its parent portal registration to designate military connectedness as (1) Not connected; (2) Yes, military connected through active duty; (3) Yes, military connected through reserve; or (4) Yes, military connected through national guard active or reserve. These data are collected four times per year because they can change.
  • In Hawaii, which also has a large military population, the education community works with base commanders to encourage parents to submit similar data.
  • The Military Interstate Children's Compact Commission (MIC3) (http://www.mic3.net/index.html) incorporates all states, as well as several outlying territories. It is intended to help kids and families with proactive ease of movement stemming from military service.

Data Governance
Forum Data Governance Working Group
PPI Chair Charlotte Ellis (Maine Department of Education) introduced Forum members Georgia Hughes‑Webb (West Virginia Department of Education) and Linda Jenkins (Arkansas Department of Education) to update PPI members on the efforts of the Forum Data Governance Working Group.

  • They reported that the working group kicked off following the July 2018 Forum meeting. After a virtual kickoff and two in‑person meetings, the group has agreed upon the intended audience, content, and organization of the resource. There are many existing resources on the topic of data governance, and this new Forum resource aims to synthesize data governance best practices from across Forum and ED resources for implementation in the real world. The primary purpose is to highlight how data governance is critical to the management and operations of any data‑driven organization.
  • The resource will focus on examples of how to best communicate and implement data governance in an organization. It will address all levels of education data. Examples might include how to bring an agency's cycle of collections up to date, approaches to reviewing and approving data requests, and enforcing data sharing agreements. As of now, the group also expects to include a chapter on how data governance needs change over time.
  • The working group is still looking for a case study about how data governance is effectively managed in a small, rural district with limited staff capacity.
  • Expect to see a draft for Forum review by fall 2019.

SLDS Data Governance Options
Continuing the theme of data governance, PPI then heard from Nancy Sharkey (NCES) and a member of the SLDS Support Team, who shared information about SLDS Data Governance Options. Nancy noted that SLDS support is available to all education agencies, not just those who have received SLDS grants. Key points addressed included the following:

  • Without sound data governance, an SLDS doesn't work‑there is too much connection between organizations, departments, and programs to not have a robust governance process.
  • The SLDS Data Governance Toolkit (https://slds.grads360.org/#program/datagovernance) includes a video that explains why data governance matters. It also offers resources about policymaking, a manual about how to operationalize data governance, roles and responsibilities (structure), and related topics such as early childhood and research requests. These resources are designed to help agencies avoid starting their data governance planning and implementation from scratch. Agencies are welcome to copy any language directly into their agency's documents.
  • A PPI member asked about the steps toward data governance maturity. After all, it's not as if an organization goes from no data governance to a robust data governance program in a few months. What are the steps of the journey?
  • Too many policymakers glaze over when they even hear the words "data governance." They still need to be convinced that it is worth thinking about at an agency‑wide level. Data leaders may need to build the value proposition in order to receive support from executive leadership, which is critical to overall system success. In order to make the concept more approachable to leadership, one LEA referred to its data governance as the "innovation council."
  • In one SEA, information technology (IT) support is centralized within the state rather than at the organizational level. This creates a gulf between IT "outsiders" and agency data staff because the organizational chart and culture are separate.
  • Rhode Island is sharing resources like financial system, geographic information system (GIS), and IT between education agencies and municipal authorities. Such collaboration requires even more granular data governance given the different perspectives of partners.
  • Many of the more rigorous data governance policies and processes are readily transferable to general good management practices.

Tuesday, July 23, 2019

Morning Session

Joint Session Follow-Up: Student Data Privacy
PPI Chair Charlotte Ellis (Maine Department of Education) welcomed Frank E. Miller, Jr., and Tracy Koumare (ED, Student Privacy Policy Office [SPPO]) to follow up on their general session presentation to the Forum. The discussion focused on the following:

  • People understand when the purpose of a studies exception (sharing education data) is to help the student. But in the K‑12‑W continuum, more and more studies focus on strengthening the workforce in general, and much more broadly than helping students. How do education agencies justify sharing the data yet be compliant with the Family Educational Rights and Privacy Act (FERPA)? This is a complex issue, made more challenging by the 1974 origin of the law that did not foresee these technology issues. ED is bound by the statute and can't mandate changes‑but Congress could do so. However, it is conceivable that ED could reach out to the U.S. Department of Labor to help figure out how to respond to this issue.
  • The SPPO at ED is working with the Department of Health and Human Services to update the 2010 Joint Guidance on the Application of FERPA and HIPAA to Student Health Records
  • When an agency contracts with an assessment vendor who offers computer‑adaptive testing, is it allowable for the company to analyze individual student results in order to maintain and update its item banks? After all, this is not what the LEA contracts for, but the vendor says it can't provide needed services without using individual data to improve its testing services. SPPO suggested that while it would be ideal if the vendor anonymized the data for those purposes, ultimately, it is probably okay to use it for product improvement such as validating their test bank items because it is necessary for them to provide the contracted service. One line that cannot be crossed is that the vendor cannot use the student data to market to students or for any other purpose not specified in the contract.
  • LEAs are interested in more guidance concerning sharing data between school resource officers (SROs) and police departments. SPPO recently published School Resource Officers, School Law Enforcement Units, and the Family Educational Rights and Privacy Act (FERPA), but while it answered a lot of questions, it raised even more. From a FERPA perspective, it matters whether the SRO is viewed by the LEA to be a school official or a law enforcement official. SPPO continues to work with the National Association of School Resource Officers to clarify positions and guidance, noting that keeping kids safe is the highest priority.
  • Is a school video part of a student record? The answer may depend on the situation; for example, using video cameras to dispatch or direct police officers in a school during a security incident is allowable because such a scenario presents an imminent threat.

Data Retention and Destruction (S2019_Data_Destruction) pdf file (95 MB)
PPI Chair Charlotte Ellis (Maine Department of Education) welcomed the Privacy Technical Assistance Center (PTAC) team, who led a discussion about how education agencies might design and implement sound data retention and destruction policies. Discussion topics included the following:

  • Data destruction, which is a part of the data life cycle and, as such, should be undertaken just like the other steps in the cycle (e.g., collection, organization, use, and management).
  • Data retention policies. The steps to creating a data retention policy include (1) check your state laws; (2) check to see how long you are required to keep certain types of records; (3) assess the storage methodology (physical or electronic media); (4) conduct a formal risk analysis; (5) align your policy with the findings from steps 1‑4.
  • National Institute of Standards and Technology (NIST) Guidelines for Media Sanitization, which are technical standards about destroying media that hold data.
  • Questions around FERPA applicability when a school pays for a college entrance exam for its students.
  • The FERPA's studies exception. This exception explicitly requires appropriate destruction of shared data. In fact, FERPA requires a "reasonable effort" to protect PII, including whether/how agencies require partners to destroy data.

Afternoon Session

Joint Session Follow‑Up: Cybersecurity
PPI Chair Charlotte Ellis (Maine Department of Education) facilitated a PPI discussion about the important cybersecurity issues introduced by Steven Hernandez (ED) during his general session presentation to the Forum. PPI broke out into discussions about SEA and LEA cybersecurity issues. Members reported that small group discussions included the following:

  • Training saturation is a concern. While training is necessary, it can't be assumed that training is always effective. To ensure that it is, in one SEA, if a staff person fails a test email from trainers to staff about phishing, for example, the staff person loses privileges. If the staff person fails a second test, the person's email gets redirected to his or her supervisor.
  • Some SEAs rely on statewide cybersecurity trainers to provide expertise that is not available in‑house.
  • One SEA outsources training and training testing to a third‑party vendor. This includes a follow‑up phishing campaign. If a staff person clicks on the wrong link, the person is required to undergo additional training.
  • PPI members are concerned about smaller LEAs that might not be able to purchase even basic training modules. The number of users relative to the resources available is a huge problem in an LEA. Training is key, but additional resources will also be necessary.
  • One SEA's cyber liability insurance providers offer training. Another SEA had the regional Federal Bureau of Investigation (FBI) office provide training at a state conference.

PPI Election
Brad McMillen (Wake County Public Schools [NC]) was nominated as the 2019‑20 PPI chair and Linda Jenkins (Arkansas Department of Education) was nominated as the 2019‑20 PPI vice chair.

EDFacts Discussion (S2019_EDFacts) pdf file (95 MB)
Barbara Timm (ED) and Beth Young (EDFacts Support Team) came to the committee to begin discussions on enrollment policies and pre‑kindergarten enrollment. Both issues came out as areas of interest from the last EDFacts Office of Management and Budget (OMB) collection package. These items were proposed and then dropped from the package based on the public response, and OMB asked for more information from states and districts on these topics for the future. The committee shared their thoughts on both topics, and some of the main points were as follows:

Enrollment Policies:

  • Some states have one policy that applies to all districts/schools in the state.
  • Many states have policies in legislation.
  • Some states have different policies for different districts or schools (charter or magnet), meaning that the districts set their own enrollment policies and that these can change.
  • States with different policies aren't likely to keep that information in their data system and would need to start collecting that information if it was required.
  • Policies aren't always cut and dried, meaning that there could be an open enrollment policy set, but for certain populations only, or until a threshold is met and then the policy closes, etc.

PreK Enrollment (in publicly funded PreK programs):

  • States have the number of PreK students enrolled in public schools and report those students as part of their EDFacts files.
  • Some states have PreK student data systems that include students in nonpublic schools (i.e., those not reported as part of EDFacts).
  • The PreK programs not in public schools are mostly managed by other state agencies.
  • Most states have surveys for incoming kindergarten students that ask about their preschool experience. However, these surveys include only those students who attend public elementary schools and are prone to parent response error.
  • Terms needed to be clarified as part of this conversation, such as public (i.e., does public funding include any public dollars from any federal agency?) and enrollment (i.e., what is the frequency and length of the PreK program: full time, part time, split time, or something else?).

Social and Emotional Learning Data
PPI Chair Charlotte Ellis (Maine Department of Education) introduced Isaiah O'Rear (NCES), Alvin Larson (Meriden Board of Education [CT]), and Ellis Ott (Fairbanks North Star Borough School District [AK]), who each delivered a presentation on how their organizations are addressing this critical education data issue.

Isaiah offered an update on psychometric benchmarking and use of ED's School Climate Surveys (EDSCLS) by states and districts. Alvin described how his school district was attempting to use social and emotional learning data to minimize student barriers and improve outcomes, stressing that the impact of the effort had very real and positive effects on children. Ellis discussed his district's participation in the EDSCLS. It has three domains (engagement, safety, and environment) and multiple topical areas within each domain. The Fairbanks North Star Borough School District publishes a publically available dashboard of the school climate survey results (https://public.tableau.com/profile/k12northstar#!/). For more information on Fairbanks North Star Borough School District's school climate surveys see https://www.k12northstar.org/Page/7759, and for more information on Meriden Board of Education's surveys see https://www.meridenk12.org/departments/research-and-evaluation/.

Wednesday, July 24, 2019

Morning Session

Common Education Data Standards (CEDS) and G3‑Generate (S2019_G3_Generate) pdf file (95 MB)
PPI Chair Charlotte Ellis (Maine Department of Education) welcomed Nancy Sharkey (NCES) and a CEDS Support Team member to discuss CEDS and Generate. The Generate Governance Group's (G3's) purpose is to serve as the SEA stakeholder advisory group to support and inform the evolution of Generate. The group's goals are to establish a collaborative network of Generate users and interested states, inform the priorities for future enhancements to Generate, contribute towards the long‑term vision for the role of Generate, and contribute towards the development of a sustainability model for Generate. PPI members were invited to join the group by logging into the GRADS360 website (https://slds.grads360.org) and accessing the G3 Community of Practice (https://slds.grads360.org/#communities/system-design/workgroups/generate-governancegroup).

Forum members Tom Howell and Mike McGroarty (Michigan Center for Educational Performance and Information) then described their agency's experience in adopting data standards from a user's perspective. This required considerable effort and expertise and had to be viewed as a long‑term project, but was worth it to improve the communication and use of data in Michigan. They argued that it is a mistake to focus on the short‑term costs because they are minuscule relative to the time LEAs save in state and federal reporting, which are long‑term improvements.

PPI members noted that as good as data systems have become in the past 10 years, they are not ideal. Implementing data standards should be viewed as a long‑term improvement that will enable even ad hoc data demands to become automated, which will free staff time for other responsibilities.

One reason CEDS is attractive is because many states are using it. Thus, expertise and experience can be shared so that one state gets to receive lessons learned from peer organizations in other states.

In Michigan, the priority is to align state and local data exchanges and then undertake federal reporting and researcher use. LEAs were initially reluctant to sign on, but by establishing a data hub, the SEA is doing a lot of transformation work, saving LEAs up to $56 million per year, which makes standards worthwhile to LEAs as well. It also allows LEAs to use the data for their purposes more easily, without every LEA needing to build its own hub or operational data store.

This solution makes records transfer between LEAs more efficient, so that when a student transfers, their data moves with them much more quickly. In LEAs, helping to solve mobility tasks is a huge value.

Chronic Absenteeism
PPI Chair Charlotte Ellis (Maine Department of Education) facilitated a PPI discussion about chronic absenteeism. Discussion focused on the following topics:

  • The relationships between chronic absenteeism and student performance are pretty strong. By and large, kids need to be in school to be successful. Sometimes SEAs and LEAs define absenteeism differently. In some LEAs, for example, missing class for a field trip or athletic event is still an absence because missing class, for whatever reason, can affect academic performance. Perhaps a student might be able to miss some special classes and still get through high school, but missing core courses would hurt a lot of kids.
  • Full‑day, part‑day, class by class, and truanc‑y (late) are all valid measures of absenteeism, depending on what the purpose is for the data.
  • SEA calculations of aggregates for accountability help policymaking, but course‑level data are what principals and students need to know in a school building (or at the LEA level, where issues are multifaceted). Extracurricular activities, discipline issues, and personal conflicts all affect attendance in a classroom. Even course schedules matter‑for example, a student who is late for school all the time might be better with a first period special class rather than math. Conversely, a student who leaves early for sports a lot of the time needs core courses scheduled for the morning. This is a great example of data use‑knowing the many dimensions of a student's life schedule to create an academic schedule that best supports success.
  • There are a lot of assumptions about absentee reports. For example, members noted that some wealthy districts say that their kids miss school because they are traveling abroad, but such an assumption should not allow decisionmakers to ignore absenteeism data.
  • LEAs pay a lot of attention to teacher absenteeism and, at least in some LEAs, teachers are absent more than students (between professional development, nonteaching responsibilities, and their own time off). Thus, teacher absenteeism is another concern, but those data are not reported to all states.

Steering Committee Report
PPI Chair Charlotte Ellis (Maine Department of Education) reported that the Steering Committee discussed CRDC in depth and offered to help NCES continue to work with the Office for Civil Rights (OCR) to improve the collection's data quality.

PPI Meeting Review/Future Planning
PPI Chair Charlotte Ellis (Maine Department of Education) led a review of the meeting and discussion of plans for future PPI meetings. Members suggested the following topics for future PPI meetings:

  • Teacher absenteeism.
  • The Protection of Pupil Rights Amendment (PPRA) and how it is handled at the state and local levels.
  • The "implementation" part of PPI‑how are agencies making things happen? What procedures are working in real‑world applications?
  • Pre‑K is a good example of the continued need for standards and standard definitions. There is no need for 50 SEAs to start from scratch on this issue.

Members shared several comments and suggestions for PPI organization:

  • The value of the Forum is to connect LEAs to SEAs to federal agencies.
  • Intimate table talks with colleagues were really helpful. Presentations can provide useful information, but shorter presentations with more collegial (small table) discussions could make for a better format.
  • Adding objectives to agendas might help members prepare their thoughts in advance, not just in response to discussion organically.
  • Charlotte noted that meeting planners asked for input from PPI members during the agenda building process, but little was offered. PPI used a tool to collect input this year, but the items were selected in spite of minimal member input.
  • Formatting: is this the right setting to engage members for two‑and‑a‑half days? Are there other formats than three committees talking and listening for two‑and‑a‑half days? Steve Smith (Cambridge Public Schools [MA]) volunteered to help lead a committee to review and evaluate.
  • Encourage collaboration between SEAs and LEAs from the same state, and encourage Forum attendance.

Closing Thoughts
PPI Chair Charlotte Ellis (Maine Department of Education) thanked members for a productive and enjoyable meeting and year. She then handed the meeting over to PPI Vice Chair Brad McMillen (Wake County Public Schools [NC]) and Linda Jenkins (Arkansas Department of Education), the nominated 2019‑20 PPI vice chair, who asked PPI members for recommendations for future PPI focus. Suggestions included considering whether the Forum's current three‑committee organization continues to best meet our communications and content needs.

Vice Chair Brad McMillen (Wake County Public Schools [NC]) thanked Chair Charlotte Ellis (Maine Department of Education) for her leadership in 2018‑19 and then closed the Summer 2019 PPI meeting, noting that members should submit their evaluation forms and look for online communications in the coming weeks as PPI continues its efforts to improve education data quality and use.

Top

Technology (TECH) Committee Meeting Summary

Monday, July 22, 2019

Morning Session

TECH Committee Kickoff
TECH Chair Ken Hutchins (Brandywine School District [DE]) and TECH Vice Chair DeDe Conner (Kentucky Department of Education) introduced themselves and welcomed members to the meeting. Ken informed members about the National Center for Education Statistics (NCES) Distance Learning Dataset Training (DLDT) system. This resource is an online, interactive tool that allows users to learn about and access NCES data across the education spectrum and evaluate it for suitability for specific research purposes. It is designed to introduce users to many NCES datasets, their design, and special considerations for analysis to facilitate effective use. TECH participants introduced themselves, noted their home agency/organization and how long they've been attending Forum meetings, and shared a recent challenge and success they faced in their agency. Ken briefly reviewed major activities and discussions from the 2018 TECH meeting and reviewed the 2019 meeting agenda.

The committee also briefly discussed Commissioner Woodworth's presentation at the Forum opening session. TECH members look forward to NCES's continued work on alternative socioeconomic status (SES) measures and would be interested in further discussion on whether and how agencies can access National School Lunch Program (NSLP) data for tribal schools.

Members also expressed interest in learning more about NCES's Education Demographic and Geographic Estimates (EDGE) program, including the difference between EDGE and the U.S. Census Bureau's American Community Survey (ACS), and would like Douglas Geverdt (NCES) to speak at next year's TECH meeting.

Afternoon Session

Report Cards
TECH Vice Chair DeDe Conner (Kentucky Department of Education) led a follow‑up discussion on topics discussed during the joint session presentation on report cards. Several TECH members shared their states' report cards:

  • Ken Hutchins (Brandywine School District [DE]) shared Delaware's Report Card website (https://reportcard.doe.k12.de.us/). State and local education agency (SEA and LEA) collaboration was most prevalent during the early stages of the report card design process. The report cards are designed to be user friendly and present key metrics and data, including enrollment, graduation rates, and subject proficiency. Ken noted that some of the metrics and data in the SEA‑developed report cards are defined differently from those at the LEA level. Looking ahead, LEAs would like to see more narrative context incorporated into the report card landing page and additional time for data reviews prior to public release.
  • DeDe shared Kentucky's School Report Card website (https://www.kyschoolreportcard.com/). The site includes links to other data reports and features data drill‑downs. The school report cards include both federally required and state‑required data and also show district and state data for comparative purposes. Starting in fall 2019, the report cards will include granular performance‑based star ratings that will help identify achievement gaps and opportunities to improve.
  • Steve Young (Washington State Office of Superintendent of Public Instruction) shared the Washington State Report Card website (https://washingtonstatereportcard.ospi.k12.wa.us/), which the SEA's Data Governance Group helped design. The site includes state, district, and school data‑with new data releases each month‑and data downloads are available for all report card data. The report card data are dynamic, so any future changes or corrections will be reflected in the report cards. There is both a secure and an unsecure site‑LEAs have a set amount of time to review data on the secure site before it is released to the public on the unsecure site. The report card landing pages feature summary data and the ability to drill down into the data. A “Contact Us” feature allows users to email appropriate staff, and data pages can be converted into PDF format for accessibility.

Members shared the following best practices and lessons learned:

  • It's important for education agencies and vendors to agree on best practices.
  • It helps LEAs when SEAs take the lead on report cards, but LEAs need to feel that their input is reflected in the report card design and creation process.
  • Metrics and data can be defined differently, even amongst staff in the same agency. User experience (UX) testing and user acceptance testing (UAT) with different groups (agency staff, engaged parents, disconnected stakeholders, etc.) are helpful to ensuring that users understanding key metrics and definitions in report cards.
  • Simplicity in data reports, particularly those geared toward parents, works well.
  • LEA quality assurance is essential, but LEAs need to be provided with adequate time for data reviews, verification, and sign‑offs.
  • School‑to‑school comparison tools can help parents make informed decisions.

Data Visualization
Forum Data Visualization Online Course
Georgia Hughes‑Webb (West Virginia Department of Education) provided an update on the Forum's Data Visualization Online Course. The Data Visualization Online Course Project Group developed this new resource to introduce the concept of data visualization. The online course describes how to apply key data visualization principles and practices to education data and explain how the data visualization process can be implemented to support effective data analysis and communication throughout an education agency. It builds upon the Forum Guide to Data Visualization: A Resource for Education Agencies, published in 2016. The course is available on the Forum Online Courses web page at https://nces.ed.gov/forum/online_courses.asp. Georgia concluded her update by encouraging members to think of ways that future Forum courses and curricula can help the Forum.

TECH members noted that short videos and presentation slides are helpful resources. In addition, Forum courses can be a useful training tool for ensuring that agency staff are on the same page with respect to sound data principles and practices.

Data Visualization Tools
TECH Vice Chair DeDe Conner (Kentucky Department of Education) discussed a new data visualization tool in her state. The Kentucky Department of Education collaborated with a vendor to create a data visualization tool focused on making data actionable for LEAs. The tool is intended to help LEA administrators get meaningful and useful insights into the vast and complex education data available through the state. The tool features a dynamic response and enables data drill‑down. The first data release focused on three areas: demographics, behavior, and attendance. Many visualizations‑including charts, heat maps, and spreadsheets‑have been developed thus far, and additional areas are planned for the future. TECH members were interested in learning more about the project timeline, the cost, and the SEA's collaboration with the vendor.

Agency Effectiveness
Supporting SEA and LEA Collaboration
(S2019_Delaware_Forum) pdf file (87 MB)
TECH Chair Ken Hutchins (Brandywine School District [DE]) and Adrian L. Peoples (Delaware Department of Education [DDOE]) provided an update on their work establishing and cochairing the Delaware Education Data Forum (Delaware Forum). Adrian and Ken developed the idea for the Delaware Forum based on their experiences as the Delaware SEA and LEA Forum representatives and their desire to foster trust and cooperation between the SEA and LEAs in their state. They established the Delaware Forum in 2016 to focus on supporting Delaware students through improvements to student‑level data. The mission of the Delaware Forum is to promote clarity, confidence, and consensus on student‑level data.

The Delaware Forum is co‑chaired by an SEA and LEA representative to ensure that the SEA and LEAs have equal responsibility, voice, and input on agenda setting and decisionmaking; Adrian currently serves as the SEA co‑chair, and Ken previously served as the LEA co‑chair. The LEA co‑chair serves as a liaison for LEA communication with the SEA. By helping to facilitate communication between the SEA and LEAs, the LEA and SEA co‑chairs can help ensure that minor issues are addressed before they become major concerns. The Delaware Forum holds a closed‑door three‑hour meeting each month; 30 minutes of meeting time is reserved for all LEA participants to meet without the SEA participants, which helps build relationships between LEAs who wouldn't otherwise meet in person on a regular basis. The small size of the state makes it relatively easy for LEAs to travel to the state capital to attend in‑person meetings. The Delaware Forum is beginning to form subcommittees and is moving toward a professional learning community approach so LEAs can learn from each other and work on common issues.

Adrian and Ken engaged in a discussion with TECH members and provided more details on the Delaware Forum:

  • Given the need to reestablish trust in data reports between the SEA and LEAs, the primary focus of the Delaware Forum is data accuracy and quality at the student level. Data validation and quality checks by LEA was critical for engendering trust. The Delaware Forum has also discussed how data are analyzed, communicated, and visualized. Mandated data collections and policies and procedures have been discussed to a lesser degree, as they are not of primary concern.
  • Currently, most Delaware LEAs and 25 percent of Delaware charter schools participate in the Delaware Forum. Membership is intentionally and primarily comprised of data staff. Ken noted that a regional approach might work well for larger states‑in practice, SEA‑LEA meetings could rotate between regions, with each region meeting 3‑4 times per year.
  • The Delaware Forum has worked on several data challenges, including annual assessment and accountability reporting, graduation and dropout rates, the Civil Rights Data Collection (CRDC), and the use of School Courses for the Exchange of Data (SCED) codes.
  • Delaware has a statewide student information system (SIS). Data are extracted from the system by the SEA on a nightly basis, then verified by LEAs before data reports are published. Data verification helps LEAs understand where they need to accept responsibility for data issues. If an error is detected, the reported data are corrected.
  • Conflict resolution is essential to the success of the Delaware Forum. Having LEA representatives and all SEA data staff in attendance helps individuals assume responsibility for issues and concerns and work together to understand how issues can be resolved.

Tuesday, July 23, 2019

Morning Session

Data Privacy
Joint Session Follow‑Up: Student Data Privacy
TECH Chair Ken Hutchins (Brandywine School District [DE]) led a follow‑up discussion on topics raised during the student data privacy presentation from Frank E. Miller, Jr., and Tracy Koumare (U.S. Department of Education [ED], Student Privacy Policy Office).

  • Several student data privacy issues persist, including
    • what constitutes directory information;
    • who may access National School Lunch Program (NSLP) data;
    • which data to include in grant applications that require low socioeconomic status (SES) information; and
    • third-party compliance with state laws regarding data access.
  • TECH members have found that the following strategies successfully strengthen student data privacy:
    • Virtual training from the Privacy Technical Assistance Center (PTAC) that includes both data staff and non‑data staff (e.g., curriculum directors)
    • The Consortium for School Networking's (CoSN) Trusted Learning Environment (TLE) Seal Program
    • Providing resources to teachers
    • The Utah State Board of Education's student data privacy training videos (https://schools.utah.gov/studentdataprivacy)
  • TECH members shared several successes in their agencies and organizations:
  • Kentucky and Connecticut have directly negotiated with Google on data privacy.
  • New legislation and regulations on data privacy laws motivated the hiring of new staff and the adoption of an adapted version of the National Institute of Standards and Technology (NIST) Cybersecurity Framework.
  • Staff awareness has improved. As staff think more intentionally about data privacy, they are more likely to take necessary action, such as ensuring that data sharing agreements are in place before signing new contracts or using new apps.
  • Assigning data sharing responsibilities to one unit/division within an agency, implementing an agency‑wide data request process, and creating a standard statewide memorandum of understanding (MoU) were helpful.
  • The Student Data Privacy Consortium (SDPC) now has 22 state alliances; of these, 11 states use the same standardized privacy contract clauses, which helps districts ensure that vendors conform with student data privacy protections.

Privacy in SEAs and LEAs
TECH Vice Chair DeDe Conner (Kentucky Department of Education) led the committee in discussing other privacy topics of importance to SEAs and LEAs. TECH members made the following points:

  • Phishing emails remain a common threat to SEAs and LEAs. Spoofed emails have also been used to divert payroll and request vendor delivery of goods to false addresses. Spoofed emails are more commonly sent after hours and from overseas. Secure email, Secure File Transfer Protocol (SFTP), and one‑roster certification can enhance email security. The Indiana Department of Education has several resources on email security and will also cover the cost of phishing campaign exercises for enrolled LEAs
  • (https://www.doe.in.gov/cybersecurity/cybersecurity-staff).
  • Having a secure network can help protect an agency, but agency information can still be compromised when staff use non‑agency or personal devices outside of the agency. Premium and freemium applications provided through professional development opportunities can also pose a threat. Potential strategies to improve network security include the NIST Cybersecurity Framework, two-factor authentication, and ethical hacking.
  • Training courses can be effective, but common privacy and security threats persist. Providing both online training and in‑person training (traveling, as needed, for district level or regional training) can be helpful.
  • The TECH Committee could work on this topic further by developing a short resource. Potential resources include a white paper, a self‑assessment rubric, an online library or list of resources, a one‑page handout/flyer, or a slide deck. The short resource should focus on common‑sense, basic recommendations.

Information Technology (IT) and Systems
TECH Panel Discussion
(S2019_IT_Systems_Panel) pdf file (87 MB)
Andre Smith (Florida Department of Education) and Dan Dandurand (South Sioux City Community Schools [NE]) discussed their agency's IT and systems, focusing on IT consolidation and legacy systems, technological readiness for the cloud, and staff buy‑in to new systems.

Florida has integrated data across several state agencies (including education, labor, and social services). The state recently passed legislation that all state agencies must be cloud based; thus, state agencies are prioritizing technology and systems that are ready for the cloud. The SEA is working toward limiting the number of apps used and consolidating apps together into one centralized environment. Andre shared the following best practices and lessons learned based on his agency's experience:

  • Leadership needs to understand and appropriately support technology and systems if changes are to be successful.
  • Legacy systems need to be critically assessed before they are replaced. If a system does not provide critical services that advance the mission of an agency, it can be difficult to justify its replacement.
  • Moving data to the cloud requires certain infrastructure and data protections to be in place.
  • Clearly assigned responsibilities are essential when modernizing technology.
  • Policies and procedures also need to be changed when IT changes.

The South Sioux City Community Schools (NE) experienced challenges when migrating to a new cloud‑based content management system. Even with more than one year's notice of the migration through monthly emails to staff, critical information and data had not been migrated to the new system when the old system was deactivated. The district also encountered challenges when implementing a new learning management system (LMS). Shortly following implementation, the district adopted a new grading plan that could not be accommodated by the new LMS and, thus, necessitated the procurement of a different LMS and the conversion of curricular content. Dan shared the following best practices and lessons learned based on his agency's experience:

  • Clear and targeted communication is essential. Cut back on emails with nonessential information (e.g., sending off‑line notices for overnight system updates).
  • Be prepared if timelines need to change. Have a contingency plan in place for users who encounter potential system migration issues (e.g., set cut‑off dates, steps on what to do if a system shuts off early, and criteria for reactivating an old system if critical data are not migrated).
  • Engage all stakeholders (leadership and staff) when considering new systems and conduct stakeholder pilots before purchasing new systems.
  • In the months prior to implementing a new system, engage staff and provide targeted inperson training to help them learn how to use a new system. Ask for feedback on the training materials when presenting to small groups.
  • Use contract clauses to protect your agency from vendor nondelivery/nonperformance.
  • Set milestones that must be completed by vendors before payment.

A few TECH members shared their thoughts and lessons learned on the topic of IT and systems:

  • Electronic notifications of system changes and replacement can be insufficient. In‑person training for multiple user groups is often needed.
  • LEA verification of data reported to SEAs can clarify how data differences impact funding.
  • Making apps accessible and Section 508 compliant is essential but not necessarily something that can be hired out to vendors.
  • It can be helpful to narrow the scope of an IT department's/staff's responsibilities and focus on critical IT and system. Regularly convening department leadership to confirm that projects align with state and district strategic plans can avoid initiative overload.
  • A Forum white paper or checklist that details the roles and responsibilities of Chief Information Officers (CIOs) would be helpful.
  • The forthcoming Forum Guide to Technology Management in Education addresses many of the topics discussed during this panel.

TECH Committee Business
DeDe Conner (Kentucky Department of Education) was nominated as the 2019‑20 TECH chair and Dawn Gessel (Putnam County Schools [WV]) was nominated as the 2019‑20 TECH vice chair.

Afternoon Session

Cybersecurity and Security in SEAs and LEAs
Joint Session Follow‑Up: Cybersecurity
TECH Chair Ken Hutchins (Brandywine School District [DE]) led a follow‑up discussion about cybersecurity following the presentation from Steven Hernandez (ED). TECH members shared the following points:

  • Systems can be particularly vulnerable to distributed denial‑of‑service (DDoS) attacks during vacation periods.
  • It can be difficult to implement dual‑factor authentication for systems that include nonagency employees (e.g., an SEA data system that has both SEA and LEA users). Multifactor authentication can also present challenges when staff leave an agency, as it may be difficult for staff to access needed files.
  • More frequent password changes can be counterintuitive‑users may be less likely to remember their password and write them down instead. Time‑based one‑time passwords can work well for smaller agencies.
  • Technology needs to be prioritized by the education agency for cybersecurity measures to be successful.

Review and Discuss TECH White Paper: The Convergence of Physical Security, Cybersecurity, and Data Security in SEAs and LEAs
TECH Chair Ken Hutchins (Brandywine School District [DE]) provided an overview of the motivation for and development of a short white paper developed by the TECH Committee, entitled The Convergence of Physical Security, Cybersecurity, and Data Security in SEAs and LEAs. At the summer 2018 meeting, TECH members suggested that standing committee meeting time could be made more actionable and contribute to the development of a tangible product. The committee's goals for this actionable session are to allow more Forum members to contribute to resource development, to encourage meeting participation, and to provide justification for members' attendance. TECH held two virtual meetings on December 14, 2018, and May 2, 2019, to discuss the intersection of cybersecurity and the broader security context. These discussions informed the development of a short white paper summarizing key issues and resource on this topic. A draft of the white paper was shared with TECH members in early July for review and discussion.

Ken opened the floor for TECH members to discuss the draft white paper. Members offered the following comments:

  • Overall, the white paper is a good draft, but would benefit from some changes to make it more accessible and appealing to a nontechnical audience.
  • The committee's goal is to create a practical document that addresses the convergence of policies, practices, and procedures that encompass the three security domains: physical security, cybersecurity, and data security. The goal of the white paper itself is to create awareness about this topic and motivate action, particularly for nontechnical readers.
  • Many stakeholders are concerned that education agencies are not spending adequate funds on school safety, but stakeholders also need to be concerned with providing comparable resources for network security.
  • Several revisions and additions to the introduction will make the paper more appealing for nontechnical readers. In addition, the graphics and tables in the body of the text could be simplified and streamlined.
  • Additional topics that could be addressed in the paper include the following:
    • Cybersecurity insurance
    • Network segmentation
    • Heating, ventilation, and air conditioning (HVAC) and refrigeration systems
    • Autodialer and automated text message service
    • Threat assessment
    • Data warehouse physical security o Disaster recovery
    • Migrating student information systems (SIS)
  • Additional resources‑including resources from CoSN, PTAC, and SDPC and resources included in Steven Hernandez's presentation on cybersecurity‑could be included in the reference list.

Topics From the Floor
Ken invited TECH members to discuss other topics of interest that were not addressed during planned conversations. Members returned to their previous conversation on the topic of report cards:

  • The Data Quality Campaign (DQC) publishes an annual analysis of report cards of all 50 states and the District of Columbia entitled Show Me the Data: DQC's Annual Analysis of Report Cards. DeDe Conner (Kentucky Department of Education) is a member of the DQC's Every Student Succeeds Act (ESSA) reporting group and offered to provide an update on the group at next year's TECH meeting. The Council of Chief State School Officers (CCSSO) also has resources and can assist with report cards.
  • District and school report cards are often generated by the SEA, and links are provided to LEAs. This lessens the burden of report card creation on LEAs, but LEAs remain responsible for local dissemination.
  • Helping stakeholders, particularly parents, understand the meaning behind certain metrics and data points remains a challenge.
  • While ESSA requires that certain information be included in report cards, states have the flexibility to include other data that are considered important to state and local stakeholders.
  • Dean Folkers (Nebraska Department of Education) shared the Nebraska Education Profile website
  • (https://nep.education.ne.gov/). The report cards draw on ACS data to identify comparable “peer” schools. Nebraska's accountability model, Accountability for a Quality Education System, Today and Tomorrow (AQuESTT), includes outcomes measures (e.g., test scores) and evidence‑based non‑outcomes measures (e.g., processes and demographics) to generate evidence‑based analysis (EBA) scores for schools. The model is based on self‑assessment rubric scores, and schools that provide evidence of meeting a certain score receive a slight increase in their accountability score. Dan Dandurand (South Sioux City Community Schools [NE]) noted that the Nebraska SEA encourages LEAs to assume responsibility and supports LEAs in their improvement efforts.

Federal, State, and District Data
Discussion of Federal Data in States and Districts with EDFacts Staff (S2019_EDFacts) pdf file (87 MB)
Barbara Timm (ED) and Beth Young (EDFacts Support Team) came to the committee to begin discussions on enrollment policies and pre‑kindergarten enrollment. Both issues came out as areas of interest from the last EDFacts Office of Management and Budget (OMB) collection package. These items were proposed and then dropped from the package based on the public response, and OMB asked for more information from states and districts on these topics for the future. The committee shared their thoughts on both topics, and some of the main points were as follows:

Enrollment Policies:

  • Some states have one policy that applies to all districts/schools in the state.
  • Many states have policies in legislation.
  • Some states have different policies for different districts or schools (charter or magnet), meaning that the districts set their own enrollment policies and that these can change.
  • States with different policies aren't likely to keep that information in their data system and would need to start collecting that information if it was required.
  • Policies aren't always cut and dried, meaning that there could be an open enrollment policy set, but for certain populations only, or until a threshold is met and then the policy closes, etc.

PreK Enrollment (in publicly funded PreK programs):

  • States have the number of PreK students enrolled in public schools and report those students as part of their EDFacts files.
  • Some states have PreK student data systems that include students in nonpublic schools (i.e., those not reported as part of EDFacts).
  • The PreK programs not in public schools are mostly managed by other state agencies.
  • Most states have surveys for incoming kindergarten students that ask about their preschool experience. However, these surveys include only those students who attend public elementary schools and are prone to parent response error.
  • Terms needed to be clarified as part of this conversation, such as public (i.e., does public funding include any public dollars from any federal agency?) and enrollment (i.e., what is the frequency and length of the PreK program: full time, part time, split time, or something else?).

Wednesday, July 24, 2019

Morning Session

TECH Discussions
Topics From the Floor: TECH White Paper
TECH Chair Ken Hutchins (Brandywine School District [DE]) began by summarizing the previous day's discussion on the TECH white paper. He invited members to share any additional comments and discuss next steps for the paper:

  • Members suggested several additional revisions to the paper.
  • Emphasize that facilities staff (physical security), data staff (data security), and information technology staff (cybersecurity) need to work together to ensure that security is addressed across all three domains.
  • Conclude with a “call‑to‑action” focused on the key takeaways:
    • Collaboration and communication are essential.
    • You can't just focus on one domain; you need to address all three areas in your agency.
    • Work with your colleagues throughout your agency to ensure that security is addressed across all three domains.
  • Following the meeting, the Forum team will update the draft based on the TECH Committee's comments and suggestions. The updated draft will be shared with the TECH Committee for review and comment, and TECH members will coordinate expert reviews with colleagues in their agencies and organizations, as appropriate.
  • The committee plans to hold a webinar in fall 2019 to finalize the content of the white paper and discuss potential dissemination routes. If the committee determines that the updated white paper is a valuable resource that will help education agencies, it would be interested in pursuing formal publication to the Forum website.
  • The committee also plans to consider
    • revisiting the white paper in the years ahead if updates are needed;
    • requesting that the Forum form a working group that can build on the committee's work (the committee suggested that PTAC or Steven Hernandez would be helpful to engage in this work);
    • whether and how to update the white paper following publications, particularly if links to other resources become inactive.

Forum Meeting Participation
CoSN Report
TECH Chair Ken Hutchins (Brandywine School District [DE]) attended the 2019 CoSN meeting that was held April 1‑4, 2019, in Portland, OR. CoSN 2019 focused on the role technology leaders play in envisioning, shaping and leading school system transformation to support student needs. The theme of the conference, “Envision 2030: Leadership for Learning,” explored how to provide the class of 2030 with the skills they will need for success. The conference theme addressed what the world will look like as the class of 2030 leaves the education system and how to prepare students now for success in the world to come. Ken attended sessions focused on timely topics in the field of education data, particularly related to cybersecurity. Several of the resources shared at the conference were used to inform the development of the TECH white paper.

Postsecondary Electronic Standards Council (PESC)
Susan Williams (Virginia Department of Education) attended the Postsecondary Electronic Standards Council (PESC) Spring 2019 Data Summit that was held May 8‑10, 2019, in Washington, DC. During the Summit, Susan and Rachel Kruse (Iowa Department of Education) presented PESC with an overview of SCED and SCED resources. Susan reported that the presentation was well received and PESC attendees were very engaged in the presentation. Susan noted that community colleges are using SCED codes; as a result, postsecondary institutions may become more familiar with SCED codes as they increasingly appear on college student transcripts. Ross Santy (ED) shared that NCES recently updated the Classification of Instructional Programs (CIP) codes and asked whether there was any value in aligning SCED codes with CIP codes. TECH members noted that alignment is different for each state's postsecondary system; thus, statewide crosswalks would likely be more feasible than a nationallevel alignment project. Susan concluded her update by encouraging members to use the Forum's SCED resources, and Michael Sessa (PESC) invited the Forum to present at next year's Spring Data Summit.

TECH Committee Business
Steering Committee Business/Report TECH Chair Ken Hutchins (Brandywine School District [DE]) reported that the Steering Committee discussed topics of interest across the Forum's three standing committees.

Meeting Review/Future Planning
TECH Chair Ken Hutchins (Brandywine School District [DE]) concluded the meeting by thanking TECH members for their contributions. TECH members shared that the white paper has been a valuable outcome of the meeting.


 

Publications of the National Forum on Education Statistics do not undergo the formal review required for products of the National Center for Education Statistics. The information and opinions published here are the product of the National Forum on Education Statistics and do not necessarily represent the policy or views of the U.S. Department of Education or the National Center for Education Statistics.