Skip Navigation

Summer 2014 Forum Meeting Notes

National Forum on Education Statistics
July 28-30, 2014
Washington, DC

Forum Opening Session
Joint Session: Family Educational Rights and Privacy Act (FERPA) Update
Joint Session: Office of Educational Technology Initiatives
Joint Session: Graduation Rate Reporting
National Education Statistics Agenda Committee (NESAC) Meeting Summary
Policies, Programs, and Implementation Committee (PPI) Meeting Summary
Technology Committee (TECH) Meeting Summary
Forum Closing Session
Steering Committee



Forum Opening Session

Monday, July 28, 2014

Forum Agenda Review and Introduction pdf file (249 KB)
Forum Chair Lee Rabbitt (Rhode Island Department of Elementary and Secondary Education) welcomed members to the Summer 2014 Forum Meeting in Washington, DC, introduced the Forum officers, and encouraged everyone to help welcome the following new members into standing committees and working groups:

  • Charlotte Bogner, Kansas State Department of Education
  • Wendy Fuller, Vermont Agency of Education
  • Dawn Gessel, Putnam County Schools (WV)
  • Laura Hansen, Metropolitan Nashville Public Schools (TN)
  • Carla Howe, West Virginia Department of Education
  • Brad McMillen, Wake County Public School System (NC)
  • Kathleen Moorhead, New York State Education Department
  • Zenaida Natividad, Guam Department of Education
  • David Reeg, Minnesota Department of Education
  • Gerald Reyes, Commonwealth of the Northern Mariana Islands (CNMI) Public School System
  • Julie Riordan, Regional Educational Laboratory – Northeast and Islands
  • Jim Robb, West Memphis School District (AR)
  • Andre Smith, Florida Department of Education
  • Tim Stensager, Washington State Office of Superintendent of Public Instruction
  • Ben Tate, Oregon Department of Education
  • Cheryl VanNoy, St. Louis Public Schools (MO)
  • Andrew Wallace, South Portland School Department (ME)
  • Jon Wiens, Oregon Department of Education

Lee announced that a new Forum resource, the Forum Guide to School Courses for the Exchange of Data (SCED) Classification System, is available on the Forum publications webpage at http://nces.ed.gov/forum/publications.asp. She concluded her opening remarks by briefly reviewing the agenda for the meeting and welcoming John Easton, Director of the Institute of Education Sciences (IES).

Welcome and IES Update pdf file (246 KB)
John Easton, Director of the Institute of Education Sciences (IES), welcomed Forum members to Washington, DC. His discussion focused on new initiatives and ongoing data collections at the National Center for Education Statistics (NCES). The full presentation is available on the IES webpage. John thanked members for attending the Summer Forum and wished everyone a productive meeting.

Top

Joint Session: Family Educational Rights and Privacy Act (FERPA) Update

pdf file (727 KB)

Monday, July 28, 2014

Kathleen Styles, Chief Privacy Officer of the U.S. Department of Education (ED), gave a presentation titled "Change is in the Air: An Update on Student Privacy". She highlighted recent developments around the importance of protecting the privacy of student data at the federal level and encouraged Forum members to go beyond the requirements set forth in the Family Educational Rights and Privacy Act (FERPA) and to focus on the guidelines set in the Federal Trade Commission’s Fair Information Practice Principles (FIPPs). By working to increase transparency and separate fact from fiction, professionals working with student data can build high quality student privacy programs within their agencies. Forum members appreciated the opportunity to discuss student privacy management issues, data security, data sharing agreements, and deductive disclosure with Kathleen.

Top

Joint Session: Office of Educational Technology Initiatives

pdf file (1.98 MB)

Tuesday, July 29, 2014

Joseph South, Deputy Director of the ED Office of Educational Technology (OET) gave a presentation titled “The Future of Educational Technology.” He discussed various technology trends that will fundamentally change how students learn and focused on the big picture of digital data and learning with a technology-focused perspective. His presentation addressed technology trends such as ubiquitous broadband, personalized learning, competency-based learning, seamless transitions to college and career, and rapid cycle, low cost research and evaluation. He noted that increasing access to technology learning opportunities will help to level the playing field for rural and under-resourced schools and will give students the same tools available to professionals to solve real problems. Forum members raised topics of discussion including increasing access to technology, tracking big data, increasing investments in technology in the classroom, and utilizing E-rate Program services.

Top

Joint Session: Graduation Rate Reporting

pdf file (989 KB)

Tuesday, July 29, 2014

Bob Balfanz and Jennifer DePaoli from the Johns Hopkins University School of Education Everyone Graduates Center gave a presentation titled “Further Improving Graduation Rate Measurement to Drive Progress: Starting a Conversation.” They discussed how collective efforts toward improving graduation rate measurements can help to identify ways to increase graduation rates for all students. Research suggests that focusing on student sub-groups, big cities, and building data capacity for accurate data will also be key drivers for graduate rate improvements. The goal of the presentation was to spur discussion around how to make current measures more accurate for the common good of all students. Forum members discussed various topics such as tracking student drop-outs; maintaining accountability; varying definitions of “graduate” between SEAs; compounding factors outside of the school that affect student attendance data; and utilizing multiple measures to produce graduation rates.

Top

National Education Statistics Agenda Committee (NESAC) Meeting Summary

Monday, July 28, 2014

Morning Session

Welcome, Introductions, and Agenda Review
NESAC Chair Allen Miedema (Northshore School District [WA]) led the group in introductions and reviewed the NESAC agenda. He welcomed everyone to the NESAC committee meeting, including new members and representatives from the Office of Management and Budget (OMB), and reminded members that Forum meetings are working meetings designed for Forum members and invited guests. Others interested in the work of the Forum were invited to attend the STATS-DC Data Conference immediately following the Forum.

Virtual Education Working Group Update
Laurel Krsek (San Ramon Valley Unified School District [CA]) updated NESAC on the Virtual Education Working Group. To help keep pace with the rapid expansion in virtual education opportunities in K-12 education, the Virtual Education Working Group is exploring new facets of virtual education and identifying virtual education data collection challenges. The group is currently developing a new Forum resource that will offer best practices for state and local education agencies.

College and Career Ready Working Group Update
Lee Rabbitt (Rhode Island Department of Elementary and Secondary Education) chairs the Forum’s College and Career Ready (CCR) Working Group. The group is moving forward with a draft of a new Forum resource that will help SEAs and LEAs understand how data can be used to support college and career readiness for all students. The group aims to publish the new guide in January 2015.

Afternoon Session

Teacher Evaluation Presentation and Discussion pdf file (190 KB)
NESAC Vice Chair Jan Petro (Colorado Department of Education) introduced Reino Makkonen (Regional Educational Laboratory [REL] West) to discuss teacher evaluation research. REL West works closely with Arizona and Utah to study pilot implementation and initial results of new teacher effectiveness systems. Reino discussed Arizona’s pilot project to introduce a new teacher evaluation system that uses three component metrics: classroom observation, stakeholder survey results, and student academic progress/growth calculations. The REL study, Combining Classroom Observations with Other Measures of Educator Effectiveness During Pilot Implementation of Arizona's New State Teacher Evaluation Model, will provide information gathered from the pilot study on the relationships between these measures. REL West is also providing assistance to the Utah State Office of Education as LEAs pilot new tools for teacher observation and the measurement of educational leadership. Reino noted that teacher effectiveness metrics represent a new type of assessment data that SEAs or LEAs may be unaccustomed to tracking. NESAC members engaged Reino in a discussion on topics including whether or not the evaluations being studied are considered high-stakes, who has access to granular data, the use of learning objectives for non-tested subject assessment, and possible additional elements for future study, such as teacher attendance. Reino noted that future research is planned on data use and how to provide professional development to teachers. He provided NESAC members with a handout with additional information on REL West’s work.

Education Data Standards Discussion
Allen Miedema (Northshore School District [WA]) facilitated a discussion on issues driving states to change standards and introduce legislation around standards. Forum members discussed the complexity of standards-based report cards and the use of badges and micro-credentialing as a way to capture student skills attainment.

Privacy Technical Assistance Center (PTAC) Update and SEA/LEA Privacy Policy Transparency
Dale King, Director of the Family Policy Compliance Office (FPCO) and Baron Rodriguez, Privacy Technical Assistance Center (PTAC) shared a presentation on FPCO and PTAC. 

FPCO implements two laws that seek to ensure student and parental rights in education: the Family Educational Rights and Privacy Act (FERPA) and the Protection of Pupil Rights Amendment (PPRA)—but the vast majority of inquiries they receive relate to FERPA. As required by law, FPCO investigates complaints from parents or eligible students, such as complaints related to inappropriate disclosure, efforts to amend student records, or lack of timely access to records. FPCO receives an average of 8-10 complaints each week, and is currently responding to a Freedom of Information Act (FOIA) request from the Electronic Privacy Information Center (EPIC) to release all response letters from recent years. FPCO also develops resources and issues guidance related to the administration of FERPA and PPRA, including the following resources that have been recently released, or are scheduled for release:

  • Guidance on amendments to FERPA that resulted from the Uninterrupted Scholars Act was released in May 2014.
  • A document providing side-by-side comparisons of the Individuals with Disabilities Education Action (IDEA) and FERPA law and regulations was released in June 2014.
  • Three new documents providing guidance on sharing information with community based organizations, sharing information with school resource officers, and handling privacy issues around integrated data systems are forthcoming.

FPCO recently launched a public-facing website as a companion to its current site. The new site includes a question and answer bank, which will be populated over the next year; it allows for the electronic submission of questions and requests to file complaints; and in the future it will include a community of practice forum that will allow users to communicate with each other.

Baron then updated members on recent PTAC work, including two new resources: Transparency Best Practices and the Surviving Heartbleed Guide. Forthcoming resources include:

  • a transparency best practices checklist, aimed at teachers and addressing “click-wrap” agreements;
  • information on mapping data flows;
  • strategies for effectively dealing with parental inquiries;
  • additional resources in the student privacy toolkit; and
  • a video providing a summary of privacy considerations for online educational services.

PTAC has also held a number of regional meetings focusing on FERPA from specific user perspectives, and an upcoming meeting focuses on nonprofits working with LEAs. In September PTAC will launch a working group on transparency and data breaches across agencies. PTAC also holds trainings on request, both virtually and in-person. Visit the PTAC website for more information about PTAC and its resources for the education community.

NESAC members asked questions about privacy issues related to school climate surveys, and noted that guidance for LEAs on working with community organizations would be helpful. Members look forward to assistance with click-wrap agreements, noting that it can be overwhelming to educate vendors and staff. Moreover, SEAs and LEAs have to be careful because some companies’ terms of service include a provision that if a company is sold, the data will be transferred to the new company.

Individuals With Disabilities Education Act (IDEA) Discussion
Meredith Miceli and Ross Santy (U.S. Department of Education [ED]) facilitated a discussion around technical assistance relating to IDEA. ED wants to be sure that the needs of SEAs for technical assistance are met, and IDEA data managers have expressed an interest in a community of practitioners that could discuss data issues. Ross and Meredith are interested in Forum member ideas on how to improve communication between the Office of Special Education Programs (OSEP) and SEAs, with the goal of improving data quality and increasing the efficiency of data collections. NESAC members expressed concerns that special education data are often kept separate from other data. Members recommended the use of the Common Education Data Standards (CEDS) as a communication tool. Conferences could serve as other opportunities for data managers to engage in conversations around data use. In addition to the STATS-DC data conference, NESAC members were interested in what other conferences are available for IDEA data managers to access technical assistance. Members also discussed data usage, policy implications, and record keeping.

Topics From the Floor
Allen Miedema (Northshore School District [WA]) asked members for topic suggestions. He noted that topics not addressed during the in-person meeting could be used for later virtual meetings. Member suggestions included

  • online assessment implementation issues;
  • report cards;
  • personalized learning and Bring Your Own Device (BYOD);
  • micro-credentialing;
  • data collection and reporting related to new standards; and
  • teacher evaluation and value added modeling.

Tuesday, July 29, 2014

Morning Session

Alternative Socioeconomic Status (SES) Measures Working Group Update
Matt Cohen (Ohio Department of Education) chairs the Forum’s Alternative Socioeconomic (SES) Measures Working Group. He provided background information on the project and an update on Working Group efforts. The Working Group is developing a Forum document designed to help readers better understand the implications of collecting and interpreting a range of SES-related data in education agencies. The group met prior to the Forum meeting to clarify the scope of the document. It should be noted that this document focuses on the needs and possible solutions for administrative records in education data systems and does not reflect the full spectrum of opportunities available to the research community. The final document will include commonly-faced challenges and suggestions. The Working Group is making progress to present a draft for Forum review in early 2015.

Joint Session Follow-up Discussion
Joseph South, Deputy Director of the ED Office of Educational Technology (OET) came to NESAC for a follow-up discussion on the future of educational technology. NESAC members asked questions on topics including

  • online state assessments;
  • online schools and instruction;
  • teachers digital literacy; and
  • access to new technology in schools.

EdFacts Updates
Ross Santy (NCES) provided NESAC with updates on EdFacts data collection, the reorganization of NCES, School Improvement Grant reporting procedures, and LEA reporting for the Common Core of Data (CCD). The 2013-14 EDFacts collection is underway and programmatic reporting starts in the fall. The 2014-15 collection has some technical amendments but is otherwise stable. The Common Core of Data (CCD) Data Management System will be introduced with the 2014-15 collection, and it will provide reports back to states. EDFacts coordinator meetings are now adjacent to STATS-DC, but limited and minimized to avoid conflict with other conference activities.
 
Ross is looking for EDFacts and CCD coordinators for feedback about technical assistance needs. For example, reporting zero versus not applicable is an ongoing issue that will be addressed in training. There is considerable variance across states with respect to how membership is reported on the CCD so EDFacts has pulled together a working group to discuss the issue. The key discriminator for reporting is students for whom the LEA is responsible. NCES will stay in contact with state representatives to ensure that reporting requirements regarding counts of high quality teachers pursuing alternative education are understood, especially because there is not a lot of time to prepare collections and responses. Ross noted that the transition phase for the new contractor is just starting, and any issues should be brought to the attention of Ross or Barbara Timm.

NESAC members raised questions on

  • the accuracy and burden of one-time ad hoc data collections;
  • homeless counts and comparisons; and
  • ESEA flexibility and waivers.

Afternoon Session

Civil Rights Data Collection (CRDC) Updates pdf file (797 KB)
Ross Santy (NCES), Abby Potts (NCES), Rebecca Fitch (CRDC), and Chris Goddards (AEM Corporation) joined NESAC to provide updates on the CRDC and to gather feedback from the Forum on the new website CRDC.GRADS360.ORG. CRDC staff members are currently working to implement improvements and determine what SEAs can do to help LEAs in preparation for the next data collection. The website features publications and tools, such as the capability for users to generate reports. NESAC members suggested that it would be useful if the website also gave users the ability to compare districts and states. Abby welcomed suggestions for additional technical assistance document topics. Forum members offered feedback on the following topics:

  • data collection policies on race/ethnicity category;
  • effective site visits;
  • school level personnel and non-personnel expenditures; and
  • a CRDC website document on selecting principal contact person.

Assessment Consortia Discussion
Jessica Jackson (Partnership for Assessment of Readiness for College and Careers [PARCC]) provided an update on PARCC and Brandt Redd (Smarter Balanced Assessment Consortium [SBAC]) shared an update on SBAC. Jessica and Brandt each provided an overview of consortia member states and addressed technology systems, Spring 2014 field tests, data flow, early lessons learned, research resulting from field tests, and the timeline. Brandt also discussed smarterapp.org, which is a community of organizations devoted to collaboration on an open licensed software suite for assessment support.   
NESAC members discussed the following topics with presenters:

  • There are concerns that the demand for certain optimal devices for the assessments may exceed the supply of devices.  
  • Some states are waiting to determine if PARCC data can be used for teacher evaluation, and it would be useful to have state-specific data back from the pilot.
  • NESAC members were interested in the length of the SBAC tests (2 days per student), the testing window (12 weeks) and the availability of interim assessments (late fall). SBAC is also developing a teacher-facing library of materials.
  • Some states experienced problems with field tests that timed out and required a bulk restart. SBAC strives to avoid the necessity for students to restart the tests by allowing breaks.
  • It would be interesting to study the effects of instruction on assessment performance over the 12 week testing window (comparing week 1 to week 12). 

NESAC Election
Jan Petro (Colorado Department of Education) was nominated as NESAC chair and Kristina Martin (Macomb Intermediate School District [MI]) was nominated as NESAC vice chair by a unanimous vote.

Wednesday, July 30, 2014

Morning Session

Graduation Rate Reporting Follow-up Discussion
Allen Miedema (Northshore School District [WA]) invited committee members to share thoughts and note any comments on the Graduation Rate Reporting Joint Session. Public reporting of graduation rates leads to the impression that rates are easily-calculated and comparable across states. However, there is wide variation in how states determine who is a graduate, especially with regard to students with disabilities. For example, exit exam requirements in some states preclude students with disabilities from getting diplomas, and therefore may exclude those students from graduation rate calculations. Further, some states publish multiple graduation rates each year. Additional topics of discussion within the NESAC standing committee included LEA and SEA efforts around

  • adding modified curriculum designators on transcripts;
  • summarizing complex student experiences through data;
  • adopting multiple graduation rates through various measures; and
  • communicating data in consideration of public perception.

School Climate Surveys
Isaiah O’Rear (NCES) provided a brief overview on the development of the School Climate Surveys (SCLS), a suite of surveys designed to collect data for the measurement of school climate. The SCLS web-based platform will be designed to enable schools, school districts, and states throughout the country to administer the survey suite to middle and high school students, teachers, and school staff. The surveys are designed so that LEAs can store data locally. This will be a voluntary survey. If desired, it can be run at any of three levels – at the SEA, LEA, or school.  Data remain at level of administration but can be shared if desired.

Isaiah previously discussed the SCLS with the Forum via a virtual meeting. Based on Forum recommendations during that meeting, the SCLS design will allow users to augment the surveys with locally-designed questions. NCES is currently testing the survey content. In March 2016 NCES will launch a national benchmark data collection for the surveys that will include 250 middle schools and 250 high schools.

NESAC members suggested allowing for a long survey window and publishing the surveys in multiple languages.

Steering Committee Business/Report
Allen Miedema (Northshore School District [WA]) reviewed topics discussed by the Steering Committee, as well as meeting events, action items, and issues that will be reported at the Closing Session.

Meeting Review and Next Steps/Virtual Meeting Review
Allen Miedema (Northshore School District [WA]) gathered feedback on the virtual meetings held earlier in the year. Members suggested distributing meeting materials to those who are unable to participate. Members also discussed interest in virtual opportunities to speak with each other in a setting that is set up less like a webinar, such as Google Hangouts.

Members also offered feedback on the Forum meeting, including

  • ensure presentation readability for everyone in the room;
  • add an additional screen in the middle for Joint Sessions;
  • print presentation handouts for attendees to take notes and use as a guide.

Top

Policies, Programs and Implementation (PPI) Standing Committee Meeting Summary

Monday, July 28, 2014

Morning Session

Welcome, Introductions, and Agenda Review
PPI Vice Chair John Kraman (Oklahoma State Department of Education) opened the meeting by reporting that PPI Chair John Metcalfe had retired from his school district. As Vice Chair, John accepted the responsibility of moderating this PPI meeting in the Chair’s absence. John then welcomed everyone to the Summer 2014 PPI meeting, led the group in introductions, and reminded participants that Forum meetings are working meetings designed for Forum members only. Others interested in the work of the Forum are encouraged to attend the STATS-DC Data Conference immediately following the Forum Meeting. John then provided a brief overview of the PPI Agenda.

U.S. Department of Education Data Inventory pdf file(1.27 MB)
Marilyn Seastrom, Chief Statistician and Program Director at NCES, joined PPI to discuss the U.S. Department of Education (ED) Data Inventory, which has been developed to describe data reported to ED. The ED Data Inventory includes data collected as part of grant activities and data collected to allow publication of valuable statistics about the state of education in this country. It provides descriptive information about each collection (metadata), along with information on the specific data elements in individual collections. It is organized into series and studies, and includes a search function that collects results from the series, study, and variable levels of each data collection. The Inventory was created to

  • identify and improve access to existing data;
  • provide data transparency to the public regarding the data collected and maintained by ED;
  • ensure responsible data management at ED;
  • improve the coordination of data collections across ED program offices; and
  • comply with OMB Directive M-13-13 (May 9, 2013). 

It is anticipated that this resource will help a wide range of NCES stakeholders, including policymakers who would benefit from a better understanding of the full portfolio of ED data. As a part of a bigger OMB data.gov project, the ED Data Inventory will be updated over time. Whenever possible, the inventory uses precise language (e.g., the exact wording of a question on a survey) which can serve as a library for external users interested in developing their own collections. The goal is to make it a one stop shop for stakeholders to learn about ED data.

Afternoon Session

Privacy Technical Assistance Center (PTAC) Update and SEA/LEA Privacy Policy Transparency
Dale King, Director of the Family Policy Compliance Office (FPCO) and Baron Rodriguez, Privacy Technical Assistance Center (PTAC) shared a presentation on FPCO and PTAC. 

FPCO implements two laws that seek to ensure student and parental rights in education: the Family Educational Rights and Privacy Act (FERPA) and the Protection of Pupil Rights Amendment (PPRA)—but the vast majority of inquiries they receive relate to FERPA. As required by law, FPCO investigates complaints from parents or eligible students, such as complaints related to inappropriate disclosure, efforts to amend student records, or lack of timely access to records. FPCO receives an average of 8-10 complaints each week, and is currently responding to a Freedom of Information Act (FOIA) request from the Electronic Privacy Information Center (EPIC) to release all response letters from recent years. FPCO also develops resources and issues guidance related to the administration of FERPA and PPRA, including the following resources that have been recently released, or are scheduled for release:

  • Guidance on amendments to FERPA that resulted from the Uninterrupted Scholars Act was released in May 2014.
  • A document providing side-by-side comparisons of the Individuals with Disabilities Education Action (IDEA) and FERPA law and regulations was released in June 2014.
  • Three new documents providing guidance on sharing information with community based organizations, sharing information with school resource officers, and handling privacy issues around integrated data systems are forthcoming.

FPCO recently launched a public-facing website as a companion to its current site. The new site includes a question and answer bank, which will be populated over the next year; it allows for the electronic submission of questions and requests to file complaints; and in the future it will include a community of practice forum that will allow users to communicate with each other.

PPI members asked whether discussions on the FPCO website could be shared publicly (or otherwise released), and Dale agreed to explore the question further, with the understanding that the answer will affect how stakeholders will use the resource.

Baron then updated members on recent PTAC work, including two new resources: Transparency Best Practices and the Surviving Heartbleed Guide. Forthcoming resources include:

  • a transparency best practices checklist, aimed at teachers and addressing “click-wrap” agreements;
  • information on mapping data flows;
  • strategies for effectively dealing with parental inquiries;
  • additional resources in the student privacy toolkit; and
  • a video providing a summary of privacy considerations for online educational services.

PTAC has also held a number of regional meetings focusing on FERPA from specific user perspectives, and an upcoming meeting focuses on nonprofits working with LEAs. In September PTAC will launch a working group on transparency and data breaches across agencies. PTAC also holds trainings on request, both virtually and in-person. Visit the PTAC website for more information about PTAC and its resources for the education community.

Forum College and Career Ready Working Group
Lee Rabbitt (Rhode Island Department of Elementary and Secondary Education) chairs the Forum’s College and Career Ready (CCR) Working Group. The group is moving forward with a draft of a new Forum resource that will help SEAs and LEAs understand how data can be used to support college and career readiness for all students. The group aims to publish the new guide in January 2015.

Predictors of College Readiness Using State Data pdf file (567 KB)
The Institute of Education Sciences (IES) Regional Educational Laboratory (REL) Program supports 10 RELs across the country. RELs are tasked with assisting states and districts in their use of research and data to inform policy and practice with the goal of improving student outcomes.

Elizabeth Davis and Matt Soldner joined PPI to discuss a REL Midwest study that is using data from Indiana high school graduates to examine three measures of college readiness based on first-year college indicators and to identify predictors of college readiness among measures collected in Indiana's longitudinal data system. The study, which was requested by the Indiana Commission for Higher Education and REL Midwest's College and Career Success Research Alliance, will inform the efforts of Indiana's policymakers and educators to increase the number of students graduating from high school ready to succeed in college. Indiana realizes that postsecondary education is highly correlated with a citizen’s economic growth and upward mobility. Thus, college and career readiness is a major goal of education reform in the state. The study defined early college success as students taking only non-remedial classes, completing all attempted credits, and persisting to a second year. Research questions included: What percentage of enrollees arrived at college ready to succeed? Do the percentages vary by student, high school, or college characteristics? and do the percentages vary by indicator of success?  Findings were not yet available to share.

Group discussion included the following topics:

  • Some vendors are selling predictive models to policymakers who do not understand that a statistically significant finding does not enable surefire predictability, which means that instructional remedies are being sold without clear proof of effect. Forum members recommended that the final report present caveats and cautions that are generalizable to modeling in a way that makes sense to policymakers.
  • Another possible variable not addressed in the study is family situation. A participant offered the example of researchers in a state who made assumptions about the effects of academic preparation on retention, only to later learn that retention had a great deal to do with the recession—changes in family finances led students to transfer or drop out. Researchers found that students were transferring to commuter colleges in order to save money because the schools they could afford in the previous year were no longer affordable.
  • Similarly, adult relationships matter – students who had an adult help them fill out Free Application for Student Aid (FAFSA) forms were more likely to apply to and attend college.
  • From an LEA perspective, college readiness (something that happens in K-12) is not the same as college early success (something that happens after K-12). This is a significant difference in meaning for people who are helping high school students prepare for college.
  • Yet another complicating factor is how college education is financed. For example, how much time is a student working to pay for school? This question may be more appropriate for a success study than a readiness study (which is narrowly limited to what happens in K-12).

Data Catalog Tool for College and Career Readiness Indicators pdf file (539 KB)
Julie Riordan (REL Northeast and Islands [REL-NEI]) joined PPI to share information on the development of the REL-NEI Data Catalog Tool for College Readiness. REL-NEI created the tool to determine the availability and reliability of U.S. Virgin Islands (USVI) Department of Education data elements to address the question, What are the strongest indicators of college readiness in USVI? The data catalog tool can be used by other jurisdictions to address the same question.

The goal of the tool is to provide a systematic process to assess the presence of college readiness indicators, identify gaps that may present challenges in developing indicator systems, determine the feasibility of college readiness-related studies using administrative data, and provide a tool for data stewards to track college readiness indicators. It is in the form of a flexible-use spreadsheet that provides a shell for organizing and tracking student data relevant for measuring college readiness and organizes available data at three levels: constructs, indicators, and data elements. REL-NEI created a summary report template to accompany the tool that describes the background and intent of the data catalog, identifies gaps in the data, discusses recommendations for study feasibility and limitations to consider, and may also be useful for briefing others within and outside the entity of the status of data elements.

Group discussion focused on the following topics:

  • Additional indicators might include academic expectations and goals, family education history (e.g., cultural capital for filling out college applications, financial aid forms, etc.), technology readiness (e.g., using the tools needed to succeed in college such as accessing library/research materials electronically).
  • There are fewer college counselors than there used to be, so students need to learn for themselves about the colleges’ application process—leaving some kids, families, and communities out of the loop. Thus, “readiness” isn’t just academic preparation, it is also cultural preparation related to knowledge about the college application process and expectations.  These are behavioral factors in addition to traditional academic issues.
  • Some students plan to go to college but first enroll in the military, go on a religious mission, or take a gap year/ life experience delay, which are all viable choices that do not necessarily indicate a lack of readiness.
  • Timing is key (e.g., if a student has earned 50% of credits needed to graduate after four semesters, there is a good chance of completion). Some courses get repeated to improve a grade (e.g., an earned C doesn’t count if the student takes a course again and get a B, which means that he or she is making progress even if not earning new credits).

Tuesday, July 29, 2014

Morning Session

Psychometric Characteristics of School Climate Indicators for Middle Schools
Thomas Hanson (REL West) joined PPI to share information on a REL West school climate study in California that examined the appropriateness of the California School Climate, Health, and Learning Survey (Cal-SCHLS) for identifying needs and monitoring improvements in school climate in middle schools. It consists of three related surveys: a student survey, a staff survey, and a parent survey. Surveys are anonymous and ask about very sensitive issues, including illegal behaviors. Although the study is not complete, it appears that variation is greater within schools than between schools. As a result, one of the long term goals of the study is to develop building level measures of school climate. 

School Courses for the Exchange of Data (SCED) Working Group
Lee Rabbitt (Rhode Island Department of Elementary and Secondary Education) chairs the Forum’s SCED Working Group. SCED is a voluntary, common classification system for prior-to-secondary and secondary school courses. It includes elements and attributes that identify basic course information. In the past year, the Working Group released SCED Version 2.0 Course Codes and the Forum Guide to SCED. The group is now developing SCED Version 3.0. Lee addressed the scope of Working Group collaboration with national and local subject matter experts on code development and provided information on the implementation and use of SCED in Rhode Island. PPI members asked questions about the alignment of prior-to-secondary and secondary course subject areas in Version 2.0 and the use of SCED in NCES transcript studies. 

Joint Session Follow-up
Joseph South, Deputy Director of the ED Office of Educational Technology (OET) joined PPI for questions and follow-up discussion following his joint session presentation on OET Initiatives. PPI focused the discussion on the following topics:

  • Active consent can be difficult to obtain in a real world public school setting, which makes the idea of joining pilot studies difficult, no matter how effective something is as a learning tool.
  • ED is very concerned about privacy. Vendors need to understand privacy concerns and regulations and be very clear about how they are using data. Schools need to become more sophisticated purchasers of online resources to ensure privacy (e.g., that all data are anonymized or fully encrypted in transit).
  • Student and parents are very concerned about student data being used for purposes other than those intended.
  • School leaders need to be able to assure parents that everything brought into the district has been privacy certified. Efforts to establish a vendor certification process within the field are growing.
  • Districts need a coherent policy governing how technology tools are brought into the classroom. PPI members offered an example of a situation that could be avoided by coherent district policies: A teacher finds a good learning resource/app, downloads it (accepting terms and conditions without legal review), and begins using it in the classroom. At a later date, the district’s information technology (IT) director receives a phone call from a vendor saying, “Did you know you had 100 students using our app?”
  • School staff using technology tools should rely on certification, read terms and conditions, and require pilots prior to whole-scale use. Procurement reform is key. 
  • OET tries to use logic with industry, cautioning that if the technology industry doesn’t address privacy concerns, they may be faced with 50 different privacy laws in 50 different states, which is a tremendous burden on business.
  • PTAC delivers additional privacy guidance.

Teacher Evaluation in Rhode Island
Lee Rabbitt (Rhode Island Department of Elementary and Secondary Education) joined PPI to discuss Rhode Island’s efforts to implement an accurate and fair system of teacher evaluation. Lee provided background on Rhode Island’s system implementation and discussed challenges, lessons learned, and progress.

Topics From the Floor
PPI used this time to continue discussion with Tom Hanson of REL West on the topic of Psychometric Characteristics of School Climate Indicators for Middle Schools and Lee Rabbitt on the Forum’s School Course for the Exchange of Data (SCED) Working Group.

Afternoon Session

Assessment Consortia Discussion
Jessica Jackson (Partnership for Assessment of Readiness for College and Careers [PARCC]) provided an update on PARCC and Brandt Redd (Smarter Balanced Assessment Consortium [SBAC]) shared an update on SBAC.  Jessica and Brandt each provided an overview of consortia member states and addressed technology systems, Spring 2014 field tests, data flow, early lessons learned, research resulting from field tests, and the timeline. Brandt also discussed smarterapp.org, which is a community of organizations devoted to collaboration on an open licensed software suite for assessment support.   

PPI members discussed the following topics with presenters:

  • To ensure comparability in hand scoring, random samples will be selected and compared. PPI members noted that the gold standard would be to send a single set of samples to each hand scoring subcontractor and compare scoring results.
    • SBAC leaves hand scoring to states to contract out.
    • PARCC is having Pearson handle all hand scoring.
  • PPI members are concerned about the cost of the assessments. Members offered the example of Wisconsin, which was paying $16 per student assessment for delivery, storage, etc., prior to the consortia assessments. The cheapest bid received for consortia assessments was $36 per student. SBAC’s fee for states is about $6 per student, which pays for the assessment. States then need to procure an assessment vendor for delivery and scoring.
  • Members are concerned that Pearson’s delivery is not open source which will lead to additional costs for states.
  • States advise the consortia, but some districts feel like districts will nevertheless carry the burden of implementing the new assessments. SBAC’s default is that states will load the registration data. PARCC defaults to state decisions about SEA or LEA registration. Members noted that spreadsheets will be very burdensome for LEAs if registration is left to the LEAs.

Forum Virtual Meeting Review
The Forum offered several virtual meetings over the winter to help bridge the loss of the Winter Forum meeting. PPI members were asked how these virtual meetings were received and how they might best be convened in the future.

  • A handful of PPI members participated in the virtual meetings. 
  • As a rule, virtual meetings are not as helpful as an in-person meeting, but the Forum virtual meetings were as effective as possible. Some, such as the presentation on ambient positional instability, were interesting but not particularly relevant to needs of practitioners. 
  • Forum members forwarded announcement to colleagues, and as a result, virtual meetings impacted additional practitioners outside of the Forum. 
  • It is very challenging to exchange information via the virtual meeting format. The process of submitting questions is often ineffective because it is not interactive enough.
  • Notification of Forum virtual meetings often came too late and made planning nearly impossible. The Forum should plan virtual meetings in the same manner as in-person meetings so that dates and times are set well in advance. Members cautioned Forum leadership to avoid scheduling many virtual meetings during assessment season – February is much better. Traditionally, Forum members know to reserve time in February.
  • PPI members suggested blocking out multiple sessions for virtual meetings but cautioned that 60-90 minutes is the effective maximum for people paying attention.
  • Instead of just focusing virtual meetings in February, members suggested offering half of the meetings in February with the rest split between November and April.
  • Members recommended that the Forum establish a community discussion thread that could include small groups for dialogue, staff notetaking, and closed conversation, for example, on a professional development session on a known topic of interest. Sessions should be kept closed and tight knit, and then group representatives can report out at the summer meeting.
  • Perhaps there could even be periodic mini-meetings: 8-10 people to talk about session-type issues and produce a product such as a white paper or notes (as simple or as useful as needed).

Common Education Data Standards (CEDS) Connect pdf file (1.08 MB)
Ross Santy (NCES) and Beth Young joined PPI for an update on CEDS. CEDS released a new virtual development community for those interested in participating in developing the Version 5 standard. The standard is currently in development with a release planned in January, 2015. Beth provided a demonstration of the myConnect feature that allows users to easily combine their maps with Connections. PPI discussion items included:

  • SEAs could use CEDS as a way to make their data dictionaries more transparent to the public and to share supporting documentation, policies, and laws driving collections.
  • SEAs should also consider using CEDS to tell researchers what information is available. If a researcher first builds a connection, the SEA can then run a myConnect to see what data exist and where there may be gaps.
  • SEAs can use the connect map to identify elements in their systems that are particularly sensitive or susceptible to hacking and then customize security resources—a hacking “heat map” that drives security design.
  • RELs can align their study questions and then partner states can compare how close they are to being able to provide the data – resulting in a real roadmap of existing capacity against research questions rather than always starting from scratch with review on a case by case basis.
  • Because there is high turnover in agencies, it would be useful if CEDS would periodically inform states who is registered in their state.

PPI Election
John Kraman (Oklahoma State Department of Education) was elected PPI Chair for 2014-15 and David Weinberger (Yonkers Public Schools, NY) was elected PPI Vice Chair for 2014-15.

Wednesday, July 30, 2014

Morning Session

School Climate Surveys
Isaiah O’Rear (NCES) provided a brief overview on the development of the School Climate Surveys (SCLS), a suite of surveys designed to collect data for the measurement of school climate. The SCLS web-based platform will be designed to enable schools, school districts, and states throughout the country to administer the survey suite to middle and high school students, teachers, and school staff. The surveys are designed so that LEAs can store data locally. This will be a voluntary survey. If desired, it can be run at any of three levels – at the SEA, LEA, or school. Data remain at level of administration but can be shared if desired.

Isaiah previously discussed the SCLS with the Forum via a virtual meeting. Based on Forum recommendations during that meeting, the SCLS design will allow users to augment the surveys with locally-designed questions. NCES is currently testing the survey content. In March 2016 NCES will launch a national benchmark data collection for the surveys that will include 250 middle schools and 250 high schools.

PPI members raised a concern that the benchmark average will be derived from 500 sampled schools nationally, regardless of size, demography, geography, etc. How will schools under extreme settings be expected to relate to such generic averages? PPI members also suggested that the survey needs a data dictionary for its fields.

EDFacts Update
Ross Santy (NCES) provided PPI with an update on EDFacts. The 2013-14 EDFacts collection is underway and programmatic reporting starts in the fall. The 2014-15 collection has some technical amendments but is otherwise stable. The Common Core of Data (CCD) Data Management System will be introduced with the 2014-15 collection, and it will provide reports back to states. EDFacts coordinator meetings are now adjacent to STATS-DC, but limited and minimized to avoid conflict with other conference activities.
 
Ross is looking for EDFacts and CCD coordinators for feedback about technical assistance needs. For example, reporting zero versus not applicable is an ongoing issue that will be addressed in training. There is considerable variance across states with respect to how membership is reported on the CCD so EDFacts has pulled together a working group to discuss the issue. The key discriminator for reporting is students for whom the LEA is responsible, not just served. NCES will stay in contact with state representatives to ensure that reporting requirements regarding counts of high quality teachers pursuing alternative education are understood, especially because there is not a lot of time to prepare collections and responses. Ross noted that the transition phase for the new contractor is just starting, and any issues should be brought to the attention of Ross or Barbara Timm.

Graduation Rate Reporting Follow-up Discussion
John Kraman (Oklahoma State Department of Education) led a follow up discussion on the Graduation Rate Reporting Joint Session. Discussion focused on the following topics:

  • Graduation rates are naturally inconsistent and vary according to the information that needs to be communicated via the rate. It is possible to standardize collections, but standardization does not seem like a worthwhile goal when different places have different information needs.
  • The goal is to improve learning, not the graduation rate.
  • Graduation rate calculations will be complicated by the increasing use of proficiency-based assessments in addition to credits, GPA, etc.
  • Policies affect graduation rates. For example, reduced standards/expectations can lead to higher graduation rates.
  • The graduation rate metric plays a considerable role in public perception of how our schools are performing, which means that the rate is important even if it isn’t driving real student learning. PPI members agreed that there is no need to put more energy into fine tuning the graduation rate measure (e.g., whether a student is an 8th or 9th grader on July 30). It is better to accept collections as they are and focus energy on improving learning.
  • The Joint Session presentation assumed that improvement in measurements will improve graduation rates, but the argument did not appear to be evidence-based.
  • In some schools, there is 0% proficiency on state assessments in 11th grade and a 90% graduation rate – making graduation rate a metric of very limited value.

Meeting Review and Next Steps
At the next in-person or virtual meeting, PPI would like to consider the following issues:

  • The graduation rate conversation provoked the most reaction and discussion. Seeing how data are being used/misused is a good mirror for PPI members. Specifically, members understand the data and are aware of limitations on the use of graduation rates, but it is provocative to hear how other people are using our data.
  • At this meeting, we moved well beyond the traditional conversations about compliance. We talked about data and the points of individual student transformation – real contributions to education rather than just doing accountability duties. This is not just professional development – these discussions can result in meaningfully better use of data for critical education issues.
  • The ED Data Inventory will be moving to Ross Santy’s section at NCES. Could it be used to align across silos (with the help of CEDS) to help reduce redundancy? This could be a big step for cascading transparency, privacy issues (putting panic to rest and allowing focus on real issues).
  • How does data become information? People often make incorrect assumptions about data. Could PPI help to explain this process for education stakeholders?

Top

Technology (TECH) Committee Meeting Summary

Monday, July 28, 2014

Morning Session

Welcome and Introductions
TECH Chair Jay Pennington (Iowa Department of Education) welcomed everyone to the Summer 2014 TECH meeting and led the group in introductions. Jay reminded participants that Forum meetings are working meetings designed for Forum members only. Others interested in the work of the Forum are encouraged to attend the STATS-DC Data Conference immediately following the Forum Meeting. 

Alternative Socioeconomic Status (SES) Measures Working Group Update
Matt Cohen (Ohio Department of Education) chairs the Forum’s Alternative Socioeconomic (SES) Measures Working Group. He provided background information on the project and an update on Working Group efforts. The Working Group is developing a Forum document designed to help readers better understand the implications of collecting and interpreting a range of SES-related data in education agencies. The group met prior to the Forum meeting to clarify the scope of the document. It should be noted that this document focuses on the needs and possible solutions for administrative records in education data systems and does not reflect the full spectrum of opportunities available to the research community. The final document will include commonly-faced challenges and suggestions. The Working Group is making progress to present a draft for Forum review in early 2015.

Afternoon Session

U.S. Department of Education Data Inventory pdf file (1.27 MB)
Marilyn Seastrom, Chief Statistician and Program Director at NCES, joined TECH to discuss the U.S. Department of Education (ED) Data Inventory, which has been developed to describe data reported to ED. The ED Data Inventory includes data collected as part of grant activities and data collected to allow publication of valuable statistics about the state of education in this country. It provides descriptive information about each collection (metadata), along with information on the specific data elements in individual collections. It is organized into series and studies, and includes a search function that collects results from the series, study, and variable levels of each data collection. The Inventory was created to

  • identify and improve access to existing data;
  • provide data transparency to the public regarding the data collected and maintained by ED;
  • ensure responsible data management at ED;
  • improve the coordination of data collections across ED program offices; and
  • comply with OMB Directive M-13-13 (May 9, 2013). 

It is anticipated that this resource will help a wide range of NCES stakeholders, including policymakers who would benefit from a better understanding of the full portfolio of ED data. As a part of a bigger OMB data.gov project, the ED Data Inventory will be updated over time. Whenever possible, the inventory uses precise language (e.g., the exact wording of a question on a survey) which can serve as a library for external users interested in developing their own collections. The goal is to make it a one stop shop for stakeholders to learn about ED data.

Best Practices around Collecting Teacher Attendance
Jay Pennington (Iowa Department of Education) led a discussion on collecting data on teacher attendance. Tom Purwin (Jersey City Schools [NJ]), shared what his district is doing in this area including implementing a substitute tracking system and modifying the Forum’s student attendance codes for teachers. Tom reviewed the purposes of the data collection, including measuring instructional time as well as time out of the building. Members discussed issues such as the different reasons teachers leave the classroom, teachers who get moved around by their administration, and efforts around improving problems that are difficult to measure.

Privacy Technical Assistance Center (PTAC) Update and SEA/LEA Privacy Policy Transparency
Dale King, Director of the Family Policy Compliance Office (FPCO) and Baron Rodriguez, Privacy Technical Assistance Center (PTAC) shared a presentation on FPCO and PTAC. 

FPCO implements two laws that seek to ensure student and parental rights in education: the Family Educational Rights and Privacy Act (FERPA) and the Protection of Pupil Rights Amendment (PPRA)—but the vast majority of inquiries they receive relate to FERPA. As required by law, FPCO investigates complaints from parents or eligible students, such as complaints related to inappropriate disclosure, efforts to amend student records, or lack of timely access to records. FPCO receives an average of 8-10 complaints each week, and is currently responding to a Freedom of Information Act (FOIA) request from the Electronic Privacy Information Center (EPIC) to release all response letters from recent years. FPCO also develops resources and issues guidance related to the administration of FERPA and PPRA, including the following resources that have been recently released, or are scheduled for release:

  • Guidance on amendments to FERPA that resulted from the Uninterrupted Scholars Act was released in May 2014.
  • A document providing side-by-side comparisons of the Individuals with Disabilities Education Action (IDEA) and FERPA law and regulations was released in June 2014.
  • Three new documents providing guidance on sharing information with community based organizations, sharing information with school resource officers, and handling privacy issues around integrated data systems are forthcoming.

FPCO recently launched a public-facing website as a companion to its current site. The new site includes a question and answer bank, which will be populated over the next year; it allows for the electronic submission of questions and requests to file complaints; and in the future it will include a community of practice forum that will allow users to communicate with each other.

Baron then updated members on recent PTAC work, including two new resources: Transparency Best Practices and the Surviving Heartbleed Guide. Forthcoming resources include:

  • a transparency best practices checklist, aimed at teachers and addressing “click-wrap” agreements;
  • information on mapping data flows;
  • strategies for effectively dealing with parental inquiries;
  • additional resources in the student privacy toolkit; and
  • a video providing a summary of privacy considerations for online educational services.

PTAC has also held a number of regional meetings focusing on FERPA from specific user perspectives, and an upcoming meeting focuses on nonprofits working with LEAs. In September PTAC will launch a working group on transparency and data breaches across agencies. PTAC also holds trainings on request, both virtually and in-person. Visit the PTAC website for more information about PTAC and its resources for the education community.

TECH asked questions related to surveillance videos, protecting staff data within a developing early childhood registry, teachers videotaping their lessons for professional development, and initial results from FPCO’s response to the EPIC FOIA request.

Tuesday, July 29, 2014

Morning Session

Office of Educational Technology Initiatives Joint Session Follow-up
Joseph South, Deputy Director of the ED Office of Educational Technology (OET) joined TECH for a follow-up on his Joint Session earlier in the morning. Topics TECH members discussed with Joseph included

  • digital citizenship;
  • what is happening in postsecondary in this area;
  • postsecondary/private partnerships; and
  • whether federal funds can be used for technology certification.

Assessment Consortia Discussion
Jessica Jackson (Partnership for Assessment of Readiness for College and Careers [PARCC]) provided an update on PARCC and Brandt Redd (Smarter Balanced Assessment Consortium [SBAC]) shared an update on SBAC. Jessica and Brandt each provided an overview of consortia member states and addressed technology systems, Spring 2014 field tests, data flow, early lessons learned, research resulting from field tests, and the timeline. Brandt also discussed smarterapp.org, which is a community of organizations devoted to collaboration on an open licensed software suite for assessment support.   

TECH members discussed the following topics with presenters:

  • TECH members suggested informing the National Assessment of Educational Progress (NAEP) on the results of pilot tests, specifically on the technology used.
  • Members asked whether SBAC provided feedback to pilot schools and Brant replied that feedback (and all communications) will go through SEAs.
  • Sub-scores will be available in addition to scale scores.
  • Data definitions for data files will be shared and they are compliant with the Common Education Data Standards (CEDS).
  • Information on whether or not the device type makes a difference in student performance will be very important to decisionmaking, especially since the test results are being rolled into teacher evaluations. Results suggest that the familiarity of the device is key to student comfort with the electronic assessment. The presenters clarified that device type research currently underway is actually a psychometric question to test the validity of the test question, not about student performance.
  • The challenge is to get online tools (especially tools such as equation builders for math) into the curriculum to build familiarity.

Afternoon Session

TECH Elections
Mike Hopkins (Rochester School Department [NH]) was elected TECH Chair for 2014-15 and Dean Folkers (Nebraska Department of Education) was elected TECH Vice Chair for 2014-15.

Common Education Data Standards (CEDS) Update pdf file(1.08 MB)
Beth Young joined TECH for an update on CEDS. CEDS released a new virtual development community for those interested in participating in developing the Version 5 standard. The standard is currently in development with a release planned in January, 2015. Beth provided a demonstration of the myConnect feature that allows users to easily combine their maps with Connections.

Topics From the Floor
Jay Pennington (Iowa Department of Education) and Vice Chair Mike Hopkins (Rochester School District [NH]) lead separate SEA and LEA group discussions on issues in their organizations that are currently taking up resources or need to be highlighted.

LEA discussion topics included:

  • Competency-based education: This is an area of growth that will have a significant impact on LEA work.
  • Feeding data back into student systems from the assessment consortia.
  • Privacy/security concerns.
  • Data ethics: There is a need for an increased focus on adult behavior, especially teachers. TECH members suggested updating the Forum data ethics document or the online courses to address teachers.

SEA discussion topics included:

  • Competency-based education: How does it impact accountability systems? What systems can track competencies versus courses?
  • Resource allocation at SEAs, especially human resources: Staff members are retiring and taking with them institutional history and knowledge.
  • Across-agency data: Data elements are needed for multiple uses, programs, and users, but often users fail to see how their data are applicable to others.

Wednesday, July 30, 2014

Morning Session

Graduation Rate Reporting Follow-up Discussion
Jay Pennington (Iowa Department of Education) led a follow-up discussion to address topics introduced during the Graduation Rate Reporting Joint Session. TECH members discussed

  • how students with disabilities are (or are not) included in graduation rates;
  • inconsistencies within states due to summer graduates and different methods of determining special education and regular education graduation requirements;
  • the importance of approving cohorts before calculating graduation rates;
  • the massive effort required to perfect a national rate; and
  • the importance of student data privacy.

TECH members noted that often, differences in data are acceptable and this is not a technology problem.

Meeting Review and Next Steps/Virtual Meeting Review
Jay Pennington (Iowa Department of Education) led a discussion on the current meeting, future meetings, and how to use virtual meetings during the year.

Suggested future topics include:

  • Graduation rates – compare definitions between states
  • Data privacy – LEA and SEA guidance
  • Classroom focused data
  • Learning Resources Metadata Initiative (LMRI) and content tagging

Virtual meeting recommendations include:

  • Setting dates of virtual meetings ahead of time is imperative
  • What about regional meetings or other partnerships?
  • Record webinars and post on Forum website for those that couldn’t attend
  • Smaller more informal working groups
  • Be clear which meetings are relevant for which staff (LEA v. SEA)
  • Keep a master calendar that shows meetings and relevancy

School Climate Surveys
Isaiah O’Rear (NCES) provided a brief overview on the development of the School Climate Surveys (SCLS), a suite of surveys designed to collect data for the measurement of school climate. The SCLS web-based platform will be designed to enable schools, school districts, and states throughout the country to administer the survey suite to middle and high school students, teachers, and school staff. The surveys are designed so that LEAs can store data locally. This will be a voluntary survey. If desired, it can be run at any of three levels – at the SEA, LEA, or school. Data remain at level of administration but can be shared if desired.

Isaiah previously discussed the SCLS with the Forum via a virtual meeting. Based on Forum recommendations during that meeting, the SCLS design will allow users to augment the surveys with locally-designed questions. NCES is currently testing the survey content. In March 2016 NCES will launch a national benchmark data collection for the surveys that will include 250 middle schools and 250 high schools.

Top

Forum Closing Session

pdf file (333 KB)

Wednesday, July 30, 2014

National Assessment of Educational Progress NAEP) and Race to the Top Assessments pdf file (330 KB)
Peggy Carr, Associate Commissioner of NCES, provided a presentation titled “NAEP and International Assessments as Independent Indicators of Student Performance.” Her presentation addressed federally sponsored assessments including NAEP, the Trends in International Mathematics and Science Study (TIMSS), the Program for International Student Assessment (PISA), and the Progress in International Reading Literacy Study (PIRLS). Each assessment plays a unique role in providing valuable information on student achievement. Peggy also provided Forum members with updates on the transition to technology-based assessment (TBA) and the transitions in the 2015 NAEP assessment. She explained that since technology has become ubiquitous, student assessments are making the transition to computer administration. Peggy concluded by encouraging Forum members to remain up-to-date as student assessment transforms across the country.

Standing Committee Progress Reports

Recognition of Forum Officers
The Forum thanked the 2013–2014 Officers for their service and presented them with plaques.

Recognition of Completed Projects
The Forum also presented plaques to recognize the contributions to the development of the new Forum resource, the Forum Guide to School Courses for the Exchange of Data (SCED) Classification System, made by members of the SCED Working Group.

Forum Election
Chair Lee Rabbitt presented the slate of proposed 2014–2015 officers for a vote. The slate was seconded and then the Forum voted to approve the following members as 2014–2015 officers:

Chair: Tom Purwin, Jersey City Public Schools (NJ)
Vice Chair: Peter Tamayo, Washington State Office of Superintendent of Public Instruction
Past Chair: Lee Rabbitt, Rhode Island Department of Elementary and Secondary Education
NESAC Chair: Jan Petro, Colorado Department of Education
NESAC Vice Chair: Kristina Martin, Macomb Intermediate School District (MI)
PPI Chair: John Kraman, Oklahoma State Department of Education
PPI Vice Chair: David Weinberger, Yonkers Public Schools (NY)
TECH Chair: Mike Hopkins, Rochester School Department (NH)
TECH Vice Chair: Dean Folkers, Nebraska Department of Education

Closing Remarks
Lee Rabbitt reflected on her time as Forum Chair and commended her Forum colleagues for staying strong throughout an exciting year of transition and changes. Tom Purwin led the Forum in recognizing Lee for her leadership and noted that she has been instrumental in the development of Forum resources and the achievements of Forum working groups.

Top

Steering Committee

Monday, July 28, 2014

Welcome and Agenda Review
Forum Chair Lee Rabbitt (Rhode Island Department of Elementary and Secondary Education) welcomed members of the committee and reviewed the agenda.

Sunday Review
Virtual Education Working Group
The Virtual Education Working Group met Sunday, July 27, prior to the Forum meeting. The group was convened to review the existing Forum Guide to Elementary/Secondary Virtual Education, and is developing a new Forum resource that will explore new facets of virtual education and offer best practices for collecting virtual education data in state and local education agencies (SEAs and LEAs). Virtual education is a dynamic topic, and the group is working to ensure that the document reflects up-to-date practices. The document is on track for January publication.

College and Career Ready Working Group
The College and Career Ready Working Group met Sunday, July 27. The group reviewed a draft document, discussed changes to use cases, and assigned members to expand and elaborate on each use case. The group intends to publish the document in January.

Alternative Socioeconomic Status (SES) Measures Working Group
The Alternative SES Measures Working Group also met on Sunday, July 27 to review a draft document. The discussion provided the group with additional clarity on the content of the resource and resulted in a name change to the document. The new resource will focus on measures of economic disadvantage.

Review of Monday’s Events
New Member Orientation
The Orientation was well-attended by new members, who received an overview of the Forum and information on the work of the Standing Committees.

Opening Session Presentation by John Easton on Institute of Education Sciences Initiatives
John Easton’s presentation was well-received by Forum members, who asked for the slides to be posted on the Forum’s website. The Steering Committee noted that, due to Forum member interest in many of the topics, it would be useful to follow-up with virtual meetings. Suggested virtual meeting topics included the use of tablets for assessments and transparency around data the federal government collects from states and why those data are collected.

Family Educational Rights and Privacy Act (FERPA) Update (General Session)
Kathleen Styles provided an overview of recent FERPA policy changes and reminded Forum members of the tools her agency has available to assist SEAs and LEAs. Steering Committee members appreciated that the presentation addressed topics of interest to the Forum membership, including the relationship between effective data governance and privacy. Further, Steering Committee members appreciated that the presentation was reinforced by afternoon Standing Committee discussions with Dale King of the Family Policy Compliance Office (FPCO) and Baron Rodriguez of the Privacy Technical Assistance Center (PTAC).

Standing Committee Time
Chairs reported on the time spent in Standing Committees:

  • NESAC members received updates on the Virtual Education Working Group, College and Career Ready Working Group, and the School Courses for the Exchange of Data (SCED) Working Group. Other discussion and presentation topics included teacher evaluations, education data standards, data collection related to the Individuals with Disabilities Education Act (IDEA), and privacy issues.
  • PPI members received an update from the College and Career Ready Working Group and then heard presentations on topics including the U.S. Department of Education’s Data Inventory, privacy issues, predictors of college readiness using state data, and the Regional Educational Laboratory (REL) Northeast and Islands Data Catalog Tool for College and Career Readiness Indicators.
  • TECH members received an update on the Alternative SES Measures Working Group followed by a presentation on the U.S. Department of Education’s Data Inventory by Marilyn Seastrom, a discussion of privacy issues led by FPCO and PTAC, and a group conversation on collecting teacher attendance data. Steering Committee members noted that it would be helpful to have more discussion about the data used in REL presentations.

Tuesday, July 29, 2014

Review of Tuesday’s Events
Office of Educational Technology (OET) Initiatives (Joint Session)
Joseph South, Deputy Director of OET, gave a presentation on the future of technology in education that was very well received by the audience. Based on the high level of interest in the work underway at OET, Steering Committee members suggested a virtual meeting to follow-up with Joseph. 

Graduation Rate Reporting (Joint Session)
Bob Balfanz and Jen DePaoli from the Everyone Graduates Center of the Johns Hopkins University shared research on graduation rates, including how rates are calculated, how calculations have changed, and why those changes may have occurred. Forum members were engaged in conversations with each other and the presenters on different methods for calculating graduation rates and on whether it is worthwhile to focus on factors affecting the calculations or to compare rates. Steering Committee members expressed appreciation that the presenter didn’t show preference for one measure over another, and agreed this is an important issue in many states.

Standing Committee Time
Standing Committee Chairs reported on the time spent in each committee:

  • NESAC members received an update on the Alternative SES Measures Working Group and reported that the conversation following the update was so lively that its time was extended. NESAC members participated in a follow-up discussion with Joseph South from OET, and presentations and discussions from Ross Santy on EDFacts, Ross and Abby Potts on the Civil Rights Data Collection (CRDC), and from Brandt Redd and Jessica Jackson on the work of the Partnership for Assessment of Readiness for College and Careers (PARCC) and the Smarter Balanced Assessment Consortium (SBAC).
  • PPI members received an update on the SCED Working Group and participated in presentations on Psychometric Characteristics of School Climate Indicators for Middle Schools from REL West, teacher evaluation in Rhode Island from Lee Rabbitt, a follow-up discussion with Joseph South from OET, and presentations and discussions from Ross Santy on EDFacts, Beth Young and Jim Campbell on CEDS Connect, and from Brandt Redd and Jessica Jackson on the assessment consortia.
  • TECH members participated in a follow-up discussion with Joseph South (OET) and noted that the discussion could have continued beyond the allotted time. Members also received an update on the assessment consortia from Brandt Redd and Jessica Jackson, with a follow-up discussion focused on SEA and LEA experiences with testing. The day ended with a breakout discussion for SEAs and LEAs on topics from members.

Communications Subcommittee Meeting
The Communications Subcommittee met on Monday evening to review the new draft of the Policies and Procedures Manual, the Member Handbook, and the template for one-page document overviews.  Members agreed to submit comments electronically on the new Member Handbook and to meet again via conference call to discuss the documents.

Standing Committee Chairs reported the results of their elections. Proposed Chairs and Vice Chairs for the 2014-15 year were

  • NESAC Chair: Jan Petro, Colorado Department of Education
  • NESAC Vice Chair: Kristina Martin, Macomb Intermediate School District (MI)
  • PPI Chair: John Kraman, Oklahoma State Department of Education
  • PPI Vice Chair: David Weinberger, Yonkers Public Schools (NY)
  • TECH Chair: Michael Hopkins, Rochester School District (NH)
  • TECH Vice Chair: Dean Folkers, Nebraska Department of Education

Forum Election
The Steering Committee proposed Thomas Purwin (Jersey City Public Schools [NJ]) as the Forum Chair and Peter Tamayo (Washington State Office of Superintendent of Public Instruction) as Vice-Chair for 2014-15.

Wednesday, July 30, 2014

Welcome to New Steering Committee Chairs
Newly-elected Chair Thomas Purwin (Jersey City Public Schools [NJ]) welcomed new Steering Committee members to the meeting.

Review of Wednesday’s Events
Standing Committee Time
The Standing Committee Chairs reported on the day’s events and shared recommendations from the committees:

  • NESAC had a follow-up discussion on Tuesday’s graduation rate presentation and discussed the School Climate Survey with Isaiah O’Rear from NCES. The group also discussed strategies to improve virtual meetings.
  • PPI had discussions on the School Climate Survey and EDFacts, in addition to a follow-up discussion on the graduation rate presentation.
  • TECH had a follow-up discussion on Tuesday’s graduation rate presentation and discussed the School Climate Survey with Isaiah O’Rear. The group also discussed the need to address data ethics at the teacher level, and suggested reviewing the Forum data ethics publication or on-line course with teacher-level information in mind. The Committee also discussed strategies to improve virtual meetings.

Closing Session (Joint Session)
Steering Committee members reported that the Closing Session went well. Members appreciated Peggy Carr’s presentation, especially the discussion around using tablets for NCES assessments.

Review of SCED Working Group Meeting on Tuesday, July 30, 2014
The SCED Working Group met Tuesday evening to work on SCED Version 3. The group reviewed new and updated codes in several subject areas and discussed the scope of SCED Version 4, which will focus on the mathematics and foreign language subject areas.

Other Issues
The Steering Committee discussed ways to make virtual meetings more useful and to improve attendance, especially since the Forum will meet in person only once each year. Strategies suggested by Steering Committee members included:

  • Set meeting dates well in advance, even if meeting content is not yet confirmed. This will allow Forum members to block out the time in their calendars. Members suggested 1-2 hour blocks each month, and/or 3-hour blocks quarterly.
  • Include small group meetings to allow members the opportunity to make connections and discuss “burning topics.”
  • Record the public meetings and post them on the Forum web site to be available to members who are unable to attend.
  • Tailor the agenda to the group and provide hourly agendas for longer meetings in case members need to step out for portions of the meeting.
  • Virtual meetings could include professional development, committee meetings, and conference planning. Steering Committee members are interested in follow-up discussions with the following meeting presenters:
    • Peggy Carr—follow-up to her discussion on the transition to electronic testing
    • Abby Potts—follow-up on changes to the Civil Rights Data Collection. Steering Committee members recommended setting a virtual meeting with her for each Standing Committee

Next Conference Call
The next Steering Committee conference call is scheduled for Friday, September 19, 2014, at 2:00 pm Eastern.

 Previous Page

Top

 

Publications of the National Forum on Education Statistics do not undergo the formal review required for products of the National Center for Education Statistics. The information and opinions published here are the product of the National Forum on Education Statistics and do not necessarily represent the policy or views of the U.S. Department of Education or the National Center for Education Statistics.