Skip Navigation
small NCES header image

Summer 2011 Forum Meeting Notes


National Forum on Education Statistics
July 25-27, 2011
Bethesda, Maryland



Professional Development Session: Primer on Growth Models and Their Application

Monday, July 25, 2011

Primer on Growth Models and Their Application MS PowerPoint (4.28 MB)

Neal Gibson, Director of the Arkansas Research Center, led a professional development session on types of growth models, commonalities and differences between the models, and ways in which data from growth models can be used. Dr. Gibson began by introducing the idea that growth models can provide a feedback loop to education administrators and practitioners – once data are available, they can be used to inform practice. States receiving Race to the Top (RTTT) or State Fiscal Stabilization Fund (SFSF) grants are mandated to use growth models. Dr. Gibson acknowledged that the implementation of these models is sometimes met with apprehension, because they are often discussed in the context of teacher evaluations. Evaluating teachers based on student growth is controversial and can be problematic, but, Dr. Gibson asserted that the usefulness of growth models far exceeds hiring and firing decisions. Emphasizing that something does not have to be perfect to be useful, Dr. Gibson explained that while each of three main types of growth model has constraints, each also has a particular utility. The discussion focused on four aspects of modeling growth: (1) assessment and scoring, (2) the main types of growth models and their application, (3) problems presented by the use of growth models, and (4) an example of the application of models. Dr. Gibson did not advocate any particular type of growth model, and noted that in his home state of Arkansas, all three types are used.

Because assessment creation for growth model development is based on item response theory, Dr. Gibson provided an overview of item analysis, raw scores versus scale scores, and basic statistical measures necessary for understanding how growth is measured. Importantly for growth model development, vertically scaled assessments are adjusted so that a higher score is expected in higher grades. Tests can be either criterion referenced, in which performance is measured relative to a determined standard (pass-fail), or norm referenced, in which an individual's performance is measured relative to others in a group. In practice, the primary difference between these types of tests is in the way scores are interpreted. Dr. Gibson gave the ACT as an example of one test that can be both criterion and norm referenced, because there is a set pass/fail score, and scores can also be determined relative to other test-takers.

The three main types of growth models are trajectory, value/transition table, and projection/linear. Trajectory models focus on a student's growth to proficiency, beginning with two scores; a current score and a score needed for future proficiency. The difference between the current and future scores is calculated and then divided evenly into annual target scores. Growth models that use value/transition tables create subdivisions of performance. A table provides the number of points earned for a transition from one subdivision to another. Schools using this model must meet a number of points to attain Average Yearly Progress (AYP). Both the trajectory and value/transition table methods of modeling growth are criticized because they offer no incentive for student achievement beyond proficiency – a potential limitation that poses policy concerns. The projection/linear model of growth uses current and past scores to predict future performance. The complexities of linear models, which hinge on regression, often make them difficult for stakeholders to understand. This raises the difficult policy question of whether it is fair to base accountability on formulas that are not easily explained or understood. This model allows a student to be counted as proficient even if one year's scores are low.

Forum members were curious as to whether it is better to have many years of data to develop a growth model. Dr. Gibson responded that it depends on the desired results. Generally, using many years of data may be the best method of predicting future outcomes. Yet, the inclusion of many years of data make the model less-sensitive to changes in a student's life that may cause a spike or fall in growth, and drops in growth caused by weak teachers may be hidden. Therefore, including only a few years of data may be a better option if the goal of the model is to determine whether changes are occurring over a short period of time (e.g., from one school year to the next).

For more information on results in states using growth models, Dr. Gibson recommended the U.S. Department of Education's report, "Final Report on the Evaluation of the Growth Model Pilot Project." For different viewpoints on the question of teacher evaluations using growth models, Dr. Gibson recommended the Economic Policy Institute's Briefing Paper #278, "Problems with the Use of Student Test Scores to Evaluate Teachers," and the Brookings Brown Center paper, "Evaluating Teachers: The Important Role of Value-Added." Dr. Gibson also introduced the Metproject – a resource for anyone interested in learning more about the measurement of effective teaching.

Controversy over growth model use is compounded by the fact that these models are dependent upon the ability to track students over time. This is especially difficult at the teacher level, where determining teacher of record status has proven to be challenging. Dr. Gibson expressed the opinion that many of the concerns around growth models are the result of anxiety about their uses. This is unfortunate because growth models can provide useful data. To demonstrate, Dr. Gibson provided examples of the types of data gathered from growth models. In one vivid example, charts detailing student growth and scale scores by subject and teacher demonstrated that a particular teacher who provides instruction in both math and reading may have had students with notably higher growth in math than in reading. Worthwhile uses of these data would be for the school to send students with math difficulties to this teacher, while offering the teacher professional development to improve his or her efficacy in teaching reading skills.

Opening Session

Monday, July 25, 2011

Forum Agenda Review and Introduction MS PowerPoint (1 MB)
Forum Chair Kathy Gosa (Kansas State Department of Education) welcomed Forum members to the 2011 Summer Forum Meeting in Bethesda, Maryland. She commented on the excellent quality of the morning's professional development session, and expressed her wish for a productive and enjoyable summer meeting. Kathy then introduced the Forum officers and welcomed the following new members to the meeting:

  • Jeff Baker, New York State Education Department
  • Laurel Ballard, Wyoming Department of Education
  • Kathleen Barfield, Regional Educational Laboratory – Southwest
  • Sarah Cox, Arkansas Department of Education
  • Robert Curtin, Massachusetts Department of Elementary and Secondary Education
  • Daniel Chuhta, Portland Public Schools (ME)
  • Daniel French, Bennington Rutland Supervisory Union (VT)
  • David Goldstein, Data Quality Campaign
  • Sally Gordon, Minnesota Department of Education
  • Kent Hatcher, Indiana Department of Education
  • James Hawbaker, Appleton Area School District (WI)
  • Bob Jensen, Sioux Falls School District (SD)
  • John Kraman, Oklahoma State Department of Education
  • Ellen Mandinach, Regional Educational Laboratory – West
  • Ruth Neild, National Center for Education Evaluation and Regional Assistance
  • Esmeray Ozdemir, Nevada Department of Education
  • Michael Pinkston, Lawrence County School District (AR)
  • Joyce Popp, Idaho State Department of Education
  • Eli Pristoop, Bill & Melinda Gates Foundation
  • John Rundle, Royal Valley USD #337 (KS)
  • Annette Severson, Colorado Department of Education
  • Ryan Smith, Green Hills Area Education Agency (IA)

Following introductions, Kathy reviewed the mission of the Forum and announced the release of three new Forum publications: the Forum Guide to Ensuring Equal Access to Education Websites; the fourth installment of the Traveling Through Time: The Forum Guide to Longitudinal Data Systems series, Book 4: Advanced LDS Usage; and the Forum Guide to Crime, Violence, and Discipline Incident Data. Kathy reviewed the meeting agenda, and shared with Forum members the success of the Forum in reaching constituents through the website and publications. Visits to the Forum website hit a new peak in January 2011, with 17,842 visits. In June, website visits were well over 11,000, and despite monthly variations, overall website visits are steadily trending upwards. In June, Forum publications were downloaded almost 7,000 times.

Welcome to the Summer 2011 Forum
John Easton, Director of the Institute of Education Sciences (IES), thanked Forum members for their work, noting the excellence of Forum publications. Dr. Easton explained his vision for IES, which is responsible for statistics, evaluation, and research. IES, he explained, has a reputation for rigorous research, and his goal is to ensure that this research is also relevant and useful to the field. He is committed to using data to help practitioners and policymakers. In order to do this, he explained that IES encourages and supports partnerships, especially through research grant programs. Some IES grants explicitly require partnerships, and others offer additional points for engaging practitioners in research. Ruth Neild, a Forum member representing the National Center for Education Evaluation and Regional Assistance (NCEE), is now responsible for the oversight of the ten Regional Educational Laboratories (RELs). IES is pursuing its goal of research partnerships through the RELs, which are required to have research alliances with various stakeholders. RELs are also increasingly using statewide longitudinal data systems. Dr. Easton has met with the Forum's SEA Data Use Working Group, which is working on the topic of how SEAs can more effectively work with researchers. It is crucial that education data return to education institutions in the form of useful and relevant research. Dr. Easton emphasized that quality data analysis, research, and evaluation have the power to directly benefit children in schools.

Assessment Consortia Panel
The U.S. Department of Education's Comprehensive Assessment Systems grant supports the development of assessment systems by consortia of states. In 2010, the SMARTER Balanced Assessment Consortium (SBAC) and the Partnership for the Assessment of Readiness for College and Careers (PARCC) earned competitive grants. The Forum convened a panel to discuss the work of the two consortia. Forum Vice Chair David Weinberger moderated the discussion.

Tom Foster, Kansas State Department of Education, reported on SBAC MS PowerPoint (2.69 MB), which is working to determine what an assessment system can do to help ensure that students leave high school college and career ready (CCR). In addition to this goal, SBAC aims to develop "comprehensive and innovative assessments for grades 3-8 and high school in English language arts and mathematics aligned to the Common Core State Standards," and to make assessments operational in consortium states in the 2014–2015 school year. The consortium currently includes 29 states, 18 of which are governing states, and the remainder serving as advisory states. Work groups meet to address specific topics of concern to the consortium, ranging from assessment design to technology and the common core standards. The consortium has partnered with institutions of higher education to create CCR benchmarks, and in turn, break down silos between K-12 and higher education. Tom reviewed the principles underlying the consortium's theory of action, and the three system components; a computer adaptive summative assessment, a computer adaptive interim assessment, and formative processes and tools. Key features of the SBAC assessments are computer adaptive testing and tailored, online reporting. Tom expressed the goal that testing should not be an event but, rather, part of the learning process. He provided a model of the system and a timeline, which anticipates that the assessments will be piloted in 2014. Tom also listed the benefits of a multi-state assessment system, including lowered costs and increased capabilities, greater control of systems, and better services for students. More information is available at the SBAC website.

Wes Bruce, Indiana Department of Education, reported on PARCC MS PowerPoint (1.54 MB). Speaking generally about the consortia, he explained that such partnerships promote collaboration, allow for piloting assessments, and allow for members to innovate at a level that would have been impossible without the grants. At heart, the consortia are about instruction, not tests. Assessments for both consortia will be online, which is new for many states. PARCC has 24 member states, and its work is very much state-led. The overriding task of PARCC is to build a pathway to college and career readiness for all students. PARCC is creating high-quality assessments for grades three through eleven. Unlike SBAC, PARCC is not adaptive. PARCC assessment items look more like classroom instruction than test items. Through the use of feedback loops, PARCC aims to support educators in classrooms. To this end, assessments are designed to provide information on student performance throughout the school year. PARCC is also promoting the improved use of technology in assessment, and automated scoring will allow a quick turnaround on data to benefit instruction. PARCC aims to advance accountability at all levels of education. More information is available at the PARCC website.

Joint Session: NAEP 12th Grade Reading & Math Data as Indicators of Academic Preparedness and Integrated Postsecondary Education Data Systems Update

Tuesday, July 26, 2011

NAEP 12th Grade Reading & Math Data as Indicators of Academic Preparedness MS PowerPoint (2.24 MB)
Cornelia Orr, Executive Director of the National Assessment Governing Board (NAGB), joined the Forum to discuss the National Assessment of Educational Progress (NAEP), also known as the Nation's Report Card. Dr. Orr explained that NAGB has been concerned with the achievement and engagement of 12th graders for almost a decade. A technical panel convened to look at the topic of 12th grade preparedness research determined that such research was feasible. The panel defined preparedness as it specifically relates to either college or the workforce, with college preparedness defined as qualifying for placement into entry-level college courses without the need for remediation, and workforce preparedness defined as qualifying for a job training program without the need for remediation. The panel differentiated between preparedness, which focuses on academics, and readiness, which encompasses non-academic factors. Multiple types of research studies are currently in various stages of completion, including content alignment, judgmental standard setting, statistical relationships, postsecondary institution surveys, and benchmarking studies. At the direction of the Technical Panel, researchers are also studying the relationships between studies. Currently, the data used in the research include the 2009 NAEP Reading and Math results. In addition to national data, eleven states also allowed for the analysis of state-level results. Dr. Orr stated that the project is looking for linkages to state data systems. Next steps in the process include developing a summary report of validity evidence and preparedness reports, with an anticipated release date in early 2012. Dr. Orr shared study findings thus far, which included the findings that NAEP content is similar to that of other indicators of college preparedness (ACT and SAT, for example), and some content is similar to WorkKeys. States can use this research as a resource for standard setting, and as a reference for state readiness and preparedness definitions. More information is available at the NAGB website.

Integrated Postsecondary Education Data Systems (IPEDS) Update MS PowerPoint (2.16 MB)
Elise Miller, IPEDS Program Director at NCES, visited the Forum to provide an update on the work of IPEDS. IPEDS is a system of surveys designed to collect data from all providers of postsecondary education, including private, public, for-profit, and non-profit institutions. IPEDS began in 1987 and currently collects data from nearly 7,000 institutions. IPEDS data collection is guided by three principles; (1) "data elements identify characteristics common to all providers," (2) elements are "interrelated to avoid duplicative reporting and reduce data burden," and (3) components are "compatible, but adapted to meet the needs and characteristics of different sectors of postsecondary education providers." Ms. Miller explained that IPEDS followed a previous study that started in 1965, so some trend data are available. Participation is mandatory for institutions that fall within the legal parameters of the program, and response rates have improved since compliance enforcement began in 2003. Ms. Miller reviewed the components of IPEDS and the release dates for data, noting that the data are at the institution, not individual student level. IPEDS data are publicly available in three forms, including the College Navigator consumer college search site, feedback reports provided to participating institutions, and the IPEDS data center. In addition, a help desk provides free assistance with IPEDS data. Ms. Miller reviewed a list of well-known foundations and news sources that use IPEDS data and reports. Recent IPEDS work includes making new information available to students, such as FAFSA information. The National Postsecondary Data Consortium is now providing guidance to institutions of higher education on making federal disclosures available on their websites. More information on IPEDS is available on the IPEDS website.

Joint Session: Implementing the Teacher-Student Link

Wednesday, July 27, 2011

Implementing the Teacher-Student Link MS PowerPoint (9.17 MB)
Hella Bel Hadj Amor and Anna Gregory met with the Forum to discuss implementation of the teacher-student link in the District of Columbia Public Schools (DCPS). Dr. Bel Hadj Amor, now with the American Institutes for Research (AIR), spent three years working with DCPS, and initiated the use of value-added models. Anna Gregory is the Director of IMPACT Operations at DCPS. The presentation focused on roster verification. Dr. Bel Hadj Amor reported that the goal of the presentation was to assist others with this task. The importance of roster verification was evident to DCPS in 2008–2009, when Mathematica Policy Research, a firm hired to research value-added approaches, found roster mismatches between teachers and the students they were teaching.

Ms. Gregory provided an overview of IMPACT, DCPS' performance management system for school-based personnel. The goal of IMPACT is to "create a system in which every parent would be satisfied randomly assigning their children to any classroom in DCPS." In 2007, Ms. Gregory explained, only 12% of DCPS students were proficient in reading (as measured by NAEP), but 95% of teachers met or exceeded expectations. The goals of IMPACT are to reward high-performing teachers, provide useful feedback, and transition out low-performing teachers. Value-added is only one component of the overall evaluation system and does not affect all teachers. Of the 23 IMPACT staff categories, 7 groups are teachers. Of these, only group 1, comprised of 4th – 8th grade reading and math teachers (15% of the force), have value-added data included in evaluations. IMPACT includes four ratings: highly effective, effective, minimally effective, and ineffective. Rewards and sanctions accompany each level of classification.

Implementation of DCPS' current roster verification was the result of a multi-year process involving nine steps, beginning with a pilot program and concluding with final data. Final data undergo three rounds of review, and participating teachers are asked to verify their rosters. DCPS established four measures for determining if roster confirmation was successful: (1) is there a high percent of educator participation, (2) are all critical data, such as subject, roster, and dosage (the estimate of the teacher's contribution to the student's instruction) included, (3) does it require a minimal time commitment, and (4) is only minimal support needed. Technical elements of roster confirmation include designing a system that includes different user profiles, developing an online tool, and establishing ways to identify and manipulate "seed" data (the initial roster data, which are retrieved from other data sources such as staff lists and student records).

After reviewing the key technical elements of roster confirmation, the presenters reviewed keys to the success of roster confirmation, as well as threats to success. The first key to success is to define priorities (e.g., what data will teachers verify, and how will this be done?). This is followed by establishing a timeline, deciding who should be involved in the process, and determining the number of rounds of roster verification needed. These steps may involve legal and union issues that must be addressed. Dr. Bel Hadj Amor emphasized the fact that no changes to the roster are made without the approval of teachers. The first threat to success was an unclear understanding of the importance of data accuracy, especially among the data team. Evaluations relying on roster verification have high-stakes results, and data errors could have negative consequences. The project also needs clarity in messaging and instructions, especially in explaining complex concepts such as "dosage" and alerting participants to the high level of detail needed for accuracy. DCPS would like to improve system performance by improving the process for verifying teacher entries, as well as the processes by which teachers verify their rosters.

The Forum audience asked questions about how dosage was determined, how the system combats fraud, and whether attendance rates are considered. Several members asked about the compensation system and teacher response to its implementation. The presenters explained the rationale driving the compensation system, and also explained that teachers have to opt in to the pay for performance system. For further information, a technical report describing value-added can be found on the DCPS website.

National Education Statistics Agenda Committee (NESAC) Meeting Summary

Monday, July 25, 2011

Afternoon Session

Welcome, Introductions, and Agenda Review
Pat Sullivan, Texas Education Agency, NESAC Chair, welcomed the committee members to Bethesda and to the NESAC Subcommittee. Members introduced themselves and Pat took a moment to remind associate members, business partners, and vendors that while their attendance in these sessions is appreciated, they may risk disqualification in certain state, local, or federal competitive bidding processes by being part of sensitive conversations during the subcommittee proceedings. Pat then reviewed the meeting agenda and the proceedings of the winter meeting in Austin.

Investing in Innovation Grants (i3) MS PowerPoint (9.13 MB)
Erin McHugh, Office of Innovation and Improvement, U.S. Department of Education, joined the committee to share information and updates on the Investing in Innovation Grants (i3). The i3 grants were originally authorized under the American Recovery and Reinvestment Act; the first grants were awarded in 2010 and a second competition will be held in 2011. The purpose of i3 is to provide competitive grants to applicants with a record of improving student achievement, attainment, or retention in order to expand the implementation of, and investment in, innovative practices that are demonstrated to have an impact on:

  1. improving student achievement or student growth, closing achievement gaps, decreasing dropout rates, increasing high school graduation rates; or
  2. increasing college enrollment and completion rates.

The Office of Innovation and Improvement is using the i3 grants to build a portfolio of information that can serve as a solutions resource for schools and districts. There are three types of i3 grants: Development, Validation, and Scale Up. All grantees are required to share lessons learned and any products developed using i3 grant funds. Grantees also participate in a "community of practice" group through the Office of Innovation. The program includes significant technical assistance provided by the Institute for Education Sciences (IES).

The 2010 i3 grants fall into one of four categories: Teacher and Principal Effectiveness; Enhanced Data Systems; College- and Career-Ready Standards and Assessments; or Improving Achievement in Persistently Low-Performing Schools. Eighteen percent of applications in 2010 fell under the data priority. These data priories include projects such as Project READs, the School of One, Engage ME: PLEASE, the Expansion and Evaluation of Education Pilot Program (EPP), and the Educator Evaluation for Excellence in Teaching and Learning (E3TL) Consortium. The abstracts and contracts for all grants can be found on the i3 website.

The 2011 grant competition will be slightly different. The competition will no longer include data systems as an absolute priority, although that will be pushed through the other priorities. The competition will also include two additional priorities: a focus on Science, Technology, Engineering and Math (STEM); and increasing graduation rates in rural areas. The 2011 competition was completed on August 2, 2011.

NESAC Discussion: Assessment Consortia
Monday morning's general session included a presentation with representatives of the two Race to the Top Assessment Consortia. Tom Foster from the Kansas Department of Education spoke on behalf of the SMARTER Balanced Assessment Consortia (SBAC) and Wes Bruce from the Indiana Department of Education represented the Partnership for the Assessment of Readiness for College and Career (PARCC). Both representatives joined NESAC to answer questions and engage in discussion with the committee members.

  • The assessment development teams and item writers will work with the authors of the Common Core State Standards to develop tests and test items. Both consortia are committed to assessments that link well with the standards and accurately assess student knowledge on the Common Core State Standards.
  • The goal of both consortia is to engage students and find out what they know. The technology of test design is evolving in exciting ways to be able to truly engage students in their learning through assessments.
  • Operationally, the assessment consortia are determined to work closely to ensure comparability across states. States will maintain control over their own procurement and vendor relations as they begin to implement either assessment system.
  • There are several small, but significant, differences between the two consortia. SBAC is a computer adaptive test while PARCC is a computer based system. PARCC has a greater focus on through-course assessments while SBAC is an end-of-term assessment. SBAC is putting a large emphasis on teacher supports, processes, and tools.
  • Each assessment consortia is focused on making testing part of the instructional process.

Creating Actionable Reports and Developing Dashboards
Kay Ihlenfeldt, Wisconsin Department of Education, joined the committee to discuss Wisconsin's dashboard development and style guides. The use of dashboards to get data into the hands of educators is fundamental to using data to improve student learning. Wisconsin started its dashboard project in April. Wisconsin will allow role-based access to different levels of data starting with access to assessment and student growth percentiles, enrollments, graduation cohort, attendance, and postsecondary (based on NSC data). An in-house development team will work with a local vendor to flush out reports focused on both growth and postsecondary achievement. The goal of the project is to be able to pull in data from the state's longitudinal data system to begin to process more sophisticated analysis like dropout predication, early warning systems, and "recovery" tools that will allow teachers and principals to know what programs are available and for what type of students they work. The security risks that come with classroom-level displays will be handled through professional development.

Tuesday, July 26, 2011

Morning Session

Common Education Data Standards MS PowerPoint (3.76 MB)
Tate Gould, NCES, and Beth Young, QIP, provided an update on the Common Education Data Standards project. Draft One of the Version 2 standard was released on July 18th for public comment. Over 200 comments were received in the first week. The CEDS Stakeholder Group has expanded to include postsecondary and early learning stakeholders. The K-12 scope will include elements needed to support assessments at the local and state levels as well as items for federal reporting and metrics/policy questions. NESAC members had the following comments and questions:

  • Will CEDS be a requirement of SLDS grants in the future?
  • What is the theory of action or adoption? Is there a difference?
  • NESAC members are hopeful for more tools and supplemental materials such as examples of crosswalks to CEDS.
  • What is the motivation for states to adopt? Is it just the need to increase the sharing of data across district and state lines?
  • How can this be shared with vendors in the development stages, and how can the Forum members help with this?
  • What is the value of sharing State Core mapping with LEAs?

NESAC Discussion: IPEDS and P-20 Data Systems
Elise Miller, the lead for the Integrated Postsecondary Education Data System (IPEDS) at the National Center for Education Statistics (NCES), presented at a Forum joint session to discuss the IPEDS system and the work NCES is doing to assist states in creating their P-20 data systems. The IPEDS system collects data from either individual institutions directly or a state-level coordinating agency depending on the state. IPEDS, like K-12 systems, has unique challenges and is working toward better data quality, better public reporting, and better use of data to improve student achievement. Ms. Miller joined the NESAC subcommittee to discuss further and answer the committee members' questions. The committee discussion hit on the following points:

  • IPEDS elements are collected every other year in an effort to relieve data burden. The office is working diligently to adjust its burden estimates by being more mindful of the effort involved at the campus level as well as by the survey staff.
  • IPEDS is working on better collaboration with the vendor community in the higher education space. Currently institutions do not always find vendor solutions that meet their needs.
  • The IPEDS office is working with the CEDS project. This work is aimed at reducing burden and increasing data quality in higher education.
  • NESAC members asked about the likelihood of unit record reporting in higher education. Elise reported that a change of that nature would require Congressional action to modify the Higher Education Act. There is some demand for these changes that is countered by privacy and other concerns. There are no immediate plans for this type of change.
  • IPEDS is making a greater effort to release data to the public. IPEDS has a longitudinal study that gives a good national level picture of student enrollment and achievement. However, this report is not available at the state, institution, or program level.
  • The SFSF assurances include significant data collection requirements for states and require that states publish information about postsecondary enrollment and success for all students. Currently, the IPEDS system is not aligned to the SFSF indicators and is not sufficient for reporting on these indicators. IPEDS would need to collect individual student level data in order to be able to answer these requests.
  • Data quality is an ongoing endeavor for IPEDS. The office is continually working to refine its system to improve data quality. Increased visibility of IPEDS data has been helpful in increasing the quality of information reported.

Professional Development Follow Up: Growth Models
On Monday morning, Forum members participated in a professional development session on using growth models. On Tuesday, Neal Gibson, Arkansas Research Center, was joined by Chris Woolard, Ohio Department of Education, to continue this discussion, answer questions, and discuss the merits of several different growth models.

Arkansas has been using the Colorado growth model, while Ohio uses the Sanders system for measuring growth (a Value-Added model). Each has experienced challenges in defining what elements are to be included in the growth measure and in deciding how these systems are to be used for high stakes decisions. There is a difference between data for accountability purposes and data for diagnostic needs and continual improvement. When carefully planned and developed, growth models can assist with both needs. A significant challenge with developing growth models is defining the teacher-student data link, and assigning a teacher of record and attribution.

NESAC Privacy Discussion
Kathleen Styles, the U.S. Department of Education's Chief Privacy Officer (CPO), joined the committee for a conversation about privacy. As the Department's first CPO, she is tasked with managing the Family Education Rights and Privacy Act (FERPA), data management and data release policies, and the Privacy Technical Assistance Center (PTAC).

Ms. Styles is committed to finalizing the FERPA regulations as soon as possible and releasing them by the end of the calendar year. This is an ambitious timeline. Input submitted during the public comment period, which ended May 23, 2011, are being reviewed and the draft regulations are being refined accordingly. The Department aims to make the final regulations helpful to states and districts as they use their data systems to share information in appropriate ways. The Department hopes to issue non-regulatory guidance that will further clarify some of the more complex aspects of the FERPA law.

NESAC members were interested in hearing how PTAC could be leveraged to help inform staff of FERPA regulations at the district and school building level, where understanding of the law is sometimes lacking. The Department is hoping to expand the PTAC activities to focus on the local level depending on the availability of resources.

NCES SLDS Technical Brief #3, Statistical Methods for Protecting Personally Identifiable Information in Aggregate Reporting, continues to cause confusion and questions. The Department will be looking into refining the briefs in order to make sure they are useful for states and districts.

NESAC Elections
The committee held leadership elections for the next year. Cheryl McMurtrey, Mountain Home School District (ID), was nominated by Linda Rocks as the committee chair. That nomination was seconded and the committee voted unanimously. Raymond Martin, Connecticut State Department of Education, received a nomination for vice chair by David Weinberger, Yonkers Public Schools (NY). The nomination was seconded and Mr. Martin was also elected by a unanimous vote.

Afternoon Session

Civil Rights Data Collection
U.S. Department of Education representatives Ross Santy, Office of Planning, Evaluation and Policy Development, and Rebecca Fitch, Office for Civil Rights (OCR), joined the committee to discuss the Civil Rights Data Collection (CRDC), including the collection process and changes to the survey based on state and district feedback.

Over the past few years, OCR has been working with EDFacts to find ways to leverage the EDEN Submission System to ease the burden of reporting the CRDC. To assist this effort, the Department has convened a workgroup of LEA and SEA Forum representatives. This workgroup has provided valuable input and feedback, which has shaped the changes to the upcoming collection.

There will not be significant changes to the survey between the 2009–2010 collection cycle and the 2010–2011 collection. The Office of Management and Budget (OMB) has cleared the Department to issue a universe collection for the 2010–2011 CRDC. The collection is typically a sample collection, expecting about 7,000 districts to respond. However, OCR does collect data in all districts once every 10 years.

To ease burden at the district level, the CRDC will prepopulate the survey with data elements already submitted to EDFacts. Wherever possible, the CRDC will access data elements that districts have already submitted to their states. Elements will all come from the same school year, which is a change from previous surveys that requested some information that required the previous year's data. OCR is working to improve outreach strategies to districts and states. A letter about this collection will be sent to the superintendent listed in the Common Core of Data asking them to identify the correct contact within their district. Where possible, the letter will also be sent to the previous respondent. NESAC members suggested that OCR draft a more detailed letter, including frequently asked questions, for those districts that have not been surveyed recently.

For more information on the CRDC and results of previous surveys, visit ocrdata.ed.gov.

Teacher Student Data Link: Making It Happen
The committee spent a significant portion of Tuesday afternoon discussing issues related to the teacher student data link implementation and defining the teacher of record (ToR).

Nancy Wilson, Center for Education Leadership and Technology (CELT), joined the committee to discuss the work CELT has been doing in five states with funding from the Bill & Melinda Gates Foundation. The Teacher Student Data Link MS PowerPoint (253 KB) (TSDL) project is set to wrap up in December 2011 and states have made significant progress in implementing the link and defining the ToR. Some of the key takeaways from the work have been:

  • States found that the most important discussion related to this project was defining the use and purpose of making the link.
  • With the high stakes nature of the teacher and student data link, states must have strong validity and reliability within the system.
  • Governance cannot be overstated. Who owns the data for the link is an essential question that must be answered.
  • Providing more access to these data – directly to teachers, parents, and students – helps to increase data quality.

Next, the committee heard from Linda Rocks, Bossier Parish Schools (LA), describing their roster verification system MS PowerPoint (2.82 MB). While developing this system was one of the easiest tasks involved in the teacher student data link project in Louisiana, it was also one of the most important.

The system allows teachers to view and edit the list of students identified as being enrolled in their class. Any changes are verified and conflicts are resolved by the building principal. The SEA provides the state-level verification making sure that a student who is "removed" by one school or teacher is "caught" by another. The state is currently facing challenges in a few areas including virtual education and highly migrant populations. These are ongoing challenges for the ToR system that Louisiana will continue to work on and refine. The TSDL project and the ToR conversations were helpful in bringing some disparate silos together at both the state and district level.

Susan Williams, Virginia Department of Education, presented on the work Virginia is doing to link students to teachersMS PowerPoint (770 KB). Virginia did not participate in the Bill & Melinda Gates funded TSDL project. Virginia is using the Forum-developed SCED codes for both secondary and prior-to-secondary education to link students to classes and sections and then link those classes and sections to teachers. The system is able to handle both simple linkages, such as every student to a teacher of record, and more complicated ones (e.g., special cases and multiple teacher situations).

In defining how this linkage would play out for high stakes decision making, staff at the Virginia Department of Education put on their "parent hats." When attending a parent teacher conference, no parent would want to hear, "Well, I don't know what's going on with your child, I only had him for 45% of the time." This thinking determined the evaluation policy. In Virginia, if a teacher has a student for more than 20 instructional hours, that student is included in the teacher's value added measure. It does not matter if the teacher is a collaboration teacher, a team teacher, or a resource teacher. All team teachers will be responsible for that child's learning. One student can and will be included in more than one teacher's value added score.

Teachers will have access to data on their past year's students based on the rosters submitted. The roster verification is handled in the local student information systems and data are resubmitted to the state through the master schedule collection. The use and visibility of these data is continually increasing data quality.

For more information, including the Master Schedule Collection layout, visit www.doe.virginia.gov.

Wednesday, July 27, 2011

Morning Session

Emerging Lessons from Race to the Top MS PowerPoint (188 KB)
The Race to the Top grant competition placed significant emphasis on using data systems to support classroom instruction and student achievement. The lessons learned by the RTT states will be valuable for all states as they refine and improve their data systems. Rob Curtin, Massachusetts Department of Education, and Robert Swiggum, Georgia Department of Education, shared their experiences and early lessons in implementing the bold plans put forward in their RTT grant applications.

  • The RTT awards are impossible to think about in silos. The implementation teams for the RTT plans must be cross agency to reflect the holistic nature of the plans. In both Massachusetts and Georgia, RTT funding and planning had to be incorporated into the existing Statewide Longitudinal Data Systems grants.
  • A challenge in Massachusetts has been working with early childhood and postsecondary agencies and sharing data across systems. Massachusetts has a very conservative view of FERPA. The state established an executive education agency that sits atop the SEA, early childhood, and postsecondary agencies and is able to act as a third party to help develop memoranda of understanding.
  • A common communications challenge related to talking about "burden reduction" is making sure that districts understand that it does not mean a reduction in jobs. Lessening the reporting burden means more work in ensuring quality data and access.

Workgroup Proposal: Student-Educator Data Link
The committee drafted a proposal to create a working group on the student-educator data link. This proposal will be brought to the Forum Steering Committee and NCES for approval.

NESAC Discussion: NCEE and the Forum
Ruth Neild, National Center for Education Evaluation and Regional Assistance (NCEE) at the U.S. Department of Education, joined the committee to discuss areas of collaboration between NCES, the Forum, and NCEE. Ms. Neild is new to NCEE, having worked previously in a research capacity at Johns Hopkins University and the University of Pennsylvania. As a researcher in Philadelphia, Ms. Neild had the task of coming up with a graduation and dropout rate calculation for the city. Ms. Neild relied on the Forum publication, Accounting for Every Student: A Taxonomy for Standard Student Exit Codes, which was instrumental in defining the parameters of the defined graduation/dropout rate and building credibility for her study.

NCEE is the newest federal member of the Forum. As a part of the U.S. Department of Education, NCEE supports and undertakes a wide range of education research activities, including evaluation, technical assistance, and the dissemination of information from evaluation and research. NCEE is also responsible for promoting the use of scientifically valid research in education. In addition to contributing to the Forum as a federal member, NCEE is collaborating with the Forum on the topic of data use. Discussions between NCEE, NCES, and the Forum revealed a shared concern regarding the need for guidance and best practices for sharing data with education researchers. NCEE and the Forum will work together to develop a new resource that focuses on facilitating the use of education data by researchers, including guidance on how researchers can request data and how education agencies can initiate best practices for evaluating data requests, sharing data, and ensuring data privacy.

NCEE is also responsible for the contracts and operation of the Regional Education Laboratories (RELs). There are 10 RELs that serve under contract with the Department. The contracts last five years. Currently, the contracts for the labs are out for bid. This competition will ask RELs to focus on three to five topic areas of highest need in the region and work with a research alliance, including SEA and LEA partners, to provide actionable information and research to educators. The allowable uses of funds include: data analysis, such as baseline, trend lines, etc.; impact studies; technical support, including workshops, webinars, videos, toolkits, etc.; and dissemination efforts. NCEE is hoping to encourage RELs to work directly with LEAs through research partnerships.

Topics from the Floor and Winter Planning
In spite of a full agenda, there were several topics that were not addressed that the committee would like to see on the winter 2012 agenda, either in committee sessions or full Forum sessions, including:

  • continued discussion on higher education linkages and teacher preparation issues;
  • implications for data systems of implementing the Common Core State Standards;
  • more discussion on privacy and the final FERPA regulations;
  • NAEP college and career ready results;
  • international assessments (Program for International Student Assessment (PISA), Trends in International Mathematics and Science Study (TIMSS));
  • Common Education Data Standards: rationale for implementing or incorporating either the State Core or National Education Data Model (NEDM); and
  • linkages between educators and students.

Policies, Programs and Implementation (PPI) Standing Committee Meeting Summary

Monday, July 25, 2011

Afternoon Session

Welcome and Introductions
PPI Chair Laurel Vorachek, Anchorage School District (AK), called the PPI committee to order and asked members to introduce themselves.

Agenda Review
Laurel Vorachek outlined the PPI agenda for the summer meeting.

Winter 2011 PPI Meeting Review
Laurel Vorachek reviewed the work PPI accomplished at the Winter 2010 Forum, noting that many of the topics would be addressed as updates at this meeting.

Data Use Working Group Update
Kathy Gosa (Kansas State Department of Education) provided an update on the Data Use Working Group. The group plans to have a draft of the introduction and educator brief sometime after its November meeting. A brief for administrators will follow. The first brief can be used as a template so the future briefs will not take as long to develop. There will also be an introductory framework that connects them.

There is a new sub-group: Data Use for SEAs. Ruth Neild, NCEE, contacted the Forum regarding finding standard ways for RELs and states to work together. The group discussed resources already in place to assist SEAs and researchers, including core practices, templates, etc. This group met for first time in the Spring and reviewed a comprehensive outline on Sunday. The working group will be meeting in November to review a draft and templates, and will have something to share with the Forum by the end of the year. Concern was expressed that LEA representatives were not on the committee.

Professional Development Follow Up: Growth Models
Neal Gibson joined PPI to follow up on Monday's professional development session, "Growth Models and Their Application." PPI had questions relating to the use of vertical scales, such as how instruments are developed, and how one determines whether a scale is correct for a growth model. In order to determine which growth model best fits an education institution, Neal advocated presenting data from growth models to administrators and educators, and asking them to determine which model best represents their situation.

PPI members asked about ways to tailor the information gained from growth models to different audiences, especially teachers. Neal suggested showing data in context, and if possible, in an interactive format. He emphasized that this gives users diagnostic tools. PPI was also interested in knowing whether anyone has looked at growth rate changes among teachers, or tracked teacher records for growth over time. For example, is it possible that a teacher's growth rates would change in different schools or different settings? Neal indicated that administrators are interested in watching a single teacher over time, and this is addressed in the Economic Policy Institute's briefing paper, "Problems with the Use of Student Test Scores to Evaluate Teachers." He also recommended www.tsdl.org as a starting point for those interested in the topic.

University Preparation of Educators to Use Data
Pat Sherrill (ED) provided an overview of university-level preparation of educators on the use of growth models. This project began with graduate students who were working with Pat to determine who in colleges of education is teaching teachers to use data. The students did an inventory for a class project but a bigger, broader project is needed. In February, the Spencer Foundation convened a discussion on the topic. The issue is that large data systems are generating large amounts of student data, but these data are not being used by classroom teachers.

PPI Discussion: Assessment Consortia
David Weinberger (Yonkers Public Schools (NY)), Tom Foster (Kansas State Department of Education), and Wes Bruce (Indiana Department of Education) joined PPI to answer questions and have a discussion about the assessment consortia. A concern was raised about the role of postsecondary education in the process. Wes indicated that both consortia are required to have postsecondary at the table. Tom stated that the involvement of postsecondary may be one of the defining factors of the new generation of assessments—changing the conversation to P-20. Wes explained that the assessment is about placement, rather than who gets into a particular college. The question is whether students are prepared for coursework. A PPI member noted a potential scenario: parents will question why their child is not accepted to a college while the tests deem the child college-ready. Tom indicated that there will have to be paradigm shifts. Wes stated that, currently, there is no common standard, and part of the idea of the consortia is to get institutions to agree on, "What is 'ready' for Math 101?" Another PPI member asked about access to the assessments for non-consortia members. Wes stated that other states, and possibly districts, will be able to opt in at a later date. Both consortia have grants to study the transition to common core standards. One district issue is that the consortia are state dependent.

Tuesday, July 26, 2011

Morning Session

Privacy Technical Document Reviews
PPI Members broke into three groups to review the three technical briefs done by NCES. There were three guiding questions for this review:

  1. Are there challenges SEAs and LEAs face in trying to meet the recommendations provided in the briefs? Are they different for SEAs and LEAs? Postsecondary?
  2. What do you think about the overall format of the briefs? Are they readable for their intended audience?
  3. What information, if any, may need additional guidance or clarification to support implementation by SEAs, LEAs, postsecondary institutions, early childhood agencies, and other groups such as researchers?

Technical Brief #1 Comments:

  • Great start
  • How do users know that there are other briefs that go with this one?
  • Target audience is not clear. Different audiences are mentioned.
  • Is this intended for LEA, IHEs, etc.? If not, why?
  • The brief should be messaged in a way that explains the relevance to the audience and why the audience should take actions on this.
  • Where is the capacity in the organization to do this work?
    • Who in the organization does this work? This should be a full-time position – not an "other duties as assigned" position.
  • References to other laws in this brief might be a distraction because there is not a thorough explanation of them (needs some linking to broader sources).

Technical Brief #2 Comments:

  • It may be worthwhile to reconsider the title of the series of briefs. The title indicates that the briefs provide guidance for Statewide Longitudinal Data Systems and the notation indicates that the briefs are intended to provide "best practices" for consideration by states developing SLDSs. Yet, this brief addresses best practices for other audiences.
  • Since having the proper governance structure in place is so critical, this part of the brief needs to be expanded. Does the structure of data governance look the same at the SEA, LEA, and postsecondary levels? It is not clear who should be represented on the committee—should the decision come from a higher governing body such as the Governor's P-20 Council? What are things to consider when they are done at the different levels rather than together to make sure they don't conflict with each other (e.g. when a district has their own governance committee that works separately from the SLDS governance committee)?
  • More discussion is needed on stewards and organizational commitment.
  • SLDS language is a little outdated, (e.g., there needs to talk about linkages between multiple agencies)
  • Summaries are good—they also naturally lead into the next section.
  • The examples provided in the document were good, but it would be beneficial to have more examples that are inclusive of early childhood and postsecondary. The document does a good job of identifying what we need to be aware of in maintaining the balance between protecting student privacy and being accountable to the public. "Reasonable efforts" is a challenge for SISs.
  • Data are everywhere, and we have to protect student identity with linking.

Technical Brief #3 Comments:

  • Well written
  • We are being asked to be transparent, but this document recommends masking more information. With all the funds invested in SLDSs, states cannot report out less.
  • Need to add a reasonableness to these recommendations
  • Add in other groups beyond SEAs
  • Maybe add in some flow charts or examples of processes

General Technical Brief Comments:

  • Should be updated (i.e., formatting, electronic links, SLDS language)
  • While states are getting millions of dollars from the federal government to build complex systems and be transparent, they are also being asked by these briefs to mask more data.
  • A summary document that describes when to use each brief would be helpful.

A Multi-State Data Exchange MS PowerPoint (240 KB)
Hans L'Orange, SHEEO, provided an update on the four-state longitudinal data exchange pilot project of the Western Interstate Commission for Higher Education (WICHE). The four states are Idaho, Hawaii, Oregon, and Washington. This project includes K-12, postsecondary, and labor representatives, and focuses on the necessary architecture, governance structures, and standard reporting while complying with applicable privacy laws. The conversation has moved from "should we do this" to "how do we do this." It is possible there will be a release this Fall but it is not guaranteed. Some of the lessons thus far include:

  • Real value comes from incorporating workforce information.
  • There is a genuine appetite among participating states for linked data.
  • There is a need to manage the expectations of partnering state agencies in terms of what data can be used for what purposes (operational vs. research).

Common Education Data Standards MS PowerPoint (3.76 MB)
Tate Gould, NCES, and Beth Young, QIP, provided an update on the Common Education Data Standards project. Draft One of the Version 2 standard was released on July 18th for public comment. Over 200 comments were received in the first week. The CEDS Stakeholder Group has expanded to include postsecondary and early learning stakeholders. The K-12 scope will include elements needed to support assessments at the local and state level as well as items for federal reporting and metrics/policy questions. PPI Members had the following comments and questions:

  • Researchers need to be involved. Researchers want data that are not shared.
  • How can P-20 councils be used in CEDS?
  • Can the code sets be displayed in tables so they can be sorted?

PPI Discussion: IPEDS and P-20 Data Systems
Elise Miller, NCES, joined PPI to answer questions on her IPEDS general session presentation. Members asked about the graduation rates used by IPEDS and the Student Right to Know Act. A member asked a question about the jump in graduation rate measurement from 100% to 150% of normal time to completion, and Elise said there is a publication that describes the technical aspects of this.

Afternoon Session

PPI Privacy Discussion
Kathleen Styles, the new Chief Privacy Officer at USED came to PPI to introduce herself and hear questions and concerns from PPI members. Serving in this position since April, Kathleen primarily works on data management. PPI members had the following comments/suggestions:

  • Parents are not as supportive on the new privacy rules. There is not a lot of confidence in the ability of technical systems to protect data (examples of breaches are frequently in the news).
  • We need to demonstrate the value that data provide and how students benefit from data sharing.
  • We need an introductory brief explaining the technical briefs for people.
  • What are the aims of the PTAC regional meetings?
  • More information on data governance is needed with respect to the technical briefs.
  • There is an overarching concern that Brief #3 would cause less transparency.
  • What is reasonable when suppressing data? For example, 100 percent proficiency cannot be reported.

Early Childhood Data Collaborative MS PowerPoint (1.46 MB)
Elizabeth Laird, Data Quality Campaign (DQC), joined the group joined the group to discuss the Early Childhood Data Collaborative (ECDC) Inaugural State ECE Analysis. She reported on the work of the ECDC, which works to build political will for coordinated early childhood data systems. The ECDC identified 10 Fundamentals of Coordinated State Early Care and Education Data Systems. Elizabeth reviewed the 10 Fundamentals and gave an overview of the work of the ECDC. She then explained that early care and education is a fragmented system, and many challenges complicate the process of linking child data with program and teacher data. Elizabeth emphasized the importance of data governance; she reported that in the ECDC's attempts to conduct a survey it was difficult to identify who is in charge of early childhood data. The ECDC surveyed 49 states on progress toward the 10 Fundamentals. Survey results showed that every state respondent collects data in at least some early childhood education programs, but data are uncoordinated across programs, and the collection of workforce data is limited. Recommendations from the survey include articulating policy questions, evaluating current and future data collections, and focusing more on governance. The ECDC wants to administer the survey again, with revisions. Elizabeth encouraged the Forum members to send her suggestions on how the survey can help the Forum, and how it can become a more useful state resource.

Civil Rights Data Collection
Rebecca Fitch and Ross Santy, ED, provided an update on the Civil Rights Data Collection (CRDC). The CRDC working group, which includes LEAs and SEAs, met at the Winter Forum meeting and again at the Summer Forum meeting. There is a website where more information can be found on the CRDC, including data from the survey. In the upcoming CRDC, there will be a few new items, including preschool suspension and expulsions. The Office of Civil Rights (OCR) is considering different snapshots and may extend the collection over the summer through the December 1 due date. Rebecca reported that non-regular schools (charter schools, juvenile justice facilities, etc.) posed the most significant challenge to including an entire universe. PPI Members had the following comments/questions:

  • Will it be a universe in the future?
  • States expressed concerns about communicating to LEAs about this survey since they are not involved. While states appreciate that ED could use their help, there is a hesitance to get involved.

PPI Elections
The committee held leadership elections for the next year. Levette Williams nominated Sonya Edwards, California Department of Education, for PPI vice chair. The vote was unanimous in her favor. Tom Howell, Michigan Center for Educational Performance and Information, was nominated for chair and the vote was unanimous in his favor.

Tuesday, July 26, 2011

Morning Session

Steering Committee Report
PPI Chair Laurel Vorachek, Anchorage School District (AK), gave a brief overview of the issues from the Steering Committee including a potential educator-student data link working group. She will bring ideas to the Steering Committee from PPI such as the idea of an online FERPA training and completion of the Forum's privacy documents.

Meeting Review/Winter 2012 Planning
PPI members felt there was a good mix of professional development, committee meetings, and general sessions and that the meeting was well organized. Members liked the privacy breakout sessions in PPI and appreciated the opportunity to review the materials ahead of time. There were several suggestions for the Winter meeting:

  • Two professional development topics from which to choose (rather than only one option)
  • FERPA update
  • Assessment Consortia: technology needs
  • Early childhood from a state perspective
  • Update on SFSF now that all states will have reported once
  • Public Domain Clearinghouse

PPI Discussion: Teacher-Student Data Link MS PowerPoint (1.28 MB)
Rob Curtin, Massachusetts Department of Elementary and Secondary Education (ESE), gave an update on the Massachusetts Student Teacher Connection. In this project, Massachusetts was trying to gather student course information in October 2010, course completion including grades/marks in SY 2010–2011, and load the information into the Education Warehouse in order to provide valuable analytical tools. This would align the ESE data collection systems more closely with the LEA data systems and improve the collection method of highly qualified teacher status to minimize reporting burden at the local level. Rob shared several lessons learned: ensure that LEAs participate (critical); communicate with vendors throughout the project; map data elements from the student information system to state data elements; and conduct cross validation between three systems. Massachusetts was dealing with some of the same challenges as other states: elementary schools may not have detailed courses; co-teachers and virtual courses are challenges; there is complexity in validating multiple systems; and the quality of local data is a concern.

PPI Discussion: NCEE and the Forum
Ruth Neild, National Center for Education Evaluation and Regional Assistance (NCEE) came to PPI to provide a quick update on the work being done at NCEE and how they would like to leverage the work of the Forum. Ruth oversees the 10 Regional Education Laboratories (RELs) and would like to bring the Forum to NCEE meetings and train others on how to use Forum work. PPI members noted that they would like to see more meaningful LEA relations and collaboration with the RELs.

Technology (TECH) Committee Meeting Summary

Monday, July 25, 2011

Afternoon Session

TECH Discussion: Assessment Consortia
In a follow up to the Opening Session presentation, TECH members had a chance to ask questions to panelists David Weinberger (Yonkers Public Schools, NY), Tom Foster (Kansas State Department of Education), and Wes Bruce (Indiana Department of Education).

PARCC representative Wes Bruce said that PARCC is developing assessments that are agnostic with respect to technology. While there is still some uncertainty about what technology will be required for planned online assessments, there is an expectation that a "typical" computer that can handle internet, sound, etc. will suffice as an assessment station. Any system will work as long as it meets a set of fairly basic minimum capacity standards. Having said that, they are designing for the future—a handheld technology tool may be more realistic in the future than the old model of a computer lab. SMARTER Balanced Assessment Consortium (SBAC) representative Tom Foster added that technology capacity and bandwidth are also issues when one considers moving data back to schools for interim use.

Both representatives agreed that the bottom line is that assessment will be different in 2014 than it is today. For example, many students are not in a traditional classroom anymore. How will those "virtual" teachers use data? How does this affect the return flow of data? TECH members noted that assessment designers should consider how students might access and take assessments outside the classroom setting. Wes replied that authentication becomes a problem under that scenario. The technology is capable of handling the authentication, but there is still an issue of who is actually answering the question if there isn't an assessment administrator onsite.

While SBAC uses an adaptive testing approach, PARCC chose to have many open-ended type questions in its assessment, which does not lend itself to adaptive testing. Both have advantages and disadvantages, which permits a comparison of approaches and provides people with choices.

For this to work, there needs to be a "marriage" of technology and assessment. While assessment needs should drive decisionmaking, technology staff need to be kept informed if they are to be expected to effectively deliver the assessment and serve data use.

Welcome, Introductions, and Winter 2011 TECH Meeting Review
TECH Chair Lee Rabbitt (Newport Public Schools, RI) welcomed everyone to the TECH meeting, led the group in introductions, and reviewed proceedings from TECH's meeting at the Winter 2011 Forum.

Summer 2011 Agenda Review
TECH Vice Chair Peter Tamayo (Washington Office of Superintendent of Public Instruction) reviewed the agenda for time together in TECH.

Section 508 Working Group
Lee Rabbitt (Newport Public Schools, RI) chaired the Forum Section 508 Accessibility Working Group and was happy to report that the document has been released online at NCES. Print versions should be available in the next few weeks. Congratulations to the Section 508 Working Group that developed the resource.

TECH Election
Peter Tamayo (Washington Office of Superintendent of Public Instruction) was elected TECH Chair and Laurel Krsek (Napa Valley Unified School District, CA) was elected Vice Chair for 2011–2012.

TECH Panel Discussion: SEA Progress on Race to the Top
Panel members included Lee Rabbitt (Newport Public Schools, RI), Rob Curtin (Massachusetts Department of Elementary and Secondary Education), and Charlene Swanson (New York State Education Department), who each described the many challenges and opportunities facing their agencies related to a Race to the Top grant. While each state was benefitting from the resources associated with Race to the Top funding, they noted that there appeared to have been some instances in which the funding was driving local/state policy rather than policy driving the vision. It was noted that these are data projects – not IT projects. Nonetheless, advances were being made to align systems across the P-20 spectrum at state, regional, and local levels. In many instances, it is impossible to separate RTTT from SLDS because they are so closely aligned. Because this work will be ongoing, TECH will likely wish to continue to receive updates at future meetings.

Tuesday, July 26, 2011

Morning Session

TECH Discussion: IPEDS and P-20 Data Systems
Elise Miller (NCES) leads the Integrated Postsecondary Education Data System (IPEDS) program and joined TECH to answer questions following her joint session presentation earlier in the morning. Questions and discussion included the following:

  • TECH members asked about collection of data about staff benefits through IPEDS--Elise said that while faculty level staff's base salaries are collected, fringe benefits details are no longer collected. Benefits as a percentage of total compensation are collected instead.
  • One member noted College Navigator's tremendous value to high school students. Elise agreed and noted that while the tool does get considerable usage, with about one million website visits per month, NCES would like to see more. Therefore, they are considering the possibility of allowing states, districts, and schools to incorporate the tool into their own websites to increase visibility at the local level.
  • Elise said that there is no relationship between the National Student Clearinghouse (NSC) and IPEDS, although IPEDS does use some NSC data.
  • Another TECH member suggested that IPEDS data would be even more useful if they were collected at the individual student-level, but acknowledged the historical resistance of such collection. Elise suggested there may be a change on this issue, which has probably been influenced by the states. However, there is legislation in place that prevents IPEDS from moving in this direction.
  • Analyses to assess postsecondary programs in terms of gainful employment will use Social Security numbers to link to workforce data.
  • IPEDS has tried to get institutions of higher education more involved in the CEDS project. Since all Title IV institutions have to report to IPEDS, alignment with CEDS could reduce reporting burden.

TECH Privacy Discussion
Kathleen Styles is the Chief Privacy Officer (CPO) at the ED. She joined each of the Forum's standing committees to introduce herself to our members and learn about the privacy, confidentiality, and security needs of our SEA and LEA representatives. Kathleen also reported that the U.S. Department of Education is taking a comprehensive and proactive approach to providing technical assistance related to these issues—in the form of the Privacy Technical Assistance Center (PTAC), a one-stop shop for technical assistance related to the privacy, confidentiality, and security of education data. Discussion topics raised by Forum members included:

  • FERPA
    • Kathleen reported that the NPRM is undergoing revision following the public comment period and should become a final rule prior to the end of calendar year 2011.
    • Forum members noted that some LEAs have school improvement grants. In these situations, teaching staff might be given access to individual student data for students throughout the school (not just in a classroom). Is this appropriate from a FERPA perspective? Kathleen replied that deciding who has an articulable, reasonable "need to know" is a local decision. If school officials believe that a school-wide management approach is articulable and reasonable, then the practice complies with FERPA.
    • LEAs in particular said that many school and district staff know very little about FERPA requirements. They suggested that PTAC develop a FERPA 101 online course – the very basics about understanding, implementing, and complying with FERPA for data clerks, school administrators, etc. The course (and related materials) should be designed and formatted for a very broad and generally uninformed audience. The biggest bang for the buck would suggest keeping it simple and practical rather than legal or theoretical. Kathleen said that the Forum privacy documents are very good. TECH members agreed but said an annual online training would have more impact than even a relatively short print publication like the Forum guides. Kathleen agreed that ED and the Forum should consider whether and how to advance this type of good idea.
      • Kathleen said that PTAC was planning an issue brief on training, but maybe this request for online training should inform how that task progresses as well.
      • From a FERPA compliance perspective, it is critical to train LEA staff. When, for example, SEA staff reject data requests on the grounds of FERPA, some researchers try to circumvent this decision by asking for the data directly from less-informed LEA staff.
    • Kansas is planning on delivering more FERPA training to its school and LEA staff members.
      • The Forum will likely wish to revise its privacy resources following the release of new FERPA rules. Perhaps there will be an opportunity to collaborate with PTAC on this front. Is this resource a chance for the Forum to advance the P-20 concept?
  • Food services eligibility data – LEA representatives said that they are being told by the U.S. Department of Agriculture (AG) that FRPM data are restricted to staff who administer food services programs—not education officials. SEA representatives added that they can drill down to the individual student level for all variables except free and reduced price meal (FRPM) data. Kathleen thought that this sounded like it contradicted the joint memo guidance issued collaboratively by ED and AG.
    • One SEA reported that AG has broadened the eligibility of FRPM to all foster children, so it no longer is a strong proxy for low income status (if it ever was).
    • Kansas State Department of Education (KSDE) sends student data to the state services agency, which matches education records with services records and then shares them with LEAs to pre-certify eligible students. Note, however, that the SEA cannot access the matched records because of privacy rights.
    • This is a big deal in many LEAs – e.g., many use FRPM data for e-rate applications, which is a very important revenue (offset) source for them.
    • Oklahoma had to get the governor to sign an interagency data agreement in order for the SEA and social services agencies to share FRPM data. It took a long time, but the system seems to be working now.
  • Cloud computing solutions – Can PTAC help SEAs and LEAs decide whether student records hosted in the cloud are secure? Kathleen's position was that ED and FERPA require that education records be maintained securely, but the technology used to do so is a local decision. If an LEA or SEA determines that the cloud is safe, they comply with FERPA. Having said that, PTAC should be able to provide some practical recommendations about best practices on this front.

TECH Panel: E-Transcripts and Student Record Exchange
Panelists included Peter Tamayo (Washington State Office of Superintendent of Public Instruction), Larry Fruth (SIF Association), John Brandt (Utah State Office of Education), Bethann Canada (Virginia Department of Education), Kathy Gosa (Kansas State Department of Education), Raymond Yeagley (Northwest Evaluation Association), and Lee Rabbitt (Newport Public Schools, RI).

The need for this level of effort for exchanging student records is driven by student populations: In some states, 30-55 percent of a school's student population switches schools each year, and 41 percent of highly mobile students are low achievers. Moreover, high mobility can have dramatic effects on school funding. Until recently, there was not a national, easy-to-use, cross-protocol system for the secure electronic transfer of student data, but the emergence of XML-based technologies has now made the definition, transmission, validation, and interpretation of data between applications or organizations possible.

In the state of Washington, student record exchange stems from the Data Governance Committee, which heard a need from the districts for quicker access to student data so that they can place transferring students in appropriate courses and programs as quickly as possible to ensure student success. When a student arrives in a new district, the student enrollment record is submitted to the state so that the district can receive access to information about the student's history in other Washington schools (e.g., assessment, enrollment, programs). To match records, all K-12 student records are searched based on general demographics. Possible matches are returned with high level enrollment information to allow for the informed selection of the correct student. Upon positive confirmation of a match, appropriate access to the student record is given. All access is logged and audited to ensure appropriate privacy protections are maintained.

In Utah, the Utah eTranscript and Record Exchange (UTREx) has transformed student data exchanges into efficient, electronic and standardized processes that improve data quality including accuracy, timeliness, accessibility, and integrity. UTREx provides the framework and mechanism to move individual student data from district information systems to the state level (for use and federal reporting) as well as to postsecondary student information systems via an e-transcript adapter. The project began in June 2010 with the release of an RFP. By July 2011, the pilot was underway and Utah expects to be using UTREx in full production by October 2011.

Virginia is a strong local control state, with numerous different student information systems in use by its 132 local education agencies (LEAs) (with 1.2 million students). SIF is used to connect these systems via a student locator system (128 LEAs), e-transcript (48 LEAs), and student schedule system (25 LEAs). Several state universities also use SIF and the National Transcript Center for e-transcripts.

Kansas is using an SLDS grant and ARRA grant to streamline the exchange of K-12 student records for transferring high school students and exchange of transcripts between postsecondary institutions. Participation is advocated rather than mandated and requires the adoption of a standard state data set (based on SIF and PESC specifications). By May 2012, the K-12 Initiative expects that all Kansas postsecondary institutions will be signed up to receive high school electronic transcripts and 39% of the 365 accredited high schools will be sending transcripts electronically. Five postsecondary institutions have volunteered to pilot postsecondary electronic transcript exchange (with hopes of full scale use by 2013). Challenges in Kansas include the integration of multiple SIS vendors as well as funding beyond the existing grant cycle.

The Rhode Island Department of Education (RIDE) has begun the process of creating a universal e-Transcript for high schools. All high school courses are being mapped to the School Codes for Exchange of Data (SCED) and Rhode Island's LEAs have been part of the design process. As part of a currently funded SLDS grant, the K-8 transcript work will begin upon completion of the high school transcript. Given the state's move to the Common Core, there is increased interest in pursuing a standards-based transcript at the K-8 levels. RIDE has formalized its statewide contract with the National Student Clearinghouse (NSC). Though the NSC does not collect student data by subgroup, the data formats allow RIDE to attach each student's state-assigned student identifier to the student record, which will enable matching.

Afternoon Session

Professional Development Follow Up: Growth Models
Neal Gibson (Arkansas Research Center) joined TECH to follow up on the professional development session he provided to the Forum on Monday morning. Neal began by providing clarification on a point from the professional development session. The lack of correlation between a student's growth from one year to the next seems to result from the fact that, in the model, students are divided into cohorts each year based on their level of growth in the previous year. Students in higher cohorts generally have difficulty maintaining high growth in the high cohort, and so end up in a lower cohort the next year where it will be easier to achieve higher growth. This causes a see-saw like trend in the data and the low correlation that Neal had displayed. Neal then fielded questions from the committee. When asked about professional development related to the growth model and associated tools, Neal said there is a course built into Arkansas's data visualization tool, Hive. Neal has also spent a good deal of time traveling around Arkansas. On the use of these growth data, Neal stressed that much of the benefit and usage will and should be at the local level. These "eyes on the ground" are also best equipped to parse out the meaning of the data (i.e., to make sense of the results and take appropriate action). Neal emphasized the need to get these data in the hands of teachers, parents, and others at the local level.

Common Education Data Standards MS PowerPoint (3.76 MB)
Tate Gould (NCES) and Beth Young (QIP) provided an update on the Common Education Data Standards (CEDS) project. Draft One of the Version 2 standard was released on July 18th for public comment. Over 200 comments were received in the first week. The CEDS Stakeholder Group has expanded to include postsecondary and early learning stakeholders. The K-12 scope will include elements needed to support assessments at the local and state levels as well as items for federal reporting and metrics/policy questions. TECH had the following comments and questions:

  • How is Ed-Fi integrating with CEDS?
  • Will CEDS accommodate for the Common Core Standards?
  • Is it possible to integrate CEDS into NEDM so the user can see how they are related to the larger context?
  • How does a state get technical assistance?
  • Can CEDS make mappings more accessible (not in Excel)?
  • If a state has already been through the State Core mapping, will the CEDS mapping be any different?

Tate and Beth will work to address these issues in the near future.

TECH Panel and SEA/LEA Breakout Discussion: Data Governance
Panelists included Bethann Canada (Virginia Department of Education), Laurel Krsek (Napa Valley Unified School District, CA), Allen Miedema (Northshore School District, WA), and Peter Tamayo (Washington State Office of Superintendent of Public Instruction). This presentation and discussion addressed many high-priority issues related to data governance, including: How is data governance evolving over time? How are SEAs and LEAs working together? How are other agencies working together? What about system governance? And, what have we learned so far?

Virginia is taking a federated approach to data governance that spans numerous state agencies, including the Virginia Department of Education, the State Council on Higher Education for Virginia, the Virginia Employment Commission, and the Virginia Community College System (Workforce Office). The project was driven by Virginia's Privacy Act and undertaken with the approval of the Virginia Attorney General. Work began with six half-day, off-site facilitated workshops in which staff from participating agencies defined their scope and priorities (availability, usability, integrity, privacy, security, consistency, and transparency) and created a "Book of Governance" table of contents. The council now meets bi-weekly and staffing is in place to write the "Book of Governance." They also developed a unified lexicon, with information on every data element such as definition, format, "friendly name," codesets (using the Lexicon to map to CEDS), location (within an agency), database, schema, table, data element, and logical relationships between elements. Processes were established for requesting data, approving requests, appealing denials, establishing data use agreements, adding/removing elements, and bringing on additional partners, as well as other work flow items.

In Napa Valley Unified School District (CA), the School Board has proactively established a culture of data use in which data are used to focus on student achievement, set direction and establish priorities, monitor results, and communicate, explain, and justify decisionmaking. The sources of data are many and varied, including student data, achievement data, resource data, process data, and perception data (e.g., how staff, students, and community members "think" the schools are doing). To coordinate this body of information, a Data Management Team identifies system-wide issues, requests ad hoc advisories as necessary, meets weekly with check-ins, documents requests, publishes a data collection and reports calendar, escalates issues to the appropriate authorities, and recommends and facilitates training. The next step is to more formally establish a data governance policy and designate one person to be responsible for overseeing the governance structure. Subsequent work would focus on creating a mission statement and core goals/objectives for the governance group, defining and documenting the roles and responsibilities of the data manager, and implementing a data quality curriculum.

In the state of Washington, the Washington State Legislature passed legislation in 2009 to jump start the work on K-12 education data governance. The legislation established the K-12 Data Governance Group with the primary role to assist in the design and implementation of a K-12 education data improvement system for use by school districts, the state, and additional stakeholders. The legislation further outlined the functionality and information to be contained in the K-12 data system and required certain reports to be available on the internet. Other tasks outlined in the legislation included responsibilities to identify critical research and policy questions that need to be addressed by the K-12 data system; identify reports and other information that should be available on the internet; create a needs requirement document detailing technical capacity needed by districts and the state to meet legislative expectations for the K-12 data system; conduct a gap analysis of the current and planned information compared to the needs requirements; identify where existing data can be reduced or simplified and where existing software can be used; focus on financial and cost data to support the new funding formulas and assure capacity to link data across financial, student and educator systems; define the operating rules and governance structure for a governance process that is objective and orderly for determining when changes are needed and how they are implemented; and establish minimum standards for school, student, financial and teacher data systems. A data governance coordinator was hired and a data management committee was formed to help manage the SEA's data collections and reporting needs. Panel discussions were conducted to inform work and get buy in from key stakeholders such as principals, teachers, superintendents, business managers, and counselors. Today, a P-20 governance model produces routine feedback reports, special reports, and research datasets for parents, teachers, administrators, planners, policymakers, and researchers. This P-20 model attempts to generate clear processes that will accelerate the ability to share data among the P-20 partners for the collaborative analysis of education data. Members include early learning, K-12, the State Board of Community and Technical Colleges, Council of Presidents, Higher Education Coordinating Board, and representatives from a school district, education service district, and a community technical college. It establishes workflow and roles and responsibilities and fosters relationships between Washington's education partners.

Wednesday, July 27, 2011

Morning Session

TECH Discussion: Ed-Fi
The Dell Foundation announced the release of a new data standard for education in July. Many TECH members were surprised by the announcement, noting that they had not been aware of the project. This surprise also caused some concern—if TECH's SEA and LEA members were not aware of the standards development project, is it possible that state and local perspectives were not included in the work? Additional concern stemmed from a perceived overlap with the SIF standard. Members cautioned that the presence of more than one "standard" might defeat the purpose of having a standard at all. Unfortunately, TECH members felt that they did not have enough information about the Ed-Fi initiative to fully evaluate its scope, content, and quality. Hopefully we can have a more informed discussion at the Winter 2012 Forum.

TECH Discussion: NCEE and the Forum
Ruth Neild is a new Forum member from the National Center for Education Evaluation and Regional Assistance (NCEE) at the U.S. Department of Education. She joined TECH to introduce herself and NCEE to the Forum.

As a part of the U.S. Department of Education, NCEE supports and undertakes a wide range of education research activities, including evaluation, technical assistance, and the dissemination of information from evaluation and research. NCEE is also responsible for promoting the use of scientifically valid research in education. In addition to contributing to the Forum as a federal member, NCEE is collaborating with the Forum on the topic of data use. Discussions between NCEE, NCES, and the Forum revealed a shared concern regarding the need for guidance and best practices for sharing data with education researchers. NCEE and the Forum will work together to develop a new resource that focuses on facilitating the use of education data by researchers, including guidance on how researchers can request data and how education agencies can initiate best practices for evaluating data requests, sharing data, and ensuring data privacy. Several SEA members of the TECH Committee are participating in the working group.

TECH members mentioned that it would be great to encourage mutually beneficial partnerships between SEAs and institutions of higher education (IHEs) through Regional Education Laboratories (RELs). For example, student researchers could help SEAs investigate their research priorities while learning how to use SEA data. Ruth also noted that she used Forum products while conducting research on Philadelphia schools—so she understands how useful the Forum is to educators and researchers, and looks forward to a productive partnership with the Forum.

Meeting Review and Winter 2012 TECH Planning
TECH members suggested that planners consider experimenting with a new room configuration at the Winter 2012 Forum.

Suggested topics for the Winter 2012 meeting included:

  • P-20 Feedback Reports (using IPEDS data) –– Kathy Gosa and Peter Tamayo volunteered to help with this topic
  • Teacher evaluation systems
  • ESEA Reauthorization
  • FERPA news (guidance should be final by then)
  • Sustainability of SLDS systems (perhaps a best practices discussion)
  • Assessment consortia data demands – how will they report data, generate indicators, etc.? – can CEDS, SIF, etc. help establish reporting standards for the consortia?
  • Moving systems to cloud computing
  • Classroom data issues (21st Century skills assessment) – Laurel Krsek and Kathy Gosa

TECH Closing Thoughts
TECH Chair Lee Rabbitt thanked the TECH members for an especially interesting and productive meeting. Incoming Chair Peter Tamayo thanked Lee for her stellar service as the TECH Chair in 2010-11.

Peter asked TECH members to look for email over the next few months as we begin our preparation for the Winter 2012 Forum in San Diego, California. Note that we will be meeting from February 13th—14th, which is a departure from our normal meeting calendar.

Closing SessionMS PowerPoint (600 KB)

NCES UpdateMS PowerPoint (2.14 MB)
Jack Buckley, Commissioner of NCES, joined the Forum to provide an update on NCES activities as well as preliminary results from several projects, including the High School Longitudinal Study, the National Assessment of Educational Performance (NAEP), and the Common Education Data Standards (CEDS) project.

  • "First Look" 2009 High School Longitudinal Study Findings – This project, which is following a nationally-representative sample of youth into the labor market, has released the first wave of data. Results thus far provide a wide range of information on 9th graders, including skill attainment, the effect of parents' education on students, course-taking patterns, and students' educational and occupational expectations.
  • NAEP – NAEP expects to release 15 reports in 2011. Commissioner Buckley reviewed 2010 NAEP results in the areas of Civics, U.S. History, and Geographic assessments. NAEP data can be used to show longitudinal trends in student achievement, as well as the widening or narrowing of achievement gaps over time.
  • CEDS – This project is a collaborative effort to develop voluntary, common data standards for a key set of variables. The stakeholder list is expanding as interest in the project grows. Commissioner Buckley emphasized that CEDS adoption is voluntary. Version 1 was released in September 2010. Version 2 aims to support all federal reporting, as well as to add a postsecondary focus. Importantly, the goal of CEDS is not only to support data collection for federal reporting, but also to get data back in the hands of school systems.

Standing Committee Progress Reports

Recognition of Completed Projects
The members of the Crime, Violence, and Discipline, and Section 508 working groups were recognized and presented with plaques for the completion of their tasks and the publication of two new Forum documents, The Forum Guide to Crime, Violence, and Discipline Incident Data and The Forum Guide to Ensuring Equal Access to Education Websites. In addition, members of the Longitudinal Data Systems Task Force were recognized for the publication of the final book in the Traveling Through Time: The Forum Guide to Longitudinal Data Systems series, Book IV: Advanced LDS Usage.

Recognition of Forum Officers
The Forum thanked the 2010–2011 Officers for their service, and presented them with plaques.

Forum Election
The slate of proposed 2011–2012 officers was presented for a vote. The slate was seconded and then the Forum voted to approve the following members as 2011–2012 officers:

Chair: David Weinberger, Yonkers Public Schools (NY)
Vice Chair:

Tom Ogle, Missouri Department of Elementary and Secondary Education

Past Chair:Kathy Gosa, Kansas State Department of Education
NESAC Chair:Cheryl McMurtrey, Mountain Home School District 193 (ID)
NESAC Vice Chair:

Raymond Martin, Connecticut State Department of Education

PPI Chair:Tom Howell, Center for Educational Performance and Information (MI)
PPI Vice Chair:Sonya Edwards, California Department of Education
TECH Chair:

Peter Tamayo, Washington State Office of Superintendent

TECH Vice Chair:

Laurel Krsek, Napa Valley Unified School District (CA)

Closing
Kathy Gosa expressed her amazement at the work the Forum is able to accomplish. This year has renewed her appreciation of the Forum and the productive partnerships it facilitates. She thanked everyone who supported the Forum in the past year.

In his opening comments as the Forum Chair, David Weinberger stated how impressed he was by the vast and expanding range of topics covered in-depth by the Forum. He finds the work of the Forum interesting, challenging, and energizing, and looks forward to the year to come.

Steering Committee

Monday, July 25, 2011

Welcome and Agenda Review
Forum Chair Kathy Gosa, Kansas State Department of Education, welcomed members of the committee and reviewed the agenda.

Sunday Review
Kathy reported to the committee on the Data Use Working Group, which met on Sunday morning. The group is developing a series of briefs to help key audiences take action with education data. There will be an introduction and three initial briefs that will be able to stand alone. The group also met on June 20 and on Sunday convened again to review drafts of the introduction and first brief, which is geared towards educators. The group planned a Webex for the middle of September. By the end of the calendar year, the group hopes to finalize Brief I and move on to a brief for administrators.

Next, Kathy, who also serves as Chair of the SEA Data Use Working Group, reviewed that group's recent work. The working group met for the second time on Sunday to advance its work to establish a common language and standard templates to help SEAs improve their processes for receiving and dealing with researchers' requests for data. The group also intends to develop a set of core practices around the sharing of data with researchers and will build use case examples around each template. In light of the regional education laboratory (REL) contracts, which are currently out for competition, there is a sense of urgency to this work. The group was joined by John Easton and Ruth Neild (NCEE) for its first meeting in June, and Ruth joined them again on Sunday. The group reviewed a draft outline on Sunday and plans to have draft templates ready for review in September. Steering Committee members suggested that a new name be given to the group, perhaps related to "researcher access," because the current name has caused some confusion.

The Communications Subcommittee convened for a short meeting Sunday afternoon. The group discussed some membership issues, focusing on the policy for accepting RELs as associates, as well as how to deal with membership vacancies. The committee also reviewed product and website statistics, some major enhancements to improve the user-friendliness and appearance of the Forum's website (e.g., new publications page), as well as a new Forum Overview PowerPoint presentation. The group decided to distribute Forum brochures to all Forum members for dissemination.

Review of Monday's Events
The New Member Orientation session went well, but low attendance was an issue. Low attendance (or late arrival) by new members to the Opening Session was also a concern. New members should be reminded to get to the Opening Session on time because they are introduced early in the session. Committee members suggested starting the Orientation session a bit later (perhaps 8:30 am) to improve attendance.

Several, but not all, vendor representatives removed themselves from meetings as a result of the vendor statement read at the beginning of each Standing Committee meeting.

Committee members were very pleased with the professional development session delivered by Neal Gibson, Arkansas Research Center. Though the topic was a difficult one, Neal delivered a very engaging and informative session.

Committee members said they greatly appreciated John Easton's (NCES) compliment of the Forum at the Opening Session. The assessment consortia session was also very well received and committee members agreed that it was a good decision to have state representatives, rather than consortia staff, speak. Throughout the day, members noted that there was good movement of speakers between committees, and good follow-up questions.

Other Issues
Committee members suggested that the time may be ripe to discuss teacher evaluation, and the Winter 2012 Meeting may be a good time to have a session on this topic.

Ghedam Bairu (NCES) gave the group a brief overview of the Education Data Cooperative Initiative (EDCI), which is the new family of programs at NCES. It seeks to ensure greater coordination and communication across several of the Center's activities related to data systems (including the Forum, CEDS, SLDS Grant Program, PTAC, other technical assistance efforts).

Tuesday, July 26, 2011

Review of Tuesday's Events
Committee members agreed that both the National Assessment of Educational Performance (NAEP) and the Integrated Postsecondary Education Data System (IPEDS) presentations were well-received. The suggestion was made to consider inviting Elise Miller to present further updates on IPEDS in the future.

Chairs then reported on standing committee time. Laurel reported that PPI undertook a review of the Technical Briefs that successfully blended SEA and LEA participants. The Technical Briefs would benefit from additional information on the topic of data governance. Throughout the day, discussions in PPI led to the committee "fighting the clock." In NESAC, members were very interested in CEDS and Ed-Fi, and had a very good discussion on the CRDC. TECH reported that the day went well, with a lot of discussion surrounding CEDS. Members enjoyed discussions with Elise Miller and Kathleen Styles. The agenda was amended due to time constraints.

The committee then discussed meeting rooms. In PPI, the table was too long. NESAC, which used a configuration of small round tables, found that the setup was successful after the head table was moved closer to other tables.

Action Items
NESAC put forth the idea of an online FERPA course that would provide basic privacy lessons (e.g., do not email test scores from a home computer). Laurel suggested that the Forum privacy document should be a priority, and perhaps the focus should be a document for LEA use. Currently, the document is on hold pending new FERPA regulations. Ghedam agreed to check with Emily Anthony at NCES to determine what can be done with respect to FERPA and ensure that the Forum does not duplicate any similar PTAC efforts. Pat suggested that the document include information on how to transmit workforce data securely.

The committee also discussed the need for an educator-student data link working group, sponsored by NESAC. Pat expressed the opinion that Teacher of Record definitions are divergent, and the Forum should develop a best practices document before the definition diverges too much. David added that the target product may need to be short-lived, aimed at addressing current needs, with regular updates necessary to keep up with developments in the field. Members suggested an online-only release. Laurel agreed, and indicated that it would be best as an online document. This project will need to consider the Gates project on a similar topic, and will have to be sensitive to "values" that may be ascribed to "Teacher of Record" definitions.

All committees reported the results of their elections. Proposed Chairs and Vice Chairs for the 2011–2012 year are as follows:

NESAC Chair: Cheryl McMurtrey, Mountain Home SD 193 (ID)
NESAC Vice Chair: Raymond Martin, Connecticut State Department of Education
PPI Chair: Tom Howell, Center for Educational Performance and Information (MI)
PPI Vice Chair: Sonya Edwards, California Department of Education
TECH Chair: Peter Tamayo, Washington State Office of Superintendent
TECH Vice Chair: Laurel Krsek, Napa Valley Unified School District (CA)

Wednesday, July 27, 2011

PPI discussed the teacher-student link topic in some detail and met with Ruth Neild (NCEE). Members reported liking the tempo and content of this Summer Meeting, but wondered whether there could be two options for professional development so that people felt like they had more choice (e.g., if the topic did not relate to their work). PPI also will look for clarification on how to archive/retire old Forum publications that are still accessible via the NCES publications page.

NESAC requested guidance on developing an educator-student link working group. Ghedam and staff will look into previous work on this topic.

TECH reported that the morning session included a continuance of the discussion on student record exchanges and e-Transcripts. TECH has also discussed the topic of the Ed-Fi standard from Dell. TECH members were very interested in CEDS, and the conversation extended beyond the committee meeting. TECH recommended topics for the meeting in San Diego, including P-20 feedback reports from IPEDS, reauthorization, FERPA, sustainability, PARCC, and teacher-student data links. Peter also expressed an interest in enterprise architecture, and Sonya agreed that it is a misunderstood topic.

The group discussed room configurations: TECH was interested in the NESAC configuration, which worked well for that committee. Standing committees agreed that it would be useful to get information on room configurations prior to the meetings (e.g., dimensions or even a photo).

PPI is concerned with membership. Some Forum members assume that PPI focuses solely on Forum policies. The purpose of each of the standing committee sometimes overlaps with other committees. Ghedam agreed, and recommended that the Steering Committee review the purpose of each committee and work to ensure that agenda topics align with each committee's stated focus and purpose. NCES will also take a look at the Forum's Strategic Plan to ensure that the purpose of PPI is clear as some members have come to believe, incorrectly, that the committee oversees the Forum. Committee members also suggested that the Policies and Procedures Manual be amended to mention postsecondary and early childhood sectors, as the scope of our work and involvement with these other sectors has expanded and should be reflected in the Manual.

David Weinberger thanked Jack Buckley for his presence at the Closing Session and the Steering Committee.

Steering Committee calls occur the 3rd Friday of every month. The next call, which will focus on meeting evaluation results, was scheduled for September 16, 2011 at 12:00 pm EST.

 Previous Page

Top

Publications of the National Forum on Education Statistics do not undergo the formal review required for products of the National Center for Education Statistics. The information and opinions published here are the product of the National Forum on Education Statistics and do not necessarily represent the policy or views of the U.S. Department of Education or the National Center for Education Statistics.


Would you like to help us improve our products and website by taking a short survey?

YES, I would like to take the survey

or

No Thanks

The survey consists of a few short questions and takes less than one minute to complete.