Skip Navigation
small NCES header image

NAEP Presentations at the 2012 CCSSO National Conference on Student Assessment,
June 26–29, Minneapolis, Minnesota

Below is a list of NAEP-related sessions excerpted from the preliminary conference program (as of May 2) on the Council of Chief State School Officers (CCSSO) National Conference on Student Assessment (NCSA) web pages. Consult the program for pre- and post-sessions. When you arrive at the conference, please check for any schedule changes. 

Be sure to visit the NAEP booth #206 at NCSA--see demonstrations of the web tools, browse the latest reports and publications, get answers to your questions, learn about NAEP and the social media, or just to say hello.

NAEP presentations are scheduled for:
Wednesday, 6/27/2012
Friday, 6/29/2012
 

==========

Wednesday, 6/27/2012

10:30 a.m.–11:30 a.m.   Marquette 3 (Hilton Minneapolis)

Using NAEP to Understand the Past & Champion the Future: Data-Driven Decisions and Best Practices
Lead Presenter: Iris L. Garner
Co-presenters: Carla Collins, Wendy Geiger, and Rhonda Patton

During a time when change is the norm and challenges are more prominent, best practices are in great demand. The National Assessment of Educational Progress (NAEP) or the Nation’s Report Card can bring a better understanding of assessment design, data analysis, and reporting.  While striving to establish what teaching, learning, and assessment will be through the next decade is essential to the mission at hand, it is imperative that an extensive understanding of the past decade is compiled, reviewed and included in the conversation. These conversations will help fuel the efforts of our nation’s educators.

"History cannot give us a program for the future, but it can give us a fuller understanding of ourselves, and of our common humanity, so that we can better face the future."
-- Robert Penn Warren

The panelists will share experiences from four different states: Alabama, New York, North Carolina, and Virginia. Their expertise represents more than ten decades in education and a plethora of roles: parent, teacher, principal, grant writer, program director, NAEP coordinator, policy consultant, central office administrator, and assessment director. Their backgrounds will contribute to rich dialogue and problem solving.

During this session, the following questions will be discussed: What do the students in the United States know in mathematics and reading at grades four and eight?  What strategies are obtainable today by all stakeholders that can aid in furthering their understanding?

In a great effort to discuss the real issues the panelists and participants will revisit the past and have authentic dialogue about data analysis and resources.

The first panelist will provide a timeline of NAEP and the nation’s performance in reading and mathematics. A snapshot of what the students know in mathematics and reading along with influences, milestones, and trends will be illustrated.

The second panelist will showcase data tools: NAEP data explorer, NAEP questions tool and NAEP comparison tool. The tools include 10 subject areas, state comparisons, and comprehensive data dating back to 1990.  This exploration will highlight the relevancy of data and urgency of multiple data streams. This component to the presentation will be unique in that all participants will have the opportunity to obtain strategies for data teams.

The third panelist will explore the building blocks for assessment literacy.  Strategies for building, implementing, and sustaining assessment literacy using NAEP as the main source will be outlined. The component will be fashioned in a manner that participants can easily implement and/or improve assessment literacy in their states. Assessment literacy blueprints will be made available with more than three hundred key assessment terms and concepts, database tutorials, and interactive data tools. An implementation plan will be shared and milestones will be highlighted.

The last contributor will share how NAEP has influenced assessment literacy in Yonkers, New York and how they have engaged the students, parents, and teachers, and principals.

This interactive session is specifically crafted for exploration and exchange; an environment conducive for the blossoming of innovation. Don’t miss the opportunity to capture the history and strategically plan for the coming decades!

“In order to plan your future wisely, it is necessary that you understand and appreciate your past.”
-- Jo Coudert



1:30 p.m.–3:00 p.m   Marquette 1 (Hilton Minneapolis)

Quality Assurance in a Chaotic World
Lead Presenter: D. E. (Sunny) Becker
Discussant: Lauress Wise
Moderator: Yvette Nemeth-Jones
Co-presenters: Jennifer Dugan, Kris Ellington, Roger Ervin, Maridyth McBee, and Jennifer Stegman

Recent expansions of assessment programs and the need to report high-stakes results accurately and quickly have strained the capacities of state agencies and testing companies. Any flaw in these programs—such as scoring/analytic errors, inadvertent miscoding, and deliberate cheating— is subject to the hot spotlight of public scrutiny. We will explore six quality assurance (QA) perspectives: two states with well-established QA systems, two states with QA systems in transition, and two researchers who have advised national and other testing programs.

The sheer amount of information required to assess the achievement of every student in several grades is daunting. Add to this the pressure to report results quickly enough to inform remediation, grade level promotion (or denial), high school graduation, and school AYP ratings, and the opportunity for errors increases at an alarming rate. QA comprises a broad set of activities designed to prevent errors, with safety nets to detect and correct errors when they do occur. QA principles are relevant to every step of the testing cycle, including item and test development, administration, scoring, analysis, and reporting.  Session presenters will describe their efforts to ensure quality on a variety of state- and national-level testing programs.

Examples of well-established state QA systems will be explicated by representatives of Kentucky and Florida. Kentucky has employed redundant psychometric processing (either internal or via external contractor) as a quality control step since the late 1990s. Kentucky's quality assurance system proactively limits potential data errors and routinely evaluates feedback on system validity by collecting data that can be linked to important system goals and outcomes. Florida employs best practices in quality assurance through direct involvement of experts, redundancy at critical points, external data checkers, and quality assurance auditors. Areas of focus for QA include calibration and equating, handscoring, data quality, document processing, and erasure/similarity of response pattern analysis. Representatives of two states will describe systems in transition: Oklahoma and Minnesota. Oklahoma has only recently used an external quality assurance provider to certify the accuracy of its state testing program. The state needed verification that the testing contractor had used the correct aggregation procedures when producing summary data files and score reports. In the future, Oklahoma intends to expand the role of the external quality assurance provider to scoring algorithms and calibration/equating.  Minnesota transitioned to a different vendor during the past year. A state representative will share QA processes and experiences as a result of that transition for all aspects of the assessment program.

Finally, two presenters will describe quality assurance issues from the perspectives of external agencies: the Human Resources Research Organization (HumRRO)  and edCount. Since 2002 the National Center for Education Statistics (NCES) has contracted with HumRRO as the quality assurance contractor on the National Assessment of Educational Progress (NAEP). In that role HumRRO monitors the work of operational NAEP contractors and provides feedback and guidance to strengthen routine QA practices and anticipate future areas of risk. edCount has helped assure the quality of data systems by providing technical assistance for the Puerto Rico Department of Education and the Laurent Clerc Center.This assistance contributed to documentation to aid in the coordination of QA principles across divisions and recommendations that will ultimately improve the quality and utility of their assessments.

In the latest incarnation of state testing, the PARCC and SBAC consortia are proceeding with ambitious schedules to develop newer, highly visible, and thoroughly scrutinized assessments. Now is the time to build in measures to ensure high quality assessment programs. We aspire to highlight lessons learned and provide early guidance for consideration by PARCC and SBAC as the CCSS assessments come to fruition.


1:30 p.m.–3:00 p.m.  Marquette 4 (Hilton Minneapolis)

Enhancing Accessibility, Validity, and Utility in Next Generation Science Assessments
Lead Presenter: Nancy Doorey
Discussant: Cheryl Alcaya
Moderator: Nancy Doorey
Co-presenters: Geneva Haertel, Mark Wilson, and Peggy Carr

This symposium examines three assessment projects seeking to improve the accessibility, validity, and utility of technology-enhanced science assessments.  These projects include Principled Science Assessment Design for Students with Disabilities (PADI-SE) to ensure tasks are accessible and valid for all students, an ACT21S-sponsored effort led by UC Berkeley researchers to assess students’ competence on four dimensions of ITC literacy, and the National Assessment of Educational Progress’ (NAEP’s) plans to integrate traditional science assessment items with interactive computer tasks and hands-on tasks.

Technological innovations hold abundant promise for improving educational assessment, but they can also pose significant threats to the accessibility, validity, and utility of those assessments.  This symposium examines three distinct approaches to developing technology-enhanced science assessments and the practical implications of the lessons learned to date on these projects.  Through examining diverse approaches to assessing science with technology, this symposium seeks to provide useful guidance to states and policymakers implementing technology-enhanced assessments across K12 subject areas.

The discussant is a supervisor in a state department of education who oversaw the development of one of the most innovative state science assessment programs in the nation.  The chair currently directs program development work for an innovative education center that focuses on advancing K12 assessment.

PADI-SE, a USED-funded project led by SRI International, aims to improve the design of science assessments by identifying any and all assumptions and expectations of assessments that are distinct from the targets of measurement.  The PADI-SE presentation will discuss how the team has applied the principles of Universal Design for Learning in an evidence-centered design (ECD) framework to maximize the accessibility of their science assessments for all students, including those with high-incidence disabilities.

With funding from ACT21S, a team of researchers at UC Berkeley's BEAR Center is developing innovative tasks that assess students’ competence on four dimensions of ITC literacy, specifically in the area of "learning in a digital community": being a consumer, being a producer, increasing their social capital, and increasing their intellectual capital through networking.  This session will examine how the measurement of ITC skills such as these can be used to create a richer portrait of students’ learning and communication skills than traditional assessment methods.

NAEP is undergoing an exciting rethinking and renewal of its science assessment, in which the assessments will be computer-delivered and adaptive, and will integrate more traditional selected- and constructed-response questions with interactive computer tasks and hands-on tasks.  The NAEP presentation will discuss the challenges and promises of developing authentic assessments with disparate components, and of reporting the results on a single scale that can be linked back to paper-and-pencil assessments from previous years.

Taken collectively, the three approaches will provide a coherent picture of the challenges and promise inherent in using technology to assess science ability across a range of purposes, including formative and summative environments, linear and adaptive formats, and selected response and hands-on tasks.  The symposium thus intends to provide conference attendees with diverse, authentic examples that they can apply to other K12 technology-enhanced assessment projects they are working on, regardless of whether these assessments are in science or a different content area.

 

3:30 p.m.–5:00 p.m    Marquette 1 (Hilton Minneapolis)

Tracking Readiness: Perspectives From Three States
Lead Presenter: Catherine Welch
Co-presenters: Scott Norton Norton, Gayle Potter, and Stephen Dunbar

Since the announcement of the Race to the Top program, college and career readiness (CCR) has become another measure of the quality of K-12 education. As ESEA is reshaped, the implementation of college and career-ready standards, high-quality assessments and the measurement of student growth towards college readiness remain a critical component of the newly identified waiver requirements. The purpose of this symposium is to examine a variety of ways that states are monitoring, measuring and understanding college and career readiness.

Recent research by College Board (2010), ACT (2011) and NAEP (2010) suggest that a high percentage of students need remediation after high school to be ready for college-level courses. This research has contributed to the call to identify as early as possible students who are not on track to be college or career ready. Early identification allows time for careful course planning and appropriate interventions for students.

The symposium consists of four papers, ranging from policy issues to cross-state validation approaches. The papers represent different critical issues related to the topic of measuring college and career readiness, including methodology adopted and results reported, policies and initiatives and the impact on various groups of the student population.

The first paper will address Louisiana’s initiative to build on their state’s strengths through the High School Redesign initiative and High Value Schools resource., Louisiana educators are working to create and support schools that are designed to prepare students to succeed whether they go directly into the workforce or continue their academic pursuits after earning their high school diploma or GED.

The second paper will discuss Arkansas’ Smart Future initiative, which addresses the state’s priority of enhancing postsecondary opportunities and ensuring student success. The presenter will describe reform efforts in curriculum and coursework as well as review indicators the state is using to report and document the effects of the initiative such as rates of remedial college coursework and timely graduation.

The third paper describes the strong link between assessment results in Iowa and admissions test scores associated with ACT readiness benchmarks. Based on the Iowa standard score and vertical scale, Iowa schools are able to track student performance in earlier grades to likelihood of success in credit-bearing postsecondary courses. Interpretations related to being on track for readiness are being provided to students and parents beginning in 6th grade.

Various studies have been conducted to define, implement and assess college readiness for diverse student populations (e.g. Bustamante, Slate, Edmonson, Combs, Moore, & Onwuegbuzie, 2010). To better understand the performance and college readiness in culturally and linguistically diverse groups, the fourth paper presents long-term analyses conducted to monitor the growth trends of these students in reaching CCR benchmarks. Comparability of readiness indicators in diverse student populations is examined.

Top

Friday, 6/29/2012

9:00 a.m.–10:30 a.m.  Marquette 7 (Hilton Minneapolis)

Linking Scores From Different Assessments: Evaluating Approaches for Comparisons of Different Groups Across Different Tests
Lead Presenter: Wayne Camara
Discussant: G. Gage Kingsbury
Moderator: Hillary Michaels
Co-presenters: Lauress Wise, Kristen Huff, and Joseph Martineau

The desire to treat scores obtained from different tests as if they were interchangeable, or at least comparable in some more limited sense, is hardly new. Linn (2005) describes various efforts to compare scores across different tests going back for 50 years. This panel discussion will focus on the increased demand for cross-state comparisons of student performance and what level of comparability can be expected. Presenters will discuss conditions and elements needed for comparing assessments such PARCC to SBAC and NAEP to state assessments. They will also explore limitations and common misperceptions.

 

11:00 a.m.–12:00 p.m. -- Marquette 3 (Hilton Minneapolis)

The Hidden Gem of NAEP: Contextual Data
Lead Presenter: Laura Coward Egan
Co-presenters: Paula Hutton, Angie Mangiantini, Jan Martin, and Paul Stemmer Jr.

NAEP contextual variables offer a wealth of descriptive data that is publicly available but commonly underutilized. Although certain cautions must be taken when using NAEP contextual data, which are cross-sectional with independent measures over time, they can be used to answer specific research questions as well as for exploratory data analysis.  With depth, breadth, and trend data for many variables, NAEP non-cognitive data can enhance understanding of the changing environment that shapes student learning both within and outside of the classroom, as well as the shifting educational context in which education and assessment practitioners work.

This session will begin with a brief overview of NAEP contextual data, including the types of variables available, how these data are collected and from whom,  and analyzing these data using the publicly-available NAEP Data Explorer. The presenters will then share replicable examples of how NAEP contextual data is used within their state education agencies to inform decisions, monitor changes, and foster dialogue about key issues facing state education agencies. Presenters will share how NAEP contextual data have informed the following research questions using industry best practices:

    South Dakota: How are teachers structuring English Language Arts curriculum in grade 8? As part of baseline data for a literacy integration project at the middle and high school levels, state and NAEP survey data from students, teachers and administrators were used to provide the context to better understand the need to integrate literacy strategies across the curriculum, and for secondary Career and Technical Education courses in particular.
 
    Michigan: How do cognitive meditational strategies and coursetaking impact student achievement? NAEP contextual data has been used to examine cognitive mediational strategies for reading literacy and the effects of mathematics course selection on achievement performance and has shared with this with the general public in combination with student achievement data to create a dialogue about how these variables may be effecting student achievement and what further and more direct studies may be required.

    Washington: What is the current state of science and STEM education in Washington, and how does this impact students’ opportunity to learn science? A White Paper was developed to describe the current state of science and STEM education in Washington State and garner support from policymakers, educators, and the public to support a statewide system of science education. This white paper included NAEP data on instructional time and concluded that time spent teaching science at the secondary level is likely not a significant factor in students’ opportunity to learn science.

    Maine: Are students in Maine ready to transition to the more rigorous coursework that the common core standards entail? In preparation for the transition to the common core standards, the Maine education agency is examining perceived rigor of current coursework from both the perspective of both students and teachers as well as opportunities for enhanced professional development.

These best practices will be framed in a broader discussion of how NAEP contextual data can be used at the state level to communicate about policy, education, and assessment topics with diverse audiences and in targeting areas for more direct research.

==========

Presentations have been archived from the 2011 NCSA and 2010 NCSA, as for previous AERA conferences.

Top


Last updated 03 May 2012 (NB)
Would you like to help us improve our products and website by taking a short survey?

YES, I would like to take the survey

or

No Thanks

The survey consists of a few short questions and takes less than one minute to complete.
National Center for Education Statistics - http://nces.ed.gov
U.S. Department of Education