Skip Navigation
small NCES header image
NAEP Instruments → Non-Cognitive Items (Questionnaires)

Non-Cognitive Items (Questionnaires)

      

Student Questionnaire

Teacher Questionnaire

School Questionnaire

SD/ELL Questionnaire

Department Head Questionnaire

horizontal line

In addition to assessing subject area achievement, the National Assessment of Educational Progress (NAEP) collects information from participating students, teachers, and schools through non-cognitive (contextual) questionnaires that are related to student achievement. This information serves, in part, to fulfill reporting requirements of federal legislation. Specifically, under the No Child Left Behind Act, NAEP is required to collect information on and report achievement results disaggregated by the following variables, when possible: gender, race and ethnicity, socioeconomic status (SES), disability status (SD), and English language learner (ELL) status. (Note that the term English language learner is used in NAEP reports administered since and including 2005; the term limited English proficient (LEP) was used on SD/ELL questionnaires administered to schools up to and including 2005.) Information from the contextual items also serves to give context to NAEP results and/or allow researchers to track factors associated with academic achievement.

Recent History

In early 2002, the National Assessment Governing Board was granted final authority over the non-cognitive items. The Governing Board adopted a policy to focus NAEP non-cognitive data on the primary purpose of the National Assessment—to provide sound, timely information on the academic achievement of students in the United States (National Assessment Governing Board, 2003). The Board also initiated a process to prepare a general framework to guide the collection and reporting of non-cognitive (contextual) data. The Background Information Framework for NAEP, developed in 2003, defines the purpose and scope of NAEP non-cognitive (contextual) data, and calls for a long-term plan for continued development. In response to this call, the National Center for Education Statistics (NCES) developed the NCES Plan for NAEP Background Variable Development, which provides a general procedural map for the development and review of each type of contextual data (National Center for Education Statistics 2004).

Types of Non-Cognitive (Contextual) Items

There are three types of non-cognitive (contextual) data: student reporting categories, other contextual/policy information, and subject-specific information. While there are some differences in the approaches to the development of each type of data, shared principles underlie all three: 

  • The Governing Board provides initial guidance on what will be developed;
  • The Governing Board has multiple opportunities to review and provide input; and
  • The overall development process seeks to reduce burden on respondents and ensure data quality while continuing to meet the needs of the NAEP program.

Descriptions of the three types of contextual data are below.

  • General Student Reporting Categories

    • Since the first NAEP assessment in 1969, achievement results have been disaggregated by subgroups of the population. Achievement has also been presented for and compared across subgroups. As mentioned earlier, since the inception of the No Child Left Behind Act, NAEP has collected information on and reported achievement results disaggregated by the following variables: gender, race and ethnicity, socioeconomic status, disability status, and English language learner status.

    • NCES monitors the quality of the data collected using the current measures and will develop new approaches to measuring student reporting variables when warranted.

  • Contextual/Policy Information

    • In every assessment, NAEP collects data on basic characteristics of the school and student body in the school; teacher background, qualifications, and experience; and several student characteristics. These variables provide a basic context for achievement.

  • Subject-Specific Information

    • The subject-specific items in NAEP are focused and limited. A set of key issues within each subject area will be addressed in a focused and in-depth manner across the life of each assessment framework.

    • When a new assessment framework is approved, NCES reviews the recommendations for contextual data made by the framework committee. NCES then develops an issues paper to reflect those priorities, and identifies the data needed to address the issues.

Non-cognitive (contextual) items and contextual variables associated with the categories described above are placed within student, teacher, school, and/or SD/ELL questionnaires, as appropriate. The placement of items and content of each questionnaire depend on the questionnaire respondent and the specific subject(s) NAEP is assessing in a given year. Often questionnaires measure similar constructs across respondents and/or subjects to provide additional information and, in some cases, to validate findings.

  • Student questionnaires ask respondents to provide information about factors such as race or ethnicity, school attendance, and academic expectations. Responses to items on the questionnaires also provide information about factors associated with academic performance, students’ educational settings and experiences, students’ effort on the assessment, and the difficulty and importance of the assessment.

  • Teacher questionnaires ask respondents to indicate teacher background, training, and instructional practices. (Teacher questionnaires are completed by teachers at grades 4 and 8. NAEP does not collect teacher information at grade 12).

  • School questionnaires ask respondents to provide information on school policies and characteristics (completed by the principal or assistant principal). There is also a supplemental charter school questionnaire designed to collect information on charter school policies and characteristics. Principals (or other administrators) of charter school students sampled to participate in NAEP complete both the school questionnaire and the charter school supplement.

  • Questionnaires about students with disabilities (SD) or English language learners (ELL) ask respondents to provide information about students selected in the sample who have disabilities or limited English proficiency (completed by a special education teacher, a bilingual education/English-language-learner teacher, or a staff member who is most familiar with the student).

In 2006 and 2012 NCES also administered a department head questionnaire for grade 12 economics. Within each participating school, the questionnaire was administered to the chair or lead teacher of every department that offered at least one economics-related course. The questionnaire asked the respondent to provide information about the characteristics of the department’s faculty, hiring requirements, and courses offered by the department. There are currently no plans to administer the department head questionnaire in every NAEP assessment.

Non-Cognitive Item Development Process

Non-cognitive items are developed through a process similar to that used for developing the cognitive items. It includes reviews by external advisory groups, cognitive interviews, and pilot testing. When developing the items, NAEP ensures that the items do not infringe on respondents' privacy, that they are grounded in educational research, and that the answers can provide information relevant to the subject being assessed. The following is an overview of the development process for non-cognitive items:

  1. The National Assessment Governing Board oversees the development of the content framework that influences the contextual factors of interest to be measured via the background items. More details about this process are provided in the Background Information Framework (National Center for Education Statistics, 2003).
  2. When a new assessment framework is approved, or when new policy issues are identified for NAEP to address, NCES develops an issues paper to reflect the new priorities, and identifies the data needed to address the issues. The development of the issues paper involves conducting a literature review to identify recent developments for the respective issues.
  3. NAEP contractors that specialize in survey development draft and revise non-cognitive items based on the recommendations of the issues paper and a panel of experts in the relevant fields convened to help ensure the items are appropriate and relevant to practice and policy.
  4. NCES then reviews the non-cognitive items to ensure fairness and quality so that NAEP’s mission of providing a fair and accurate measure of student achievement and achievement trends over time is fulfilled (see NCES Statistical Standards).
  5. New and revised items undergo cognitive interview testing, in which respondents are interviewed to identify potential problems with their comprehension of the contextual items and their ability to provide reliable and valid answers. Based on the results of the cognitive interviews, some items may be dropped or further revised (e.g., to reduce potentially confusing language or improve response options) prior to the pilot.
  6. The items are piloted, and the results are analyzed.
  7. Based upon pilot data results, some items are revised.
  8. The non-cognitive items once again undergo reviews by item development contractors and then by NCES.
  9. NCES presents items to the Governing Board before a pilot or operational administration for its approval, as specified in Education Sciences Reform Act, P.L. 107-279 . The Board has "final authority on the appropriateness of all assessment items" and is required "to take steps to ensure that all items selected for use in the National Assessment are free from racial, cultural, gender, or regional bias and are secular, neutral, and non-ideological."
  10. Prior to cognitive interviews, pilot testing, and operational administration, the items are submitted for clearance by NCES to the Office of Management and Budget, which checks to make sure the items comply with government policies.
  11. After pilot or operational administration clearance is received, print-ready files are created for each student, teacher, school, and/or SD/ELL questionnaire.

Non-Cognitive Data

The purpose of administering non-cognitive items is to give context to NAEP results and/or to track factors associated with academic achievement. The data are also the basis for NAEP’s major reporting groups. Therefore, it is important to note that since NAEP is based on a cross-sectional design, it is not possible to infer cause-and-effect relationships—it cannot prescribe what should be done. Rather its descriptions of the educational circumstances of students at various achievement levels—considered in light of research from other sources—may provide important information for public discussion and policy action (National Assessment Governing Board, 2003). For more information regarding how NAEP data is analyzed and reported refer to the “Results” section of NAEP’s Frequently Asked Questions or the Background Information Framework for the National Assessment of Educational Progress (NAEP) developed by the Governing Board. For more information on how you can explore and manipulate NAEP data, go to the NAEP Research e-Center or the NAEP Data Explorer. Please note that in the NAEP Data Explorer, the results of the contextual questionnaires are sorted into eight broad categories:

  • major reporting groups,
  • student factors,
  • factors beyond school,
  • instructional content and practice,
  • teacher factors,
  • school factors,
  • community factors, and
  • government factors.


Last updated 02 June 2014 (GF)
Would you like to help us improve our products and website by taking a short survey?

YES, I would like to take the survey

or

No Thanks

The survey consists of a few short questions and takes less than one minute to complete.
National Center for Education Statistics - http://nces.ed.gov
U.S. Department of Education