1. ICILS
  2. ICILS Resources

Frequently Asked Questions

In addition to the following questions about ICILS, more FAQs about international assessments are available at http://nces.ed.gov/surveys/international/faqs.asp.


ICILS is a computer-based international assessment of eighth-grade students. ICILS assesses students’ capacities to use information communications technologies (ICT) productively for a range of different purposes in ways that go beyond basic use. ICILS assesses two domains: computer and information literacy (CIL) and computational thinking (CT).

  • CIL, first assessed in 2013, focuses on understanding computers, gathering information, producing information, and digital communication.
  • CT, first assessed in 2018, focuses on conceptualizing problems and operationalizing solutions. CT is an optional assessment for education systems participating in ICILS.

More information about ICILS can be found in the ICILS 2023 Assessment Framework on the IEA website.

The ICILS 2023 Assessment Framework defines CIL as an “individual’s ability to use computers to investigate, create, and communicate in order to participate effectively at home, at school, in the workplace, and in society.” The framework divides CIL into four strands: (1) understanding computer use, (2) gathering information, (3) producing information, and (4) digital communication. The strands represent the overarching conceptual category for framing the skills and knowledge assessed by the instruments. Each strand includes aspects that refer to the specific content category within the strand.

Distribution of score points and percentages across CIL strands and aspects
Strand/aspect Score points Percentage of total
Strand 1. Understanding computer use 18 12%
Aspect 1.1: Foundations of computer use 3 2%
Aspect 1.2: Computer-use conventions 15

10%

Strand 2. Gathering information

33

21%

Aspect 2.1: Accessing and evaluating information

22

14%

Aspect 2.2: Managing information

11

7%

Strand 3. Producing information

76

48%

Aspect 3.1: Transforming information

22

14%

Aspect 3.2: Creating information

54

34%

Strand 4. Digital communication

30

19%

Aspect 4.1: Sharing information

12

8%

Aspect 4.2: Using information responsibly and safely

17

11%

Total 150 100%

More information about CIL can be found in the ICILS 2023 Assessment Framework on the IEA website.

The ICILS 2023 Assessment Framework defines CT as an “individual’s ability to recognize aspects of real‐world problems that are appropriate for computational formulation and to evaluate and develop algorithmic solutions to those problems so that the solutions could be operationalized with a computer.” The framework divides CT into two strands: (1) conceptualizing problems and (2) operationalizing solutions. The strands represent the overarching conceptual category for framing the skills and knowledge assessed by the instruments. Each strand includes aspects that refer to the specific content category within the strand.

Distribution of score points and percentages across CT strands and aspects

Strand/aspect Score points Percentage of total

Strand 1. Conceptualizing problems

20

31%

Aspect 1.1: Knowing about and understanding digital systems

9

14%

Aspect 1.2: Formulating and analyzing problems

4

6%

Aspect 1.3: Collecting and representing relevant data

7

11%

Strand 2. Operationalizing solutions

45

69%

Aspect 2.1: Planning and evaluating solutions

19

29%

Aspect 2.2: Developing algorithms, programs and interfaces

26

40%

Total 65 100%

More information about CT can be found in the ICILS 2023 Assessment Framework on the IEA website.

CIL assessment modules include the following essential features:

  • Students complete tasks solely on a computer.
  • The tasks have a real-world, cross-curricular focus.
  • The tasks combine technical, receptive, productive, and evaluative skills.
  • The tasks reference safe and ethical uses of computer-based information.

CIL test modules follow a linear narrative and comprise a set of questions and tasks based on real-world themes (e.g., design a web page representing a band featured in a school band competition). Modules start with a series of smaller tasks that focus on skill execution and information management. These smaller tasks typically take less than 1 minute to complete and prepare the student to carry out a larger task which typically takes 10–15 minutes to complete.

Example CIL task from Breathing Module

Example CIL task from Breathing Module

Copyright © 2023 International Association for the Evaluation of Educational Achievement (IEA).

The CIL assessment takes approximately 80 minutes to complete: 20 minutes of a tutorial followed by two modules, which take 30 minutes each to complete. After students complete the CIL assessment, they complete a 30-minute questionnaire.

Following the ICILS 2023 assessment, the IEA released two CIL modules (Breathing and School Trip). Visit the ICILS Released Assessment Questions page access to the modules.

CT test modules share the same essential features as CIL modules but have an additional focus on systems thinking and development of algorithmic solutions. Modules comprise a set of questions and tasks based on real-world themes that can be addressed with computer‐based solutions. The tasks assess a range of technical competencies as well as critical thinking, problem‐solving, and evaluation skills. In addition, some tasks require students to create and execute block‐based algorithms that are designed such that students can demonstrate aspects of computational and algorithmic thinking without the need to learn the syntax or features of a specific programming language.

Example CT block‐based algorithms task from the Farm Drone Module Example CT block‐based algorithms task from the Farm Drone Module

Copyright © 2023 International Association for the Evaluation of Educational Achievement (IEA).

In countries participating in the CT option, students complete two of the CT test modules in randomized order after they complete the CIL test and the student questionnaire. Students have 50 minutes to complete the CT test.

Following the ICILS 2023 assessment, the IEA released two CT modules (Farm Drone and Automated Bus). Visit the ICILS Released Assessment Questions page for access to the modules.

The ICILS reporting scales for computer and information literacy (CIL) and computational thinking (CT) were established in ICILS 2013 and ICILS 2018, respectively, setting the mean of national average scores to 500 and the standard deviation to 100. Subsequent ICILS assessments transformed achievement data to this metric so that CIL scale scores across ICILS cycles are directly comparable, as are CT scale scores across cycles. Items from each assessment and their relative difficulty were analyzed to identify knowledge, skills, and understanding demonstrated by students that could be used to characterize the different levels (i.e., ranges) on the respective achievement scales. From this process, proficiency levels were established for each scale. The proficiency levels are syntheses of the common elements of CT knowledge, skills, and understanding described by the items within each level. They also describe the typical ways in which students working at a level demonstrate their proficiency.

The proficiency levels are considered hierarchical in that a student’s proficiency becomes more sophisticated as student achievement progresses up the scale. Therefore, it is assumed that students performing at a higher proficiency level are also able to perform tasks/skills at lower proficiency levels.

ICILS proficiency level ranges
Proficiency level CIL scale ranges CT scale ranges

Level 4

661.001 and above

660.001 and above

Level 3

576.001‒661

550.001‒660

Level 2

492.001‒576

440.001‒550

Level 1

407.001‒492

330.001‒440


More information about the proficiency levels, and the skills students demonstrate at each proficiency level can be found in the ICILS 2023 International Report on the IEA website.

ICILS assesses students’ ability to use computers to investigate, create, and communicate in order to participate effectively in a wide variety of environments—at home, at school, in the workplace, and in the community—rather than assessing only what they learn in specific classes.

Because computer and information literacy is not always attained in or confined to a specific classroom, the ICILS assessment is accompanied by questionnaires for students, teachers, information and communications technologies (ICT) coordinators, and school principals that aim to get a better understanding of students’ opportunities to gain computer and information literacy skills both inside and outside the classroom and across subjects.

ICILS collects and analyzes data to provide a better understanding of today’s pervasively digital context as well as of students’ skills in information management, communication, and computational thinking. This includes aspects related to digital citizenship, reflecting young people’s increasing opportunities for online citizenship participation and helping measure progress toward UNESCO’s Sustainable Development Goal 4.4 (increasing the number of young people who have relevant skills for employment).

For a list of education systems that have participated in every ICILS cycle, see the table of participating countries.

Assessment year Number of participating schools Number of participants Overall weighted participation rate

2018

 

 

 

Student survey1

263

6,790

70%

Teacher survey1

259

3,218

65%

2023

 

 

 

Student survey1

118

2,352

56%

Teacher survey2

95

898

48%

1 Did not meet the guidelines for a sample participation rate of 85 percent for the student assessment and not included in the international average.
2 The response rate for sampled teachers in 2023 in the United States was below 50 percent. As a result, the U.S. teacher data is not included in the international average or presented in the main tables of any ICILS 2023 international report, but instead located in the appendix of the international report that presents teacher data.

ICILS has a stratified two-stage probability cluster sampling design. During the first stage of sampling, schools are selected systematically with probability proportional to their size (PPS), as measured by the total number of enrolled target-grade students. For ICILS 2013 and ICILS 2018, a random sample of students enrolled in the target grade was selected within each school during the second stage. For ICILS 2023, a random sample of target-grade classes was selected within each school during the second stage.  

Since 2013, ICILS has been administered every 5 years, with the United States participating in 2018 and 2023.

Schools cannot sign up to participate in ICILS as part of the national U.S. sample. To have fair comparisons across countries, it is important that each country only includes in its national sample those schools and students scientifically sampled by the international contractor to fairly represent the country.

ICILS 2023 and 2018 data for the United States is only available at the national level. If states want to obtain state-level data, they can sign up to participate in future ICILS assessments at their own cost. Sample size requirements apply. Please contact NCES for more information.

The most recent ICILS administration was in 2023, with results released in November 2024. The next administration of ICILS is scheduled for 2028.




Back to Top