Teaching critical digital literacy

Critical digital literacy is a set of skills, competencies, and analytical viewpoints that allows a person to use, understand, and create digital media and tools. Related to information literacy skills such as numeracy, listening, speaking, reading, writing and critical thinking, the goal of critical digital literacy is to develop active and engaged thinkers and creators in digital environments. Digital literacy is more than technological understanding or computer skills and involves a range of reflective, ethical, and social perspectives on digital activities.

See also studies of "Critical Media Literacy."

This article is based on work done by Robin Davis as part of the Folger Institute’s Early Modern Digital Agendas (2013) institute. We welcome the addition of resources and readings, particularly those focused on teaching digital literacy from an early modernist perspective and critically analyzing the digital tools related to early modern studies.

  • 1 Multi-literacies of digital environments
  • 2 Teaching critical early modern digital literacy
  • 3 Pedagogical settings for digital literacy
  • 4.2 Exercises
  • 4.3 Information about tools and infrastructure
  • 5.1 Introductions
  • 5.2 Short reads
  • 5.3 Long reads

Multi-literacies of digital environments

Juliet Hinrichsen and Antony Coombs at the University of Greenwich propose a "5 Resources Model" for articulating the scope and dimensions of digital literacies. The five resources are:

  • Decoding : Learners need to develop familiarity with the structures and conventions of digital media, sensitivity to the different modes at work within digital artifacts and confident use of the operational frameworks within which they exist. These skills focus on understanding the navigational mechanisms and movement of the digital landscape (buttons, scrolling, windows, bars); understanding the norms and practices of digital environments (safety and online behavior, community norms, privacy and sharing); understanding common operations (e.g. saving, upload and download, organizing files); recognizing and evaluating stylistics (e.g. the social codes embedded in color, font, transition, and layout choices); and recognizing that different modes of digital texts (twitter streams, video, immersive games) have different characteristics and conventions.
  • Meaning Making : This aspect maps onto other well-recognized literacy concepts in recognizing the agency of the learner as a participant in constructing meaning. This dimension of digital literacy focuses on reading , the fluent assimilation of digital content and the ability to follow a narrative across diverse semantic, visual and structural elements; relating , or recognizing relationships between new and existing knowledge and adapting mental models; and expressing , or the capacity to translate a purpose, intention, feeling or idea into a digital form.
  • Using : Learners need to develop the ability to effectively and creatively deploy digital tools. This includes finding the tool and the requisite evaluation of different tool options, effectively apply tools and techniques, employ problem solving, and have the confidence to explore, experiment, and innovate to create solutions with imaginative approaches, techniques, or content.
  • Analyzing : Learners need to develop the ability to make informed judgements and choices in the digital domain. This includes deconstructing digital resources to analyze their constituent parts, selecting resources and tools, and interrogating the provenance, purpose, and structures that affect digital content and influence its output.
  • Persona : Sensitivity to the issues of reputation, identity and membership within different digital contexts. In identity building, learners develop a sense of their own role in digital environments; reputation management emphasizes the importance of building and maintaining both individual and community reputations as assets; while participation focuses on the nature of the collaborative (both synchronous and asynchronous) contributions that make up many digital projects and their ethical and cultural challenges.

Teaching critical early modern digital literacy

Students encounter early modern texts, images, and objects through an increasing array of digital tools and sources. By reading an edited version of Macbeth via the Folger Digital Texts, critiquing a digital facsimile of a pamphlet on Early English Books Online, or navigating the Agas Map at the Map of Early Modern London , digital texts and tools allow for interactive and expansive exploration of early modern topics. In encouraging students to engage with early modern literature and history through these tools, we also need to provide them with the skills to analyze the tool, its intended audiences, its affordances and limitations. By teaching students how to assess digital editions and tools critically, we can prepare them not only to select the best tools for their purpose, but also provide the skills necessary for further development of digital texts, tools, and resources. Critical digital literacy is the first step towards digital authorship.

Pedagogical settings for digital literacy

Critical digital literacy, writ broadly, is taught in a variety of settings, including humanities and information courses; public, academic, and special collections libraries; and K-12 classrooms and after-school programs. The articles listed below present a variety of perspectives on the benefits of critical digital literacy and point to some of the fields in which discussions of critical digital literacy are taking place.

  • Graduate education; Rhetoric and Composition.
  • Undergraduate education; English Department.
  • High school education and postsecondary education; English, Art, Preservice teacher education.
  • High School, Academic standards and student attitudes.
  • Undergraduate education; Academic libraries.

Teaching resources and exercises

Many digital literacy exercises are designed to make the affordances of specific tools visible. Many students treat digital humanities tools as "black boxes": opaque systems in which a user provides input and receives a product, but does not have information about what happens in between. The goal of many of these exercises is to make the box a little less opaque and open up digital humanities tools, programs, and databases to criticism. We want students to question the assumptions and choices made in the selection, organization, and presentation of content and understand what their tools are doing for them.

http://dhbox.org/ : streamlines installation processes and provides a digital humanities laboratory in the cloud through simple sign-in via a web browser.

  • How to see through the cloud : using traceroute to walk through the physical network of the Internet
  • Governing Algorithms: Provocation piece : short essay split into 38 provocations concerning algorithms, policy, and practice. #21 is a great discussion starter, courtesy of Solon Barocas, Sophie Hood, and Malte Ziewitz.
  • Teaching with Lingfish : Using the Early Modern Recipe Archive, Luna, the OED, and EEBO to examine how different repositories deal with fish, courtesy of Nancy Simpson-Younger.

Information about tools and infrastructure

  • The Basement : what an internet hub for a major American city looks like, courtesy of Cabel Sasser.

Further reading

Introductions.

  • Folgerpedia's Glossary of digital humanities terms
  • A career-focused and skills-focused list of resources by the Department of Commerce. This page allows practitioners in service-oriented organizations—such as libraries, schools, community centers, community colleges, and workforce training centers—to find digital literacy content.
  • This section looks at the various aspects and principles relating to digital literacy and the many skills and competencies that fall under the digital literacy umbrella. The relationship between digital literacy and digital citizenship is also explored and tips are provided for teaching these skills in the classroom.
  • Kellner, Douglas and Jeff Share, "Toward Critical Media Literacy: Core concepts, debates, organizations, and policy," Discourse: studies in the cultural politics of education . 26, no. 3 (September 2005): 369-86.

Short reads

  • Practicing Freedom in the Digital Library by Barbara Fister in Library Journal (2013)
  • Beyond Citation student project at the CUNY Graduate Center (2014)
  • Digital Humanities Pedagogy: Practices, Principles, and Politics edited by Brett D. Hirsch (2012)
  • "Hacking the Classroom: Eight Perspectives" Eds. Mary Hocks and Jentry Sayers. Computers and Composition Online . (Spring 2014).
  • Suggested readings from CRTCLDGTL , a reading group at Northwestern
  • Never Neutral: Critical Approaches to Digital Tools & Culture in the Humanities , essay by Josh Honn (2013)
  • Folger Institute
  • Digital humanities
  • Bibliography

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Springer Nature - PMC COVID-19 Collection

Logo of phenaturepg

A systematic review on digital literacy

Hasan tinmaz.

1 AI & Big Data Department, Endicott College of International Studies, Woosong University, Daejeon, South Korea

Yoo-Taek Lee

2 Endicott College of International Studies, Woosong University, Daejeon, South Korea

Mina Fanea-Ivanovici

3 Department of Economics and Economic Policies, Bucharest University of Economic Studies, Bucharest, Romania

Hasnan Baber

4 Abu Dhabi School of Management, Abu Dhabi, United Arab Emirates

Associated Data

The authors present the articles used for the study in “ Appendix A ”.

The purpose of this study is to discover the main themes and categories of the research studies regarding digital literacy. To serve this purpose, the databases of WoS/Clarivate Analytics, Proquest Central, Emerald Management Journals, Jstor Business College Collections and Scopus/Elsevier were searched with four keyword-combinations and final forty-three articles were included in the dataset. The researchers applied a systematic literature review method to the dataset. The preliminary findings demonstrated that there is a growing prevalence of digital literacy articles starting from the year 2013. The dominant research methodology of the reviewed articles is qualitative. The four major themes revealed from the qualitative content analysis are: digital literacy, digital competencies, digital skills and digital thinking. Under each theme, the categories and their frequencies are analysed. Recommendations for further research and for real life implementations are generated.

Introduction

The extant literature on digital literacy, skills and competencies is rich in definitions and classifications, but there is still no consensus on the larger themes and subsumed themes categories. (Heitin, 2016 ). To exemplify, existing inventories of Internet skills suffer from ‘incompleteness and over-simplification, conceptual ambiguity’ (van Deursen et al., 2015 ), and Internet skills are only a part of digital skills. While there is already a plethora of research in this field, this research paper hereby aims to provide a general framework of digital areas and themes that can best describe digital (cap)abilities in the novel context of Industry 4.0 and the accelerated pandemic-triggered digitalisation. The areas and themes can represent the starting point for drafting a contemporary digital literacy framework.

Sousa and Rocha ( 2019 ) explained that there is a stake of digital skills for disruptive digital business, and they connect it to the latest developments, such as the Internet of Things (IoT), cloud technology, big data, artificial intelligence, and robotics. The topic is even more important given the large disparities in digital literacy across regions (Tinmaz et al., 2022 ). More precisely, digital inequalities encompass skills, along with access, usage and self-perceptions. These inequalities need to be addressed, as they are credited with a ‘potential to shape life chances in multiple ways’ (Robinson et al., 2015 ), e.g., academic performance, labour market competitiveness, health, civic and political participation. Steps have been successfully taken to address physical access gaps, but skills gaps are still looming (Van Deursen & Van Dijk, 2010a ). Moreover, digital inequalities have grown larger due to the COVID-19 pandemic, and they influenced the very state of health of the most vulnerable categories of population or their employability in a time when digital skills are required (Baber et al., 2022 ; Beaunoyer, Dupéré & Guitton, 2020 ).

The systematic review the researchers propose is a useful updated instrument of classification and inventory for digital literacy. Considering the latest developments in the economy and in line with current digitalisation needs, digitally literate population may assist policymakers in various fields, e.g., education, administration, healthcare system, and managers of companies and other concerned organisations that need to stay competitive and to employ competitive workforce. Therefore, it is indispensably vital to comprehend the big picture of digital literacy related research.

Literature review

Since the advent of Digital Literacy, scholars have been concerned with identifying and classifying the various (cap)abilities related to its operation. Using the most cited academic papers in this stream of research, several classifications of digital-related literacies, competencies, and skills emerged.

Digital literacies

Digital literacy, which is one of the challenges of integration of technology in academic courses (Blau, Shamir-Inbal & Avdiel, 2020 ), has been defined in the current literature as the competencies and skills required for navigating a fragmented and complex information ecosystem (Eshet, 2004 ). A ‘Digital Literacy Framework’ was designed by Eshet-Alkalai ( 2012 ), comprising six categories: (a) photo-visual thinking (understanding and using visual information); (b) real-time thinking (simultaneously processing a variety of stimuli); (c) information thinking (evaluating and combining information from multiple digital sources); (d) branching thinking (navigating in non-linear hyper-media environments); (e) reproduction thinking (creating outcomes using technological tools by designing new content or remixing existing digital content); (f) social-emotional thinking (understanding and applying cyberspace rules). According to Heitin ( 2016 ), digital literacy groups the following clusters: (a) finding and consuming digital content; (b) creating digital content; (c) communicating or sharing digital content. Hence, the literature describes the digital literacy in many ways by associating a set of various technical and non-technical elements.

Digital competencies

The Digital Competence Framework for Citizens (DigComp 2.1.), the most recent framework proposed by the European Union, which is currently under review and undergoing an updating process, contains five competency areas: (a) information and data literacy, (b) communication and collaboration, (c) digital content creation, (d) safety, and (e) problem solving (Carretero, Vuorikari & Punie, 2017 ). Digital competency had previously been described in a technical fashion by Ferrari ( 2012 ) as a set comprising information skills, communication skills, content creation skills, safety skills, and problem-solving skills, which later outlined the areas of competence in DigComp 2.1, too.

Digital skills

Ng ( 2012 ) pointed out the following three categories of digital skills: (a) technological (using technological tools); (b) cognitive (thinking critically when managing information); (c) social (communicating and socialising). A set of Internet skill was suggested by Van Deursen and Van Dijk ( 2009 , 2010b ), which contains: (a) operational skills (basic skills in using internet technology), (b) formal Internet skills (navigation and orientation skills); (c) information Internet skills (fulfilling information needs), and (d) strategic Internet skills (using the internet to reach goals). In 2014, the same authors added communication and content creation skills to the initial framework (van Dijk & van Deursen). Similarly, Helsper and Eynon ( 2013 ) put forward a set of four digital skills: technical, social, critical, and creative skills. Furthermore, van Deursen et al. ( 2015 ) built a set of items and factors to measure Internet skills: operational, information navigation, social, creative, mobile. More recent literature (vaan Laar et al., 2017 ) divides digital skills into seven core categories: technical, information management, communication, collaboration, creativity, critical thinking, and problem solving.

It is worth mentioning that the various methodologies used to classify digital literacy are overlapping or non-exhaustive, which confirms the conceptual ambiguity mentioned by van Deursen et al. ( 2015 ).

Digital thinking

Thinking skills (along with digital skills) have been acknowledged to be a significant element of digital literacy in the educational process context (Ferrari, 2012 ). In fact, critical thinking, creativity, and innovation are at the very core of DigComp. Information and Communication Technology as a support for thinking is a learning objective in any school curriculum. In the same vein, analytical thinking and interdisciplinary thinking, which help solve problems, are yet other concerns of educators in the Industry 4.0 (Ozkan-Ozen & Kazancoglu, 2021 ).

However, we have recently witnessed a shift of focus from learning how to use information and communication technologies to using it while staying safe in the cyber-environment and being aware of alternative facts. Digital thinking would encompass identifying fake news, misinformation, and echo chambers (Sulzer, 2018 ). Not least important, concern about cybersecurity has grown especially in times of political, social or economic turmoil, such as the elections or the Covid-19 crisis (Sulzer, 2018 ; Puig, Blanco-Anaya & Perez-Maceira, 2021 ).

Ultimately, this systematic review paper focuses on the following major research questions as follows:

  • Research question 1: What is the yearly distribution of digital literacy related papers?
  • Research question 2: What are the research methods for digital literacy related papers?
  • Research question 3: What are the main themes in digital literacy related papers?
  • Research question 4: What are the concentrated categories (under revealed main themes) in digital literacy related papers?

This study employed the systematic review method where the authors scrutinized the existing literature around the major research question of digital literacy. As Uman ( 2011 ) pointed, in systematic literature review, the findings of the earlier research are examined for the identification of consistent and repetitive themes. The systematic review method differs from literature review with its well managed and highly organized qualitative scrutiny processes where researchers tend to cover less materials from fewer number of databases to write their literature review (Kowalczyk & Truluck, 2013 ; Robinson & Lowe, 2015 ).

Data collection

To address major research objectives, the following five important databases are selected due to their digital literacy focused research dominance: 1. WoS/Clarivate Analytics, 2. Proquest Central; 3. Emerald Management Journals; 4. Jstor Business College Collections; 5. Scopus/Elsevier.

The search was made in the second half of June 2021, in abstract and key words written in English language. We only kept research articles and book chapters (herein referred to as papers). Our purpose was to identify a set of digital literacy areas, or an inventory of such areas and topics. To serve that purpose, systematic review was utilized with the following synonym key words for the search: ‘digital literacy’, ‘digital skills’, ‘digital competence’ and ‘digital fluency’, to find the mainstream literature dealing with the topic. These key words were unfolded as a result of the consultation with the subject matter experts (two board members from Korean Digital Literacy Association and two professors from technology studies department). Below are the four key word combinations used in the search: “Digital literacy AND systematic review”, “Digital skills AND systematic review”, “Digital competence AND systematic review”, and “Digital fluency AND systematic review”.

A sequential systematic search was made in the five databases mentioned above. Thus, from one database to another, duplicate papers were manually excluded in a cascade manner to extract only unique results and to make the research smoother to conduct. At this stage, we kept 47 papers. Further exclusion criteria were applied. Thus, only full-text items written in English were selected, and in doing so, three papers were excluded (no full text available), and one other paper was excluded because it was not written in English, but in Spanish. Therefore, we investigated a total number of 43 papers, as shown in Table ​ Table1. 1 . “ Appendix A ” shows the list of these papers with full references.

Number of papers identified sequentially after applying all inclusion and exclusion criteria

DatabaseKeyword combinationsTotal number of papers
Digital literacy AND systematic reviewDigital skills AND systematic reviewDigital competence AND systematic reviewDigital fluency AND systematic review
1. WoS/Clarivate Analytics4 papers3 papers5 papers12 papers
2. Proquest Central7 papers4 papers1 paper12 papers
3.Emerald Management Jour3 papers1 paper1 paper-5 papers
4. Jstor Business College Collection9 papers1 paper10 papers
5. Scopus, Elsevier4 papers4 papers
Total per keyword combination27 papers8 papers6 papers2 papers43 papers

Data analysis

The 43 papers selected after the application of the inclusion and exclusion criteria, respectively, were reviewed the materials independently by two researchers who were from two different countries. The researchers identified all topics pertaining to digital literacy, as they appeared in the papers. Next, a third researcher independently analysed these findings by excluded duplicates A qualitative content analysis was manually performed by calculating the frequency of major themes in all papers, where the raw data was compared and contrasted (Fraenkel et al., 2012 ). All three reviewers independently list the words and how the context in which they appeared and then the three reviewers collectively decided for how it should be categorized. Lastly, it is vital to remind that literature review of this article was written after the identification of the themes appeared as a result of our qualitative analyses. Therefore, the authors decided to shape the literature review structure based on the themes.

As an answer to the first research question (the yearly distribution of digital literacy related papers), Fig.  1 demonstrates the yearly distribution of digital literacy related papers. It is seen that there is an increasing trend about the digital literacy papers.

An external file that holds a picture, illustration, etc.
Object name is 40561_2022_204_Fig1_HTML.jpg

Yearly distribution of digital literacy related papers

Research question number two (The research methods for digital literacy related papers) concentrates on what research methods are employed for these digital literacy related papers. As Fig.  2 shows, most of the papers were using the qualitative method. Not stated refers to book chapters.

An external file that holds a picture, illustration, etc.
Object name is 40561_2022_204_Fig2_HTML.jpg

Research methods used in the reviewed articles

When forty-three articles were analysed for the main themes as in research question number three (The main themes in digital literacy related papers), the overall findings were categorized around four major themes: (i) literacies, (ii) competencies, (iii) skills, and (iv) thinking. Under every major theme, the categories were listed and explained as in research question number four (The concentrated categories (under revealed main themes) in digital literacy related papers).

The authors utilized an overt categorization for the depiction of these major themes. For example, when the ‘creativity’ was labelled as a skill, the authors also categorized it under the ‘skills’ theme. Similarly, when ‘creativity’ was mentioned as a competency, the authors listed it under the ‘competencies’ theme. Therefore, it is possible to recognize the same finding under different major themes.

Major theme 1: literacies

Digital literacy being the major concern of this paper was observed to be blatantly mentioned in five papers out forty-three. One of these articles described digital literacy as the human proficiencies to live, learn and work in the current digital society. In addition to these five articles, two additional papers used the same term as ‘critical digital literacy’ by describing it as a person’s or a society’s accessibility and assessment level interaction with digital technologies to utilize and/or create information. Table ​ Table2 2 summarizes the major categories under ‘Literacies’ major theme.

Categories (more than one occurrence) under 'literacies' major theme

CategorynCategorynCategoryn
Digital literacy5Disciplinary literacy4Web literacy2
Critical digital literacy2Data literacy3New literacy2
Computer literacy5Technology literacy3Mobile literacy2
Media literacy5Multiliteracy3Personal literacy2
Cultural literacy5Internet literacy2Research literacy2

Computer literacy, media literacy and cultural literacy were the second most common literacy (n = 5). One of the article branches computer literacy as tool (detailing with software and hardware uses) and resource (focusing on information processing capacity of a computer) literacies. Cultural literacy was emphasized as a vital element for functioning in an intercultural team on a digital project.

Disciplinary literacy (n = 4) was referring to utilizing different computer programs (n = 2) or technical gadgets (n = 2) with a specific emphasis on required cognitive, affective and psychomotor skills to be able to work in any digital context (n = 3), serving for the using (n = 2), creating and applying (n = 2) digital literacy in real life.

Data literacy, technology literacy and multiliteracy were the third frequent categories (n = 3). The ‘multiliteracy’ was referring to the innate nature of digital technologies, which have been infused into many aspects of human lives.

Last but not least, Internet literacy, mobile literacy, web literacy, new literacy, personal literacy and research literacy were discussed in forty-three article findings. Web literacy was focusing on being able to connect with people on the web (n = 2), discover the web content (especially the navigation on a hyper-textual platform), and learn web related skills through practical web experiences. Personal literacy was highlighting digital identity management. Research literacy was not only concentrating on conducting scientific research ability but also finding available scholarship online.

Twenty-four other categories are unfolded from the results sections of forty-three articles. Table ​ Table3 3 presents the list of these other literacies where the authors sorted the categories in an ascending alphabetical order without any other sorting criterion. Primarily, search, tagging, filtering and attention literacies were mainly underlining their roles in information processing. Furthermore, social-structural literacy was indicated as the recognition of the social circumstances and generation of information. Another information-related literacy was pointed as publishing literacy, which is the ability to disseminate information via different digital channels.

Other mentioned categories (n = 1)

Advanced digital assessment literacyIntermediate digital assessment literacySearch literacy
Attention literacyLibrary literacySocial media literacy
Basic digital assessment literacyMetaliteracySocial-structural literacy
Conventional print literacyMultimodal literacyTagging literacy
Critical literacyNetwork literacyTelevision literacy
Emerging technology literacyNews literacyTranscultural digital literacy
Film literacyParticipatory literacyTransliteracy
Filtering literacyPublishing literacy

While above listed personal literacy was referring to digital identity management, network literacy was explained as someone’s social networking ability to manage the digital relationship with other people. Additionally, participatory literacy was defined as the necessary abilities to join an online team working on online content production.

Emerging technology literacy was stipulated as an essential ability to recognize and appreciate the most recent and innovative technologies in along with smart choices related to these technologies. Additionally, the critical literacy was added as an ability to make smart judgements on the cost benefit analysis of these recent technologies.

Last of all, basic, intermediate, and advanced digital assessment literacies were specified for educational institutions that are planning to integrate various digital tools to conduct instructional assessments in their bodies.

Major theme 2: competencies

The second major theme was revealed as competencies. The authors directly categorized the findings that are specified with the word of competency. Table ​ Table4 4 summarizes the entire category set for the competencies major theme.

Categories under 'competencies' major theme

CategorynCategoryn
Digital competence14Cross-cultural competencies1
Digital competence as a life skill5Digital teaching competence1
Digital competence for work3Balancing digital usage1
Economic engagement3Political engagement1
Digital competence for leisure2Complex system modelling competencies1
Digital communication2Simulation competencies1
Intercultural competencies2Digital nativity1

The most common category was the ‘digital competence’ (n = 14) where one of the articles points to that category as ‘generic digital competence’ referring to someone’s creativity for multimedia development (video editing was emphasized). Under this broad category, the following sub-categories were associated:

  • Problem solving (n = 10)
  • Safety (n = 7)
  • Information processing (n = 5)
  • Content creation (n = 5)
  • Communication (n = 2)
  • Digital rights (n = 1)
  • Digital emotional intelligence (n = 1)
  • Digital teamwork (n = 1)
  • Big data utilization (n = 1)
  • Artificial Intelligence utilization (n = 1)
  • Virtual leadership (n = 1)
  • Self-disruption (in along with the pace of digitalization) (n = 1)

Like ‘digital competency’, five additional articles especially coined the term as ‘digital competence as a life skill’. Deeper analysis demonstrated the following points: social competences (n = 4), communication in mother tongue (n = 3) and foreign language (n = 2), entrepreneurship (n = 3), civic competence (n = 2), fundamental science (n = 1), technology (n = 1) and mathematics (n = 1) competences, learning to learn (n = 1) and self-initiative (n = 1).

Moreover, competencies were linked to workplace digital competencies in three articles and highlighted as significant for employability (n = 3) and ‘economic engagement’ (n = 3). Digital competencies were also detailed for leisure (n = 2) and communication (n = 2). Furthermore, two articles pointed digital competencies as an inter-cultural competency and one as a cross-cultural competency. Lastly, the ‘digital nativity’ (n = 1) was clarified as someone’s innate competency of being able to feel contented and satisfied with digital technologies.

Major theme 3: skills

The third major observed theme was ‘skills’, which was dominantly gathered around information literacy skills (n = 19) and information and communication technologies skills (n = 18). Table ​ Table5 5 demonstrates the categories with more than one occurrence.

Categories under 'skills' major theme

CategorynCategoryn
Information literacy skills19Decision making skills3
ICT skills18Social intelligence3
Communication skills9Digital learning2
Collaboration skills9Digital teaching2
Digital content creation skills4Digital fluency2
Ethics for digital environment4Digital awareness2
Research skills3Creativity2

Table ​ Table6 6 summarizes the sub-categories of the two most frequent categories of ‘skills’ major theme. The information literacy skills noticeably concentrate on the steps of information processing; evaluation (n = 6), utilization (n = 4), finding (n = 3), locating (n = 2) information. Moreover, the importance of trial/error process, being a lifelong learner, feeling a need for information and so forth were evidently listed under this sub-category. On the other hand, ICT skills were grouped around cognitive and affective domains. For instance, while technical skills in general and use of social media, coding, multimedia, chat or emailing in specific were reported in cognitive domain, attitude, intention, and belief towards ICT were mentioned as the elements of affective domain.

Sub-categories under ‘information literacy’ and ‘ICT’ skills

Sub-category for information literacy skillsnSub-category for ICT skillsn
Evaluating information6Technical skills4
Using obtained information4Attitude towards ICT4
Legal use of information3Use of social media3
Finding information3Intention to use ICT2
Locating information2Beliefs about the use of ICT1
Feeling the need for information1General knowledge of ICT1
Documenting information1Use of chat1
Life-long learning1Use of email1
Trial and error1Digital text skills1
Dealing with the excessiveness of information1Use of multimedia technologies1
Coding1

Communication skills (n = 9) were multi-dimensional for different societies, cultures, and globalized contexts, requiring linguistic skills. Collaboration skills (n = 9) are also recurrently cited with an explicit emphasis for virtual platforms.

‘Ethics for digital environment’ encapsulated ethical use of information (n = 4) and different technologies (n = 2), knowing digital laws (n = 2) and responsibilities (n = 2) in along with digital rights and obligations (n = 1), having digital awareness (n = 1), following digital etiquettes (n = 1), treating other people with respect (n = 1) including no cyber-bullying (n = 1) and no stealing or damaging other people (n = 1).

‘Digital fluency’ involved digital access (n = 2) by using different software and hardware (n = 2) in online platforms (n = 1) or communication tools (n = 1) or within programming environments (n = 1). Digital fluency also underlined following recent technological advancements (n = 1) and knowledge (n = 1) including digital health and wellness (n = 1) dimension.

‘Social intelligence’ related to understanding digital culture (n = 1), the concept of digital exclusion (n = 1) and digital divide (n = 3). ‘Research skills’ were detailed with searching academic information (n = 3) on databases such as Web of Science and Scopus (n = 2) and their citation, summarization, and quotation (n = 2).

‘Digital teaching’ was described as a skill (n = 2) in Table ​ Table4 4 whereas it was also labelled as a competence (n = 1) as shown in Table ​ Table3. 3 . Similarly, while learning to learn (n = 1) was coined under competencies in Table ​ Table3, 3 , digital learning (n = 2, Table ​ Table4) 4 ) and life-long learning (n = 1, Table ​ Table5) 5 ) were stated as learning related skills. Moreover, learning was used with the following three terms: learning readiness (n = 1), self-paced learning (n = 1) and learning flexibility (n = 1).

Table ​ Table7 7 shows other categories listed below the ‘skills’ major theme. The list covers not only the software such as GIS, text mining, mapping, or bibliometric analysis programs but also the conceptual skills such as the fourth industrial revolution and information management.

Categories (one-time occurrence) under 'skills' major theme

CategoryCategoryCategory
Digital connectivity skillCulture transformationText mining
Digital systems skillReadiness to Industry 4.0GIS (geographic information system)
Re(design) skillInternet of Things (IoT)Bibliometric analysis
Digital readinessTechnology-human adaptationMapping
Digital commerceInformation management

Major theme 4: thinking

The last identified major theme was the different types of ‘thinking’. As Table ​ Table8 8 shows, ‘critical thinking’ was the most frequent thinking category (n = 4). Except computational thinking, the other categories were not detailed.

Categories under ‘thinking’ major theme

CategorynCategoryn
Critical thinking4System thinking1
Computational thinking3Interdisciplinary thinking1
Analytical thinking1Purposeful thinking1
Innovative thinking1Quick thinking1

Computational thinking (n = 3) was associated with the general logic of how a computer works and sub-categorized into the following steps; construction of the problem (n = 3), abstraction (n = 1), disintegration of the problem (n = 2), data collection, (n = 2), data analysis (n = 2), algorithmic design (n = 2), parallelization & iteration (n = 1), automation (n = 1), generalization (n = 1), and evaluation (n = 2).

A transversal analysis of digital literacy categories reveals the following fields of digital literacy application:

  • Technological advancement (IT, ICT, Industry 4.0, IoT, text mining, GIS, bibliometric analysis, mapping data, technology, AI, big data)
  • Networking (Internet, web, connectivity, network, safety)
  • Information (media, news, communication)
  • Creative-cultural industries (culture, publishing, film, TV, leisure, content creation)
  • Academia (research, documentation, library)
  • Citizenship (participation, society, social intelligence, awareness, politics, rights, legal use, ethics)
  • Education (life skills, problem solving, teaching, learning, education, lifelong learning)
  • Professional life (work, teamwork, collaboration, economy, commerce, leadership, decision making)
  • Personal level (critical thinking, evaluation, analytical thinking, innovative thinking)

This systematic review on digital literacy concentrated on forty-three articles from the databases of WoS/Clarivate Analytics, Proquest Central, Emerald Management Journals, Jstor Business College Collections and Scopus/Elsevier. The initial results revealed that there is an increasing trend on digital literacy focused academic papers. Research work in digital literacy is critical in a context of disruptive digital business, and more recently, the pandemic-triggered accelerated digitalisation (Beaunoyer, Dupéré & Guitton, 2020 ; Sousa & Rocha 2019 ). Moreover, most of these papers were employing qualitative research methods. The raw data of these articles were analysed qualitatively using systematic literature review to reveal major themes and categories. Four major themes that appeared are: digital literacy, digital competencies, digital skills and thinking.

Whereas the mainstream literature describes digital literacy as a set of photo-visual, real-time, information, branching, reproduction and social-emotional thinking (Eshet-Alkalai, 2012 ) or as a set of precise specific operations, i.e., finding, consuming, creating, communicating and sharing digital content (Heitin, 2016 ), this study reveals that digital literacy revolves around and is in connection with the concepts of computer literacy, media literacy, cultural literacy or disciplinary literacy. In other words, the present systematic review indicates that digital literacy is far broader than specific tasks, englobing the entire sphere of computer operation and media use in a cultural context.

The digital competence yardstick, DigComp (Carretero, Vuorikari & Punie, 2017 ) suggests that the main digital competencies cover information and data literacy, communication and collaboration, digital content creation, safety, and problem solving. Similarly, the findings of this research place digital competencies in relation to problem solving, safety, information processing, content creation and communication. Therefore, the findings of the systematic literature review are, to a large extent, in line with the existing framework used in the European Union.

The investigation of the main keywords associated with digital skills has revealed that information literacy, ICT, communication, collaboration, digital content creation, research and decision-making skill are the most representative. In a structured way, the existing literature groups these skills in technological, cognitive, and social (Ng, 2012 ) or, more extensively, into operational, formal, information Internet, strategic, communication and content creation (van Dijk & van Deursen, 2014 ). In time, the literature has become richer in frameworks, and prolific authors have improved their results. As such, more recent research (vaan Laar et al., 2017 ) use the following categories: technical, information management, communication, collaboration, creativity, critical thinking, and problem solving.

Whereas digital thinking was observed to be mostly related with critical thinking and computational thinking, DigComp connects it with critical thinking, creativity, and innovation, on the one hand, and researchers highlight fake news, misinformation, cybersecurity, and echo chambers as exponents of digital thinking, on the other hand (Sulzer, 2018 ; Puig, Blanco-Anaya & Perez-Maceira, 2021 ).

This systematic review research study looks ahead to offer an initial step and guideline for the development of a more contemporary digital literacy framework including digital literacy major themes and factors. The researchers provide the following recommendations for both researchers and practitioners.

Recommendations for prospective research

By considering the major qualitative research trend, it seems apparent that more quantitative research-oriented studies are needed. Although it requires more effort and time, mixed method studies will help understand digital literacy holistically.

As digital literacy is an umbrella term for many different technologies, specific case studies need be designed, such as digital literacy for artificial intelligence or digital literacy for drones’ usage.

Digital literacy affects different areas of human lives, such as education, business, health, governance, and so forth. Therefore, different case studies could be carried out for each of these unique dimensions of our lives. For instance, it is worth investigating the role of digital literacy on lifelong learning in particular, and on education in general, as well as the digital upskilling effects on the labour market flexibility.

Further experimental studies on digital literacy are necessary to realize how certain variables (for instance, age, gender, socioeconomic status, cognitive abilities, etc.) affect this concept overtly or covertly. Moreover, the digital divide issue needs to be analysed through the lens of its main determinants.

New bibliometric analysis method can be implemented on digital literacy documents to reveal more information on how these works are related or centred on what major topic. This visual approach will assist to realize the big picture within the digital literacy framework.

Recommendations for practitioners

The digital literacy stakeholders, policymakers in education and managers in private organizations, need to be aware that there are many dimensions and variables regarding the implementation of digital literacy. In that case, stakeholders must comprehend their beneficiaries or the participants more deeply to increase the effect of digital literacy related activities. For example, critical thinking and problem-solving skills and abilities are mentioned to affect digital literacy. Hence, stakeholders have to initially understand whether the participants have enough entry level critical thinking and problem solving.

Development of digital literacy for different groups of people requires more energy, since each group might require a different set of skills, abilities, or competencies. Hence, different subject matter experts, such as technologists, instructional designers, content experts, should join the team.

It is indispensably vital to develop different digital frameworks for different technologies (basic or advanced) or different contexts (different levels of schooling or various industries).

These frameworks should be updated regularly as digital fields are evolving rapidly. Every year, committees should gather around to understand new technological trends and decide whether they should address the changes into their frameworks.

Understanding digital literacy in a thorough manner can enable decision makers to correctly implement and apply policies addressing the digital divide that is reflected onto various aspects of life, e.g., health, employment, education, especially in turbulent times such as the COVID-19 pandemic is.

Lastly, it is also essential to state the study limitations. This study is limited to the analysis of a certain number of papers, obtained from using the selected keywords and databases. Therefore, an extension can be made by adding other keywords and searching other databases.

See Table ​ Management9 9 .

List of papers (n = 43) included in the qualitative analysis—ordered alphabetically by title

#Author and yearTitleJournal/Book
1Sulzer, M. A. (2018)(Re)conceptualizing digital literacies before and after the election of TrumpEnglish Teaching: Practice and Critique
2Gunduzalp, S. (2021)21st Century Skills for Sustainable Education: Prediction Level of Teachers’ Information Literacy Skills on Their Digital Literacy SkillsDiscourse and Communication for Sustainable Education
3Palts, T., Pedaste, M. (2020)A Model for Developing Computational Thinking SkillsInformatics in Education
4Starkey, L. (2020)A systematic review of research exploring teacher preparation for the digital ageCambridge Journal of Education
5Ozkan-Ozen, Y. D., Kazancoglu, Y. (2021)Analysing workforce development challenges in the Industry 4.0International Journal of Manpower
6Barna, C., Epure, M. (2020)Analyzing youth unemployment and digital literacy skills in romania in the context of the current digital transformationReview of Applied Socio-Economic Research
7Reis, D. A., Fleury, A. L., Carvalho, M. M. (2021)Consolidating core entrepreneurial competences: toward a meta-competence frameworkInternational Journal of Enterpreneurial Behavior & Researh
8van Laar, E., van Deursen, J. A. M., van Dijk, J. A. G. M., de Haan, J. (2020)Determinants of 21st-Century Skills and 21st-Century Digital Skills for Workers: A Systematic Literature ReviewSAGE Open
9Kim, M., Choi, D. (2018)Development of Youth Digital Citizenship Scale and Implication for Educational SettingJournal of Educational Technology & Society
10Eyal, L. (2012)Digital Assessment Literacy — the Core Role of the Teacher in a Digital EnvironmentJournal of Educational Technology & Society
11Spante, M., Hashemi, S. S., Lundin, M., Algers, A. (2018)Digital competence and digital literacy in higher education research: Systematic review of concept useCogent Education
12Zhao, Y., Pinto Llorente, A. M., Cruz Sanchez Gomez, M. (2021)Digital competence in higher education research: A systematic literature reviewComputers & Education
13Batanero, J. M. F., Montenegro Rueda, M., Cerero, J. F., Garcia Martinez, I. (2020)Digital competences for teacher professional development. Systematic reviewEuropean Journal of Teacher Education
14Murawski, M., Bick, M. (2017)Digital competences of the workforce – a research topic?Business Process Management Journal
15Gibson, P. F., Smith, S. (2018)Digital literacies: preparing pupils and students for their information journey in the twenty-first centuryInformation and Learning Science
16Mcclurken, J., Boggs, J., Wadewitz, A., Geller, A. E., Beasley-Murray, J. (2013)Digital Literacy and the Undergraduate CurriculumBook: Hacking the Academy: New Approaches to Scholarship and Teaching from Digital Humanities. The University of Michigan Press
17Radovanovic, D., Holst, C., Belur, S. B., Srivastava, R., Houngbonon, G. V., Le Quentrec, E., Miliza, J., Winkler, A. S., Noll, J. (2020)Digital Literacy Key Performance Indicators for Sustainable DevelopmentSocial Inclusion
18Soomro, M. A., Hizam-Hanafiah, M., Abdullah, N. L. (2020)Digital readiness models: A systematic literature reviewCompusoft, An international journal of advanced computer technology
19Martinez-Bravo, M. C., Sadaba-Chalezquer, C., Serrano-Puche, J. (2020)Fifty years of digital literacy studies: A meta-research for interdisciplinary and conceptual convergenceProfesional de la informacion
20Kolle, S. R. (2017)Global research on information literacy: a bibliometric analysis from 2005 to 2014The Electronic Library
21Dominguez Figaredo, D. (2017)Heuristics and Web Skills Acquisition in Open Learning EnvironmentsEducational Technology & Society
22Bawden, D. (2001)Information and digital literacies: a review of conceptsJournal of Documentation
23Coklar, A. N., Yaman, N. D., Yurdakul, I. K. (2017)Information literacy and digital nativity as determinants of online information search strategiesComputers in Human Behavior
24Fosmire, M. (2014)Information literacy and lifelong learningBook: Integrating Information into the Engineering Design Process, Purdue University Press
25Buschman, J. (2009)Information Literacy, “New” Literacies, and LiteracyThe Library Quarterly
26Reis, C., Pessoa, T., Gallego-Arrufat, M. J. (2019)Literacy and digital competence in Higher Education: A systematic reviewRevista de Docencia Universitaria
27Oh, S. S., Kim, K.-A., Kim, M., Oh, J., Chu, S. H., Choi, J. Y. (2021)Measurement of Digital Literacy Among Older Adults: Systematic ReviewJournal of Medical Internet Research
28Santandreu Calonge, D., Shah, M. A., Riggs, K., Connor, M. (2019)MOOCs and upskilling in Australia: A qualitative literature studyCogent Education
29Mahiri, J. (2011)New literacies need new learningBook: Digital Tools in Urban Schools. Mediating a Remix of Learning. The University of Michigan Press
30Hicks, T., Hawley Turner, K. (2013)No Longer a Luxury: Digital Literacy Can't WaitEnglish Journal
31Khuraisah, M. N., Khalid, F., Husnin, H. (2020)Preparing graduates with digital literacy skills toward fulfilling employability need in 4IR Era: A reviewInternational Journal of Advanced Computer Science and Applications
32da Silva, C. R. S., Carvalho Teixeira, T. M., Bentes Pinto, V. (2019)Research methodology in information literacy: A systematic reviewDigital Journal of Library and Information Science
33Garcia-Perez, L., Garcia-Garnica, M., Olmedo-Moreno, E. M. (2021)Skills for a Working Future: How to Bring about Professional Success from the Educational SettingEducation Sciences
34Stordy, P. (2015)Taxonomy of literaciesJournal of Documentation
35Aesaert, K., Vanderlinde, R., Tondeur, J., van Braak, J. (2013)The content of educational technology curricula: a cross-curricular state of the artEducational Technology Research and Development
36de Greef, M., Segers, M., Nijhuis, J., Lam, J. F., van Groenestijn, M., van Hoek, F., van Deursen, A. J. A. M., Bohnenn, E., Tubbing, M. (2015)The development and validation of testing materials for literacy, numeracy and digital skills in a Dutch contextInternational Review of Education
37Rodriguez-Garcia, A. M., Caceres Reche, M. P., Garcia, S. A. (2018)The digital competence of the future teacher: bibliometric analysis of scientific productivity indexed in ScopusInternational Journal of Educational Research and Innovation
38Sanchez-Caballe, A., Gisbert-Cervera, M., Esteve-Mon, F. (2020)The digital competence of university students: a systematic literature reviewAloma
39Keshavarz, M. (2020)The effect of distance education on information literacy case study: IranThe Quarterly Review of Distance Education
40van Laar, E., van Deursen, J. A. M., van Dijk, J. A. G. M., de Haan, J.(2017)The relation between 21st-century skills and digital skills: A systematic literature reviewComputers in Human Behavior
41Esteban-Navarro, M. A., Garcia-Madurga, M. A., Morte-Nadal, T., Nogales-Bocio, A. I. (2020)The Rural Digital Divide in the Face of the COVID-19 Pandemic in Europe—Recommendations from a Scoping ReviewInformatics
42Rifai, I., Setiadi, C. J., Renaldo, J. Andreani, W. (2021)Toward society 5.0: Indonesia and Japan on the twenty-first century literacy skillsIOP Conf. Series: Earth and Environmental Science
43Kozanoglu, D. C., Abedin, B. (2020)Understanding the role of employees in digital transformation: conceptualization of digital literacy of employees as a multi-dimensional organizational affordanceJournal of Enterprise Information Management

Author contributions

The authors worked together on the manuscript equally. All authors have read and approved the final manuscript.

This research is funded by Woosong University Academic Research in 2022.

Availability of data and materials

Declarations.

The authors declare that they have no competing interests.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Contributor Information

Hasan Tinmaz, Email: rk.ca.ttocidne@zamnith .

Yoo-Taek Lee, Email: rk.ca.usw@eelty .

Mina Fanea-Ivanovici, Email: [email protected] .

Hasnan Baber, Email: [email protected] .

  • Baber H, Fanea-Ivanovici M, Lee YT, Tinmaz H. A bibliometric analysis of digital literacy research and emerging themes pre-during COVID-19 pandemic. Information and Learning Sciences. 2022 doi: 10.1108/ILS-10-2021-0090. [ CrossRef ] [ Google Scholar ]
  • Beaunoyer E, Dupéré S, Guitton MJ. COVID-19 and digital inequalities: Reciprocal impacts and mitigation strategies. Computers in Human Behavior. 2020; 111 :10642. doi: 10.1016/j.chb.2020.106424. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Blau I, Shamir-Inbal T, Avdiel O. How does the pedagogical design of a technology-enhanced collaborative academic course promote digital literacies, self-regulation, and perceived learning of students? The Internet and Higher Education. 2020; 45 :100722. doi: 10.1016/j.iheduc.2019.100722. [ CrossRef ] [ Google Scholar ]
  • Carretero, S., Vuorikari, R., & Punie, Y. (2017). DigComp 2.1: The Digital Competence Framework for Citizens with eight proficiency levels and examples of use (No. JRC106281). Joint Research Centre, https://publications.jrc.ec.europa.eu/repository/handle/JRC106281
  • Eshet, Y. (2004). Digital literacy: A conceptual framework for survival skills in the digital era. Journal of Educational Multimedia and Hypermedia , 13 (1), 93–106, https://www.learntechlib.org/primary/p/4793/
  • Eshet-Alkalai Y. Thinking in the digital era: A revised model for digital literacy. Issues in Informing Science and Information Technology. 2012; 9 (2):267–276. doi: 10.28945/1621. [ CrossRef ] [ Google Scholar ]
  • Ferrari, A. (2012). Digital competence in practice: An analysis of frameworks. JCR IPTS, Sevilla. https://ifap.ru/library/book522.pdf
  • Fraenkel JR, Wallen NE, Hyun HH. How to design and evaluate research in education. 8. Mc Graw Hill; 2012. [ Google Scholar ]
  • Heitin, L. (2016). What is digital literacy? Education Week, https://www.edweek.org/teaching-learning/what-is-digital-literacy/2016/11
  • Helsper EJ, Eynon R. Distinct skill pathways to digital engagement. European Journal of Communication. 2013; 28 (6):696–713. doi: 10.1177/0267323113499113. [ CrossRef ] [ Google Scholar ]
  • Kowalczyk N, Truluck C. Literature reviews and systematic reviews: What is the difference ? Radiologic Technology. 2013; 85 (2):219–222. [ PubMed ] [ Google Scholar ]
  • Ng W. Can we teach digital natives digital literacy? Computers & Education. 2012; 59 (3):1065–1078. doi: 10.1016/j.compedu.2012.04.016. [ CrossRef ] [ Google Scholar ]
  • Ozkan-Ozen YD, Kazancoglu Y. Analysing workforce development challenges in the Industry 4.0. International Journal of Manpower. 2021 doi: 10.1108/IJM-03-2021-0167. [ CrossRef ] [ Google Scholar ]
  • Puig B, Blanco-Anaya P, Perez-Maceira JJ. “Fake News” or Real Science? Critical thinking to assess information on COVID-19. Frontiers in Education. 2021; 6 :646909. doi: 10.3389/feduc.2021.646909. [ CrossRef ] [ Google Scholar ]
  • Robinson L, Cotten SR, Ono H, Quan-Haase A, Mesch G, Chen W, Schulz J, Hale TM, Stern MJ. Digital inequalities and why they matter. Information, Communication & Society. 2015; 18 (5):569–582. doi: 10.1080/1369118X.2015.1012532. [ CrossRef ] [ Google Scholar ]
  • Robinson P, Lowe J. Literature reviews vs systematic reviews. Australian and New Zealand Journal of Public Health. 2015; 39 (2):103. doi: 10.1111/1753-6405.12393. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Sousa MJ, Rocha A. Skills for disruptive digital business. Journal of Business Research. 2019; 94 :257–263. doi: 10.1016/j.jbusres.2017.12.051. [ CrossRef ] [ Google Scholar ]
  • Sulzer A. (Re)conceptualizing digital literacies before and after the election of Trump. English Teaching: Practice & Critique. 2018; 17 (2):58–71. doi: 10.1108/ETPC-06-2017-0098. [ CrossRef ] [ Google Scholar ]
  • Tinmaz, H., Fanea-Ivanovici, M., & Baber, H. (2022). A snapshot of digital literacy. Library Hi Tech News , (ahead-of-print).
  • Uman LS. Systematic reviews and meta-analyses. Journal of the Canadian Academy of Child and Adolescent Psychiatry. 2011; 20 (1):57–59. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Van Deursen AJAM, Helsper EJ, Eynon R. Development and validation of the Internet Skills Scale (ISS) Information, Communication & Society. 2015; 19 (6):804–823. doi: 10.1080/1369118X.2015.1078834. [ CrossRef ] [ Google Scholar ]
  • Van Deursen AJAM, van Dijk JAGM. Using the internet: Skills related problems in users’ online behaviour. Interacting with Computers. 2009; 21 :393–402. doi: 10.1016/j.intcom.2009.06.005. [ CrossRef ] [ Google Scholar ]
  • Van Deursen AJAM, van Dijk JAGM. Measuring internet skills. International Journal of Human-Computer Interaction. 2010; 26 (10):891–916. doi: 10.1080/10447318.2010.496338. [ CrossRef ] [ Google Scholar ]
  • Van Deursen AJAM, van Dijk JAGM. Internet skills and the digital divide. New Media & Society. 2010; 13 (6):893–911. doi: 10.1177/1461444810386774. [ CrossRef ] [ Google Scholar ]
  • van Dijk JAGM, Van Deursen AJAM. Digital skills, unlocking the information society. Palgrave MacMillan; 2014. [ Google Scholar ]
  • van Laar E, van Deursen AJAM, van Dijk JAGM, de Haan J. The relation between 21st-century skills and digital skills: A systematic literature review. Computer in Human Behavior. 2017; 72 :577–588. doi: 10.1016/j.chb.2017.03.010. [ CrossRef ] [ Google Scholar ]

Logo for OPEN OKSTATE

7 Digital literacies and the skills of the digital age

Cathy L. Green, Oklahoma State University

Oklahoma State University

Abstract – This chapter is intended to provide a framework and understanding of digital literacy, what it is and why it is important. The following pages explore the roots of digital literacy, its relationship to language literacy and its role in 21 st century life.

Introduction

Unlike previous generations, learning in the digital age is marked by the use of rapidly evolving technology, a deluge of information and a highly networked global community (Dede, 2010). In such a dynamic environment, learners need skills beyond the basic cognitive ability to consume and process language. In other words: To understand what the characteristics of the digital age, and of digital learners, means for how people learn in this new and changing landscape, one may turn to the evolving discussion of literacy or, as one might say now, of digital literacy. The history of literacy contextualizes digital literacy and illustrates changes in literacy over time. By looking at literacy as a historical phenomenon, the characteristics of which have evolved over time, we can glean the fundamental characteristics of the digital age. Those characteristics in turn illuminate the skills needed in order to take advantage of digital environments. The following discussion is an overview of digital literacy, its essential components and why it is important for learning in a digital age.

Moving from Literacy to Digital Literacy

Literacy refers to the ability of people to read and write (UNESCO, 2017). Reading and writing then, is about encoding and decoding information between written symbols and sound (Resnick, 1983; Tyner, 1998). More specifically, literacy is the ability to understand the relationship between sounds and written words such that one may read, say and understand them (UNESCO, 2004; Vlieghe, 2015). Literacy is often considered a skill or competency and is often referred to as such. Children and adults alike can spend years developing the appropriate skills for encoding and decoding information.

Over the course of thousands of years, literacy has become much more common and widespread with a global literacy rate ranging from 81% to 90% depending on age and gender (UNESCO, 2016). From a time when literacy was the domain of an elite few, it has grown to include huge swaths of the global population. There are a number of reasons for this, not the least of which are some of the advantages the written word can provide. Kaestle, (1985) tells us that “literacy makes it possible to preserve information as a snapshot in time, allows for recording, tracking and remembering information, and sharing information more easily across distances among others” (p. 16). In short, literacy led “to the replacement of myth by history and the replacement of magic by skepticism and science. Writing allowed bureaucracy, accounting, and legal systems with universal rules and has replaced face-to-face governance with depersonalized administration” (Kaestle, 1985, p. 16). This is not to place a value judgement on the characteristics of literacy but rather to explain some of the many reasons why it spread.

There are, however, other reasons for the spread of literacy. In England, throughout the middle ages literacy grew in part, because people who acquired literacy skills were able to parlay those skills into work with more pay and social advantages (Clanchy, 1983). The great revolutions of the 19th and 20th centuries also relied on leaders who could write and compatriots who could read as a way to spread new ideas beyond the street corners and public gatherings of Paris, Berlin, and Vienna. Literacy was perceived as necessary for spreading information to large numbers of people. In the 1970’s Paulo Freire insisted that literacy was vital for people to participate in their own governance and civic life (Tyner, 1998). His classic “Pedagogy of the Oppressed” begins from the premise that bringing the traditional illiterate and uneducated into learning situations as partners with their teachers awakens the critical conscience necessary as a foundation for action to foment change (Freire, 1973). UNESCO (2004) also acknowledges the role that literacy plays in enabling populations to effect change and achieve social justice aims. They speak even more broadly, moving beyond the conditions necessary for revolution, contending that literacy is a fundamental right of every human being, providing employment opportunities, and the fundamental skills necessary to accrue greater wealth and improve one’s quality of life.

Although the benefits of literacy were a driving force in its spread, technological advances also enabled the spread of literacy to greater and greater numbers of people. From stamped tokens, tally sticks and clay tablets, to ancient scrolls, handwritten volumes, the printing press, typewriters, and finally computers, technology is largely responsible for driving the evolution of literacy into the particular forms of encoding and decoding information associated with the digital age. Technology has made it possible for literacy to move from the hands of the few to the hands of the masses and to morph into a digital environment with characteristics extending far beyond anything that has been seen before.

Not only did computers and electronic technology deliver literacy into the hands of many but also created an environment that made it possible to store vast amounts of information. Books and libraries led the way to making information easily available to the public, but within the age of computers and the internet the volume of accessible information is larger than ever, more readily available than ever, and changing more quickly than ever before. In the early 21st century, technology continues to develop more quickly than at any time in the past creating an environment that is constantly changing. These changes contribute to the need for different skills beyond traditional literacy skills also called new media literacy (Jenkins, 2018). For a short video on the reasons why digital literacy is important visit “ The New Media Literacies ” located on YouTube.com and created by the research team at Project New Media Literacies.

Literacy in the Digital Age

If literacy involves the skills of reading and writing, digital literacy requires the ability to extend those skills in order to effectively take advantage of the digital world (ALA, 2013). More general definitions express digital literacy as the ability to read and understand information from digital sources as well as to create information in various digital formats (Bawden, 2008; Gilster, 1997; Tyner, 1998; UNESCO, 2004). Developing digital skills allows digital learners to manage a vast array of rapidly changing information and is key to both learning and working in an evolving digital landscape (Dede, 2010; Koltay, 2011; Mohammadyari & Singh, 2015). As such, it is important for people to develop certain competencies specifically for handling digital content.

People who adapt well to the digital world exhibit characteristics enabling them to develop and maintain digital literacy skills. Lifelong learning is a key characteristic necessary for handling rapid changes in technology and information and thus, critical to digital literacy. Successful digital learners have a high level of self-motivation, a desire for active modes of learning and they exercise the ability to learn how to learn. Maintaining and learning new technical skills also benefits learners in the digital age and an attitude of exploration and play will help learners stay engaged and energized in a world where speed of change and volume of information could otherwise become overwhelming (Dede, 2010; Jenkins, 2018; Visser, 2012). A final characteristic of a digital learner includes the ability to engage in a global network with a greater awareness of one’s place and audience in that network. Together, these characteristics of the digital age guide us in understanding what traits a learner will require to be successful in the digital environment. The following section will help understand what lies at the intersection of digital skills and traits of successful digital learners by reviewing existing digital literacy frameworks.

Reviewing Existing Frameworks for Digital Literacy/ies

Digital literacy is alternately described as complicated, confusing, too broad to be meaningful and always changing (Heitin, 2016; Pangrazio, 2014; Tyner, 1998; Williams, 2006). Due to this confusion, some feel it best to completely avoid the term digital literacy altogether and instead opt for the terms such as digital competencies (Buckingham, 2006), 21st century skills (Williamson, 2011) or digital skills (Heitin, 2016). Another way to sort out the confusion is to look at digital literacy as multiple literacies (Buckingham, 2006; Lankshear & Knobel, 2008; UNESCO, 2004)

Here, I take the latter approach and look at digital literacy as a collection of literacies each of which play a significant role in learning in a digital world. Ng (2012), operationalizes digital literacy as a framework of multiple, specific competencies which, when combined, form a cohesive collection of skills. By taking this approach, we link the characteristics of the digital environment as well as those of the digital learner not to a single digital skill but rather a set of digital literacy practices. In this way, we can consider the various skills needed to navigate the digital world in an organized and consistent manner.

Ng (2012) proposes a three-part schema for discussing the overlapping functional characteristics of a digitally competent person: technical, cognitive, and social (see Figure 1).

critical thinking in digital literacy

Technical literacy, also referred to as operational literacy, refers to the mastery of technical skills and tasks required to access and work with digital technology such as how to operate a computer; use a mouse and keyboard; open software; cut, copy and paste data and files, acquire an internet connection and so on (Lankshear & Knobel, 2008). The cognitive area of digital literacy focuses on activities such as critical thinking, problem solving and decision making (Williamson, 2011) and includes the ability to “evaluate and apply new knowledge gained from digital environments”(Jones-Kavalier & Flannigan, 2006, p. 5). The third of Ng’s three categories – social literacies – covers a wide range of activities which together constitute the ability to communicate in a digital environment both socially and professionally, understand cyber security, follow “netiquette” protocols, and navigate discussions with care so as not to misrepresent or create misunderstandings (Ng, 2012). Of particular note, Ng captures the essence of digital literacy by showing how digital literacy exists at the intersection of the technical, cognitive and social aspects of literacy which are referred to as dimensions. Ng’s framework is not, however, a digital literacy framework itself. Instead it provides a vehicle for exploring the various components of digital literacy at a conceptual level while remaining clear that the individual skills are at all times connected to and dependent upon each other.

There are a number of organizations that publish their own framework for digital literacies including the International Society for Technology in Education ICT Skills (ISTE), the American Association of College and Universities (AACU), the Organization for Economic Cooperation and Development (OECD), the American Library Association (ALA), and the Partnership for 21st Century Skills among others (Dede, 2010). The digital frameworks exhibit many similarities, and a few differences. There are some differences in the terminology and organization of these frameworks, but they all include similar skills. What follows is a brief overview of the different digital frameworks. See Figure 2 for a composite of these frameworks.

critical thinking in digital literacy

Figure 2. Major Frameworks for 21st Century Skills (American Library Association, 2013; Dede, 2010; SCONUL, 2016; Vockley & Lang, 2008)

Each of the frameworks come from a slightly different angle and will at times reflect the background from which they come. The American Library Association (ALA) framework evolved out of the information literacy tradition of libraries, while the American Association of College and Universities (AACU) and the Society of College and University Libraries (SOCNUL) evolved from higher education perspective, the Partnership for 21st century learning addresses K-12 education, and the ISTE is steeped in a more technical tradition. Even with these different areas of focus the components of each framework are strikingly similar although some in more detail than others. Three of the six specifically address the skills necessary for accessing, searching and finding information in a digital environment while the other three have broader categories in which one might expect to find these skills including, research and information fluency, intellectual skills, and ICT literacy. Cognitive skills required for digital literacy are also covered by all of the frameworks in varying degrees of specificity. Among them one will find references to evaluating, understanding, creating, integrating, synthesizing, creativity and innovation. Finally, four of the six digital frameworks pay homage to the necessity of solid communication skills. They are in turn, referred to as life skills, personal and social responsibility, communication, collaboration, digital citizenship and collective intelligence.

What seems oddly missing from this list of skills is the technical component which only appears explicitly in the ISTE list of skills. The partnership for 21st century learning uses ICT literacy as a designation for the ability to use technology and the ALA, in discussing its framework, makes it clear that technical proficiency is a foundational requirement for digital literacy skills. Even with these references to technical skills the digital literacy frameworks are overwhelmingly partial to the cognitive and social focus of digital skills and technical proficiency tends to be glossed over compared to the other dimensions. Even though technical skills receive relatively little attention by comparison we will assume for this discussion, technical skills are a prerequisite to the other digital skills, and we will look more carefully at each of them in the next section.

To fully understand the many digital literacies, we will use the ALA framework as a point of reference for further discussion using the other frameworks and other materials to further elucidate each skill area. The ALA framework is laid out in terms of basic functions with enough specificity to make it easy to understand and remember but broad enough to cover a wide range of skills. The ALA framework includes the following areas:

  • Understanding,
  • Evaluating,
  • Creating, and
  • Communicating (American Library Association, 2013).

Finding information in a digital environment represents a significant departure from the way human beings have searched for information for centuries. The learner must abandon older linear or sequential approaches to finding information such as reading a book, using a card catalog, index or table of contents and instead use lateral approaches like natural language searches, hypermedia text, keywords, search engines, online databases and so on (Dede, 2010; Eshet, 2002). The shift from sequential to lateral involves developing the ability to construct meaningful search parameters (SCONUL, 2016) whereas before, finding the information would have meant simply looking up page numbers based on an index or sorting through a card catalog. Although finding information may depend to some degree on the search tool being used (library, internet search engine, online database, etc.) the search results also depend on how well a person is able to generate appropriate keywords and construct useful Boolean searches. Failure in these two areas could easily return too many results to be helpful, vague or generic results, or potentially no useful results at all (Hangen, 2015).

Not immediately obvious, but part of the challenge of finding information is the ability to manage the results. Because there is so much data, changing so quickly, in so many different formats it can be challenging to organize and store it in such a way as to be useful. SCONUL (2016) talks about this as the ability to organize, store, manage and cite digital resources while the Educational Testing Service also specifically mentions the skills to access and manage information. Some ways to accomplish these tasks is through the use of social bookmarking tools such as Diigo, clipping and organizing software such as Evernote and OneNote, and bibliographic software. Many sites, such as YouTube allow individuals with an account to bookmark videos as well as create channels or collections of videos for specific topics or uses. Other websites have similar features.

Understanding

Understanding in the context of digital literacy perhaps most closely resembles traditional literacy in so much as it too, is the ability to read and interpret text (Jones-Kavalier & Flannigan, 2006). In the digital age, however, the ability to read and understand extends much further than text alone. For example, searches may return results with any combination of text, video, sound, and audio as well as still and moving pictures. As the internet has evolved, there have evolved a whole host of visual languages such as moving images, emoticons, icons, data visualizations, videos and combinations of all of the above. Lankshear & Knoble, (2008) refer to these modes of communication as “post typographic textual practice”. Understanding the variety of modes of digital material may also be referred to as multimedia literacy (Jones-Kavalier & Flannigan, 2006), visual literacy (Tyner, 1998), and digital literacy (Buckingham, 2006).

Evaluating digital media requires competencies ranging from evaluating the importance of a piece of information to determine its accuracy and its source. Evaluating information is not new to the digital age, but the nature of digital information can make it more difficult to understand who the source of information is and whether it can be trusted (Jenkins, 2018). When there is abundant and rapidly changing data across heavily populated networks, anyone with access can generate information online, making decisions about its authenticity, trustworthiness, relevance, and significance daunting. Learning evaluative digital skills means learning to ask questions about who is writing the information, why they are writing it, and who the intended audience is (Buckingham, 2006). Developing critical thinking skills is part of the literacy of evaluating and assessing the suitability for the use of a specific piece of information (SCONUL, 2016).

Looking for secondary sources of information can help confirm the authenticity and accuracy of online data and researching the credentials and affiliations of the author is another way to find out more about whether an article is trustworthy or valid. One may find other places the author has been published and verify they are legitimate. Sometimes one may be able to review affiliated organizations to attest to the expertise of the author such finding out where an employee works if they are a member of a professional organization or a leading researcher in a given field. All of these provide essential clues for use in evaluating information online.

Creating in the digital world makes explicit the production of knowledge and ideas in digital formats. While writing is a critical component of traditional literacy, it is not the only creative tool in the digital toolbox. Other tools are available and include creative activities such as podcasting, making audio-visual presentations, building data visualizations, 3D printing, writing blogs and new tools that haven’t even been thought of yet. In short, all formats in which digital information may be consumed, a digitally literate individual will also want to be able to use in the creation of a product. A key component of creating with digital tools is understanding what constitutes fair use and what is considered plagiarism. While this is not new to the digital age, it may be more challenging to find the line between copying and extending someone else’s work.

In part, the reason for the increased difficulty of finding the line between plagiarism and new work is the “cut and paste culture” of the internet referred to as “reproduction literacy” (Eshet 2002, p.4) also referred to as appropriation in Jenkins’ New Media Literacies (Jenkins, 2018). The question is, what can one change and how much can one change work without being considered copying? This skill requires the ability to think critically, evaluate a work and make appropriate decisions. There are tools and information to help understand and find those answers such as the creative commons. Learning about these resources and learning how to use them is part of this digital literacy.

Communicating

Communicating is the final category of digital skills in the ALA digital framework. The capacity to connect with individuals all over the world creates unique opportunities for learning and sharing information for which developing digital communication skills is vital. Some of the skills required for communicating in a digital environment include digital citizenship, collaboration, and cultural awareness. This is not to say that one does not need to develop communication skills outside of the digital environment but that the skills required for digital communication go beyond what is required in a non-digital environment. Most of us are adept at personal, face to face communication but digital communication needs the ability to engage in asynchronous environments such as email, online forums, blogs and social media platforms where what we say can’t always be deleted but can be easily misinterpreted. Add that to an environment where people number in the millions and the opportunities for misunderstandings and cultural miscues are much more likely.

The communication category of digital literacies covers an extensive array of skills above and beyond what one might need for face to face interactions. It includes competencies around ethical and moral behavior, responsible communication for engagement in social and civic activities (Adam Becker et al., 2017), an awareness of audience and an ability to evaluate the potential impact of one’s actions online. It also includes skills for handling privacy and security in online environments. These activities fall into two main categories of activity including digital citizenship and collaboration.

Digital citizenship refers to one’s ability to interact effectively in the digital world. Part of this skill is good manners, often referred to as “netiquette. There is a level of context which is often missing in digital communication due to physical distance, lack of personal familiarity with people online and the sheer volume of people who may come in contact with our words. People who know us well may understand exactly what we mean when we say something sarcastic or ironic, but those and other vocal and facial cues are missing in most digital communication making it more likely we will be misunderstood. Also, we are also more likely to misunderstand or be misunderstood if we remain unaware of cultural differences amongst people online. So, digital citizenship includes an awareness of who we are, what we intend to say and how it might be perceived by other people we do not know (Buckingham, 2006). It is also a process of learning to communicate clearly and in ways that help others understand what we mean.

Another key digital skill is collaboration, and it is essential for effective participation in digital projects via the internet. The internet allows people to engage with others we may never see in person and work towards common goals be they social, civic or business oriented. Creating a community and working together requires a degree of trust and familiarity that can be difficult to build given the physical distance between participants. Greater awareness must be paid to inclusive behavior, and more explicit efforts need to be made to make up for perceived or actual distance and disconnectedness. So, while the promise of digital technology to connect people is impressive it is not necessarily an automatic transition, and it requires new skills.

Parting thoughts.

It is clear from our previous discussion of digital literacy that technology and technical skills underpin every other digital skill. A failure to understand hardware, software, the nature of the internet, cloud-based technologies and an inability to learn new concepts and tools going forward handicaps one’s ability to engage with the cognitive and social literacies. While there are sometimes tacit references to technical skills and ability, extant digital literacy frameworks tend to focus more on the cognitive and social aspects of digital environments. There is an implied sense that once technical skills are learned, we the digitally literate person can forget about them and move on to the other skills. Given the rapid pace of technological change in the last 40 years, however, anyone working in a digital environment would be well advised to keep in mind that technical concepts and tools continue to develop. It does not seem likely that we will ever reach a point where people can simply take technological skills for granted and to do so would undermine our ability to address the other digital skills.

Another way to think of this is to recognize that no matter what the skill, none of them operate independently of one another. Whether searching, creating, evaluating, understanding or communicating, it is a combination of skills (or literacies) that allow us to accomplish our goals. Thinking critically, and evaluating information and sources leads to sound decision-making. Understanding and synthesizing information is necessary for creating and again the technical tools are necessary for completing the product. Finding information is of little use if one is unable to analyze its usefulness and creating a great video or podcast will not mean much if one is unable to navigate social and professional networks to communicate those works to others. If only understood in isolation, digital literacies have little meaning and can be of little use in approaching digital environments.

Ng’s (2012) conceptual framework reminds us that digital literacy is that space where technical, cognitive and social literacies overlap. A digital skill is not the same thing as digital literacy but the two are fully intwined. Acquiring digital skills is only the beginning of a study of digital literacies, however, and it would be a mistake to stop here. Furthermore, digital literacies span multiple areas including both the cognitive and the social. The real value of digital literacy lies in understanding the synergistic effect of individual digital literacy skills integrated with sets of competencies that enable one to work effectively in the digital world.

Learning Activities.

Literacy Narratives are stories about reading and composing in any form or context. They often include poignant memories that involve a personal experience with literacy. Digital literacy narratives can sometimes be categorized as narratives that focus on how the writer came to understand the importance of technology in his/her life or teaching pedagogy. More often, they are simply narratives that use a medium beyond the print-based essay to tell the story.

Kairos: A Journal of Rhetoric, Technology, and Pedagogy, 20(1), available at http://kairos.technorhetoric.net/20.1/praxis/bourelle-et-al

  • Combining both aspects of the genre, write a piece based on your technological literacy, choosing a medium you feel best conveys the message you want to share with your audience.
  • Find and read 2-4 literacy narratives online that emphasize the use of technology and write a short reflection that discusses the main digital literacies used, summarizes the main points made and describes the elements you felt were most important. Also, describe any digital literacy skills you utilized to complete the assignment.
  • Create your literacy narrative that tells the story of a significant experience of your own with digital literacy. Use a multi-modal tool that includes audio and images or video. Share with your classmates and discuss the most important ideas you noticed in others’ narratives.
  • Compare two of the literacy frameworks in Figure 2. How are they alike? How are they different? Do you like one better than the other? Why or Why not?
  • Digital Literacy and why it matters – https://www.youtube.com/watch?v=p2k3C-iB88w
  • The essential elements of digital literacies https://www.youtube.com/watch?v=A8yQPoTcZ78
  • What is a Literacy Narrative? https://www.youtube.com/watch?v=_Mhl2j-cpZo
  • Benji Bissman’s computer literacy narrative – http://daln.osu.edu/handle/2374.DALN/2327 [site can’t be reached, KE 6.12.24]  
  • Global Digital Literacy Council [page not found, KE 6.12.24]
  • International Society for Technology in Education
  • Information and Communication Technologies [site can’t be reached, KE 6.12.24]
  • Education Development Center, Inc.
  • International Visual Literacy Association
  • http://mediasmarts.ca/digital-media-literacy-fundamentals/digital-literacy-fundamentals
  • https://www.microsoft.com/en-us/digitalliteracy/overview.aspx [page not found, KE 6.12.24]
  • . http://info.learning.com/hubfs/Corp_Site/Sales%20Tools/12EssentialSkills_Brochure_Apr16.pdf [page not found, KE 6.12.24]
  • http://www. digitalliteracy.us
  • https://k12.thoughtfullearning.com/FAQ/what-are-literacy-skills

References.

Adam Becker, S., Cummins, M., Davis, A., Freeman, A., Hall Gieseinger, C., & Ananthanarayanan, V. (2017). NMC Horizon Report: 2017 Higher Education Edition. NMC Horizon Report. https://doi.org/ISBN 978-0-9977215-7-7

Association, A. L. (2013). Digital literacy, libraries, and public policy (January). Washington, D.C. Retrieved from http://www.districtdispatch.org/wp-content/uploads/2013/01/2012_OITP_digilitreport_1_22_13.pdf

Bawden, D. (2008). Origins and concepts of digital literacy. In C. Lankshear & M. Knobel (Eds.) (pp. 17–32).

Buckingham, D. (2006). Defining digital literacy. District Dispatch, 263–276. https://doi.org/10.1007/978-3-531-92133-4_4

Clanchy, M. (1983). Looking back from the invention of printing. Resnick (Ed.), Literacy in historical perspective (pp. 7–22). Library of Congress.

Dede, C. (2010). Comparing frameworks for 21st century skills. 21st Century Skills: Rethinking How Students Learn, 51–76.

Eshet, Y. (2002). Digital literacy: A new terminology framework and its application to the design of meaningful technology-based learning environments. Association for the Advancement of Computing in Education, 1–7.

Gilster, P. (1997). Digital Literacy. New York: Wiley Computer Pub.

Hangen, T. (2015). Historical digital literacy, one classroom at a time. Journal of American History. https://doi.org/10.1093/jahist/jav062

Heitin, L. (2016). Digital Literacy: Forging agreement on a definition. Retrieved from www.edweek.org/go/changing-literacy

Jenkins, H. (2018). This page has a content security policy that prevents it from being loaded in this way . Retrieved from http://www.newmedialiteracies.org/

Jones-Kavalier, B. B. R., & Flannigan, S. L. (2006). Connecting the digital dots : Literacy of the 21st century. Workforce, 29(2), 8–10. https://doi.org/Article

Kaestle, C. F. (1985). Review of Research in Education. The History of Literacy and the History of Readers, 12(1985), 11–53. Retrieved from http://www.jstor.org/stable/1167145

Koltay, T. (2011). The media and the literacies: Media literacy, information literacy, digital literacy. Media, Culture & Society, 33(2), 211–221. https://doi.org/10.1177/0163443710393382

Lankshear, Colin & Knobel, M. (2008). Introduction. In C. & K. M. Lankshear (Ed.), Digital Literacies: Concepts, policies and practices. https://doi.org/9781433101694

Mohammadyari, S., & Singh, H. (2015). Understanding the effect of e-learning on individual performance: The role of digital literacy. Computers and Education, 82, 11–25. https://doi.org/10.1016/j.compedu.2014.10.025

Ng, W. (2012). Can we teach digital natives digital literacy? Computers and Education, 59(3), 1065–1078. https://doi.org/10.1016/j.compedu.2012.04.016

Pangrazio, L. (2014). Reconceptualising critical digital literacy. Discourse: Studies in the Cultural Politics of Education, 37(2), 163–174. https://doi.org/10.1080/01596306.2014.942836

Reynolds, R. (2016). Defining, designing for, and measuring social constructivist digital literacy development in learners: a proposed framework. Educational Technology Research and Development. https://doi.org/10.1007/s11423-015-9423-4

SCONUL. (2016). The SCONUL7 pillars of information literacy through a digital literacy “ lens .” Retrieved from https://www.sconul.ac.uk/sites/default/files/documents/Digital_Lens.pdf

Tyner, K. (1998). Tyner, Kathleen. Literacy in a digital world: Teaching and learning in the age of information (Kindle). Routledge.

UNESCO. (2004) The plurality of literacy. UNESCO, (The plurality of literacy and its Implications for Policies and Programmes UNESCO Education Sector Position Paper). https://doi.org/10.1017/CBO9781107415324.004

Visser, M. (2012). Digital literacy definition. Retrieved from http://connect.ala.org/node/181197

Vlieghe, J. (2015). Traditional and digital literacy. The literacy hypothesis, technologies of reading and writing, and the “grammatized” body. Ethics and Education. https://doi.org/10.1080/17449642.2015.1039288

Vockley, M., & Lang, V. (2008). 21st century skills , education & competitiveness. Retrieved from http://www.p21.org/storage/documents/21st_century_skills_education_and_competitiveness_guide.pdf

Williams, B. T. (2006). Girl power in a digital world: Considering the complexity of gender, literacy, and technology. Journal of Adolescent & Adult Literacy, 50(4), 300–307. https://doi.org/10.1598/JAAL.50.4.6

Williamson, R. (2011). Digital literacy. EPI Education Partnerships, inc. Retrieved from http://www.iste.org/standards/aspx

This resource is no cost at https://open.library.okstate.edu/learninginthedigitalage/

Links checked 6.12.24 KE

Learning in the Digital Age Copyright © 2020 by Cathy L. Green, Oklahoma State University is licensed under a Creative Commons Attribution 4.0 International License , except where otherwise noted.

Share This Book

Project New Yorker

What Is Digital Literacy and Why Is It Important?

  • Updated on March 6, 2024

' src=

Digital literacy confines our skills required to understand and use digital technologies. It involves peoples critical thinking and ability to navigate online environments.

In this modern world digital age means more than just using technology. It’s about competence to access, analyze, create, and communicate in an increasingly digital world. 

Digital literacy is not only based on a single aspect, it also spans various dimensions including knowledge of basic hardware and software, understanding of internet navigation, and the ethical use of online platforms.

From students to professionals must cultivate these competencies to thrive in modern society. As the digital landscape evolves, staying digitally literate is a continuous journey, essential for participation in the global economy and for engaging with a broader community in an effective, safe, and secure manner.

What Is Digital Literacy

Digital literacy has become a fundamental skill in the 21st century, related to reading and writing in previous centuries. It bounds a range of abilities that are essential for navigating the technology-saturated world we live in today. Digital Literacy is not merely about knowing how to use a computer; it involves critical thinking and digital problem-solving skills that enable individuals to engage, express, collaborate, and communicate in all aspects of life.

According to the American Library Association (ALA), Digital literacy is the ability to search, evaluate, and communicate information through technology. Beside usage of digital tools it is also a matter of understanding how to use them effectively and responsibly. 

This type of literacy helps you in equipping yourself with the skills to navigate the digital realm safely and confidently.

Importance Of Digital Literacy

Relevance of digital literacy cannot be overstated. In this time of digital age, the ability to understand and utilize technology influences every aspect of life—from personal growth to global participation in the digital economy. Some main key points for highlight its significance are:

  • Employment Opportunities: Proficiency in digital tools is often a prerequisite for job candidates.
  • Educational Resources: Digital skills open up a wealth of learning materials online.
  • Global Communication: Internet breaks down geographical barriers, allowing instant interaction with diverse cultures.
  • Information Literacy: Being digitally literate aids in discerning the credibility of online content.

Importance Of Digital Literacy

Evolution Of Digital Literacy

Concept of digital literacy has changed as technology has advanced. It began with basic ability to operate a computer and now includes a wide range of competencies:

1990sTyping, Basic Computing
2000sEmail, Internet Browsing
2010sSocial Media, Cloud Services
2020sAI Interaction, Data Analytics

Progression of digital literacy highlights an ongoing shift toward more complex and integrated uses of technology. To stay current, individuals and organizations must adapt and continuously refine their digital skill sets.

Digital Literacy Skills

Understanding digital literacy skills is essential in our technology-driven age. To thrive in the digital realm, mastering these competencies is crucial as learning to read and write. In essence, digital literacy encompasses a wide array of skills ranging from information evaluation to the responsible use of online platforms. Here, we delve into the core abilities that define a digitally literate individual.

Critical Thinking And Evaluation

In this time of globalization, critical thinking and evaluation are paramount. Digital users must discern credible information from misleading content. This necessitates:

  • Analysis of digital resources for accuracy and bias
  • Assessment of the relevancy and currency of online data
  • Understanding the nuances between opinion and fact

Users who are Equipped with these skills, can navigate the vast digital landscape with a discerning eye by ensuring the veracity and usefulness of consumed information.

Communication And Collaboration

Digital environment extends beyond individual use. Now it’s become a social platform where communication and collaboration reign supreme. Core skills include:

  • Effective digital communication using various platforms like email, forums, and social media
  • Collaboration using online tools such as shared documents, project management software, and virtual meeting spaces
  • Understanding digital etiquette, including tone, privacy, and respect for intellectual property

With these capabilities, one can participate in global conversations, contribute to group projects, and foster community regardless of physical boundaries.

Privacy And Security Awareness

Digital users must be vigilant about privacy and security. Key elements of this critical skill set include:

Privacy SkillDescription
Personal Information ManagementControlling what personal data is shared and understanding how it can be used against you
Use of Privacy SettingsAdjusting configurations on social media and other online platforms to protect your identity
Security PracticesImplementing strong passwords, using reliable antivirus software, and recognizing phishing attempts

A solid grounding in these areas is indispensable for ensuring personal and professional data remains secure from various threats.

Principles of Digital Literacy

Digital Literacy

Digital literacy is grounded in four fundamental principles. Each of the principles forming the core of developing digital literacy skills. Principles of Digital Literacy are as follows:

Understand What You See Online: Digital literacy means knowing how to really get what you’re looking at on the internet. It’s not just about the surface stuff, but digging deeper to grasp the details on digital platforms.

Realize Everything Is Connected: Digital stuff on the web is like a big web of connections. When you know how things link up, you can use digital content better.

Know People Are Different: Remember, online, people come from all walks of life, like different ages, have various educations, jobs, and families. All of these things impact how stuff is made, shared, and stored online.

Learn to Organize What You Find: Think of digital curation as creating your own online library. You can pick and choose what you want to keep, so you can easily find it later.

Above 4 principles are like a map to help you navigate in the digital world. It can help you to understand, connect dots, respect diversity, and stay organized in the vast world of the internet.

Digital Literacy For Lifelong Learning

Embracing the concept of digital literacy is critical in a evolved digital world, especially with its profound impact on lifelong learning. This type of learning is not just confined to the classroom; it extends into every facet of our lives. 

Digital literacy enhances the ability to learn new skills, adapt to changing environments, and access vast resources available through technology. Let’s delve into the various ways digital literacy propels learners of all ages towards ongoing personal and professional development.

Professional Development

In today’s fast-paced world, staying competitive means never standing still. Professional development is a perpetual journey, and digital literacy offers the tools necessary for this quest. With digital competencies, individuals can:

  • Enhance job performance by keeping up with industry trends and software updates.
  • Participate in online workshops, webinars, and courses that fit within busy schedules.
  • Collaborate with global experts, gaining insights and exchanging ideas.

Access To Online Resources

Internet is a treasure trove of information and learning materials— from scholarly articles to instructional videos, educational games to digital libraries. Those who are digitally literate unlock the potential to:

  • Discover a wide range of resources tailored to specific learning needs and interests.
  • Drive self-paced learning by accessing online tutorials, courses, and certification programs.
  • Engage with interactive and multimedia content for a more enriching learning experience.

Navigating The Digital Landscape

Digital landscape is vast and navigating it can be daunting without the necessary skills. Digital literacy empowers individuals to navigate the digital world with confidence and ease. By being digitally literate, users can:

  • Identify credible information and avoid the pitfalls of misinformation.
  • Use various tools and platforms effectively, from social media to cloud storage.
  • Protect personal information and privacy through understanding cybersecurity basics.

Digital Literacy Skills and Their Everyday Use

Digital literacy skills necessary  for people to navigate in this digital world effectively. These skills can be applied in the following ways:

  • Digital media: Skills in digital literacy allows ease and comfort of navigating digital media in everyday life.
  • Researching: Individuals will gain the ability to find information and identify reliable sources within different digital platforms.
  • Social Inclusivity: Digital literacy skills removes barriers of location, allowing for an increase in social opportunities and connection with those online. 
  • Understanding Digital Media: With these skills people will be more familiar with common digital media terms and will be able to navigate digital features in their totality.

How to Develop Digital Literacy Skills

Here are some tips from experts to develop and strengthen digital literacy skills. 

  • Explore with digital tools to understand how they work.
  • Stay updated about the latest technologies to remain relevant.
  • Focus on technology devices and digital platforms that align with your needs and preferences.
  • Be open to learning new things as the digital landscape is always evolving.
  • Enroll in online courses to gain insight into different digital technologies and their uses.
  • Seek help when encountering new technology devices or platforms.

Frequently Asked Questions On What Is Digital Literacy

What is meant by digital literacy.

Digital literacy refers to an individual’s ability to find, evaluate, and compose clear information through writing and other mediums on various digital platforms. It is crucial for effective communication and problem-solving in the digital age.

  Why Is Digital Literacy Important?

  Digital literacy is important as it empowers people to navigate technology confidently. This includes safely managing personal data, understanding online resources, and being proficient in digital tools, which are vital skills in today’s tech-driven world.

  How Can You Improve Your Digital Literacy?

  Improving digital literacy involves regular use of technology, seeking education through online courses or workshops, and practicing critical evaluation of digital content. Staying updated with technological advancements also greatly enhances digital literacy skills.

  What Are The Key Components Of Digital Literacy?

  Key components of digital literacy include technical skills to use devices, cognitive skills for critical thinking, and ethical understanding for responsible use of technology.  

Bottom Line

Nowadays digital literacy is becoming one of the essential parts for individuals navigating this digital world. Development of these skills ensures that individuals can explore digital platforms safely and responsibly.

Related Posts

Why Is Self Care Important

Why Is Self Care Important

  • June 7, 2024
  • by projectnewyorker

What Is Adult Literacy And Why Is It Important

What Is Adult Literacy And Why Is It Important

  • April 24, 2024

Diabetes workshops in queens

Diabetes Workshops in Queens

  • April 9, 2024

TeachThought

5 Dimensions Of Critical Digital Literacy: A Framework

Digital Literacy is increasingly important in an age where many students read more from screens than they do from books.

5 Dimensions Of Critical Digital Literacy: A Framework

What Are The Dimensions Of Critical Digital Literacy?

by TeachThought Staff

In fact, the very definition of many of these terms is changing as the overlap across media forms increases. Interactive eBooks can function like both long-form blogs and traditional books. Threaded email can look and function like social media. Email and texting and social media messaging are increasingly similar. Videos are live-streamed, then curated to a YouTube channel where a tool transcribes them into a podcast. And so on.

This is the modern digital era.

The above framework was developed by Juliet Hinrichsen and Antony Coombs at the University of Greenwich . Explaining its origins, they describe the model as “a framework to articulate the scope and dimensions of digital literacies. It is based on an established model of literacy which is underpinned by critical perspectives (the Four Resources Model of Critical Literacy, after Luke & Freebody). It has been adapted for the digital context.”

The framework is minimalist in design, forgoing any kind of analysis of each dimension, or examples of how readers may use them, but that’s part of its charm: At a glance, it refracts digital literacy rather succinctly.

5 Dimensions Of Digital Literacy

1. Decoding

Focus: the media–modes, structures, and conventions of digital media

2. Meaning Making

Focus: the reader–style, purpose, interpretation

3. Analyzing

Focus: the author–aesthetics, ethics, and related choices

Focus: a community–how others perceive the issue, topics, and context

Focus: a marriage of self and community–problem-solving and data acquisition for a variety of authentic–and changing–purposes

5 Dimensions Of Critical Digital Literacy: A Framework; image attribution  Juliet Hinrichsen and Antony Coombs at the University of Greenwich.

TeachThought is an organization dedicated to innovation in education through the growth of outstanding teachers.

Programs submenu

Regions submenu, topics submenu, solving the world’s hardest problems with mellody hobson: closing the racial wealth gap, weapons in space: a virtual book talk with dr. aaron bateman, u.s.-australia-japan trilateral cooperation on strategic stability in the taiwan strait report launch.

  • Abshire-Inamori Leadership Academy
  • Aerospace Security Project
  • Africa Program
  • Americas Program
  • Arleigh A. Burke Chair in Strategy
  • Asia Maritime Transparency Initiative
  • Asia Program
  • Australia Chair
  • Brzezinski Chair in Global Security and Geostrategy
  • Brzezinski Institute on Geostrategy
  • Chair in U.S.-India Policy Studies
  • China Power Project
  • Chinese Business and Economics
  • Defending Democratic Institutions
  • Defense-Industrial Initiatives Group
  • Defense 360
  • Defense Budget Analysis
  • Diversity and Leadership in International Affairs Project
  • Economics Program
  • Emeritus Chair in Strategy
  • Energy Security and Climate Change Program
  • Europe, Russia, and Eurasia Program
  • Freeman Chair in China Studies
  • Futures Lab
  • Geoeconomic Council of Advisers
  • Global Food and Water Security Program
  • Global Health Policy Center
  • Hess Center for New Frontiers
  • Human Rights Initiative
  • Humanitarian Agenda
  • Intelligence, National Security, and Technology Program
  • International Security Program
  • Japan Chair
  • Kissinger Chair
  • Korea Chair
  • Langone Chair in American Leadership
  • Middle East Program
  • Missile Defense Project
  • Project on Critical Minerals Security
  • Project on Fragility and Mobility
  • Project on Nuclear Issues
  • Project on Prosperity and Development
  • Project on Trade and Technology
  • Renewing American Innovation
  • Scholl Chair in International Business
  • Smart Women, Smart Power
  • Southeast Asia Program
  • Stephenson Ocean Security Project
  • Strategic Technologies Program
  • Sustainable Development and Resilience Initiative
  • Wadhwani Center for AI and Advanced Technologies
  • Warfare, Irregular Threats, and Terrorism Program
  • All Regions
  • Australia, New Zealand & Pacific
  • Middle East
  • Russia and Eurasia
  • American Innovation
  • Civic Education
  • Climate Change
  • Cybersecurity
  • Defense Budget and Acquisition
  • Defense and Security
  • Energy and Sustainability
  • Food Security
  • Gender and International Security
  • Geopolitics
  • Global Health
  • Human Rights
  • Humanitarian Assistance
  • Intelligence
  • International Development
  • Maritime Issues and Oceans
  • Missile Defense
  • Nuclear Issues
  • Transnational Threats
  • Water Security

The Digital Literacy Imperative

Photo: Yasuyoshi Chiba/AFP/Getty Images

Photo: Yasuyoshi Chiba/AFP/Getty Images

Table of Contents

Brief by Romina Bandura and Elena I. Méndez Leal

Published July 18, 2022

Available Downloads

  • Download the CSIS Brief 1106kb

The Issue: 

  • Digital literacy has become indispensable for every global citizen, whether to communicate, find employment, receive comprehensive education, or socialize. Acquiring the right set of digital skills is not only important for learning and workforce readiness but also vital to foster more open, inclusive, and secure societies.
  • Digital literacy , like other competencies, should start at school. However, many education systems lack the proper infrastructure, technological equipment, teacher training, or learning benchmarks to effectively integrate digital literacy into curriculums.
Effective strategies to address digital literacy and skill-building will require public and private investments in digital infrastructure, policy and governance frameworks, and training in the use of digital technologies.
  • The U.S. government—particularly through the work of the U.S. Agency for International Aid (USAID) , its premier development agency—can partner with the private sector, local organizations, and civil society to lead and support an international coalition on digital skills. It can lead in this space by convening a multistakeholder working group on digital skills, investing in skills development among vulnerable and excluded populations (such as women and girls), and enhancing digital skills in basic curriculums.

Introduction

Reading, writing, and numeracy: these are foundational skills people learn at school and continue using throughout their lifetimes. But as societies evolve and technology progresses, the learning needs and demands of one generation change for the next. Curriculums in educational institutions must keep up with these changes to reflect the new realities. They do so by removing outdated content, incorporating new disciplines, and innovating with new educational tools and techniques. While previous American generations learned Latin and shorthand, current generations learn Spanish or French and practice typing. In many public schools across the United States, cursive handwriting is no longer taught. Children now practice writing and typing using new technology such as tablets and computers, not typewriters. In advanced countries, educational equipment such as blackboards, chalkboards, and even whiteboards have been replaced with high-tech tools such as Promethean boards.

While numeracy and basic literacy are still fundamental to learning, digital literacy has emerged as another critical life skill and is now, per the World Economic Forum, part of the twenty-first-century toolkit (see Figure 1). Beyond basic literacy, digital skills have become indispensable for every global citizen, whether to communicate, find employment, receive comprehensive education , or socialize. More than 90 percent of professional roles in across sectors in Europe require a basic level of digital knowledge and understanding. This need has become even more evident during the Covid-19 pandemic, making it more urgent for countries to embrace digital technologies and their associated skills.

Figure 1: The Skills Toolkit for the Twenty-First Century

critical thinking in digital literacy

To keep up with technological advancements, companies need to hire employees that have the right skills. However, workforces are not always equipped with the requisite digital skills, and businesses often struggle to find qualified talent. Digital skills are at a premium, even in advanced economies. For instance, the European Union’s Digital Economy and Society Index (DESI) shows that approximately 42 percent of Europeans lack basic digital skills, including 37 percent of those in the workforce. Women are particularly underrepresented in tech-related professions; only one in six information and communications technology (ICT) specialists and one in three science, technology, engineering, and mathematics (STEM) graduates are women.

However, acquiring the right set of digital skills is not only important for learning and workforce readiness: digital skills are also vital to fostering more open, inclusive, and secure societies. When people interact with digital infrastructure, they need to be aware of the privacy and data risks as well as cybersecurity challenges (e.g., ransomware and phishing attacks). Thus, digital literacy also includes handling security and safety challenges created by technology. At the same time, with the rise of digital authoritarianism , misinformation, and disinformation , as well as limitations on personal freedoms, it is equally important to maintain a values framework for digital transformation.

Acquiring the right set of digital skills is not only important for learning and workforce readiness: digital skills are also vital to fostering more open, inclusive, and secure societies.

Digital Literacy in a Continuum

What exactly does “digital literacy” entail? There are many competing definitions, but it can be thought of as the ability to use digital technologies—both hardware and software—safely and appropriately. According to the UN Educational, Scientific, and Cultural Organization ( UNESCO ) , this includes competencies such as using ICT, processing information, and engaging with media. However, digital skills do not exist in a vacuum and interact with other capabilities such as general literacy and numeracy, social and emotional skills, critical thinking, complex problem solving, and the ability to collaborate.

Digital skills also have different degrees of complexity. According to the International Telecommunication Union (ITU), digital skills exist along a continuum ranging from basic to intermediate to advanced (see Figure 2). Basic digital skills comprise the effective use of hardware (e.g., typing or operating touch-screen technology), software (e.g., word processing, organizing files on laptops, and managing privacy settings on mobile phones), and internet/ICT tasks (e.g., emailing, browsing the internet, or completing an online form). Intermediate digital skills comprise the ability to critically evaluate technology or create content; they are characterized as “job-ready skills” and include desktop publishing, digital graphic design, and digital marketing. Finally, specialists use advanced digital skills in ICT professions such as computer programming and network management. Many technology-sector jobs now require advanced digital skills related to such innovations as artificial intelligence (AI), big data, natural language processing, cybersecurity, the Internet of Things (IoT), software development, and digital entrepreneurship.

Figure 2: The Digital Skills Continuum

critical thinking in digital literacy

Digital literacy, like other competencies, should start at school. But many education systems are not equipped to teach children these skills because they lack the proper infrastructure, technological equipment, teacher training, curriculum, or learning benchmarks. This gap is further pronounced in developing countries. A 2020 study conducted in Chile, Ecuador, Mexico, and Peru assessed teachers’ digital skills and readiness for remote learning, finding that 39 percent of teachers were only able to execute basic tasks, 40 percent were able to perform basic tasks and use the internet to browse or send email, and only 13 percent of teachers could do more complex functions.

Moreover, enhancing digital literacy goes beyond providing access to computers, smartphones, or tablets. Although nearly half of the world is still offline, supplying hardware alone will be insufficient to acquire digital literacy. That is, beyond the estimated $428 billion in investment required to close the digital coverage gap, there is little information on the total investment or demand for addressing this issue. There are alternative models for delivering digital literacy—particularly to vulnerable and under-connected communities—including interactive voice response (IVR), solar-powered devices, downloadable learning, and feature phones. Despite the many innovations in this space, there is scant evidence of what works or what can be scaled.

Although nearly half of the world is still offline, supplying hardware alone will be insufficient to acquire digital literacy.

Multilateral Efforts on Digital Literacy

The need to equip current and future generations with the necessary skills is captured in the United Nations’ sustainable development goals (SDGs). Under target 4.4, the United Nations nudges countries to increase the proportion of youth and adults who have skills relevant for the job market. More specifically, indicator 4.4.1 calls on governments to measure the proportion of youth and adults with ICT skills.

In this regard, international organizations, the private sector, and national governments have established initiatives on digital literacy (see text box). For example, frameworks that measure digital skills target different societal sectors, including primary and secondary schools, government structures, and specific industries. The most relevant frameworks include the ITU’s Digital Toolkit, the Eurostat digital skills indicator survey, and the European Union’s Digital Competence 2.0 Framework for Citizens . In addition, the DQ Framework aggregates 24 leading international frameworks to promote digital literacy and digital skills around the world.

Digital literacy is both an international and local issue. Countries and regions will require tailored approaches to meet their unique needs and contexts. Some governments are putting together strategic plans to increase citizens’ digital literacy, albeit for different purposes. For example, the Republic of Korea has prioritized fostering digital skills in public administration officials to improve efficiency in delivering public services. Meanwhile, Oman has used Microsoft’s Digital Literacy curriculum to improve the ICT industry’s workforce and prepare youth for employment. In 2019, the Ukrainian government launched a national digital education platform called Diia Digital Education offering over 75 courses and teaching materials to its citizens.  Through its skills agenda, the European Union has set a target to ensure that 70 percent of adults have basic digital skills by 2025 and to cut the percentage of teens who underperform in computing and digital literacy from 30 percent in 2019 to 15 percent by 2030. Ghana has partnered with the World Bank’s Digital Economy for Africa initiative, launching a $212 million “eTransform” program to increase training, mentoring, and access to technologies.

Digital literacy is both an international and local issue. Countries and regions will require tailored approaches to meet their unique needs and contexts.

Examples of International Digital-Skills Initiatives

USAID released its first four-year digital strategy plan in 2020 to achieve safe and inclusive digital ecosystems in developing countries. In April 2022, the agency also released its Digital Literacy Primer , which aims to raise awareness on how digital literacy can contribute to global development goals and what role USAID can play to improve digital-literacy programming and initiatives.

UNESCO has partnered with Pearson’s Project Literacy program to create digital-literacy guidelines for nongovernmental organizations, governments, and the private sector to utilize when pursuing digital-literacy projects. UNESCO’s Institute for Information Technologies in Education (IITE) hosts discussion events, provides training programs to educators and schools, and advocates for digital-literacy education policies.

The Organization for Economic Cooperation and Development’s (OECD) Program for International Student Assessment (PISA) conducted a report on access to digital technology, in which it promoted initiatives to combat growing digital divides. In addition, the OECD’s Going Digital project explains to policymakers why digital development is crucial.

The ITU operates projects around the world centered on digital inclusion through “digital transformation centers,” “smart villages,” and digitizing government services. The ITU and Cisco Systems have also partnered to address digital-skills and -literacy gaps in developing countries.

The World Bank ’s Development Data Group was implemented to expand access and promote digital-skills development across multiple sector In World Development Report 2021: Data for Better Lives , the World Bank offers recommendations for digital-literacy campaigns and strategies. This builds upon the multiple projects in several regions it has instituted to support digital literacy since 2006, as well as its joint program with EQUALS Global Partnership and local organizations to increase digital skills and literacy for women and girls in Rwanda, Nigeria, and Uganda. 

The UN Children’s Fund (UNICEF) explored digital-literacy frameworks in 2019, seeking to partner with multiple stakeholders and expand its research on this issue. Thus far, UNICEF primarily researches and publishes reports on digital-literacy plans but does include related goals and resources in its various education programs .

DO4Africa aims to expand digital innovation and projects on the continent, including by promoting digital literacy .

The Role of the United States in Digital Literacy

The Covid-19 pandemic has made evident the need to increase the adoption of innovative digital solutions and, in turn, build the skills that can accompany this wave of digitization. While the impact of the pandemic has increased opportunities for digital payments, e-services, and telework, digital technologies will not foster development and inclusion on their own. Effective strategies to address digital literacy and skill-building will require public and private investments in digital infrastructure, policy and governance frameworks, and training in the use of digital technologies.

In this regard, the U.S. government—particularly through the work of USAID, its premier development agency—can partner with the private sector, local organizations, and civil society to lead and support an international coalition on digital skills. Through its Digital Strategy , USAID is breaking silos internally, incorporating “digital” as an all-encompassing issue across the agency’s strategies and operations. This puts the United States ahead of the curve: Among bilateral donor agencies in the OECD, only 12 countries have digitalization strategies within their organizations.

First, the United States should convene a multistakeholder working group on digital skills. Although many important institutions are working in this sector (including UNESCO, the ITU, the OECD, and the World Bank), USAID can play an important role in supporting a values-based approach to digital transformation and digital literacy. Donors need to think carefully about the principles and values being embedded into digital systems. Without strong guiding standards (such as the Principles for Digital Development) and values for these systems, we risk empowering malign actors instead of lifting people out of poverty; we risk enabling surveillance, disinformation, and digital authoritarianism instead of personal freedom and financial inclusivity; and we risk the wrong kind of systemic change by destabilizing the financial system and entrenching existing inequalities.

To ensure maximum lasting impact, public and private organizations need to work together in a skills-development ecosystem, with more actors connecting through digital platforms and learning. As the saying goes, “If you want to walk fast, you walk alone; if you want to walk far, you walk together.” In this regard, USAID has a comparative advantage in engaging with the private sector. Specifically, the agency could convene a multi-pronged approach that effectively coordinates efforts from the international donor community, multilateral institutions, local governments, and companies. This includes taking stock of existing collaborations and developing a single channel through which donors can communicate and coordinate. The U.S. government and relevant agencies should engage at not just the bilateral level but also multilaterally in order to maintain leadership roles across donor and recipient initiatives and participate in discussions to address digital-literacy challenges such as infrastructure and access.

Second, the United States and its partners should learn from previous digital-transformation approaches and elevate and invest in skills development among vulnerable and excluded populations such as women and girls. In low- and middle-income countries, fewer than 50 percent of women have access to the internet, and far fewer have the skills to effectively and safely interact online, thus impeding their social and economic opportunities . Investment in general education, in addition to digital education, will also be critical to developing twenty-first-century literacy and skills. Digital literacy is a very nuanced topic with many different elements, but research on related programming and interventions is not as robust. USAID can promote and facilitate evidence-based learning around what types of interventions work best for digital literacy—for now there is a large amount of innovation in this field, but there is also a lack of scale.

Third, in partnership with the donor community, USAID should work to enhance digital skills in basic curriculums and identify critical gaps in education systems. The earlier digital education begins, the more attainable a high degree of digital literacy is. Digital Education would improve the overall quality of life in low- to middle-income countries and equip future workforces with necessary skills in a rapidly digitizing world. Where possible, integrating technology and digital skills into curriculums will allow for early development of digital literacy, allowing students to familiarize themselves with modern methods of communication and accessing information. These initiatives will need to be adapted to different countries’ and communities’ local and cultural contexts to maximize learning impact and ensure minimal exclusion. USAID can also assist governments in establishing upskilling initiatives to train older workers and employers in how to integrate digital technologies into businesses and sectors. These initiatives should emphasize preventing “brain drain” and retaining local digital talent. Improving ICT infrastructure will also increase the ability for people to access programs and integrate digital skills into their daily lives.

Digitalization is no longer a sectoral issue but is all-encompassing across sectors and actors. In that regard, “ the future is already here ”—and investing in digital-literacy programs will be critical to establishing global leadership in the digital age. Meeting digital demands and supporting digital transitions worldwide will be essential for global development programming and progressing toward a free, sustainable, and equitable future. In a world that is increasingly online, accessing technologies and the proper digital skills will be critical for countries’ development, security, and inclusion.

Romina Bandura is a senior fellow with the Project on Prosperity and Development and the Project on U.S. Leadership in Development  at the Center for Strategic and International Studies (CSIS) in Washington, D.C.  Elena I. Méndez Leal is a program coordinator and research assistant with the Project on Prosperity and Development at CSIS.

This brief is made possible by the generous support from USAID and DAI.

CSIS Briefs  are produced by the Center for Strategic and International Studies (CSIS), a private, tax-exempt institution focusing on international public policy issues. Its research is nonpartisan and nonproprietary. CSIS does not take specific policy positions. Accordingly, all views, positions, and conclusions expressed in this publication should be understood to be solely those of the author(s).

© 2022 by the Center for Strategic and International Studies. All rights reserved.

Romina Bandura

Romina Bandura

Elena I. Méndez Leal

Elena I. Méndez Leal

Programs & projects.

  • Education, Work, and Youth
  • Digital Public Infrastructure
  • Open access
  • Published: 08 June 2022

A systematic review on digital literacy

  • Hasan Tinmaz   ORCID: orcid.org/0000-0003-4310-0848 1 ,
  • Yoo-Taek Lee   ORCID: orcid.org/0000-0002-1913-9059 2 ,
  • Mina Fanea-Ivanovici   ORCID: orcid.org/0000-0003-2921-2990 3 &
  • Hasnan Baber   ORCID: orcid.org/0000-0002-8951-3501 4  

Smart Learning Environments volume  9 , Article number:  21 ( 2022 ) Cite this article

61k Accesses

51 Citations

10 Altmetric

Metrics details

The purpose of this study is to discover the main themes and categories of the research studies regarding digital literacy. To serve this purpose, the databases of WoS/Clarivate Analytics, Proquest Central, Emerald Management Journals, Jstor Business College Collections and Scopus/Elsevier were searched with four keyword-combinations and final forty-three articles were included in the dataset. The researchers applied a systematic literature review method to the dataset. The preliminary findings demonstrated that there is a growing prevalence of digital literacy articles starting from the year 2013. The dominant research methodology of the reviewed articles is qualitative. The four major themes revealed from the qualitative content analysis are: digital literacy, digital competencies, digital skills and digital thinking. Under each theme, the categories and their frequencies are analysed. Recommendations for further research and for real life implementations are generated.

Introduction

The extant literature on digital literacy, skills and competencies is rich in definitions and classifications, but there is still no consensus on the larger themes and subsumed themes categories. (Heitin, 2016 ). To exemplify, existing inventories of Internet skills suffer from ‘incompleteness and over-simplification, conceptual ambiguity’ (van Deursen et al., 2015 ), and Internet skills are only a part of digital skills. While there is already a plethora of research in this field, this research paper hereby aims to provide a general framework of digital areas and themes that can best describe digital (cap)abilities in the novel context of Industry 4.0 and the accelerated pandemic-triggered digitalisation. The areas and themes can represent the starting point for drafting a contemporary digital literacy framework.

Sousa and Rocha ( 2019 ) explained that there is a stake of digital skills for disruptive digital business, and they connect it to the latest developments, such as the Internet of Things (IoT), cloud technology, big data, artificial intelligence, and robotics. The topic is even more important given the large disparities in digital literacy across regions (Tinmaz et al., 2022 ). More precisely, digital inequalities encompass skills, along with access, usage and self-perceptions. These inequalities need to be addressed, as they are credited with a ‘potential to shape life chances in multiple ways’ (Robinson et al., 2015 ), e.g., academic performance, labour market competitiveness, health, civic and political participation. Steps have been successfully taken to address physical access gaps, but skills gaps are still looming (Van Deursen & Van Dijk, 2010a ). Moreover, digital inequalities have grown larger due to the COVID-19 pandemic, and they influenced the very state of health of the most vulnerable categories of population or their employability in a time when digital skills are required (Baber et al., 2022 ; Beaunoyer, Dupéré & Guitton, 2020 ).

The systematic review the researchers propose is a useful updated instrument of classification and inventory for digital literacy. Considering the latest developments in the economy and in line with current digitalisation needs, digitally literate population may assist policymakers in various fields, e.g., education, administration, healthcare system, and managers of companies and other concerned organisations that need to stay competitive and to employ competitive workforce. Therefore, it is indispensably vital to comprehend the big picture of digital literacy related research.

Literature review

Since the advent of Digital Literacy, scholars have been concerned with identifying and classifying the various (cap)abilities related to its operation. Using the most cited academic papers in this stream of research, several classifications of digital-related literacies, competencies, and skills emerged.

Digital literacies

Digital literacy, which is one of the challenges of integration of technology in academic courses (Blau, Shamir-Inbal & Avdiel, 2020 ), has been defined in the current literature as the competencies and skills required for navigating a fragmented and complex information ecosystem (Eshet, 2004 ). A ‘Digital Literacy Framework’ was designed by Eshet-Alkalai ( 2012 ), comprising six categories: (a) photo-visual thinking (understanding and using visual information); (b) real-time thinking (simultaneously processing a variety of stimuli); (c) information thinking (evaluating and combining information from multiple digital sources); (d) branching thinking (navigating in non-linear hyper-media environments); (e) reproduction thinking (creating outcomes using technological tools by designing new content or remixing existing digital content); (f) social-emotional thinking (understanding and applying cyberspace rules). According to Heitin ( 2016 ), digital literacy groups the following clusters: (a) finding and consuming digital content; (b) creating digital content; (c) communicating or sharing digital content. Hence, the literature describes the digital literacy in many ways by associating a set of various technical and non-technical elements.

  • Digital competencies

The Digital Competence Framework for Citizens (DigComp 2.1.), the most recent framework proposed by the European Union, which is currently under review and undergoing an updating process, contains five competency areas: (a) information and data literacy, (b) communication and collaboration, (c) digital content creation, (d) safety, and (e) problem solving (Carretero, Vuorikari & Punie, 2017 ). Digital competency had previously been described in a technical fashion by Ferrari ( 2012 ) as a set comprising information skills, communication skills, content creation skills, safety skills, and problem-solving skills, which later outlined the areas of competence in DigComp 2.1, too.

  • Digital skills

Ng ( 2012 ) pointed out the following three categories of digital skills: (a) technological (using technological tools); (b) cognitive (thinking critically when managing information); (c) social (communicating and socialising). A set of Internet skill was suggested by Van Deursen and Van Dijk ( 2009 , 2010b ), which contains: (a) operational skills (basic skills in using internet technology), (b) formal Internet skills (navigation and orientation skills); (c) information Internet skills (fulfilling information needs), and (d) strategic Internet skills (using the internet to reach goals). In 2014, the same authors added communication and content creation skills to the initial framework (van Dijk & van Deursen). Similarly, Helsper and Eynon ( 2013 ) put forward a set of four digital skills: technical, social, critical, and creative skills. Furthermore, van Deursen et al. ( 2015 ) built a set of items and factors to measure Internet skills: operational, information navigation, social, creative, mobile. More recent literature (vaan Laar et al., 2017 ) divides digital skills into seven core categories: technical, information management, communication, collaboration, creativity, critical thinking, and problem solving.

It is worth mentioning that the various methodologies used to classify digital literacy are overlapping or non-exhaustive, which confirms the conceptual ambiguity mentioned by van Deursen et al. ( 2015 ).

  • Digital thinking

Thinking skills (along with digital skills) have been acknowledged to be a significant element of digital literacy in the educational process context (Ferrari, 2012 ). In fact, critical thinking, creativity, and innovation are at the very core of DigComp. Information and Communication Technology as a support for thinking is a learning objective in any school curriculum. In the same vein, analytical thinking and interdisciplinary thinking, which help solve problems, are yet other concerns of educators in the Industry 4.0 (Ozkan-Ozen & Kazancoglu, 2021 ).

However, we have recently witnessed a shift of focus from learning how to use information and communication technologies to using it while staying safe in the cyber-environment and being aware of alternative facts. Digital thinking would encompass identifying fake news, misinformation, and echo chambers (Sulzer, 2018 ). Not least important, concern about cybersecurity has grown especially in times of political, social or economic turmoil, such as the elections or the Covid-19 crisis (Sulzer, 2018 ; Puig, Blanco-Anaya & Perez-Maceira, 2021 ).

Ultimately, this systematic review paper focuses on the following major research questions as follows:

Research question 1: What is the yearly distribution of digital literacy related papers?

Research question 2: What are the research methods for digital literacy related papers?

Research question 3: What are the main themes in digital literacy related papers?

Research question 4: What are the concentrated categories (under revealed main themes) in digital literacy related papers?

This study employed the systematic review method where the authors scrutinized the existing literature around the major research question of digital literacy. As Uman ( 2011 ) pointed, in systematic literature review, the findings of the earlier research are examined for the identification of consistent and repetitive themes. The systematic review method differs from literature review with its well managed and highly organized qualitative scrutiny processes where researchers tend to cover less materials from fewer number of databases to write their literature review (Kowalczyk & Truluck, 2013 ; Robinson & Lowe, 2015 ).

Data collection

To address major research objectives, the following five important databases are selected due to their digital literacy focused research dominance: 1. WoS/Clarivate Analytics, 2. Proquest Central; 3. Emerald Management Journals; 4. Jstor Business College Collections; 5. Scopus/Elsevier.

The search was made in the second half of June 2021, in abstract and key words written in English language. We only kept research articles and book chapters (herein referred to as papers). Our purpose was to identify a set of digital literacy areas, or an inventory of such areas and topics. To serve that purpose, systematic review was utilized with the following synonym key words for the search: ‘digital literacy’, ‘digital skills’, ‘digital competence’ and ‘digital fluency’, to find the mainstream literature dealing with the topic. These key words were unfolded as a result of the consultation with the subject matter experts (two board members from Korean Digital Literacy Association and two professors from technology studies department). Below are the four key word combinations used in the search: “Digital literacy AND systematic review”, “Digital skills AND systematic review”, “Digital competence AND systematic review”, and “Digital fluency AND systematic review”.

A sequential systematic search was made in the five databases mentioned above. Thus, from one database to another, duplicate papers were manually excluded in a cascade manner to extract only unique results and to make the research smoother to conduct. At this stage, we kept 47 papers. Further exclusion criteria were applied. Thus, only full-text items written in English were selected, and in doing so, three papers were excluded (no full text available), and one other paper was excluded because it was not written in English, but in Spanish. Therefore, we investigated a total number of 43 papers, as shown in Table 1 . “ Appendix A ” shows the list of these papers with full references.

Data analysis

The 43 papers selected after the application of the inclusion and exclusion criteria, respectively, were reviewed the materials independently by two researchers who were from two different countries. The researchers identified all topics pertaining to digital literacy, as they appeared in the papers. Next, a third researcher independently analysed these findings by excluded duplicates A qualitative content analysis was manually performed by calculating the frequency of major themes in all papers, where the raw data was compared and contrasted (Fraenkel et al., 2012 ). All three reviewers independently list the words and how the context in which they appeared and then the three reviewers collectively decided for how it should be categorized. Lastly, it is vital to remind that literature review of this article was written after the identification of the themes appeared as a result of our qualitative analyses. Therefore, the authors decided to shape the literature review structure based on the themes.

As an answer to the first research question (the yearly distribution of digital literacy related papers), Fig.  1 demonstrates the yearly distribution of digital literacy related papers. It is seen that there is an increasing trend about the digital literacy papers.

figure 1

Yearly distribution of digital literacy related papers

Research question number two (The research methods for digital literacy related papers) concentrates on what research methods are employed for these digital literacy related papers. As Fig.  2 shows, most of the papers were using the qualitative method. Not stated refers to book chapters.

figure 2

Research methods used in the reviewed articles

When forty-three articles were analysed for the main themes as in research question number three (The main themes in digital literacy related papers), the overall findings were categorized around four major themes: (i) literacies, (ii) competencies, (iii) skills, and (iv) thinking. Under every major theme, the categories were listed and explained as in research question number four (The concentrated categories (under revealed main themes) in digital literacy related papers).

The authors utilized an overt categorization for the depiction of these major themes. For example, when the ‘creativity’ was labelled as a skill, the authors also categorized it under the ‘skills’ theme. Similarly, when ‘creativity’ was mentioned as a competency, the authors listed it under the ‘competencies’ theme. Therefore, it is possible to recognize the same finding under different major themes.

Major theme 1: literacies

Digital literacy being the major concern of this paper was observed to be blatantly mentioned in five papers out forty-three. One of these articles described digital literacy as the human proficiencies to live, learn and work in the current digital society. In addition to these five articles, two additional papers used the same term as ‘critical digital literacy’ by describing it as a person’s or a society’s accessibility and assessment level interaction with digital technologies to utilize and/or create information. Table 2 summarizes the major categories under ‘Literacies’ major theme.

Computer literacy, media literacy and cultural literacy were the second most common literacy (n = 5). One of the article branches computer literacy as tool (detailing with software and hardware uses) and resource (focusing on information processing capacity of a computer) literacies. Cultural literacy was emphasized as a vital element for functioning in an intercultural team on a digital project.

Disciplinary literacy (n = 4) was referring to utilizing different computer programs (n = 2) or technical gadgets (n = 2) with a specific emphasis on required cognitive, affective and psychomotor skills to be able to work in any digital context (n = 3), serving for the using (n = 2), creating and applying (n = 2) digital literacy in real life.

Data literacy, technology literacy and multiliteracy were the third frequent categories (n = 3). The ‘multiliteracy’ was referring to the innate nature of digital technologies, which have been infused into many aspects of human lives.

Last but not least, Internet literacy, mobile literacy, web literacy, new literacy, personal literacy and research literacy were discussed in forty-three article findings. Web literacy was focusing on being able to connect with people on the web (n = 2), discover the web content (especially the navigation on a hyper-textual platform), and learn web related skills through practical web experiences. Personal literacy was highlighting digital identity management. Research literacy was not only concentrating on conducting scientific research ability but also finding available scholarship online.

Twenty-four other categories are unfolded from the results sections of forty-three articles. Table 3 presents the list of these other literacies where the authors sorted the categories in an ascending alphabetical order without any other sorting criterion. Primarily, search, tagging, filtering and attention literacies were mainly underlining their roles in information processing. Furthermore, social-structural literacy was indicated as the recognition of the social circumstances and generation of information. Another information-related literacy was pointed as publishing literacy, which is the ability to disseminate information via different digital channels.

While above listed personal literacy was referring to digital identity management, network literacy was explained as someone’s social networking ability to manage the digital relationship with other people. Additionally, participatory literacy was defined as the necessary abilities to join an online team working on online content production.

Emerging technology literacy was stipulated as an essential ability to recognize and appreciate the most recent and innovative technologies in along with smart choices related to these technologies. Additionally, the critical literacy was added as an ability to make smart judgements on the cost benefit analysis of these recent technologies.

Last of all, basic, intermediate, and advanced digital assessment literacies were specified for educational institutions that are planning to integrate various digital tools to conduct instructional assessments in their bodies.

Major theme 2: competencies

The second major theme was revealed as competencies. The authors directly categorized the findings that are specified with the word of competency. Table 4 summarizes the entire category set for the competencies major theme.

The most common category was the ‘digital competence’ (n = 14) where one of the articles points to that category as ‘generic digital competence’ referring to someone’s creativity for multimedia development (video editing was emphasized). Under this broad category, the following sub-categories were associated:

Problem solving (n = 10)

Safety (n = 7)

Information processing (n = 5)

Content creation (n = 5)

Communication (n = 2)

Digital rights (n = 1)

Digital emotional intelligence (n = 1)

Digital teamwork (n = 1)

Big data utilization (n = 1)

Artificial Intelligence utilization (n = 1)

Virtual leadership (n = 1)

Self-disruption (in along with the pace of digitalization) (n = 1)

Like ‘digital competency’, five additional articles especially coined the term as ‘digital competence as a life skill’. Deeper analysis demonstrated the following points: social competences (n = 4), communication in mother tongue (n = 3) and foreign language (n = 2), entrepreneurship (n = 3), civic competence (n = 2), fundamental science (n = 1), technology (n = 1) and mathematics (n = 1) competences, learning to learn (n = 1) and self-initiative (n = 1).

Moreover, competencies were linked to workplace digital competencies in three articles and highlighted as significant for employability (n = 3) and ‘economic engagement’ (n = 3). Digital competencies were also detailed for leisure (n = 2) and communication (n = 2). Furthermore, two articles pointed digital competencies as an inter-cultural competency and one as a cross-cultural competency. Lastly, the ‘digital nativity’ (n = 1) was clarified as someone’s innate competency of being able to feel contented and satisfied with digital technologies.

Major theme 3: skills

The third major observed theme was ‘skills’, which was dominantly gathered around information literacy skills (n = 19) and information and communication technologies skills (n = 18). Table 5 demonstrates the categories with more than one occurrence.

Table 6 summarizes the sub-categories of the two most frequent categories of ‘skills’ major theme. The information literacy skills noticeably concentrate on the steps of information processing; evaluation (n = 6), utilization (n = 4), finding (n = 3), locating (n = 2) information. Moreover, the importance of trial/error process, being a lifelong learner, feeling a need for information and so forth were evidently listed under this sub-category. On the other hand, ICT skills were grouped around cognitive and affective domains. For instance, while technical skills in general and use of social media, coding, multimedia, chat or emailing in specific were reported in cognitive domain, attitude, intention, and belief towards ICT were mentioned as the elements of affective domain.

Communication skills (n = 9) were multi-dimensional for different societies, cultures, and globalized contexts, requiring linguistic skills. Collaboration skills (n = 9) are also recurrently cited with an explicit emphasis for virtual platforms.

‘Ethics for digital environment’ encapsulated ethical use of information (n = 4) and different technologies (n = 2), knowing digital laws (n = 2) and responsibilities (n = 2) in along with digital rights and obligations (n = 1), having digital awareness (n = 1), following digital etiquettes (n = 1), treating other people with respect (n = 1) including no cyber-bullying (n = 1) and no stealing or damaging other people (n = 1).

‘Digital fluency’ involved digital access (n = 2) by using different software and hardware (n = 2) in online platforms (n = 1) or communication tools (n = 1) or within programming environments (n = 1). Digital fluency also underlined following recent technological advancements (n = 1) and knowledge (n = 1) including digital health and wellness (n = 1) dimension.

‘Social intelligence’ related to understanding digital culture (n = 1), the concept of digital exclusion (n = 1) and digital divide (n = 3). ‘Research skills’ were detailed with searching academic information (n = 3) on databases such as Web of Science and Scopus (n = 2) and their citation, summarization, and quotation (n = 2).

‘Digital teaching’ was described as a skill (n = 2) in Table 4 whereas it was also labelled as a competence (n = 1) as shown in Table 3 . Similarly, while learning to learn (n = 1) was coined under competencies in Table 3 , digital learning (n = 2, Table 4 ) and life-long learning (n = 1, Table 5 ) were stated as learning related skills. Moreover, learning was used with the following three terms: learning readiness (n = 1), self-paced learning (n = 1) and learning flexibility (n = 1).

Table 7 shows other categories listed below the ‘skills’ major theme. The list covers not only the software such as GIS, text mining, mapping, or bibliometric analysis programs but also the conceptual skills such as the fourth industrial revolution and information management.

Major theme 4: thinking

The last identified major theme was the different types of ‘thinking’. As Table 8 shows, ‘critical thinking’ was the most frequent thinking category (n = 4). Except computational thinking, the other categories were not detailed.

Computational thinking (n = 3) was associated with the general logic of how a computer works and sub-categorized into the following steps; construction of the problem (n = 3), abstraction (n = 1), disintegration of the problem (n = 2), data collection, (n = 2), data analysis (n = 2), algorithmic design (n = 2), parallelization & iteration (n = 1), automation (n = 1), generalization (n = 1), and evaluation (n = 2).

A transversal analysis of digital literacy categories reveals the following fields of digital literacy application:

Technological advancement (IT, ICT, Industry 4.0, IoT, text mining, GIS, bibliometric analysis, mapping data, technology, AI, big data)

Networking (Internet, web, connectivity, network, safety)

Information (media, news, communication)

Creative-cultural industries (culture, publishing, film, TV, leisure, content creation)

Academia (research, documentation, library)

Citizenship (participation, society, social intelligence, awareness, politics, rights, legal use, ethics)

Education (life skills, problem solving, teaching, learning, education, lifelong learning)

Professional life (work, teamwork, collaboration, economy, commerce, leadership, decision making)

Personal level (critical thinking, evaluation, analytical thinking, innovative thinking)

This systematic review on digital literacy concentrated on forty-three articles from the databases of WoS/Clarivate Analytics, Proquest Central, Emerald Management Journals, Jstor Business College Collections and Scopus/Elsevier. The initial results revealed that there is an increasing trend on digital literacy focused academic papers. Research work in digital literacy is critical in a context of disruptive digital business, and more recently, the pandemic-triggered accelerated digitalisation (Beaunoyer, Dupéré & Guitton, 2020 ; Sousa & Rocha 2019 ). Moreover, most of these papers were employing qualitative research methods. The raw data of these articles were analysed qualitatively using systematic literature review to reveal major themes and categories. Four major themes that appeared are: digital literacy, digital competencies, digital skills and thinking.

Whereas the mainstream literature describes digital literacy as a set of photo-visual, real-time, information, branching, reproduction and social-emotional thinking (Eshet-Alkalai, 2012 ) or as a set of precise specific operations, i.e., finding, consuming, creating, communicating and sharing digital content (Heitin, 2016 ), this study reveals that digital literacy revolves around and is in connection with the concepts of computer literacy, media literacy, cultural literacy or disciplinary literacy. In other words, the present systematic review indicates that digital literacy is far broader than specific tasks, englobing the entire sphere of computer operation and media use in a cultural context.

The digital competence yardstick, DigComp (Carretero, Vuorikari & Punie, 2017 ) suggests that the main digital competencies cover information and data literacy, communication and collaboration, digital content creation, safety, and problem solving. Similarly, the findings of this research place digital competencies in relation to problem solving, safety, information processing, content creation and communication. Therefore, the findings of the systematic literature review are, to a large extent, in line with the existing framework used in the European Union.

The investigation of the main keywords associated with digital skills has revealed that information literacy, ICT, communication, collaboration, digital content creation, research and decision-making skill are the most representative. In a structured way, the existing literature groups these skills in technological, cognitive, and social (Ng, 2012 ) or, more extensively, into operational, formal, information Internet, strategic, communication and content creation (van Dijk & van Deursen, 2014 ). In time, the literature has become richer in frameworks, and prolific authors have improved their results. As such, more recent research (vaan Laar et al., 2017 ) use the following categories: technical, information management, communication, collaboration, creativity, critical thinking, and problem solving.

Whereas digital thinking was observed to be mostly related with critical thinking and computational thinking, DigComp connects it with critical thinking, creativity, and innovation, on the one hand, and researchers highlight fake news, misinformation, cybersecurity, and echo chambers as exponents of digital thinking, on the other hand (Sulzer, 2018 ; Puig, Blanco-Anaya & Perez-Maceira, 2021 ).

This systematic review research study looks ahead to offer an initial step and guideline for the development of a more contemporary digital literacy framework including digital literacy major themes and factors. The researchers provide the following recommendations for both researchers and practitioners.

Recommendations for prospective research

By considering the major qualitative research trend, it seems apparent that more quantitative research-oriented studies are needed. Although it requires more effort and time, mixed method studies will help understand digital literacy holistically.

As digital literacy is an umbrella term for many different technologies, specific case studies need be designed, such as digital literacy for artificial intelligence or digital literacy for drones’ usage.

Digital literacy affects different areas of human lives, such as education, business, health, governance, and so forth. Therefore, different case studies could be carried out for each of these unique dimensions of our lives. For instance, it is worth investigating the role of digital literacy on lifelong learning in particular, and on education in general, as well as the digital upskilling effects on the labour market flexibility.

Further experimental studies on digital literacy are necessary to realize how certain variables (for instance, age, gender, socioeconomic status, cognitive abilities, etc.) affect this concept overtly or covertly. Moreover, the digital divide issue needs to be analysed through the lens of its main determinants.

New bibliometric analysis method can be implemented on digital literacy documents to reveal more information on how these works are related or centred on what major topic. This visual approach will assist to realize the big picture within the digital literacy framework.

Recommendations for practitioners

The digital literacy stakeholders, policymakers in education and managers in private organizations, need to be aware that there are many dimensions and variables regarding the implementation of digital literacy. In that case, stakeholders must comprehend their beneficiaries or the participants more deeply to increase the effect of digital literacy related activities. For example, critical thinking and problem-solving skills and abilities are mentioned to affect digital literacy. Hence, stakeholders have to initially understand whether the participants have enough entry level critical thinking and problem solving.

Development of digital literacy for different groups of people requires more energy, since each group might require a different set of skills, abilities, or competencies. Hence, different subject matter experts, such as technologists, instructional designers, content experts, should join the team.

It is indispensably vital to develop different digital frameworks for different technologies (basic or advanced) or different contexts (different levels of schooling or various industries).

These frameworks should be updated regularly as digital fields are evolving rapidly. Every year, committees should gather around to understand new technological trends and decide whether they should address the changes into their frameworks.

Understanding digital literacy in a thorough manner can enable decision makers to correctly implement and apply policies addressing the digital divide that is reflected onto various aspects of life, e.g., health, employment, education, especially in turbulent times such as the COVID-19 pandemic is.

Lastly, it is also essential to state the study limitations. This study is limited to the analysis of a certain number of papers, obtained from using the selected keywords and databases. Therefore, an extension can be made by adding other keywords and searching other databases.

Availability of data and materials

The authors present the articles used for the study in “ Appendix A ”.

Baber, H., Fanea-Ivanovici, M., Lee, Y. T., & Tinmaz, H. (2022). A bibliometric analysis of digital literacy research and emerging themes pre-during COVID-19 pandemic. Information and Learning Sciences . https://doi.org/10.1108/ILS-10-2021-0090 .

Article   Google Scholar  

Beaunoyer, E., Dupéré, S., & Guitton, M. J. (2020). COVID-19 and digital inequalities: Reciprocal impacts and mitigation strategies. Computers in Human Behavior, 111 , 10642. https://doi.org/10.1016/j.chb.2020.106424

Blau, I., Shamir-Inbal, T., & Avdiel, O. (2020). How does the pedagogical design of a technology-enhanced collaborative academic course promote digital literacies, self-regulation, and perceived learning of students? The Internet and Higher Education, 45 , 100722. https://doi.org/10.1016/j.iheduc.2019.100722

Carretero, S., Vuorikari, R., & Punie, Y. (2017). DigComp 2.1: The Digital Competence Framework for Citizens with eight proficiency levels and examples of use (No. JRC106281). Joint Research Centre, https://publications.jrc.ec.europa.eu/repository/handle/JRC106281

Eshet, Y. (2004). Digital literacy: A conceptual framework for survival skills in the digital era. Journal of Educational Multimedia and Hypermedia , 13 (1), 93–106, https://www.learntechlib.org/primary/p/4793/

Eshet-Alkalai, Y. (2012). Thinking in the digital era: A revised model for digital literacy. Issues in Informing Science and Information Technology, 9 (2), 267–276. https://doi.org/10.28945/1621

Ferrari, A. (2012). Digital competence in practice: An analysis of frameworks. JCR IPTS, Sevilla. https://ifap.ru/library/book522.pdf

Fraenkel, J. R., Wallen, N. E., & Hyun, H. H. (2012). How to design and evaluate research in education (8th ed.). Mc Graw Hill.

Google Scholar  

Heitin, L. (2016). What is digital literacy? Education Week, https://www.edweek.org/teaching-learning/what-is-digital-literacy/2016/11

Helsper, E. J., & Eynon, R. (2013). Distinct skill pathways to digital engagement. European Journal of Communication, 28 (6), 696–713. https://doi.org/10.1177/0267323113499113

Kowalczyk, N., & Truluck, C. (2013). Literature reviews and systematic reviews: What is the difference ? . Radiologic Technology, 85 (2), 219–222.

Ng, W. (2012). Can we teach digital natives digital literacy? Computers & Education, 59 (3), 1065–1078. https://doi.org/10.1016/j.compedu.2012.04.016

Ozkan-Ozen, Y. D., & Kazancoglu, Y. (2021). Analysing workforce development challenges in the Industry 4.0. International Journal of Manpower . https://doi.org/10.1108/IJM-03-2021-0167

Puig, B., Blanco-Anaya, P., & Perez-Maceira, J. J. (2021). “Fake News” or Real Science? Critical thinking to assess information on COVID-19. Frontiers in Education, 6 , 646909. https://doi.org/10.3389/feduc.2021.646909

Robinson, L., Cotten, S. R., Ono, H., Quan-Haase, A., Mesch, G., Chen, W., Schulz, J., Hale, T. M., & Stern, M. J. (2015). Digital inequalities and why they matter. Information, Communication & Society, 18 (5), 569–582. https://doi.org/10.1080/1369118X.2015.1012532

Robinson, P., & Lowe, J. (2015). Literature reviews vs systematic reviews. Australian and New Zealand Journal of Public Health, 39 (2), 103. https://doi.org/10.1111/1753-6405.12393

Sousa, M. J., & Rocha, A. (2019). Skills for disruptive digital business. Journal of Business Research, 94 , 257–263. https://doi.org/10.1016/j.jbusres.2017.12.051

Sulzer, A. (2018). (Re)conceptualizing digital literacies before and after the election of Trump. English Teaching: Practice & Critique, 17 (2), 58–71. https://doi.org/10.1108/ETPC-06-2017-0098

Tinmaz, H., Fanea-Ivanovici, M., & Baber, H. (2022). A snapshot of digital literacy. Library Hi Tech News , (ahead-of-print).

Uman, L. S. (2011). Systematic reviews and meta-analyses. Journal of the Canadian Academy of Child and Adolescent Psychiatry, 20 (1), 57–59.

Van Deursen, A. J. A. M., Helsper, E. J., & Eynon, R. (2015). Development and validation of the Internet Skills Scale (ISS). Information, Communication & Society, 19 (6), 804–823. https://doi.org/10.1080/1369118X.2015.1078834

Van Deursen, A. J. A. M., & van Dijk, J. A. G. M. (2009). Using the internet: Skills related problems in users’ online behaviour. Interacting with Computers, 21 , 393–402. https://doi.org/10.1016/j.intcom.2009.06.005

Van Deursen, A. J. A. M., & van Dijk, J. A. G. M. (2010a). Measuring internet skills. International Journal of Human-Computer Interaction, 26 (10), 891–916. https://doi.org/10.1080/10447318.2010.496338

Van Deursen, A. J. A. M., & van Dijk, J. A. G. M. (2010b). Internet skills and the digital divide. New Media & Society, 13 (6), 893–911. https://doi.org/10.1177/1461444810386774

van Dijk, J. A. G. M., & Van Deursen, A. J. A. M. (2014). Digital skills, unlocking the information society . Palgrave MacMillan.

van Laar, E., van Deursen, A. J. A. M., van Dijk, J. A. G. M., & de Haan, J. (2017). The relation between 21st-century skills and digital skills: A systematic literature review. Computer in Human Behavior, 72 , 577–588. https://doi.org/10.1016/j.chb.2017.03.010

Download references

This research is funded by Woosong University Academic Research in 2022.

Author information

Authors and affiliations.

AI & Big Data Department, Endicott College of International Studies, Woosong University, Daejeon, South Korea

Hasan Tinmaz

Endicott College of International Studies, Woosong University, Daejeon, South Korea

Yoo-Taek Lee

Department of Economics and Economic Policies, Bucharest University of Economic Studies, Bucharest, Romania

Mina Fanea-Ivanovici

Abu Dhabi School of Management, Abu Dhabi, United Arab Emirates

Hasnan Baber

You can also search for this author in PubMed   Google Scholar

Contributions

The authors worked together on the manuscript equally. All authors have read and approved the final manuscript.

Corresponding author

Correspondence to Hasnan Baber .

Ethics declarations

Competing of interests.

The authors declare that they have no competing interests.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Tinmaz, H., Lee, YT., Fanea-Ivanovici, M. et al. A systematic review on digital literacy. Smart Learn. Environ. 9 , 21 (2022). https://doi.org/10.1186/s40561-022-00204-y

Download citation

Received : 23 February 2022

Accepted : 01 June 2022

Published : 08 June 2022

DOI : https://doi.org/10.1186/s40561-022-00204-y

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Digital literacy
  • Systematic review
  • Qualitative research

critical thinking in digital literacy

Discover digital literacy courses and resources

What is digital literacy, expand economic opportunity for everyone.

Digital literacy can play a powerful role in helping people connect, learn, engage with their community, and create more promising futures. Simply reading articles online does not address digital literacy, so it is important for everyone to understand the variety of content and possibilities that are accessible online. This digital literacy course can help individuals gain the digital skills necessary to engage in a digital economy and improve livelihoods. The digital literacy program is used by individuals, nonprofits, schools, and governments all over the world.

Start the digital literacy learning pathway

Microsoft Digital Literacy is for anyone with basic reading skills who wants to learn the fundamentals of using digital technologies. Get started with LinkedIn Learning courses.

System requirements

Additional digital literacy resources, digital literacy frequently asked questions.

Find additional information about digital literacy course requirements, plans for course updates, and other information specific to completing the Digital Literacy course.

Creative Commons license

Microsoft Digital Literacy is licensed under the Creative Commons Attribution Non-Commercial Share-Alike License which means you may share and adapt this material for non-commercial use as long as you attribute it to Microsoft and license your adapted material under these same licensing terms.

Localized resources and courses

Arabic course resources | موارد الدورة التدريبية باللغة العربية.

  • الدورة التدريبية 1 - العمل باستخدام أجهزة الكمبيوتر
  • الدورة التدريبية 2 - الوصول إلى المعلومات عبر الإنترنت
  • الدورة التدريبية 3 - التواصل عبر الإنترنت
  • الدورة التدريبية 4 - المشاركة بأمان ومسؤولية على الإنترنت
  • الدورة التدريبية 5 - إنشاء محتوى رقمي
  • الدورة التدريبية 6 - التعاون وإدارة المحتوى رقميًا

Chinese Simplified course resources | 简体中文课程资源

  • 课程 1 - 使用计算机进行工作
  • 课程 2 - 在线访问信息
  • 课程 3 - 在线交流
  • 课程 4 - 安全且负责任地参与线上活动
  • 课程 5 - 创作数字内容
  • 课程 6 - 以数字方式协作并管理内容

English course resources

  • Course 1 - Work with computers
  • Course 2 - Access information online
  • Course 3 - Communicate online
  • Course 4 - Participate safely and responsibly online
  • Course 5 - Create digital content
  • Course 6 - Collaborate and manage content digitally
  • Transcript files
  • Teacher Resource files

English SCORM packages

Download the English Digital Literacy SCORM packages by course module.

  • Work with Computers
  • Access information online
  • Communicate online
  • Participate safely and responsibly online
  • Create digital content
  • Collaborate and manage content digitally

French course resources | Détails sur les cours en français

  • Cours 1 - Travailler sur ordinateur
  • Cours 2 - Consulter des informations en ligne
  • Cours 3 - Communiquer en ligne
  • Cours 4 - Travailler en ligne de manière sûre et responsable
  • Cours 5 - Créer des contenus numériques
  • Cours 6 - Collaborer sur des contenus numériques et les gérer
  • Fichiers de transcription

Édition Web de LinkedIn Learning

  • Travailler avec des ordinateurs et des périphériques informatiques
  • Travailler et collaborer en ligne

German course resources | Kursressourcen Deutsch

  • Kurs 1 – Arbeiten mit Computern
  • Kurs 2 – Informationen online abrufen
  • Kurs 3 – Online-Kommunikation
  • Kurs 4 – Sicher und verantwortungsvoll online teilnehmen
  • Kurs 5 – Erstellen digitaler Inhalte
  • Kurs 6 – Digitale Inhalte im Team bearbeiten und verwalten
  • Transcript-Dateien

LinkedIn Learning Web-Edition

  • Mit Computern und Mobilgeräten arbeiten
  • Online arbeiten: allein und im Team

Indonesian course resources | Referensi kursus bahasa Indonesia

  • Kursus 1 - Menggunakan komputer (Bagian 1)
  • Kursus 1 - Menggunakan komputer (Bagian 2)
  • Kursus 2 - Mengakses informasi secara online
  • Kursus 3 - Berkomunikasi lewat online
  • Kursus 4 - Berpartisipasi dengan aman dan bertanggung jawab secara online
  • Kursus 5 - Membuat konten digital
  • Kursus 6 - Berkolaborasi dan mengelola konten secara digital
  • File transkrip

Japanese course resources | 日本語コースのリソース

  • コース 1 - コンピューターの操作
  • コース 2 - オンラインでの情報へのアクセス
  • コース 3 - オンラインでのコミュニケーション
  • コース 4 - オンラインでの安全かつ責任ある参加
  • コース 5 - デジタル コンテンツの作成
  • コース 6 - デジタルでのコンテンツのコラボレーションおよび管理
  • トランスクリプト ファイル

Web 版 LinkedIn Learning

  • パソコンやモバイルデバイスを仕事で使うには
  • オンラインで共同作業を行うには

Portuguese (Brazil) course resources | Recursos do curso em português (Brasil)

  • Curso 1 - Trabalhar com computadores
  • Curso 2 - Acessar informações online
  • Curso 3 - Comunicar-se online
  • Curso 4 - Participar da comunidade online com segurança e responsabilidade
  • Curso 5 - Criar conteúdo digital
  • Curso 6 ‑ Colaborar e gerenciar digitalmente conteúdo
  • Arquivos de transcrição

Edição Web do LinkedIn Learning

  • Como Trabalhar com Computadores e Dispositivos Móveis
  • Como Trabalhar e Colaborar On-line

Portuguese (Portugal) course resources | Recursos dos cursos em português (Portugal)

  • Curso 2 - Aceder às informações online
  • Curso 3 - Comunicar online
  • Curso 4 - Participar online de forma segura e responsável
  • Curso 5 - Criar conteúdos digitais
  • Curso 6 - Colaborar e gerir conteúdos digitalmente
  • Ficheiros de transcrição

Spanish course resources | Recursos de los cursos en español

  • Curso 1: Trabajo con ordenadores
  • Curso 2: Acceso a información online
  • Curso 3: Comunicación online
  • Curso 4: Participación online de forma segura y responsable
  • Curso 5: Creación de contenido digital
  • Curso 6: Colaboración y gestión digital del contenido
  • Archivos de transcripciones

LinkedIn Learning edición web

  • Trabajando con computadoras y dispositivos
  • Trabajando y colaborando online

Vietnamese course resources | Tài nguyên khóa học tiếng Việt

  • Khóa học 1 - Làm việc với máy tính (Phần 1)
  • Khóa học 1 - Làm việc với máy tính (Phần 2)
  • Khóa 2 - Truy cập thông tin trực tuyến
  • Khóa 3 - Giao tiếp trực tuyến
  • Khóa 4 - Tham gia an toàn và có trách nhiệm trên mạng
  • Khóa 5 - Tạo nội dung kỹ thuật số
  • Khóa 6 - Cộng tác và quản lý nội dung kỹ thuật số
  • Tệp bảng điểm

Follow Microsoft

Facebook

Development of a digital literacy measurement tool for middle and high school students in the context of scientific practice

  • Open access
  • Published: 31 August 2024

Cite this article

You have full access to this open access article

critical thinking in digital literacy

  • Mihyun Son   ORCID: orcid.org/0000-0002-0093-305X 1 &
  • Minsu Ha   ORCID: orcid.org/0000-0003-3087-3833 2  

Digital literacy is essential for scientific literacy in a digital world. Although the NGSS Practices include many activities that require digital literacy, most studies have examined digital literacy from a generic perspective rather than a curricular context. This study aimed to develop a self-report tool to measure elements of digital literacy among middle and high school students in the context of science practice. Using Messick's validity framework, Rasch analysis was conducted to ensure the tool's validity. Initial items were developed from the NGSS, KSES, and other countries' curricula and related research literature. The final 38 items were expertly reviewed by scientists and applied to 1194 students for statistical analysis. The results indicated that the tool could be divided into five dimensions of digital literacy in the context of science practice: collecting and recording data, analyzing and interpreting (statistics), analyzing and interpreting (tools), generating conclusions, and sharing and presenting. Item fit and reliability were analyzed. The study found that most items did not show significant gender or school level differences, but scores increased with grade level. Boys tended to perform better than girls, and this difference did not change with grade level. Analysis and Interpretation (Tools) showed the largest differences across school levels. The developed measurement tool suggests that digital literacy in the context of science practice is distinct from generic digital literacy, requiring a multi-contextual approach to teaching. Furthermore, the gender gap was evident in all areas and did not decrease with higher school levels, particularly in STEM-related items like math and computational languages, indicating a need for focused education for girls. The tool developed in this study can serve as a baseline for teachers to identify students' levels and for students to set learning goals. It provides information on how digital literacy can be taught within a curricular context.

Similar content being viewed by others

critical thinking in digital literacy

Development and validation of an instrument for assessing scientific literacy from junior to senior high school

critical thinking in digital literacy

Understanding science data literacy: a conceptual framework and assessment tool for college students majoring in STEM

critical thinking in digital literacy

Digital Literacy Among Students of Pedagogical Faculties in Poland—A Systematic Literature Analysis

Explore related subjects.

  • Artificial Intelligence
  • Digital Education and Educational Technology

Avoid common mistakes on your manuscript.

1 Introduction

Fostering scientific literacy is one of the most important goals of science education, and scientific literacy has been defined in various ways. Scientific literacy is sometimes described as the ability to use evidence and data to evaluate the quality of scientific information and claims presented by scientists and the media (NRC, 1996 ), or as the ability to understand and make decisions about changes in the world by drawing evidence-based conclusions using scientific knowledge (AAAS, 1993 ). The Next Generation Science Standards (NGSS), which can be considered a representative guideline for the direction of science education in the United States, also mentions scientific literacy in terms of students' understanding of scientific concepts, evaluation of scientific claims based on evidence, and participation in scientific practices (NRC, 2013 ). In this way, scientific literacy is the most fundamental competency for understanding the world and for continuous scientific engagement. In this context, digital literacy is essential for fostering scientific literacy in the digital world (Demirbag & Bahcivan, 2021 ; Mason et al., 2010 ; Walraven et al., 2009 ).

Korean science education research also emphasizes digital literacy in the context of scientific practices. The Korean Science Education Standards (KSES) divides scientific literacy into three dimensions: competences dimension, knowledge dimension, and participation & action dimension. Among these, the competences dimension includes scientific inquiry ability, scientific thinking ability, communication and collaboration ability, information processing and decision-making ability, and lifelong learning ability in a hyper-connected society (MOE et al. 2019 ). These five areas encompass both the skills traditionally emphasized in science education and those anticipated to be necessary in the future society characterized by the digital revolution. For instance, within the scientific inquiry ability, there are skills such as data transformation, engineering design and creation, and explanation generation and argumentation. Additionally, scientific thinking ability includes mathematical and computational thinking, while communication and collaboration ability includes the ability to express ideas. The 'information processing and decision-making ability' within the competences dimension involves the ability to search for, select, produce, and evaluate information and data. The emphasis on the importance of digital literacy and its integration with subject education can also be found in the curricula of various countries such as Singapore and Europe, as well as in reports from organizations like the OECD (Ei & Soon, 2021 ; Erstad et al., 2021 ; Polizzi, 2020 ).

The trend of science education reform is calling for changes in the relationship between scientific knowledge and scientific methods in science learning (Kawasaki & Sandoval, 2020 ). First, when students handle actual data, learning experiences related to data utilization skills can occur, ultimately aiming to cultivate scientific thinking and problem-solving abilities. The actual data used by students can take various forms, such as data collected by students in inquiry projects, searches in online data repositories, illustrations and tables in textbooks, or scientific publications (Hug & McNeill, 2008 ; Kerlin et al., 2010 ). Students need to select appropriate data from these sources, classify it according to their objectives, and develop skills in collecting, storing, representing, and sharing data. In this process, they should be able to engage in activities such as data analysis and interpretation, utilizing mathematical and computational thinking, and participating in evidence-based arguments (NGSS Lead States, 2013 ; NRC, 2013 ).

Additionally, basic computational thinking is necessary to understand and solve socio-scientific issues related to real life. This requires the ability to use algorithmic thinking, data analysis and representation for modeling thinking, and simulation tools (Rodríguez-Becerra et al., 2020 ). The importance of computers in scientific inquiry has grown due to advancements in artificial intelligence, software platforms, and sensors. While there have been limitations in science education due to the lack of various data sets, the proliferation of sensors has made personalized data collection possible, facilitating the collection of data relevant to scientific inquiry contexts. Furthermore, the establishment of platforms for data sharing and environments that facilitate data analysis and visualization have made computer and digital-based scientific inquiry representative activities of scientific practice.

Digital literacy refers not only to the basic skills related to using digital devices but also to the complex skills that support learners by enhancing learning outcomes in digital environments. These skills include cognitive, social, and emotional skills (Eshet-Alkalai & Soffer, 2012 ). The meaning of digital literacy has expanded to include communication and content production using information and communication technology (ICT) (Mason et al., 2018 ), information retrieval and processing through new technologies (Siddiq et al., 2016 ), and communication with communities (da Silva & Heaton, 2017 ). Various countries and research organizations have presented diverse aspects of data literacy, which commonly include three main elements: 1) information and data, 2) communication and collaboration, and 3) technical skills (Bravo et al., 2021 ). These three elements commonly included in digital literacy largely overlap with the components of scientific literacy, indicating that digital literacy can be integrated with subject-specific digital competence education (Kotzebue et al., 2021 ).

Based on the relationship between these two literacies, many scholars have continued efforts to understand scientific literacy through digital literacy (Bliss, 2019 ; Da Silva & Heaton, 2017 ; Holincheck et al., 2022 ; Mardiani et al., 2024 ). They have introduced terms such as digital scientific literacy (Holincheck et al., 2022 ), aimed to develop critical evaluation skills for digital scientific materials (Bliss, 2019 ; Holincheck et al., 2022 ), engaged in inquiry activities using digital scientific materials (Mardiani et al., 2024 ), and examined the impact of information or data sharing—a component of digital literacy—on students' construction of scientific knowledge (Dewi et al., 2021 ; Mardiani et al., 2024 ). However, the evaluation tools used to assess the effectiveness of education have mostly focused on separately verifying digital literacy and subject content. Given that digital literacy includes both generic and subject-specific aspects (D-EDK, 2014 ; Kotzebue et al., 2021 ), most measurements have emphasized the generic part of digital literacy.

Studies aimed at developing digital literacy assessment tools have also emphasized the cross-curricular aspects of digital literacy, often constructing items in the form of exam questions (ACARA, 2018 ; Chetty et al., 2018 ; Covello & Lei, 2010 ; Jin et al., 2020 ), which makes it difficult for students to develop metacognitive understanding of the level of digital literacy they need to attain. Additionally, most tools are designed for use at a specific school level or age group (Cote & Milliner, 2016 ; Jin et al., 2020 ; Oh et al., 2021 ), making it challenging to longitudinally track changes in students' literacy levels.

Another aspect of this study is the evaluation tools for scientific literacy (or skills), which face challenges in finding forms that are applicable in the digital age. While traditional scientific literacy competencies have emphasized data analysis, representation, and sharing, there are difficulties in adapting these tools for the digital era. For instance, in the study by Gormally et al. ( 2012 ) on developing scientific literacy assessment tools, it is noted that students should have the basic scientific literacy to approach scientific phenomena quantitatively and possess various skills to apply this to problem-solving in everyday life (NRC, 2003 ). However, traditional tools derived from scientific inquiry and scientific methods carry inherent limitations. These tools often fail to accurately explain what is important in science, seem to perform inquiries only to explain theories (Osborne, 2014 ), and do not focus on activities (Ford, 2015 ). Consequently, to solve everyday problems in the digital world, there is a need for a new term that can encompass a broader meaning and have a sustained and widespread impact on our lives (Da Silva & Heaton, 2017 ).

Thus, the term 'Practice' is being used in place of scientific method or inquiry to represent the educational goal of teaching students how to reason and act scientifically in an integrated digital world (Osborne, 2014 ; Ford, 2015 ). Based on this discussion, we aim to develop a self-report measurement tool that can be utilized in classrooms, grounded in the important elements of digital literacy within the context of scientific practice. The specific research questions of this study are as follows:

RQ1. What is the content validity of the digital literacy assessment tool in the context of scientific practice?

RQ2. What validity evidence is identified in the statistical tests using evaluation data for the digital literacy assessment tool in the context of scientific practice?

RQ3. Are there significant gender and school level differences in the scores of the digital literacy assessment tool in the context of scientific practice?

2 Research method

The central research question of this study is to develop a digital literacy assessment tool based on strong validity evidence. Our RQ1 concerns the content validity of the developed assessment tool. Additionally, RQ2 involves collecting validity evidence through statistical methods using actual student data. Furthermore, RQ3 is a study on the application of the developed assessment tool. To verify the validity of the assessment tool developed in this study, Messick's ( 1995 ) validity framework was used. Messick ( 1995 ) defined validity as "an integrated judgment of the degree to which theoretical and empirical evidence supports the adequacy and appropriateness of interpretations and actions based on test scores." He proposed six aspects of validity: content, substantive, structural, generalizability, external, and consequential (Messick, 1995 ). In this study, among Messick's six validity frames, content-based validity and substantive validity were verified using qualitative methods through the evaluation of scientists and the analysis of the student survey process during the item development process. The sections 'Initial Development through Literature Review' and 'Completion through Surveys with Scientists' pertain to content validity and correspond to RQ1. Subsequently, the sections 'Participants, Data Collection, and Analysis' correspond to RQ2 and RQ3.

2.1 Initial development through literature review

This study develops self-report items that measure digital literacy related to the scientific practice process. The goal is to present the functional objectives of digital-related scientific inquiry and develop items to identify the current level of students. Since this study defines the necessary skills for middle and high school students according to the 'inquiry process,' it uses the 'science and engineering practices standards' from NGSS and Korea's KSES as the basic framework. Additionally, it incorporates the difficulties and required student skills identified in various studies that combine scientific inquiry contexts with digital literacy. The digital-related inquiry process centers on activities beginning with data collection and analysis, followed by constructing and sharing conclusions. Ultimately, the items were developed in a four-stage structure: data collection, data analysis, drawing conclusions, and sharing. To emphasize the social practice of science learning in the digital age, the process of sharing was included, replacing the term 'communication' from NGSS's Practice with 'sharing (communication)' to reflect the importance of information sharing in the digital era (Elliott & McKaughan, 2014 ).

When examining the eight practices of the NGSS in the United States, terms that did not appear in the general scientific inquiry process are directly mentioned (NRC, 2013 ). Terms such as “Developing and using models,” “Using mathematics and computational thinking,” and “Constructing explanations and designing solutions” highlight the need to focus on these functions in scientific inquiry as science, engineering, and technology become increasingly integrated. Similarly, South Korea has developed and announced science education standards for future generations from 2014 to 2019. The KSES includes not only traditional scientific competencies and skills but also those anticipated to be necessary in a future society characterized by the digital revolution (Song et al., 2019 ). Additionally, the data literacy presented in the OECD 2030 report served as an important basis for item development (OECD, 2019 ). Many countries have recently set data literacy and digital literacy as goals within their educational curricula, and related research has been utilized in item development (Ei & Soon, 2021 ; Erstad et al., 2021 ; Polizzi, 2020 ). Therefore, by referencing research articles on scientific inquiry published between 2018 and 2022 that implemented programs related to cultivating competencies in data literacy or digital literacy or presented specific inquiry processes, the necessary skills were added (Aksit & Wiebe, 2020 ; Arastoopour Irgens et al., 2020 ; Chen et al., 2022 ; Clark et al., 2019 ; Gibson & Mourad, 2018 ; Kjelvik & Schultheis, 2019 ; Lichti et al., 2021 ; Son et al., 2018 ; Tsybulsky & Sinai, 2022 ; Wolff et al., 2019 ).

The tool developed in this study is a self-report measurement tool. Self-report tools in competency assessment can have limitations due to biases such as overconfidence (Moore & Healy, 2008 ). However, this tool is not intended to quantify abilities but rather to be used for learning assessments, allowing students to evaluate their own state and goals and reflect metacognitively. Our goal is for the developed assessment tool to be widely used in digital-based science classes conducted in schools. Therefore, the assessment tool was developed to include a Likert scale for self-reporting. Through this tool, students can evaluate their practical competencies in reflecting on themselves, as well as acquiring skills and knowledge (Demirbag & Bahçivan, 2021 ). It is about identifying their position in the learning goal achievement process and their ability to investigate and integrate additional information (Bråten et al., 2011 ). A self-report assessment tool can help students identify their current position and independently set future learning goals.

2.2 Completion through surveys with scientists

The 48 items completed through the literature review were sent to seven scientists researching advanced digital-based scientific fields to confirm the content validity of the items. Digital literacy in science is an essential scientific inquiry skill for students who will live in future societies and a necessary inquiry skill for high school students who plan to advance to STEM universities. However, as science and technology rapidly develop, the content and methods of education change accordingly, creating a time lag between the development of science and the development of science education. Therefore, to bridge this gap, it is necessary to review the opinions of scientists currently conducting research in relevant fields. A total of seven scientists reviewed these items, each with more than 10 years of research experience and actively engaged in recent research activities (see Table  1 ). The scientists confirmed the content validity of each item and, when modifications were necessary, described the reasons and directions for the revisions.

After undergoing content validity evaluation, the final 48 items were administered to 43 middle school students to verify substantive aspect of construct validity. This process aimed to confirm whether students could understand the content of the items and respond as intended. It was checked if the terms were appropriate for the students' cognitive level and whether the questions were understood as intended. During this process, some students had difficulty interpreting certain items, so examples were added, or the items were revised into language easier for students to understand. The survey took approximately 30 min, and it was confirmed that students were able to focus better on the survey when guided by a supervising teacher. The final revised items were confirmed, and a large amount of data was collected from middle and high school students for statistical validity verification.

2.3 Participants, data collection and analysis

To verify statistical validity, the finalized items were administered to over a thousand students. A total of 1,194 students participated, including 651 middle school students and 543 high school students. The survey was conducted in five schools: one middle school and one high school located in a major city, and one middle school and two high schools located in a small city. Regarding the gender of the participants, there were 537 male students (331 middle school students) and 657 female students (320 middle school students). To minimize data bias related to educational level and gender, participants were recruited considering various regions and a balanced gender ratio. This study involved minors as vulnerable participants, and the entire process was conducted with approval from the IRB of the relevant research institution before the study commenced.

Using data from over a thousand students, statistical tests were conducted to confirm item fit, reliability, differential item functioning, criterion-related validity, and structural validity. The statistical tests were performed using item response theory-based analyses, such as Rasch analysis, suitable for Messick's validity framework (Wolfe & Smith, 2007 ). In the Rasch analysis, item fit was checked using Infit MNSQ and Outfit MNSQ, with the criterion value set between 0.5 and 1.5 (Boone et al., 2014 ). Person reliability and item reliability were verified using Rasch analysis. To confirm construct validity based on internal structure, dimensionality was tested in Rasch analysis to satisfy unidimensionality (Boone et al., 2014 ). For external validity, five additional self-report items measuring core competencies in Korean science subjects were included in the field test alongside the developed items. These self-report items for measuring core competencies in science subjects had been previously field-tested on more than 2000 Korean adolescents and were known for their high validity and reliability (Ha et al., 2018 ). Additionally, since these core competency items included some scientific inquiry skills such as information processing, data transformation, and analysis, they were appropriate for securing external validity. Lastly, group score comparisons were conducted to identify any gender or school level differences in the scores of the developed tool. Rasch analysis was performed using Winsteps 4.1.0, and all other statistics were analyzed using SPSS 26.

3 Research results

3.1 rq1: content validity of items as judged by scientists.

These are the results of the scientist evaluation to verify the internal validity of the developed items. The scientists agreed that, while science inquiry education in schools is generally well-conducted, there is a need for changes in its approach. The scientists reviewed the items and assessed the content validity regarding whether each skill was necessary for middle and high school students. We analyzed the Content Validity Index (CVI) using their evaluations. The acceptability of the CVI value depends on the number of panelists; since there were seven scientists in this study, a CVI of 0.83 or higher is required for acceptability (Lynn, 1986 ). Most items had values of 0.86 or higher, but a few items had lower values. The seven items out of the total 48 that did not meet the acceptable range are as follows (Table  2 ).

Generally, the items included in the analysis and interpretation process had lower content validity, whereas items related to data collection, recording, drawing conclusions, and sharing processes had overall high content validity. Analyzing the items with low content validity reveals two main points. First, students showed negative opinions regarding expressing scientific discovery results using mathematical models or formulas. Second, while understanding and utilizing pre-developed or pre-written computer programs or code is considered a necessary skill, students did not see the need for a deep understanding required to develop or debug programs themselves.

The scientists mentioned that the reason they did not consider these functions important is that there should be a distinction between students who will major in science in university and those who need general scientific literacy. They thought it unnecessary to practice creating mathematical models in general science education, as it might not be important or possible depending on the type of scientific inquiry. Furthermore, they were concerned that overly generalizing results to fit into mathematical models at the students' level of mathematics might lead to misconceptions. Regarding learning computer programming skills, they were apprehensive about the potential focus on programming languages themselves. Since programming languages and software continually evolve, they believed there was no need to become familiar with the grammar of computer languages. Instead, they emphasized the importance of analyzing how to process problems and predicting the outcomes of those processes. Based on expert review, six items deemed more appropriate for university-level science majors were deleted from the study. Additionally, four items with overlapping content were combined into more comprehensive questions, resulting in a final set of 38 items.

3.2 RQ2: Validity evaluation based on statistics

The final set of items was administered to 1,199 students, and the collected data was analyzed to verify validity through various methods. The first analysis conducted was dimensionality analysis. We categorized the digital competencies in the context of scientific practice into four dimensions: data collection and recording, analysis and interpretation, conclusion generation, and sharing and presentation. We composed various items for each factor. Each item was intended to contribute to measuring its respective construct, and each factor was assumed to be unidimensional. If multiple items for a specific construct do not assume unidimensionality and are instead divided into multiple components internally, they are not valid from a measurement perspective.

We performed PCA analysis using residuals from Rasch analysis for this evaluation (Table  3 ). If there are consistent patterns in the parts of the data that do not align with the Rasch measurement values, it suggests the presence of an unexpected dimension. According to Bond et al. ( 2020 ), if the Eigenvalue of the unexplained variance exceeds 2, there is a possibility of another dimension, while if it is below 2, the construct can be assumed to be unidimensional. As shown in Table  3 , the first unexplained variance for data collection and recording, conclusion generation, and sharing and presentation does not exceed 2. However, for the analysis and interpretation items, the first unexplained variance is 2.555, which significantly exceeds 2. We further conducted an exploratory factor analysis for this construct and found that splitting it into two dimensions—items 1 to 8 and items 9 to 12—meets the unidimensionality assumption. Upon close examination, we discovered that items 1 to 8 pertain to the analysis and interpretation of statistical data and graphs, while items 9 to 12 pertain to the use of analytical tools, indicating a difference in content (see Appendix). Therefore, we concluded that it is more valid to separate this part into two dimensions. Consequently, the valid use of this assessment tool is determined to be the analysis of five categories: data collection and recording, analysis and interpretation 1 (statistics), analysis and interpretation 2 (analytical tools), conclusion generation, and sharing and presentation.

Item fit refers to information about whether there are any unusual respondent reactions to specific items. For example, if a significantly higher number of respondents agree or disagree with a particular item compared to other items, the item fit decreases. In Rasch analysis, item fit is checked using Mean Square (MNSQ). Rasch analysis also allows for checking various types of reliability. Person reliability (PR) checks how reliably items measure the respondent's abilities, while item reliability (IR) checks how appropriate the respondent's abilities are for verifying the quality of the items. Additionally, internal consistency reliability is verified using Cronbach's alpha (CA). To see if a specific item supports or hinders the internal consistency of the construct, the Cronbach alpha if the item is deleted (Alpha if item deleted, AIC) is also checked. We recorded all these results in a single table. The comprehensive information in the table reveals the following (Table  4 ).

Overall, all items have adequate fit. The person reliability and item reliability identified in the Rasch analysis both exceeded or approached 0.8 or 0.9, indicating very high reliability. The internal consistency reliability of the items also exceeded 0.8, showing excellent reliability. Additionally, no items were found to significantly affect internal consistency reliability. Based on item fit and reliability information, we can conclude that there are no particular issues that need to be addressed in the developed items.

The following validity evidence pertains to generalizability validity (Table  5 ). Using the measurement values related to digital competence, score comparisons were conducted across various groups such as gender and grade levels. The premise for comparing scores between groups is that the measurement tool functions equally across different groups. Evidence regarding generalizability validity can be confirmed through differential item functioning (DIF) analysis. In Rasch analysis, DIF is checked using the difference in DIF values (DIF C), Rasch-Welch t-test, and Mantel chi-square test. The table presents DIF C (DIF contrast), the significance of the Rasch-Welch t-test (RW p), and the significance of the Mantel chi-square test (MC p).

Regarding the interpretation of DIF differences, a value between 0.43 and 0.64 indicates a moderate level of DIF difference, while a value exceeding 0.64 indicates a large DIF difference (Zwick et al., 1999 ). Although there were no items exceeding 0.64, one item showed a DIF difference exceeding 0.43 for gender, and one item showed a similar difference for grade levels. When using the significance values of the Rasch-Welch t-test and Mantel chi-square test, more items were found to have a p -value of 0.00. For gender, five items showed a p -value of 0.00, and for grade levels, about eight items showed similar results. We concluded that some items in the digital competence tool exhibit differential item functioning. This may be due to the inconsistent application of various elements within the items across groups. For example, the ability to understand graphs and tables in item 7 of the analysis and interpretation section showed DIF for both gender and grade level, indicating that this item functions differently across these groups. Nonetheless, considering that the overall DIF differences are not large and that experiences related to digital competence may vary significantly by gender and grade level, it can be interpreted that no severe DIF was found in the items.

We also examined criterion-related validity. The scores for science-related digital competence are closely related to core science competencies and interest in science or information and computer subjects (Table 6 ). Therefore, the scores of our developed science digital competence should show significant correlations with general science core competency scores and interest in science and computer subjects. To verify this, we conducted a correlation analysis. We selected five items developed by Ha et al. ( 2018 ) to generate scores for science core competencies. We also collected Likert scale scores for the items "Do you like science?" and "Do you like computer or information subjects?". The correlations between the five variables we developed and the three external criteria (science core competencies, interest in science subjects, and interest in computer/information subjects) are presented in Table 6 . Since interest in subjects was collected using single Likert scale items, Spearman's rho correlation coefficients were used for analysis, while Pearson's correlation coefficients were used for the others.

Science digital competence showed a high correlation with science core competency scores. All correlations were significant at the 0.001 level, with r values exceeding 0.7, indicating a very strong correlation. There were also significant correlations at the 0.001 level with interest in science subjects and computer/information subjects. These results confirm that our developed science digital competence assessment tool is related to other similar indicators and operates as a valid measurement tool.

3.3 RQ3: Gender and school level differences in the scores of the digital literacy assessment tool

Our final statistical analysis concerns whether there are score differences in the assessment tool we developed based on gender and grade level. As discussed in the introduction, it is known that both science and digital competence have gender effects, with males generally showing higher competence or interest (Divya & Haneefa, 2018 ; Esteve-Mon et al., 2020 ; Gebhardt et al., 2019 ). Additionally, as students progress to higher school grades, their learning in science digital competence is expected to improve, resulting in higher competence scores. To confirm if our data exhibited these trends, we conducted a two-way ANOVA and presented the results in graphs and tables (Fig.  1 and Table  7 ). The graphs show the mean scores and standard errors for each group to provide an intuitive comparison of overall scores. The key statistical results of the two-way ANOVA, including F -values, significance levels, and effect sizes, are summarized in a Table  7 .

figure 1

Mean and standard error of scores by gender and school level

Examining the scores for the five items across four groups divided by gender and grade level, we observed consistent trends across all areas. For the five items of science digital competence, male students scored higher than female students, and high school students scored higher than middle school students. Notably, the most significant gender effect size was observed in the analysis and interpretation 2 category. Unlike analysis and interpretation 1, analysis and interpretation 2 involves the use of mathematical tools, computer coding, and programming languages like Python. This suggests that male students had significantly more experience and learning related to these areas compared to female students.

4 Discussions

4.1 rq1. the content validity of the digital literacy assessment tool in the context of scientific practice.

The purpose of this study was to develop a valid assessment tool to evaluate the level of digital literacy in the context of scientific practice for middle and high school students and to establish indicators of digital literacy in scientific practice. To this end, we developed the initial items through literature review and expert Delphi surveys, applied them to middle and high school students to verify statistical validity, and investigated whether the items could be applied regardless of gender and school level to finalize the items. Through this process, we identified a consensus on the elements and levels of digital literacy required in the context of scientific practice among scientists, national curricula, and empirical experiences in classroom settings. Additionally, considering that digital literacy is not merely the ability to use technology but also complements the enhancement of students' learning abilities in the context of science education (Yasa et al., 2023 ), we can propose specific directions for 'learning by doing' in science classes by providing empirical indicators of scientific practice and digital literacy.

Based on research from various countries and major institutions on specific scientific inquiry activities related to digital literacy, we initially developed 48 items. We then had scientists review whether each item was necessary for science majors or for general middle and high school students through two rounds of validation. Through this process and refinement, we finalized a total of 38 items. This process revealed differences between the digital literacy levels scientists believe students should have and the level of digital literacy needed for scientific inquiry performed in classroom settings. Scientists did not consider the criteria emphasizing complex skills, tool usage, or programming languages to be particularly important. They also expressed concerns that generalizations through formulas without sufficient theoretical background might lead to misconceptions. This indicates that the primary goal of science education, which is to develop students' thinking and problem-solving skills, remains unchanged. It also suggests the need for more detailed standards and application plans to avoid instrumentalism and ensure that the purpose of digital literacy aligns with the level students need to learn.

Digital competence in the context of scientific practice was divided into four dimensions: data collection and recording, analysis and interpretation, conclusion generation, and sharing and presentation, and dimensionality analysis was conducted. The dimensionality analysis revealed that the 'analysis and interpretation' part did not form a single dimension. An exploratory factor analysis showed that it split into statistical processing and the use of analytical tools. Thus, digital competence in the context of scientific practice was confirmed to be divided into five dimensions: data collection and recording, analysis and interpretation (statistics), analysis and interpretation (analytical tools), conclusion generation, and sharing and presentation.

Generally, digital literacy is theoretically composed of several dimensions, but empirical measurements of digital literacy often result in a single dimension or show strong correlations between elements (Aesaert et al., 2014 ; Demirbag and Bahcivan, 2021 ; Fraillon et al., 2019 ). While existing digital literacy developments encompass universal content, this study constructed elements within the context of scientific practice.

This indicates that when digital literacy education is conducted within the context of specific subjects, it is more likely that only certain elements, tailored to the characteristics of the subject, will be learned rather than all elements of digital literacy. So, it implies that digital literacy training tailored to specific subjects can facilitate the smooth operation of classes when teaching subjects that require digital literacy.

Furthermore, this implies that general digital literacy and digital literacy within specific subject contexts may differ. In the case of data literacy, which is similar to digital literacy, research has emphasized competencies within particular subject contexts, leading to the development of terms, definitions, and measurement tools such as scientific data literacy (Son & Jeong, 2020 ; Qiao et al., 2024 ; Qin and D’ignazio, 2010 ). However, there has been limited research on digital literacy within specific subject contexts. This study may serve as practical evidence supporting the argument that universal literacies, such as digital literacy and data literacy, require a different perspective on definition and measurement when learned within the context of specific subjects.

4.2 RQ2. Validity evidence identified in the statistical tests

The analysis of item fit and reliability showed that the item fit was generally appropriate across all items. The reliability of the items was measured using person reliability (PR), item reliability (IR), and Cronbach's alpha (CA), all of which were found to be above 0.8, indicating very high reliability. In addition to the content validity of the developed items, we examined criterion-related validity to confirm additional validity. Since the developed items pertain to digital competence in the context of scientific practice, it was assumed that scientific competence and interest in computers would be closely related to the results of these items. Therefore, additional survey questions on scientific competence and interest in computers and information were analyzed. The results showed significant correlations at the 0.001 level with both interest in science subjects and interest in computer/information subjects. Thus, we confirmed that the tool developed in this study operates validly.

Since we developed digital literacy items in the context of scientific practice for middle and high school students, it is necessary to confirm the generalizability across both school levels and between genders. We conducted DIF analysis to compare scores between groups, assuming that the measurement tool performs equally across different groups. The analysis showed that one item had a moderate difference by school level, and one item had a moderate difference by gender. Using the significance levels of the Rasch-Welch t-test and Mantel chi-square test, we found differences in five items by gender and eight items by school level. Gender differences were evenly distributed across factors, while school level differences mostly occurred in the analysis and interpretation factors.

These items were related to mathematical knowledge and the use of computer languages, indicating that these competencies may vary as students' mathematical concepts and computer language skills increase (Fraillon et al., 2019 ). Lazonder et al. ( 2020 ) found that digital skills are influenced more by early exposure to digital tools than by age. However, higher-order thinking skills such as analysis and interpretation require not only early exposure but also cognitive level and understanding of subjects like mathematics, science, and computer science.

4.3 RQ3. Gender and school level differences in the scores of the digital literacy assessment tool

We conducted a two-way ANOVA to explore the differences by gender and grade level more deeply, confirming that digital and scientific literacy increase with higher grade levels. This trend has been confirmed by various studies (ACARA, 2018 ; Kim et al., 2019 ). When examining gender differences, we found that male students scored higher than female students across all items, with the most significant differences observed in items related to computer coding and software. The effect size was greater for male students, contrasting with the general trend where female students often score higher in science concept learning (Fraillon et al., 2019 ).

In our study, more items focused on functional aspects rather than conceptual ones, possibly giving male students an advantage in technical tasks (Divya & Haneefa, 2018 ; Esteve-Mon et al., 2020 ; Gebhardt et al., 2019 ). Additionally, many items were related to computers and mathematics, where male students tend to exhibit higher overconfidence (Adamecz-Völgy et al., 2023 ). The self-report nature of the survey may also have contributed to these results, as female students might underreport their abilities and confidence in STEM fields compared to their actual capabilities (Hand et al., 2017 ; Sobieraj & Krämer, 2019 ).

Consequently, students believe that their digital literacy within the context of scientific practices increases with age, and male students tend to rate themselves higher than female students across all categories. This suggests that male students find technical tasks easier and have reached a higher level, particularly in areas where mathematics and computer coding are integrated into scientific practices, compared to female students. Although this study is based on self-reported assessments, it can be inferred that there are actual differences in ability, not just in interest or confidence, among middle and high school students who have some understanding of their capabilities. These findings are consistent with previous research indicating that female students lag behind male students in STEM-related skills (Divya & Haneefa, 2018 ; Esteve-Mon et al., 2020 ; Fraillon et al., 2019 ; Gebhardt et al., 2019 ). Therefore, it is necessary to develop instructional strategies in science education to cultivate these competencies.

5 Conclusion and direction of future studies

In this study, we developed a measurement tool for digital literacy in the context of scientific practice for middle and high school students. Based on a literature review and a Delphi study with scientists, an initial draft was created and then applied to Korean middle and high school students. Through a statistical validation process, the tool was finalized. Assuming that digital competence should combine both general and subject-specific digital competencies, we aimed to establish specific criteria for digital literacy integrated with scientific practice. The developed items are applicable in both middle and high schools, with only a few items showing gender-related differences, which are not significant enough to limit their use.

Since the developed measurement tool consists of self-report items, it is important to consider the potential issues of overconfidence bias and the tendency to measure higher actual performance in digital literacy compared to conceptual understanding (Porat et al., 2018 ). However, this study is significant in that it approached digital literacy in a subject-specific context and presented an assessment tool with concrete and practical science lessons in mind to enhance digital competence. It can be universally used in various science subjects, providing guidance for teachers and students on the objectives of their participation in science classes. Understanding the characteristics of the various elements of digital literacy in the context of scientific practice can lead to the development of specific teaching and learning methods to enhance the corresponding competencies. This suggests that digital literacy, within the context of specific subjects, requires a different perspective in terms of its definition and measurement.

The items developed in this study are designed to be used in both middle and high schools, making them suitable for longitudinal research by other researchers. Given the technical changes and software developments, some items may need to be modified, and future related studies are expected to adapt these items accordingly. Additionally, it is necessary to more closely examine the reasons why female students have lower digital literacy, particularly in STEM-related fields, within the context of scientific practices compared to male students, and to explore strategies to reduce this gap.

Data availability

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Adamecz-Völgyi, A., Jerrim, J., Pingault, J. B., & Shure, D. (2023). Overconfident boys: The gender gap in mathematics self-assessment. IZA Discussion Paper No. 16180. https://doi.org/10.2139/ssrn.4464593

Aesaert, K., Van Nijlen, D., Vanderlinde, R., & van Braak, J. (2014). Direct measures of digital information processing and communication skills in primary education: Using item response theory for the development and validation of an ICT competence scale. Computers & Education, 76 , 168–181.

Article   Google Scholar  

Aksit, O., & Wiebe, E. N. (2020). Exploring force and motion concepts in middle grades using computational modeling: A classroom intervention study. Journal of Science Education and Technology, 29 (1), 65–82. https://doi.org/10.1007/s10956-019-09800-z

American Association for the Advancement of Science. (1993). Benchmarks for science literacy. Oxford University Press.

Arastoopour Irgens, G., Dabholkar, S., Bain, C., Woods, P., Hall, K., Swanson, H., Horn, M., & Wilensky, U. (2020). Modeling and measuring high school students’ computational thinking practices in science. Journal of Science Education and Technology, 29 (1), 137–161.

Australian Curriculum, Assessment and Reporting Authority (ACARA). (2018). National Assessment Program –ICT literacy years 6 & 10 2017 report. Sydney, NSW: ACARA. Retreived from: https://www.nap.edu.au/docs/default-source/default-documentlibrary/2017napictlreport_final.pdf?sfvrsn=2 .

Bliss, A. C. (2019). Adult science-based learning: The intersection of digital, science, and information literacies. Adult Learning, 30 (3), 128–137.

Bond, T., Yan, Z., & Heene, M. (2020). Applying the Rasch Model: Fundamental Measurement in the Human Sciences(4th ed.). New York: Routledge.

Boone, W., Staver, J., & Yale, M. (2014). Rasch analysis in the human sciences . Springer.

Book   Google Scholar  

Bråten, I., Britt, M. A., Strømsø, H. I., & Rouet, J. F. (2011). The role of epistemic beliefs in the comprehension of multiple expository texts: Toward an integrated model. Educational Psychologist, 46 (1), 48–70.

Bravo, M. C. M., Chalezquer, C. S., & Serrano-Puche, J. (2021). Meta-framework of digital literacy: A comparative analysis of 21st-century skills frameworks. Revista Latina De Comunicacion Social, 79 , 76–109.

Google Scholar  

Chen, C. M., Li, M. C., & Chen, Y. T. (2022). The effects of web-based inquiry learning mode with the support of collaborative digital reading annotation system on information literacy instruction. Computers & Education, 179 , 104428.

Chetty, K., Qigui, L., Gcora, N., Josie, J., Wenwei, L., & Fang, C. (2018). Bridging the digital divide: Measuring digital literacy. Economics, 12 (1), 20180023.

Clark, J., Falkner, W., Balaji Kuruvadi, S., Bruce, D., Zummo, W., & Yelamarthi, K. (2019). Development and implementation of real-time wireless sensor networks for data literacy education. In Proceedings of the 2019 ASEE North Central Section Conference, Morgan Town, WV, USA (pp. 22–23).

Cote, T., & Milliner, B. (2016). Japanese university students’ self-assessment and digital literacy test results. CALL Communities and Culture–Short Papers from EUROCALL, 125–131.

Covello, S., & Lei, J. (2010). A review of digital literacy assessment instruments. Syracuse University, 1 , 31.

Demirbag, M., & Bahcivan, E. (2021). Comprehensive exploration of digital literacy: Embedded with self-regulation and epistemological beliefs. Journal of Science Education and Technology, 30 (3), 448–459.

Deutschschweizer Erziehungsdirektoren-Konferenz (D-EDK) (2014). Lehrplan21 – Rahmen-informationen. Luzern: D-EDK Geschäftsstelle.

Dewi, C., Pahriah, P., & Purmadi, A. (2021). The urgency of digital literacy for generation Z students in chemistry learning. International Journal of Emerging Technologies in Learning (IJET), 16 (11), 88–103.

Da Silva, P. D., & Heaton, L. (2017). Fostering digital and scientific literacy: Learning through practice. First Monday.

Divya, P., & Haneefa, M. (2018). Digital reading competency of students: A study in universities in Kerala. DESIDOC Journal of Library & Information Technology, 38 (2), 88–94.

Ei, C. H., & Soon, C. (2021). Towards a unified framework for digital literacy in Singapore. IPS Work. Pap, 39.

Elliott, K. C., & McKaughan, D. J. (2014). Nonepistemic values and the multiple goals of science. Philosophy of Science, 81 (1), 1–21.

Erstad, O., Kjällander, S., & Järvelä, S. (2021). Facing the challenges of ‘digital competence’ a Nordic agenda for curriculum development for the 21st century. Nordic Journal of Digital Literacy, 16 (2), 77–87.

Eshet-Alkalai, Y., & Soffer, O. (2012). Guest editorial–Navigating in the digital era: Digital literacy: Socio-cultural and educational aspects. Educational Technology & Society, 15 (2), 1–2.

Esteve-Mon, F., Llopis, M., & Adell-Segura, J. (2020). Digital competence and computational thinking of student teachers. International Journal of Emerging Technologies in Learning (iJET), 15 (2), 29–41.

Ford, M. J. (2015). Educational implications of choosing “practice” to describe science in the next generation science standards. Science Education., 99 (6), 1041–1048.

Fraillon, J., Ainley, J., Schulz, W., Duckworth, D., & Friedman, T. (2019). IEA international computer and information literacy study 2018 assessment framework (p. 74). Springer Nature.

Gebhardt, E., Thomson, S., Ainley, J., & Hillman, K. (2019). Gender differences in computer and information literacy: An in-depth analysis of data from ICILS (p. 73). Springer nature.

Gibson, P., & Mourad, T. (2018). The growing importance of data literacy in life science education. American Journal of Botany, 105 (12), 1953–1956.

Gormally, C., Brickman, P., & Lutz, M. (2012). Developing a test of scientific literacy skills (TOSLS): Measuring undergraduates’ evaluation of scientific information and arguments. CBE—Life Sciences Education, 11(4), 364–377.

Ha, M., Park, H., Kim, Y. J., Kang, N. H., Oh, P. S., Kim, M. J., & Son, M. H. (2018). Developing and applying the questionnaire to measure science core competencies based on the 2015 revised national science curriculum. Journal of the Korean Association for Science Education, 38 (4), 495–504.

Hand, S., Rice, L., & Greenlee, E. (2017). Exploring teachers’ and students’ gender role bias and students’ confidence in STEM fields. Social Psychology of Education, 20 , 929–945.

Holincheck, N., Galanti, T. M., & Trefil, J. (2022). Assessing the development of digital scientific literacy with a computational evidence-based reasoning tool. Journal of Educational Computing Research, 60 (7), 1796–1817.

Hug, B., & McNeill, K. L. (2008). Use of first-hand and second-hand data in science: Does data type influence classroom conversations? International Journal of Science Education, 30 (13), 1725–1751.

Jin, K. Y., Reichert, F., Cagasan, L. P., Jr., de La Torre, J., & Law, N. (2020). Measuring digital literacy across three age cohorts: Exploring test dimensionality and performance differences. Computers & Education, 157 , 103968. https://doi.org/10.1016/j.compedu.2020.103968

Kawasaki, J., & Sandoval, W. A. (2020). Examining teachers’ classroom strategies to understand their goals for student learning around the science practices in the Next Generation Science Standards. Journal of Science Teacher Education, 31 (4), 384–400.

Kerlin, C. K., McDonald, S. P., & Kelly, G. J. (2010). Complexity of Secondary Scientific Data Sources and Students’ Argumentative Discourse. International Journal of Science Education, 32 (9), 1207–1225.

Kim, H. S., Ahn, S. H., & Kim, C. M. (2019). A new IChT literacy test for elementary and middle school students in republic of Korea. Te Asia-Pacific Education Researcher, 28 , 203–212.

Kjelvik, M. K., & Schultheis, E. H. (2019). Getting messy with authentic data: Exploring the potential of using data from scientific research to support student data literacy. CBE—Life Sciences Education , 18(2), es2.

Kotzebue, L. V., Meier, M., Finger, A., Kremser, E., Huwer, J., Thoms, L. J., & Thyssen, C. (2021). The framework DiKoLAN (Digital competencies for teaching in science education) as basis for the self-assessment tool DiKoLAN-Grid. Education Sciences, 11 (12), 775. https://doi.org/10.3390/educsci11120775

Lazonder, A. W., Walraven, A., Gijlers, H., & Janssen, N. (2020). Longitudinal assessment of digital literacy in children: Findings from a large Dutch single-school study. Computers & Education, 143 , 103681.

Lichti, D., Mosley, P., & Callis-Duehl, K. (2021). Learning from the trees: Using project budburst to enhance data literacy and scientific writing skills in an introductory biology laboratory during remote learning. Citizen Science: Theory and Practice, 6 (1), 1–12. https://doi.org/10.5334/CSTP.432

Lynn, M. R. (1986). Determination and quantification of content validity. Nursing Research, 35 (6), 382–385.

Mardiani, E., Mokodenseho, S., Matiala, T. F., Limbalo, S. S. A., & Mokodompit, N. Y. (2024). Implementation of Digital Science and Literacy Teaching in Developing Science Literacy in Middle School Students in Indonesia. The Eastasouth Journal of Learning and Educations, 2 (01), 63–74.

Mason, L., Boldrin, A., & Ariasi, N. (2010). Epistemic metacognition in context: Evaluating and learning online information. Metacognition and Learning, 5 (1), 67–90.

Mason, L., Scrimin, S., Tornatora, M. C., Suitner, C., & Moè, A. (2018). Internet source evaluation: The role of implicit associations and psychophysiological self-regulation. Computers & Education, 119 , 59–75.

Messick, S. (1995). Validity of psychological assessment: Validation of inferences from persons’ responses and performances as scientific inquiry into score meaning. American Psychologist, 50 (9), 741–749. https://doi.org/10.1037/0003-066X.50.9.741

Ministry of Education (MOE), Ministry of Science and ICT (MSICT), & Korea Foundation for the Advancement of Science and Creativity (KOFAC). (2019). Scientific literacy for all Koreans: Korean science education standards for the next generation. Seoul: KOFAC.

Moore, D. A., & Healy, P. J. (2008). The trouble with overconfidence. Psychological Review, 115 (2), 502–517.

National Research Council, Division of Behavioral, Board on Science Education, & National Committee on Science Education Standards. (1996). National science education standards. National Academies Press.

National Research Council. (2003). BIO 2010: Transforming undergraduate education for future research biologists . National Academies Press.

National Research Council. (2013). Next Generation Science Standards: For States . The National Academies Press.

NGSS Lead States. (2013). Next generation science standards: For states, by states. National Academies Press.

OECD. (2019). An OECD Learning Framework 2030. The Future of Education and Labor, 23–35.

Oh, S. S., Kim, K. A., Kim, M., Oh, J., Chu, S. H., & Choi, J. (2021). Measurement of digital literacy among older adults: Systematic review. Journal of Medical Internet Research, 23 (2), e26145.

Osborne, J. (2014). Teaching scientific practices: Meeting the challenge of change. Journal of Science Teacher Education, 25 (2), 177–196.

Polizzi, G. (2020). Digital literacy and the national curriculum for England: Learning from how the experts engage with and evaluate online content. Computers & Education, 152 , 103859.

Porat, E., Blau, I., & Barak, A. (2018). Measuring digital literacies: Junior high-school students’ perceived competencies versus actual performance. Computers & Education, 126 , 23–36.

Qiao, C., Chen, Y., Guo, Q., & Yu, Y. (2024). Understanding science data literacy: A conceptual framework and assessment tool for college students majoring in STEM. International Journal of STEM Education, 11 (1), 1–21.

Qin, J., & D’ignazio, J. (2010). The central role of metadata in a science data literacy course. Journal of Library Metadata, 10 (2–3), 188–204.

Rodríguez-Becerra, J., Cáceres-Jensen, L., Diaz, T., Druker, S., Padilla, V. B., Pernaa, J., & Aksela, M. (2020). Developing technological pedagogical science knowledge through educational computational chemistry: A case study of pre-service chemistry teachers’ perceptions. Chemistry Education Research and Practice, 21 (2), 638–654.

Siddiq, F., Hatlevik, O. E., Olsen, R. V., Throndsen, I., & Scherer, R. (2016). Taking a future perspective by learning from the past–A systematic review of assessment instruments that aim to measure primary and secondary school students’ ICT literacy. Educational Research Review, 19 , 58–84.

Sobieraj, S., & Krämer, N. C. (2019). The impacts of gender and subject on experience of competence and autonomy in STEM. Frontiers in Psychology, 10 , 1432.

Son, M., & Jeong, D. (2020). Exploring the direction of science inquiry education in knowledge-information based society. School Science Journal, 14 (3), 401–414.

Son, M., Jeong, D., & Son, J. (2018). Analysis of middle school students’ difficulties in science inquiry activity in view of knowledge and information processing competence. Journal of the Korean Association for Science Education, 38 (3), 441–449.

Song, J., Kang, S. J., Kwak, Y., Kim, D., Kim, S., Na, J., & Joung, Y. J. (2019). Contents and features of “Korean Science Education Standards (KSES)” for the next generation. Journal of the Korean Association for Science Education, 39 (3), 465–478.

Tsybulsky, D., & Sinai, E. (2022). IoT in project-based biology learning: Students’ experiences and skill development. Journal of Science Education and Technology, 31 (4), 542–553.

Walraven, A., Brand-Gruwel, S., & Boshuizen, H. P. (2009). How students evaluate information and sources when searching the World Wide Web for information. Computers & Education, 52 (1), 234–246.

Wolfe, E. W., & Smith, E. V., Jr. (2007). Instrument development tools and activities for measure validation using Rasch models: Part I-instrument development tools. Journal of Applied Measurement, 8 (1), 97–123.

Wolff, A., Wermelinger, M., & Petre, M. (2019). Exploring design principles for data literacy activities to support children’s inquiries from complex data. International Journal of Human-Computer Studies, 129 , 41–54.

Yasa, A. D., & Rahayu, S. (2023). A survey of elementary school students’ digital literacy skills in science learning. In AIP Conference Proceedings (Vol. 2569, No. 1). AIP Publishing.

Zwick, R., Thayer, D. T., & Lewis, C. (1999). An empirical Bayes approach to Mantel-Haenszel DIF analysis. Journal of Educational Measurement, 36 (1), 1–28.

Download references

Acknowledgements

This research was supported by the National Research Foundation of Korea (Grant Number NRF-700-20230072) and the 4th BK21 Infosphere Science Education Research Center granted by Ministry of Education of the Republic of Korea.

Open Access funding enabled and organized by Seoul National University. This research was supported by the National Research Foundation of Korea (Grant Number NRF-700–20230072) and the 4th BK21 Infosphere Science Education Research Center granted by Ministry of Education of the Republic of Korea.

Author information

Authors and affiliations.

Future Innovation Institute, Seoul National University, 173 Seouldaehak-Ro, Siheung, Gyeonggi-Do, Republic of Korea

Department of Science Education, Seoul National University, 1, Gwanak-Ro, Gwanak-Gu, Seoul, Republic of Korea

You can also search for this author in PubMed   Google Scholar

Contributions

Mihyun Son was responsible for the design of this study, data collection, data analysis, and writing the paper. Minsu Ha was responsible for data analysis and writing the paper.

Corresponding author

Correspondence to Minsu Ha .

Ethics declarations

Competing interests.

The authors declare that they have no competing interests.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Data collection and recording

1

I know reliable websites and can search for appropriate papers or books when conducting theoretical research to solve questions (e.g., knowing which sites to access to find existing studies on the population of our neighborhood or air quality)

2

I can use drives (e.g., Google Drive or OneDrive) to store and effectively manage my data

3

I can identify variables according to questions and hypotheses and determine what data or information needs to be collected (e.g., considering and measuring factors like carbon dioxide, air quality, and tree types to know if trees help improve classroom air quality)

4

I know how to enter data into spreadsheets like Excel

5

I understand that sensors or measuring devices do not always measure accurate values

6

I can think about what to consider to determine if the collected data is reliable

7

I know what to consider when selecting sensors to measure data

8

I know how to deal with errors in the devices I use

9

I understand the difference between data being valid and data being reliable and can explain the meaning of each

10

I try to devise and improve various methods of data collection

Analysis and interpretation 1 (statistics)

1

I attempt to analyze data in various ways (e.g., by day of the week, date, time, correlation with other variables)

2

I can find recurring patterns when converting data into graphs (meaning abstraction)

3

I can distinguish between correlation and causation (e.g., 'higher temperatures cause more photosynthesis' is causation, and 'students with higher math scores also have higher language scores' is correlation)

4

I understand and can interpret the meanings of statistical results such as standard deviation, variance, mean, and maximum values

5

When analyzing collected data, I can identify causes of potential errors and limitations in the data collection process to avoid excessive generalization (e.g., explaining why results from our classroom should not be generalized to all classrooms)

6

While interpreting graphs, I can use scientific background knowledge to explain why certain inquiry results occurred

7

I can understand and explain pictures, tables, and data

8

I can compare and use different types of graphs and tables, understanding their characteristics and usage (e.g., knowing when to use pie charts, line graphs, or bar graphs)

Analysis and interpretation 2 (analytical tools)

1

I can use mathematical tools or techniques to calculate data (e.g., setting up Excel formulas for complex calculations)

2

I can use computer languages for statistical analysis (block coding or text coding) (e.g., using Entry, Python, or Scratch to find mean, standard deviation, mode, etc.)

3

I can understand the meaning of codes written by others in Entry or Scratch

4

I can understand the meaning of codes written by others in Python or R

Conclusion generation

1

I can ethically consider and evaluate various information and alternatives during the problem-solving process

2

I can thoroughly discuss the impacts of my solutions on other related fields

3

I can derive creative conclusions by combining newly found information with what I already know

4

I can synthesize various information to draw conclusions that help solve problems

5

I can draw trend lines from current data to predict trends

6

I can objectively evaluate the strengths and limitations of my conclusions

7

I can explain how my conclusions are related to scientific and social issues

8

I can self-evaluate and revise the solutions I propose to solve problems

Sharing and presentation

1

I can make logical arguments using data or scientific results

2

I can communicate with other students using presentations, online software, discussion boards, etc

3

I can share solutions to problems with other students through shared documents or platforms like Google Docs or Padlet

4

I know methods for sharing information or knowledge (e.g., Google Drive, writing shared documents)

5

I can effectively present new information using computer programs (Excel, PowerPoint, Hangul)

6

I know how to present information in a way that makes my written words, speeches, graphs, pictures, posters, etc., easily understood by others

7

I can use various computer programs (video editing, photo editing, document creation, using formulas, or drawing graphs)

8

I know how to present information beautifully to make it aesthetically pleasing to people

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Son, M., Ha, M. Development of a digital literacy measurement tool for middle and high school students in the context of scientific practice. Educ Inf Technol (2024). https://doi.org/10.1007/s10639-024-12999-z

Download citation

Received : 16 June 2024

Accepted : 12 August 2024

Published : 31 August 2024

DOI : https://doi.org/10.1007/s10639-024-12999-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Digital literacy
  • Scientific practice
  • Measurement tool
  • Middle and high school students
  • Find a journal
  • Publish with us
  • Track your research

IMAGES

  1. Let’s Dive into Digital Literacy: A Course in the Making

    critical thinking in digital literacy

  2. What Is Digital Literacy (and How to Best Communicate It)?

    critical thinking in digital literacy

  3. Digital Literacy Skills: Critical Thinking

    critical thinking in digital literacy

  4. Digital Literacy in pedagogical skills

    critical thinking in digital literacy

  5. Developing Digital Literacy Skills

    critical thinking in digital literacy

  6. Eight Components of Digital Literacy Stock Photo

    critical thinking in digital literacy

VIDEO

  1. VJ Gurukul is live ! critical thinking & digital awareness

  2. Chat GPT and I made a mistake? Can your students figure it out? Ontario Language 2023 Curriculum

  3. What is Digital Literacy? #shorts

  4. Bond Academy Secondary

  5. Information Literacy and Critical Thinking 2024

  6. Critical Thinking Digital Images Dr. Natasa Lackovic

COMMENTS

  1. Critical digital literacies at school level: A systematic review

    Critical digital literacies emphasise critical attitudes as an aim for an individual's digital competence, but critical can be understood in various ways, such as: (1) the overly technological framing of digital literacy which requires a critical perspective, drawing on theories and pedagogies from critical literacy and media education ...

  2. Today's Two Important Skills: Digital Literacy and Critical Thinking

    Critical thinking helps people find the relevant and correct information on a specific subject (Cottrell, 2005). Therefore, digital literacy and critical thinking are two vital skills for the twenty first century (Halpern, 2003). Furthermore, Kong (2014) notes that these twenty-first century skills should be mastered for the success in the life ...

  3. Critical literacies in a digital age: current and future issues

    These critical digital literacy (CDL) practices share a specific focus on navigating, interrogating, critiquing, and shaping textual meaning across digital and face-to-face contexts. In this introductory article, the guest editors overview several examples of pedagogical scholarship concerned with these practices, collectively referred to as CDL.

  4. Re-thinking Critical Digital Literacies in the Context of Compulsory

    Approaches to critical digital literacies vary considerably and multiple definitions and interpretations of digital literacy can be found in academic literature since the term was first coined by Gilster ().Furthermore, various conceptual digital literacy or competence frameworks have been developed over the past decades by international organisations and institutions (see European Union, 2021 ...

  5. A Fresh Take on Digital Media Literacy and Online Critical Thinking

    In response to the challenges posed by the age of AI and online misinformation, educational institutions are reevaluating their approach to digital literacy and critical thinking skills. By implementing the strategies listed above, institutions can better prepare students to navigate the complexities of today's digital world and apply ...

  6. Key factors in digital literacy in learning and education: a ...

    Avila and Pandya further emphasize the critical-thinking dimension of digital literacy by coining the term "critical digital literacies" (Avila & Pandya, 2013, p. 3). Other scholars such as Aviram and Eshet-Alkalai ( 2006 ); Ng ( 2012 ); Tuamsuk and Subramaniam ( 2017 ) even go beyond in suggesting that there is another dimension to digital ...

  7. Teaching critical digital literacy

    Teaching critical digital literacy. Critical digital literacy is a set of skills, competencies, and analytical viewpoints that allows a person to use, understand, and create digital media and tools. Related to information literacy skills such as numeracy, listening, speaking, reading, writing and critical thinking, the goal of critical digital ...

  8. What is digital literacy? A comparative review of publications across

    Canada's Centre for Digital and Media Literacy sum up the differences: "media literacy generally focuses on teaching youth to be critically engaged consumers of media, while digital literacy is more about enabling youth to participate in digital media in wise, safe and ethical ways" (Media Smarts, 2010). Digital literacy therefore ...

  9. A systematic review on digital literacy

    Thinking skills (along with digital skills) have been acknowledged to be a significant element of digital literacy in the educational process context (Ferrari, ). In fact, critical thinking, creativity, and innovation are at the very core of DigComp. Information and Communication Technology as a support for thinking is a learning objective in ...

  10. 7 Digital literacies and the skills of the digital age

    The cognitive area of digital literacy focuses on activities such as critical thinking, problem solving and decision making (Williamson, 2011) and includes the ability to "evaluate and apply new knowledge gained from digital environments"(Jones-Kavalier & Flannigan, 2006, p. 5).

  11. PDF Digital Literacy

    Digital literacy development is a critical component of adult basic education instruction. ABE classrooms are filled ... Critical thinking —Students must have the skills and knowledge necessary to draw on inductive and deductive reasoning, systems thinking, and analysis so that one can evaluate evidence, opinions, and ...

  12. PDF A Global Framework of Reference on Digital Literacy Skills for ...

    implement digital literacy frameworks and strategic plans to bolster citizens' digital literacy. However, the reasons for countries to adopt and develop frameworks vary. For example, the Republic of Korea intends to enhance the digital literacy of public officials to increase the efficiency, transparency and delivery of services

  13. Rethinking Digital Literacy

    In this rapidly evolving digital environment, we could say that digital literacy encompasses the set of capabilities and skills and values; it is the ability to mindfully analyze, process, design, and produce information; to develop and employ critical thinking skills in the landscape of mis- and disinformation practices at digital platforms ...

  14. What Is Digital Literacy and Why Is It Important?

    Digital literacy confines our skills required to understand and use digital technologies. It involves peoples critical thinking and ability to navigate online environments. In this modern world digital age means more than just using technology. It's about competence to access, analyze, create, and communicate in an increasingly digital world.

  15. Full article: Exploring teachers' perceptions of critical digital

    Background. The notion of digital literacy is central to research and education policy, however, it is ever-evolving and often contested, and multiple definitions have emerged in academic literature and policy documents since the term was first coined by Gilster (Citation 1997).In particular, the early 1990s signalled the introduction and growing popularity of a range of other terms associated ...

  16. 5 Dimensions Of Critical Digital Literacy: A Framework

    The framework is minimalist in design, forgoing any kind of analysis of each dimension, or examples of how readers may use them, but that's part of its charm: At a glance, it refracts digital literacy rather succinctly. 5 Dimensions Of Digital Literacy. 1. Decoding. Focus: the media-modes, structures, and conventions of digital media. 2 ...

  17. The Digital Literacy Imperative

    Digital literacy is essential for full participation in today's global economy. Access, education, and comprehension of digital tools and methods are priorities for equitable and inclusive economic development. ... critical thinking, complex problem solving, and the ability to collaborate. Digital skills also have different degrees of ...

  18. A systematic review on digital literacy

    The four major themes revealed from the qualitative content analysis are: digital literacy, digital competencies, digital skills and digital thinking. Under each theme, the categories and their frequencies are analysed. ... Research work in digital literacy is critical in a context of disruptive digital business, and more recently, the pandemic ...

  19. Full article: Digital competence and digital literacy in higher

    Interestingly, although digital literacy has been used in research for a longer time than digital competence (see Figure 3), definitions of digital literacy and digital competence originating from policy documents have gained legitimacy and this certainly needs further critical investigations, which is also emphazised in Lea (Citation 2013).

  20. Language, Ideology, and Critical Digital Literacy

    As a construct, critical digital literacy continues to be labeled and interpreted in different ways and for different ends. Developing a vision for student success in the new global economy, The Framework for twenty-first Century Learning (Partnership for 21st Century Skills 2011) identify information, media, and ICT literacies as including both functional and critical thinking skills.

  21. Digital Literacy courses, programs & resources

    Digital literacy is the ability to navigate our digital world using reading, writing, technical skills, and critical thinking. It's using technology—like a smartphone, PC, e-reader, and more—to find, evaluate, and communicate information. With Microsoft Digital Literacy classes, you can gain skills needed to effectively explore the Internet.

  22. Developing Digital Literacy for Teaching and Learning

    Abstract. Digital literacy is a critical competence for empowering citizenship in a digital world. It has become a key element in teaching and learning across the different educational stages that has been addressed since the last decade of the twentieth century within the field of open, distance, and digital education.

  23. Development of a digital literacy measurement tool for ...

    Digital literacy is essential for scientific literacy in a digital world. Although the NGSS Practices include many activities that require digital literacy, most studies have examined digital literacy from a generic perspective rather than a curricular context. This study aimed to develop a self-report tool to measure elements of digital literacy among middle and high school students in the ...