U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Springer Nature - PMC COVID-19 Collection

Logo of phenaturepg

The Use of Critical Thinking to Identify Fake News: A Systematic Literature Review

Paul machete.

Department of Informatics, University of Pretoria, Pretoria, 0001 South Africa

Marita Turpin

With the large amount of news currently being published online, the ability to evaluate the credibility of online news has become essential. While there are many studies involving fake news and tools on how to detect it, there is a limited amount of work that focuses on the use of information literacy to assist people to critically access online information and news. Critical thinking, as a form of information literacy, provides a means to critically engage with online content, for example by looking for evidence to support claims and by evaluating the plausibility of arguments. The purpose of this study is to investigate the current state of knowledge on the use of critical thinking to identify fake news. A systematic literature review (SLR) has been performed to identify previous studies on evaluating the credibility of news, and in particular to see what has been done in terms of the use of critical thinking to evaluate online news. During the SLR’s sifting process, 22 relevant studies were identified. Although some of these studies referred to information literacy, only three explicitly dealt with critical thinking as a means to identify fake news. The studies on critical thinking noted critical thinking as an essential skill for identifying fake news. The recommendation of these studies was that information literacy be included in academic institutions, specifically to encourage critical thinking.


The information age has brought a significant increase in available sources of information; this is in line with the unparalleled increase in internet availability and connection, in addition to the accessibility of technological devices [ 1 ]. People no longer rely on television and print media alone for obtaining news, but increasingly make use of social media and news apps. The variety of information sources that we have today has contributed to the spread of alternative facts [ 1 ]. With over 1.8 billion active users per month in 2016 [ 2 ], Facebook accounted for 20% of total traffic to reliable websites and up to 50% of all the traffic to fake news sites [ 3 ]. Twitter comes second to Facebook, with over 400 million active users per month [ 2 ]. Posts on social media platforms such as Facebook and Twitter spread rapidly due to how they attempt to grab the readers’ attention as quickly as possible, with little substantive information provided, and thus create a breeding ground for the dissemination of fake news [ 4 ].

While social media is a convenient way of accessing news and staying connected to friends and family, it is not easy to distinguish real news from fake news on social media [ 5 ]. Social media continues to contribute to the increasing distribution of user-generated information; this includes hoaxes, false claims, fabricated news and conspiracy theories, with primary sources being social media platforms such as Facebook and Twitter [ 6 ]. This means that any person who is in possession of a device, which can connect to the internet, is potentially a consumer or distributor of fake news. While social media platforms and search engines do not encourage people to believe the information being circulated, they are complicit in people’s propensity to believe the information they come across on these platforms, without determining their validity [ 6 ]. The spread of fake news can cause a multitude of damages to the subject; varying from reputational damage of an individual, to having an effect on the perceived value of a company [ 7 ].

The purpose of this study is to investigate the use of critical thinking methods to detect news stories that are untrue or otherwise help to develop a critical attitude to online news. This work was performed by means of a systematic literature review (SLR). The paper is presented as follows. The next section provides background information on fake news, its importance in the day-to-day lives of social media users and how information literacy and critical thinking can be used to identify fake news. Thereafter, the SLR research approach is discussed. Following this, the findings of the review are reported, first in terms of descriptive statistics and the in terms of a thematic analysis of the identified studies. The paper ends with the Conclusion and recommendations.

Background: Fake News, Information Literacy and Critical Thinking

This section discusses the history of fake news, the fake news that we know today and the role of information literacy can be used to help with the identification of fake news. It also provides a brief definition of critical thinking.

The History of Fake News

Although fake news has received increased attention recently, the term has been used by scholars for many years [ 4 ]. Fake news emerged from the tradition of yellow journalism of the 1890s, which can be described as a reliance on the familiar aspects of sensationalism—crime news, scandal and gossip, divorces and sex, and stress upon the reporting of disasters, sports sensationalism as well as possibly satirical news [ 5 ]. The emergence of online news in the early 2000s raised concerns, among them being that people who share similar ideologies may form “echo chambers” where they can filter out alternative ideas [ 2 ]. This emergence came about as news media transformed from one that was dominated by newspapers printed by authentic and trusted journalists to one where online news from an untrusted source is believed by many [ 5 ]. The term later grew to describe “satirical news shows”, “parody news shows” or “fake-news comedy shows” where a television show, or segment on a television show was dedicated to political satire [ 4 ]. Some of these include popular television shows such as The Daily Show (now with Trevor Noah), Saturday Night Live ’s “The Weekend Update” segment, and other similar shows such as Last Week Tonight with John Oliver and The Colbert Report with Stephen Colbert [ 4 ]. News stories in these shows were labelled “fake” not because of their content, but for parodying network news for the use of sarcasm, and using comedy as a tool to engage real public issues [ 4 ]. The term “Fake News” further became prominent during the course of the 2016 US presidential elections, as members of the opposing parties would post incorrect news headlines in order to sway the decision of voters [ 6 ].

Fake News Today

The term fake news has a more literal meaning today [ 4 ]. The Macquarie Dictionary named fake news the word of the year for 2016 [ 8 ]. In this dictionary, fake news is described it as a word that captures a fascinating evolution in the creation of deceiving content, also allowing people to believe what they see fit. There are many definitions for the phrase, however, a concise description of the term can be found in Paskin [ 4 ] who states that certain news articles originating from either social media or mainstream (online or offline) platforms, that are not factual, but are presented as such and are not satirical, are considered fake news. In some instances, editorials, reports, and exposés may be knowingly disseminating information with intent to deceive for the purposes of monetary or political benefit [ 4 ].

A distinction amongst three types of fake news can be made on a conceptual level, namely: serious fabrications, hoaxes and satire [ 3 ]. Serious fabrications are explained as news items written on false information, including celebrity gossip. Hoaxes refer to false information provided via social media, aiming to be syndicated by traditional news platforms. Lastly, satire refers to the use of humour in the news to imitate real news, but through irony and absurdity. Some examples of famous satirical news platforms in circulation in the modern day are The Onion and The Beaverton , when contrasted with real news publishers such as The New York Times [ 3 ].

Although there are many studies involving fake news and tools on how to detect it, there is a limited amount of academic work that focuses on the need to encourage information literacy so that people are able to critically access the information they have been presented, in order to make better informed decisions [ 9 ].

Stein-Smith [ 5 ] urges that information/media literacy has become a more critical skill since the appearance of the notion of fake news has become public conversation. Information literacy is no longer a nice-to-have proficiency but a requirement for interpreting news headlines and participation in public discussions. It is essential for academic institutions of higher learning to present information literacy courses that will empower students and staff members with the prerequisite tools to identify, select, understand and use trustworthy information [ 1 ]. Outside of its academic uses, information literacy is also a lifelong skill with multiple applications in everyday life [ 5 ]. The choices people make in their lives, and opinions they form need to be informed by the appropriate interpretation of correct, opportune, and significant information [ 5 ].

Critical Thinking

Critical thinking covers a broad range of skills that includes the following: verbal reasoning skills; argument analysis; thinking as hypothesis testing; dealing with likelihood and uncertainties; and decision making and problem solving skills [ 10 ]. For the purpose of this study, where we are concerned with the evaluation of the credibility of online news, the following definition will be used: critical thinking is “the ability to analyse and evaluate arguments according to their soundness and credibility, respond to arguments and reach conclusions through deduction from given information” [ 11 ]. In this study, we want to investigate how the skills mentioned by [ 11 ] can be used as part of information literacy, to better identify fake news.

The next section presents the research approach that was followed to perform the SLR.

Research Method

This section addresses the research question, the search terms that were applied to a database in relation to the research question, as well as the search criteria used on the search results. The following research question was addressed in this SLR:

  • What is the role of critical thinking in identifying fake news, according to previous studies?

The research question was identified in accordance to the research topic. The intention of the research question is to determine if the identified studies in this review provide insights into the use of critical thinking to evaluate the credibility of online news and in particular to identify fake news.


In the construction of this SLR, the following definitions of fake news and other related terms have been excluded, following the suggestion of [ 2 ]:

  • Unintentional reporting mistakes;
  • Rumours that do not originate from a particular news article;
  • Conspiracy theories;
  • Satire that is unlikely to be misconstrued as factual;
  • False statements by politicians; and
  • Reports that are slanted or misleading, but not outright false.

Search Terms.

The database tool used to extract sources to conduct the SLR was Google Scholar ( https://scholar.google.com ). The process for extracting the sources involved executing the search string on Google Scholar and the retrieval of the articles and their meta-data into a tool called Mendeley, which was used for reference management.

The search string used to retrieve the sources was defined below:

(“critical think*” OR “critically (NEAR/2) reason*” OR “critical (NEAR/2) thought*” OR “critical (NEAR/2) judge*” AND “fake news” AND (identify* OR analyse* OR find* OR describe* OR review).

To construct the search criteria, the following factors have been taken into consideration: the research topic guided the search string, as the key words were used to create the base search criteria. The second step was to construct the search string according to the search engine requirements on Google Scholar.

Selection Criteria.

The selection criteria outlined the rules applied in the SLR to identify sources, narrow down the search criteria and focus the study on a specific topic. The inclusion and exclusion criteria are outlined in Table  1 to show which filters were applied to remove irrelevant sources.

Table 1.

Inclusion and exclusion criteria for paper selection

Source Selection.

The search criteria were applied on the online database and 91 papers were retrieved. The criteria in Table  1 were used on the search results in order to narrow down the results to appropriate papers only.

PRISMA Flowchart.

The selection criteria included four stages of filtering and this is depicted in Fig.  1 . In then Identification stage, the 91 search results from Google Scholar were returned and 3 sources were derived from the sources already identified from the search results, making a total of 94 available sources. In the screening stage, no duplicates were identified. After a thorough screening of the search results, which included looking at the availability of the article (free to use), 39 in total records were available – to which 55 articles were excluded. Of the 39 articles, nine were excluded based on their titles and abstract being irrelevant to the topic in the eligibility stage. A final list of 22 articles was included as part of this SLR. As preparation for the data analysis, a data extraction table was made that classified each article according to the following: article author; article title; theme (a short summary of the article); year; country; and type of publication. The data extraction table assisted in the analysis of findings as presented in the next section.

An external file that holds a picture, illustration, etc.
Object name is 497534_1_En_20_Fig1_HTML.jpg

PRISMA flowchart

Analysis of Findings

Descriptive statistics.

Due to the limited number of relevant studies, the information search did not have a specified start date. Articles were included up to 31 August 2019. The majority of the papers found were published in 2017 (8 papers) and 2018 (9 papers). This is in line with the term “fake news” being announced the word of the year in the 2016 [ 8 ].

The selected papers were classified into themes. Figure  2 is a Venn diagram that represents the overlap of articles by themes across the review. Articles that fall under the “fake news” theme had the highest number of occurrences, with 11 in total. Three articles focused mainly on “Critical Thinking”, and “Information Literacy” was the main focus of four articles. Two articles combined all three topics of critical thinking, information literacy, and fake news.

An external file that holds a picture, illustration, etc.
Object name is 497534_1_En_20_Fig2_HTML.jpg

Venn diagram depicting the overlap of articles by main focus

An analysis of the number of articles published per country indicate that the US had a dominating amount of articles published on this topic, a total of 17 articles - this represents 74% of the selected articles in this review. The remaining countries where articles were published are Australia, Germany, Ireland, Lebanon, Saudi Arabia, and Sweden - with each having one article published.

In terms of publication type, 15 of the articles were journal articles, four were reports, one was a thesis, one was a magazine article and one, a web page.

Discussion of Themes

The following emerged from a thematic analysis of the articles.

Fake News and Accountability.

With the influence that social media has on the drive of fake news [ 2 ], who then becomes responsible for the dissemination and intake of fake news by the general population? The immediate assumption is that in the digital age, social media platforms like Facebook and Twitter should be able to curate information, or do some form of fact-checking when posts are uploaded onto their platforms [ 12 ], but that leans closely to infringing on freedom of speech. While different authors agree that there need to be measures in place for the minimisation of fake news being spread [ 12 , 13 ], where that accountability lies differs between the authors. Metaxas and Mustafaraj [ 13 ] aimed to develop algorithms or plug-ins that can assist in trust and postulated that consumers should be able to identify misinformation, thus making an informed decision on whether to share that information or not. Lazer et al. [ 12 ] on the other hand, believe the onus should be on the platform owners to put restrictions on the kind of data distributed. Considering that the work by Metaxas and Mustafaraj [ 13 ] was done seven years ago, one can conclude that the use of fact-checking algorithms/plug-ins has not been successful in curbing the propulsion of fake news.

Fake News and Student Research.

There were a total of four articles that had a focus on student research in relation to fake news. Harris, Paskin and Stein-Smith [ 4 , 5 , 14 ] all agree that students do not have the ability to discern between real and fake news. A Stanford History Education Group study reveals that students are not geared up for distinguishing real from fake news [ 4 ]. Most students are able to perform a simple Google search for information; however, they are unable to identify the author of an online source, or if the information is misleading [ 14 ]. Furthermore, students are not aware of the benefits of learning information literacy in school in equipping them with the skills required to accurately identify fake news [ 5 ]. At the Metropolitan Campus of Fairleigh Dickson University, librarians have undertaken the role of providing training on information literacy skills for identifying fake news [ 5 ].

Fake News and Social Media.

A number of authors [ 6 , 15 ] are in agreement that social media, the leading source of news, is the biggest driving force for fake news. It provides substantial advantage to broadcast manipulated information. It is an open platform of unfiltered editors and open to contributions from all. According to Nielsen and Graves as well as Janetzko, [ 6 , 15 ], people are unable to identify fake news correctly. They are likely to associate fake news with low quality journalism than false information designed to mislead. Two articles, [ 15 ] and [ 6 ] discussed the role of critical thinking when interacting on social media. Social media presents information to us that has been filtered according to what we already consume, thereby making it a challenge for consumers to think critically. The study by Nielsen and Graves [ 6 ] confirm that students’ failure to verify incorrect online sources requires urgent attention as this could indicate that students are a simple target for presenting manipulated information.

Fake News That Drive Politics.

Two studies mention the effect of social and the spread of fake news, and how it may have propelled Donald Trump to win the US election in 2016 [ 2 , 16 ]. Also, [ 8 ] and [ 2 ] mention how a story on the Pope supporting Trump in his presidential campaign, was widely shared (more than a million times) on Facebook in 2016. These articles also point out how in the information age, fact-checking has become relatively easy, but people are more likely to trust their intuition on news stories they consume, rather than checking the reliability of a story. The use of paid trolls and Russian bots to populate social media feeds with misinformation in an effort to swing the US presidential election in Donald Trump’s favour, is highlighted [ 16 ]. The creation of fake news, with the use of alarmist headlines (“click bait”), generates huge traffic into the original websites, which drives up advertising revenue [ 2 ]. This means content creators are compelled to create fake news, to drive ad revenue on their websites - even though they may not be believe in the fake news themselves [ 2 ].

Information Literacy.

Information literacy is when a person has access to information, and thus can process the parts they need, and create ways in which to best use the information [ 1 ]. Teaching students the importance of information literacy skills is key, not only for identifying fake news but also for navigating life aspects that require managing and scrutinising information, as discussed by [ 1 , 17 ], and [ 9 ]. Courtney [ 17 ] highlights how journalism students, above students from other disciplines, may need to have some form of information literacy incorporated into their syllabi to increase their awareness of fake news stories, creating a narrative of being objective and reliable news creators. Courtney assessed different universities that teach journalism and media-related studies, and established that students generally lack awareness on how useful library services are in offering services related to information literacy. Courtney [ 17 ] and Rose-Wiles [ 9 ] discuss how the use of library resources should be normalised to students. With millennials and generation Z having social media as their first point of contact, Rose-Wiles [ 9 ] urges universities, colleges and other academic research institutes to promote the use of more library resources than those from the internet, to encourage students to lean on reliable sources. Overall, this may prove difficult, therefore Rose-Wiles [ 9 ] proposes that by teaching information literacy skills and critical thinking, students can use these skills to apply in any situation or information source.

Referred to as “truth decay”, people have reached a point where they no longer need to agree with facts [ 18 ]. Due to political polarisation, the general public hold the opinion of being part of an oppressed group of people, and therefore will believe a political leader who appeals to that narrative [ 18 ]. There needs to be tangible action put into driving civil engagement, to encourage people to think critically, analyse information and not believe everything they read.

Critical Thinking.

Only three of the articles had critical thinking as a main theme. Bronstein et al. [ 19 ] discuss how certain dogmatic and religious beliefs create a tendency in individuals to belief any information given, without them having a need to interrogate the information further and then deciding ion its veracity. The article further elaborates how these individuals are also more likely to engage in conspiracy theories, and tend to rationalise absurd events. Bronstein et al.’s [ 19 ] study conclude that dogmatism and religious fundamentalism highly correlate with a belief in fake news. Their study [ 19 ] suggests the use of interventions that aim to increase open-minded thinking, and also increase analytical thinking as a way to help religious, curb belief in fake news. Howlett [ 20 ] describes critical thinking as evidence-based practice, which is taking the theories of the skills and concepts of critical thinking and converting those for use in everyday applications. Jackson [ 21 ] explains how the internet purposely prides itself in being a platform for “unreviewed content”, due to the idea that people may not see said content again, therefore it needs to be attention-grabbing for this moment, and not necessarily accurate. Jackson [ 21 ] expands that social media affected critical thinking in how it changed the view on published information, what is now seen as old forms of information media. This then presents a challenge to critical thinking in that a large portion of information found on the internet is not only unreliable, it may also be false. Jackson [ 21 ] posits that one of the biggest dangers to critical thinking may be that people have a sense of perceived power for being able to find the others they seek with a simple web search. People are no longer interested in evaluation the credibility of the information they receive and share, and thus leading to the propagation of fake news [ 21 ].

Discussion of Findings

The aggregated data in this review has provided insight into how fake news is perceived, the level of attention it is receiving and the shortcomings of people when identifying fake news. Since the increase in awareness of fake news in 2016, there has been an increase in academic focus on the subject, with most of the articles published between 2017 and 2018. Fifty percent of the articles released focused on the subject of fake news, with 18% reflecting on information literacy, and only 13% on critical thinking.

The thematic discussion grouped and synthesised the articles in this review according to the main themes of fake news, information literacy and critical thinking. The Fake news and accountability discussion raised the question of who becomes accountable for the spreading of fake news between social media and the user. The articles presented a conclusion that fact-checking algorithms are not successful in reducing the dissemination of fake news. The discussion also included a focus on fake news and student research , whereby a Stanford History Education Group study revealed that students are not well educated in thinking critically and identifying real from fake news [ 4 ]. The Fake news and social media discussion provided insight on social media is the leading source of news as well as a contributor to fake news. It provides a challenge for consumers who are not able to think critically about online news, or have basic information literacy skills that can aid in identifying fake news. Fake news that drive politics highlighted fake news’ role in politics, particularly the 2016 US presidential elections and the influence it had on the voters [ 22 ].

Information literacy related publications highlighted the need for educating the public on being able to identify fake news, as well as the benefits of having information literacy as a life skill [ 1 , 9 , 17 ]. It was shown that students are often misinformed about the potential benefits of library services. The authors suggested that university libraries should become more recognised and involved as role-players in providing and assisting with information literacy skills.

The articles that focused on critical thinking pointed out two areas where a lack of critical thinking prevented readers from discerning between accurate and false information. In the one case, it was shown that people’s confidence in their ability to find information online gave made them overly confident about the accuracy of that information [ 21 ]. In the other case, it was shown that dogmatism and religious fundamentalism, which led people to believe certain fake news, were associated with a lack of critical thinking and a questioning mind-set [ 21 ].

The articles that focused on information literacy and critical thinking were in agreement on the value of promoting and teaching these skills, in particular to the university students who were often the subjects of the studies performed.

This review identified 22 articles that were synthesised and used as evidence to determine the role of critical thinking in identifying fake news. The articles were classified according to year of publication, country of publication, type of publication and theme. Based on the descriptive statistics, fake news has been a growing trend in recent years, predominantly in the US since the presidential election in 2016. The research presented in most of the articles was aimed at the assessment of students’ ability to identify fake news. The various studies were consistent in their findings of research subjects’ lack of ability to distinguish between true and fake news.

Information literacy emerged as a new theme from the studies, with Rose-Wiles [ 9 ] advising academic institutions to teach information literacy and encourage students to think critically when accessing online news. The potential role of university libraries to assist in not only teaching information literacy, but also assisting student to evaluate the credibility of online information, was highlighted. The three articles that explicitly dealt with critical thinking, all found critical thinking to be lacking among their research subjects. They further indicated how this lack of critical thinking could be linked to people’s inability to identify fake news.

This review has pointed out people’s general inability to identify fake news. It highlighted the importance of information literacy as well as critical thinking, as essential skills to evaluate the credibility of online information.

The limitations in this review include the use of students as the main participants in most of the research - this would indicate a need to shift the academic focus towards having the general public as participants. This is imperative because anyone who possesses a mobile device is potentially a contributor or distributor of fake news.

For future research, it is suggested that the value of the formal teaching of information literacy at universities be further investigated, as a means to assist students in assessing the credibility of online news. Given the very limited number of studies on the role of critical thinking to identify fake news, this is also an important area for further research.

Contributor Information

Marié Hattingh, Email: [email protected] .

Machdel Matthee, Email: [email protected] .

Hanlie Smuts, Email: [email protected] .

Ilias Pappas, Email: [email protected] .

Yogesh K. Dwivedi, Email: moc.liamg@ideviwdky .

Matti Mäntymäki, Email: [email protected] .



Critical Thinking and Fake News

Search SkillsYouNeed:

Learning Skills:

  • A - Z List of Learning Skills
  • What is Learning?
  • Learning Approaches
  • Learning Styles
  • 8 Types of Learning Styles
  • Understanding Your Preferences to Aid Learning
  • Lifelong Learning
  • Decisions to Make Before Applying to University
  • Top Tips for Surviving Student Life
  • Living Online: Education and Learning
  • 8 Ways to Embrace Technology-Based Learning Approaches
  • Critical Thinking Skills
  • Understanding and Addressing Conspiracy Theories
  • Critical Analysis
  • Study Skills
  • Exam Skills
  • Writing a Dissertation or Thesis
  • Research Methods
  • Teaching, Coaching, Mentoring and Counselling
  • Employability Skills for Graduates

Subscribe to our FREE newsletter and start improving your life in just 5 minutes a day.

You'll get our 5 free 'One Minute Life Skills' and our weekly newsletter.

We'll never share your email address and you can unsubscribe at any time.

Since the 2016 US presidential election, the phrase ‘ fake news ’ has become standard currency. But what does the term actually mean, and how can you distinguish fake from ‘real’ news?

The bad news is that ‘fake news’ is often very believable, and it is extremely easy to get caught out.

This page explains how you can apply critical thinking techniques to news stories to reduce the chances of believing fake news, or at least starting to understand that ‘not everything you read is true’.

What is ‘ Fake News ’?

‘Fake news’ is news stories that are either completely untrue, or do not contain all the truth, with a view to deliberately misleading readers.

Fake news became prominent during the US election, with supporters of both sides tweeting false information in the hope of influencing voters. But it is nothing new.

“The report of my death was an exaggeration.”

In May 1897, Mark Twain, the American author, was in London. Rumours reached the US that he was very ill and, later, that he had died. In a letter to Frank Marshall White, a journalist who inquired after his health as a result, Mark Twain suggested that the rumours had started because his cousin, who shared his surname, had been ill a few weeks before. He noted dryly to White,

“The report of my death was an exaggeration”.

It had, nonetheless, been widely reported in the US, with one newspaper even printing an obituary.

Fake news is not:

Articles on satirical or humorous websites, or related publications, that make a comment on the news by satirising them, because this is intended to inform and amuse, not misinform;

Anything obvious that ‘everyone already knows’ (often described using the caption ‘that’s not news’; or

An article whose content you disagree with.

The deliberate intention of fake news to mislead is crucial.

Why is Fake News a Problem?

If fake news has been around for so long, why is it suddenly a problem?

The answer is that social media means that credible fake news stories can spread very quickly.

In the worst cases, they can have major effects. There are suggestions that fake news influenced the 2016 US election. In another case, a gunman opened fire at a pizzeria that had been falsely but widely reported as being the centre of a paedophile ring involving prominent politicians. In less critical cases, fake news reports can result in distress or reputational damage for the people or organisations mentioned in the articles.

It is, therefore, important to be alert to the potential for reports to be fake, and to ensure that you are not party to their spread.

Spotting fake news

Unfortunately, it is not always easy to spot false news.

Sometimes, a story may be obviously false – for example, it may contain typos or spelling mistakes, or formatting errors. Like phishing emails, however, some fake news stories are a lot more subtle than that.

Facebook famously issued a guide to spotting fake news in May 2017. Its advice ranges from the obvious to the much less intuitive. Useful tips include:

Investigate the source

Be wary of stories written by unknown sources, and check their website for more information. Stories from reliable news sources, such as national newspapers or broadcasters, are more likely to have been checked and verified. It is also worth looking at the URL, to make sure it is a genuine news organisation.

Look at the evidence on which the article bases its claims, and check whether they seem credible. If there are no sources given, or the source is an unknown ‘expert’ or ‘friend’ of someone concerned, be sceptical.

Check whether other, reliable news sources are carrying the story

Sometimes, even otherwise reliable news sources get carried away and forget to do all the necessary checks. But one very good check is to ask whether other reliable sources are also carrying the story. If yes, it is likely to be correct. If not, you should at least be doubtful.

Facebook’s advice boils down to reading news stories critically.

That does not mean looking for their flaws, or criticising them, although this can be part of critical reading and thinking. Instead, it means applying logic and reason to your thinking and reading, so that you make a sensible judgement about what you are reading.

In practice, this means being alert to why the article has been written, and what the author wants you to feel, think or even do as a result of reading it. Even accurate stories may have been written in a way that is designed to steer you towards a particular point of view or action.

For more about this see our pages on Critical Thinking and Critical Reading .

A word about bias

It is worth remembering that everyone has their opinions, and therefore sources of potential bias in what they write. These may be conscious or unconscious. News organisations tend to have an organisational ‘view’ or political slant. For example, the UK’s Guardian is broadly left-wing, and most of the UK tabloids are right-wing in their views, and this affects both what they report and how they report it.

As a reader, you also have biases, both conscious and unconscious, and these affect the stories you choose to read, and the sources you use. It is therefore possible to self-select only stories that confirm your own view of the world, and social media is very good at helping with this.

To overcome this, it is important to use more than one source of information, and try to ensure that they have at least small differences in their political views.

A final thought

Fake news spreads so fast because we all like the idea of telling people something that they did not already know, something exclusive, and because we want to share our view of the world. It’s a bit like gossip.

But like false gossip, fake news can harm. Next time, before you click on ‘share’ or ‘retweet’, just take a moment to think about whether the story that you are spreading is likely to be true or not. Even if you think it is true, consider the possible effect of spreading it. Is it going to hurt anyone if it turns out to be false?

If so, don’t go there, think before you share.

Continue to: Understanding and Addressing Conspiracy Theories Critical Thinking Skills

See also: Making Sense of Financial News Understanding and Interpreting Online Product Reviews Critical Analysis

  • Share full article


Supported by

Don’t Go Down the Rabbit Hole

Critical thinking, as we’re taught to do it, isn’t helping in the fight against misinformation.

critical thinking in fake news

By Charlie Warzel

Mr. Warzel is an Opinion writer at large.

For an academic, Michael Caulfield has an odd request: Stop overthinking what you see online.

Mr. Caulfield, a digital literacy expert at Washington State University Vancouver, knows all too well that at this very moment, more people are fighting for the opportunity to lie to you than at perhaps any other point in human history.

Misinformation rides the greased algorithmic rails of powerful social media platforms and travels at velocities and in volumes that make it nearly impossible to stop. That alone makes information warfare an unfair fight for the average internet user. But Mr. Caulfield argues that the deck is stacked even further against us. That the way we’re taught from a young age to evaluate and think critically about information is fundamentally flawed and out of step with the chaos of the current internet.

“We’re taught that, in order to protect ourselves from bad information, we need to deeply engage with the stuff that washes up in front of us,” Mr. Caulfield told me recently. He suggested that the dominant mode of media literacy (if kids get taught any at all) is that “you’ll get imperfect information and then use reasoning to fix that somehow. But in reality, that strategy can completely backfire.”

In other words: Resist the lure of rabbit holes, in part, by reimagining media literacy for the internet hellscape we occupy.

It’s often counterproductive to engage directly with content from an unknown source, and people can be led astray by false information. Influenced by the research of Sam Wineburg, a professor at Stanford, and Sarah McGrew, an assistant professor at the University of Maryland, Mr. Caulfield argued that the best way to learn about a source of information is to leave it and look elsewhere , a concept called lateral reading .

For instance, imagine you were to visit Stormfront, a white supremacist message board, to try to understand racist claims in order to debunk them. “Even if you see through the horrible rhetoric, at the end of the day you gave that place however many minutes of your time,” Mr. Caulfield said. “Even with good intentions, you run the risk of misunderstanding something, because Stormfront users are way better at propaganda than you. You won’t get less racist reading Stormfront critically, but you might be overloaded by information and overwhelmed.”

Our current information crisis, Mr. Caulfield argues, is an attention crisis.

“The goal of disinformation is to capture attention, and critical thinking is deep attention,” he wrote in 2018 . People learn to think critically by focusing on something and contemplating it deeply — to follow the information’s logic and the inconsistencies.

That natural human mind-set is a liability in an attention economy. It allows grifters, conspiracy theorists, trolls and savvy attention hijackers to take advantage of us and steal our focus. “Whenever you give your attention to a bad actor, you allow them to steal your attention from better treatments of an issue, and give them the opportunity to warp your perspective,” Mr. Caulfield wrote.

One way to combat this dynamic is to change how we teach media literacy: Internet users need to learn that our attention is a scarce commodity that is to be spent wisely.

In 2016, Mr. Caulfield met Mr. Wineburg, who suggested modeling the process after the way professional fact checkers assess information. Mr. Caulfield refined the practice into four simple principles:

2. Investigate the source.

3. Find better coverage.

4. Trace claims, quotes and media to the original context.

Otherwise known as SIFT.

Mr. Caulfield walked me through the process using an Instagram post from Robert F. Kennedy Jr., a prominent anti-vaccine activist, falsely alleging a link between the human papillomavirus vaccine and cancer. “If this is not a claim where I have a depth of understanding, then I want to stop for a second and, before going further, just investigate the source,” Mr. Caulfield said. He copied Mr. Kennedy’s name in the Instagram post and popped it into Google. “Look how fast this is,” he told me as he counted the seconds out loud. In 15 seconds, he navigated to Wikipedia and scrolled through the introductory section of the page, highlighting with his cursor the last sentence, which reads that Mr. Kennedy is an anti-vaccine activist and a conspiracy theorist.

“Is Robert F. Kennedy Jr. the best, unbiased source on information about a vaccine? I’d argue no. And that’s good enough to know we should probably just move on,” he said.

He probed deeper into the method to find better coverage by copying the main claim in Mr. Kennedy’s post and pasting that into a Google search. The first two results came from Agence France-Presse’s fact-check website and the National Institutes of Health. His quick searches showed a pattern: Mr. Kennedy’s claims were outside the consensus — a sign they were motivated by something other than science.

The SIFT method and the instructional teaching unit (about six hours of class work) that accompanies it has been picked up by dozens of universities across the country and in some Canadian high schools. What is potentially revolutionary about SIFT is that it focuses on making quick judgments. A SIFT fact check can and should take just 30, 60, 90 seconds to evaluate a piece of content.

The four steps are based on the premise that you often make a better decision with less information than you do with more. Also, spending 15 minutes to determine a single fact in order to decipher a tweet or a piece of news coming from a source you’ve never seen before will often leave you more confused than you were before. “The question we want students asking is: Is this a good source for this purpose, or could I find something better relatively quickly?” Mr. Caulfield said. “I’ve seen in the classroom where a student finds a great answer in three minutes but then keeps going and ends up won over by bad information.”

SIFT has its limits. It’s designed for casual news consumers, not experts or those attempting to do deep research. A reporter working on an investigative story or trying to synthesize complex information will have to go deep. But for someone just trying to figure out a basic fact, it’s helpful not to get bogged down. “We’ve been trained to think that Googling or just checking one resource we trust is almost like cheating,” he said. “But when people search Google, the best results may not always be first, but the good information is usually near the top. Often you see a pattern in the links of a consensus that’s been formed. But deeper into the process, it often gets weirder. It’s important to know when to stop.”

Christina Ladam, an assistant political science professor at the University of Nevada, Reno, has seen the damage firsthand. While teaching an introductory class as a Ph.D. student in 2015, she noticed her students had trouble vetting sources and distinguishing credible news from untrustworthy information. During one research assignment on the 2016 presidential race, multiple students cited a debunked claim from a satirical website claiming that Ben Carson, a candidate that year, had been endorsed by the Ku Klux Klan. “Some of these students had never had somebody even talk to them about checking sources or looking for fake news,” she told me. “It was just uncritical acceptance if it fit with the narrative in their head or complete rejection if it didn’t.”

Ms. Ladam started teaching a SIFT-based media literacy unit in her political science classes because of the method’s practical application. The unit is short, only two weeks long. Her students latched onto quick tricks like how to hover over a Twitter handle and see if the account looks legitimate or is a parody account or impersonation. They learned how to reverse image search using Google to check if a photo had been doctored or if similar photos had been published by trusted news outlets. Students were taught to identify claims in Facebook or Instagram posts and, with a few searches, decide — even if they’re unsure of the veracity — whether the account seems to be a trustworthy guide or if they should look elsewhere.

The goal isn’t to make political judgments or to talk students out of a particular point of view, but to try to get them to understand the context of a source of information and make decisions about its credibility. The course is not precious about overly academic sources, either.

“The students are confused when I tell them to try and trace something down with a quick Wikipedia search, because they’ve been told not to do it,” she said. “Not for research papers, but if you’re trying to find out if a site is legitimate or if somebody has a history as a conspiracy theorist and you show them how to follow the page’s citation, it’s quick and effective, which means it’s more likely to be used.”

As a journalist who can be a bit of a snob about research methods, it makes me anxious to type this advice. Use Wikipedia for quick guidance! Spend less time torturing yourself with complex primary sources! A part of my brain hears this and reflexively worries these methods could be exploited by conspiracy theorists. But listening to Ms. Ladam and Mr. Caulfield describe disinformation dynamics, it seems that snobs like me have it backward.

Think about YouTube conspiracy theorists or many QAnon or anti-vaccine influencers. Their tactic, as Mr. Caulfield noted, is to flatter viewers while overloading them with three-hour videos laced with debunked claims and pseudoscience, as well as legitimate information. “The internet offers this illusion of explanatory depth,” he said. “Until 20 seconds ago, you’d never thought about, say, race and IQ, but now, suddenly, somebody is treating you like an expert. It’s flattering your intellect, and so you engage, but you don’t really stand a chance.”

What he described is a kind of informational hubris we have that is quite difficult to fight. But what SIFT and Mr. Caulfield’s lessons seem to do is flatter their students in a different way: by reminding us our attention is precious.

The goal of SIFT isn’t to be the arbiter of truth but to instill a reflex that asks if something is worth one’s time and attention and to turn away if not. Because the method is less interested in political judgments, Mr. Caulfield and Ms. Ladam noticed, students across the political spectrum are more likely to embrace it. By the end of the two-week course, Ms. Ladam said, students are better at finding primary sources for research papers. In discussions they’re less likely to fall back on motivated reasoning. Students tend to be less defensive when confronted with a piece of information they disagree with. Even if their opinions on a broader issue don’t change, a window is open that makes conversation possible. Perhaps most promising, she has seen her students share the methods with family members who post dubious news stories online. “It sounds so simple, but I think that teaching people how to check their news source by even a quick Wikipedia can have profound effects,” she said.

SIFT is not an antidote to misinformation. Poor media literacy is just one component of a broader problem that includes more culpable actors like politicians, platforms and conspiracy peddlers. If powerful, influential people with the ability to command vast quantities of attention use that power to warp reality and platforms don’t intervene, no mnemonic device can stop them. But SIFT may add a bit of friction into the system. Most important, it urges us to take the attention we save with SIFT and apply it to issues that matter to us.

“Right now we are taking the scarcest, most valuable resource we have — our attention — and we’re using it to try to repair the horribly broken information ecosystem,” Mr. Caulfield said. “We’re throwing good money after bad.”

Our focus isn’t free, and yet we’re giving it away with every glance at a screen. But it doesn’t have to be that way. In fact, the economics are in our favor. Demand for our attention is at an all-time high, and we control supply. It’s time we increased our price.

The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips . And here’s our email: [email protected] .

Follow The New York Times Opinion section on Facebook , Twitter (@NYTopinion) and Instagram .

An earlier version of this article misattributed a quotation about determining the reliability of a news source. It was Michael Caulfield — not Robert F. Kennedy Jr. — who said, “The question we want students asking is: Is this a good source for this purpose, or could I find something better relatively quickly?”

How we handle corrections

Charlie Warzel , a New York Times Opinion writer at large, covers technology, media, politics and online extremism. He welcomes your tips and feedback: [email protected]  | @ cwarzel

Shenandoah University

Fake News & Critical Thinking: Critical Thinking

  • Critical Thinking
  • Finding Real News

What is Critical Thinking?

Think critically about what you read, hear, see, and about how you absorb information. It's a good way to begin figuring out and interpreting fake news, alternative facts, post-truth, and misinformation.

Resources - Articles, Books, & Websites

  • Critical Thinking: Where to Begin? from the Foundation for Critical Thinking
  • Critical Thinking: Why Is It So Hard to Teach (article) Willingham, D. T. (2008). Critical thinking: why is it so hard to teach? Arts Education Policy Review, 109 (4), 21–29.
  • Civic Online Reasoning (COR) A free curriculum developed by the Stanford History Education Group
  • Center for News Literacy
  • FakeOut Can you spot ‘fake news’? Have fun finding the facts with this social media-emulating game.

Cover Art

Figuring Out Images

With advanced image altering software, images can provide false or misleading information too. Google Reverse Image Search allows users to verify what websites used the image. When viewing images, ask yourself these questions:

  • When was the picture taken?
  • Where was the picture taken?
  • Who took the picture?
  • Look closely?
  • Why are you seeing this now?

Analyzing News Sources

  • Avoid websites that end in “lo” ex: Newslo. These sites take pieces of accurate information and then packaging that information with other false or misleading “facts” (sometimes for the purposes of satire or comedy).
  • Watch out for websites that end in “.com.co” as they are often fake versions of real news sources.
  • Watch out if known/reputable news sites are not also reporting on the story. Sometimes lack of coverage is the result of corporate media bias and other factors, but there should typically be more than one source reporting on a topic or event.
  • Odd domain names generally equal odd and rarely truthful news.
  • Lack of author attribution may, but not always, signify that the news story is suspect and requires verification.
  • Some news organizations are also letting bloggers post under the banner of particular news brands; however, many of these posts do not go through the same editing process (ex: BuzzFeed Community Posts, Kinja blogs, Forbes blogs).
  • Check the “About Us” tab on websites or look up the website on  Snopes  or  Wikipedia  for more information about the source.
  • Bad web design and use of ALL CAPS can also be a sign that the source you’re looking at should be verified and/or read in conjunction with other sources.
  • If the story makes you REALLY ANGRY it’s probably a good idea to keep reading about the topic via other sources to make sure the story you read wasn’t purposefully trying to make you angry (with potentially misleading or false information) in order to generate shares and ad revenue.
  • If the website you’re reading encourages you to dox individuals (doxing is searching for and publishing private or identifying information about someone on the Internet, typically with malicious intent), it’s unlikely to be a legitimate source of news.
  • It’s always best to read multiple sources of information to get a variety of viewpoints and media frames. Some sources not yet included in this list (although their practices at times may qualify them for addition), such as The Daily Kos, The Huffington Post, and Fox News, vacillate between providing important, legitimate, problematic, and/or hyperbolic news coverage, requiring readers and viewers to verify and contextualize information with other sources.

[from M. Zimbar,  False, misleading, clickbait-y, and / or satirical "news" sources.  Retrieved from Pace University Library Research Guide .  Illustration by Jim Cooke .]

A Word About Search Engines

Search engine creators, like Google, are businesses whose purpose is to turn a profit, not help you find information. Using several different search engines when seeking information is good practice, don't become loyal to just one. In addition, consider using search engines that are uncensored and anonymous, such as, DuckDuckGo  or GIBIRU .

  • << Previous: So What?
  • Next: Finding Real News >>
  • Last Updated: Apr 2, 2024 6:52 PM
  • URL: https://libguides.su.edu/fake_news
  • Reference Manager
  • Simple TEXT file

People also looked at

Original research article, “fake news” or real science critical thinking to assess information on covid-19.

critical thinking in fake news

  • 1 Department of Applied Didactics, Universidade de Santiago de Compostela (USC), Santiago de Compostela, Spain
  • 2 IES Ramón Cabanillas, Xunta de Galicia, Cambados, Spain

Few people question the important role of critical thinking in students becoming active citizens; however, the way science is taught in schools continues to be more oriented toward “what to think” rather than “how to think.” Researchers understand critical thinking as a tool and a higher-order thinking skill necessary for being an active citizen when dealing with socio-scientific information and making decisions that affect human life, which the pandemic of COVID-19 provides many opportunities for. The outbreak of COVID-19 has been accompanied by what the World Health Organization (WHO) has described as a “massive infodemic.” Fake news covering all aspects of the pandemic spread rapidly through social media, creating confusion and disinformation. This paper reports on an empirical study carried out during the lockdown in Spain (March–May 2020) with a group of secondary students ( N = 20) engaged in diverse online activities that required them to practice critical thinking and argumentation for dealing with coronavirus information and disinformation. The main goal is to examine students’ competence at engaging in argumentation as critical assessment in this context. Discourse analysis allows for the exploration of the arguments and criteria applied by students to assess COVID-19 news headlines. The results show that participants were capable of identifying true and false headlines and assessing the credibility of headlines by appealing to different criteria, although most arguments were coded as needing only a basic epistemic level of assessment, and only a few appealed to the criterion of scientific procedure when assessing the headlines.

Introduction: Critical Thinking for Social Responsibility – An Urgent Need in the Covid-19 Pandemic

The COVID-19 pandemic is a global phenomenon that affects almost all spheres of our life, aside from its obvious direct impacts on human health and well-being. As mentioned by the UN Secretary General, in his call for solidarity, “We are facing a global health crisis unlike any in the 75-year history of the United Nations — one that is spreading human suffering, infecting the global economy and upending people’s lives.” (19 March 2020, Guterres, 2020 ). COVID-19 has revealed the vulnerability of global systems’ abilities to protect the environment, health and economy, making it urgent to provide a responsible response that involves collaboration between diverse social actors. For science education the pandemic has raised new and unthinkable challenges ( Dillon and Avraamidou, 2020 ; Jiménez-Aleixandre and Puig, 2021 ), which highlight the importance of critical thinking (CT) development in promoting responsible actions and responses to the coronavirus disease, which is the focus of this paper. Despite the general public’s respect of science and scientific advances, denial movements – such as the ones that reject the use of vaccines and advocate for alternative health therapies – are increasing during this period ( Dillon and Avraamidou, 2020 ). The rapid global spread of the coronavirus disease has been accompanied by what the World Health Organization (WHO) has described as the COVID-19 social media infodemic. The term infodemic refers to an overabundance of information (real or not) associated with a specific topic, whose growth can occur exponentially in a short period of time [ World Health Organization (WHO), 2020 ]. The case of the COVID-19 pandemic shows the crucial importance of socio-scientific instruction toward students’ development of critical thinking (CT) for citizenship.

Critical thinking is embedded within the framework of “21st century skills” and is considered one of the goals of education ( van Gelder, 2005 ). Despite its importance, there is not a clear consensus on how to better promote CT in science instruction, and teachers often find it unclear what CT means and requires from them in their teaching practice ( Vincent-Lacrin et al., 2019 ). CT is understood in this study as a set of skills and dispositions that enable students and people to take critical actions based on reasons and values, but also as independent thinking ( Jiménez-Aleixandre and Puig, 2021 ). It is also considered as a dialogic practice that students can enact and thereby become predisposed to practice ( Kuhn, 2019 ). We consider that CT has two fundamental roles in SSI instruction: one role linked to the promotion of rational arguments, cognitive skills and dispositions; and the other related to the idea of critical action and social activism, which is consistent with the characterization of CT provided by Jiménez-Aleixandre and Puig (2021) . Although research on SSIs has provided us with empirical evidence supporting the benefits of SSI instruction, particularly argumentation and students’ motivation toward learning science, there is still scarce knowledge on how CT is articulated in these contexts. One challenge with promoting CT, especially in SSIs, is linked to new forms of communication that generate a rapid increase of information and easy access to it ( Puig et al., 2020 ).

The study was developed in an unprecedented scenario, during the lockdown in Spain (March–May 2020), which forced the change of face-to-face teaching to virtual teaching, involving students in online activities that embraced the application of scientific notions related to COVID-19 and CT for assessing claims published in news headlines related to it. Previous studies have pointed out the benefits of virtual environments to foster CT among students, particularly asynchronous discussions that minimize social presence and favor all students expressing their own opinion ( Puig et al., 2020 ).

In this research, we aim to explore students’ ability to critically engage in the assessment of the credibility of COVID-19 claims during a moment in which fake news disseminated by social media was shared by the general public and disinformation on the virus was easier to access than real news.

Theoretical Framework

We will first discuss the crucial role of CT to address controversial issues and to fight against the rise of misinformation on COVID-19; and then turn attention to the role of argumentation in students’ development of CT in SSI instruction in epistemic education.

Critical Thinking on Socio-Scientific Instruction to Face the Rise of Disinformation

SSIs are compelling issues for the application of knowledge and processes contributing to the development of CT. They are multifaceted problems, as is the case of COVID-19, that involve informal reasoning and elements of critique where decisions present direct consequences to the well-being of human society and the environment ( Jiménez-Aleixandre and Puig, 2021 ). People need to balance subject matter knowledge, personal values, and societal norms when making decisions on SSIs ( Aikenhead, 1985 ) but they also have to be critical of the discourses that shape their own beliefs and practices to act responsibly ( Bencze et al., 2020 ). According to Duschl (2020) , science education should involve the creation of a dialogic discourse among members of a class that focuses on the teaching and learning of “how did we come to know?” and “why do we accept that knowledge over alternatives?” Studies on SSIs during the last decades have pointed out students’ difficulties in building arguments and making critical choices based on evidence ( Evagorou et al., 2012 ). However, literature also indicates that students find SSIs motivational for learning and increase their community involvement ( Eastwood et al., 2012 ; Evagorou, 2020 ), thus they are appropriate contexts for CT development. While research on content knowledge and different modes of reasoning on SSIs is extensive, the practice of CT is understudied in science instruction. Of particular interest in science education are SSIs that involve health controversies, since they include some of the challenges posed by the post-truth era, as the health crisis produced by coronavirus shows. The COVID-19 pandemic is affecting most countries and territories around the world, which is why it is considered the greatest challenge that humankind has faced since the 2nd World War ( Chakraborty and Maity, 2020 ). Issues like COVID-19 that affect society in multiple ways require literate citizens who are capable of making critical decisions and taking actions based on reasons. As the world responds to the COVID-19 pandemic, we face the challenge of an overabundance of information related to the virus. Some of this information may be false and potentially harmful [ World Health Organization (WHO), 2020 ]. In the context of growing disinformation related to the COVID-19 outbreak, EU institutions have worked to raise awareness of the dangers of disinformation and promoted the use of authoritative sources ( European Council of the European Union, 2020 ). Educators and science educators have been increasingly concerned with what can be done in science instruction to face the spread of misinformation and denial of well-established claims; helping students to identify what is true can be a hard task ( Barzilai and Chinn, 2020 ). As these authors suggest, diverse factors may shape what people perceive as true, such as the socio-cultural context in which people live, their personal experiences and their own judgments, that could be biased. We concur with these authors and Feinstein and Waddington (2020) , who argue that science education should not focus on achieving the knowledge, but rather on gaining appropriate scientific knowledge and skills, which in our view involves CT development. Furthermore, according to Sperber et al. (2010) , there are factors that affect the acceptance or rejection of a piece of information. These factors have to do either with the source of the information – “who to believe” – or with its content – “what to believe.” The pursuit of truth when dealing with SSIs can be facilitated by the social practices used to develop knowledge ( Duschl, 2020 ), such as argumentation understood as the evaluation of claims based on evidence, which is part of CT development.

We consider CT and argumentation as overlapping competencies in their contexts of practice; for instance, when assessing claims on COVID-19, as in this study. According to Sperber et al. (2010) , we now have almost no filters on information, and this requires a much more vigilant, knowledgeable reader. As these authors point out, individuals need to become aware of their own cognitive biases and how to avoid being victims themselves. If we want students to learn how to critically evaluate the information and claims they will encounter in social media outside the classroom, we need to engage them in the practice of argumentation and CT. This raises the question of what type of information is easier or harder for students to assess, especially when they are directly affected by the problem. In this paper we aim to explore this issue by exploring students’ arguments while assessing diverse claims on COVID-19. We think that students’ arguments reflect their ability to apply CT in this context, although this does not mean that CT skills always produce a well-reasoned argument ( Halpern, 1998 ). Students should be encouraged to express their own thoughts in SSI instruction, but also to support their views reasonably ( Puig and Ageitos, 2021 ). Specifically, when they must assess the validity of information that affects not only them as individuals but also the whole society and environment. CT may equip citizens to discard fake news and to use appropriate criteria to evaluate information. This requires the design and implementation of specific CT tasks, as this study presents.

Argumentation to Enhance Critical Thinking Development in Epistemic Education on SSIs

While the concept of CT has a long tradition and educators agree on its importance, there is a lack of agreement on what this notion involves ( Thomas and Lok, 2015 ). CT has been used with a wide range of meanings in theoretical literature ( Facione, 1990 ; Ennis, 2018 ). In 1990, The American Philosophical Association convened an authoritative panel of forty-six noted experts on CT to produce a definitive account of the concept, which was published in the Delphi Report ( Facione, 1990 ). The Delphi definition provides a list of skills and dispositions that can be useful and guide CT instruction. However, as Davies and Barnett (2015) point out, this Delphi definition does not include the phenomenon of action. We concur with these authors that CT education should involve students in “CT for action,” since decision making – a way of deciding on a course of action – is based on judgments derived from argumentation using CT. Drawing from Halpern (1998) , we also think that CT requires awareness of one’s own knowledge. CT requires, for instance, insight into what one knows and the extent and importance of what one does not know in order to assess socio-scientific news and its implications ( Puig and Ageitos, 2021 ).

Critical thinking and argumentation share core elements like rationality and reflection ( Andrews, 2015 ). Some researchers suggest understanding CT as a dialogic practice ( Kuhn, 2019 ) has implications in CT instruction and development. Argumentation on SSIs, particularly on health controversies, is receiving increasing attention in science education in the post-truth era, as the coronavirus pandemic and denial movements related to its origin, prevention, and treatment show. Science education should involve the creation of a dialogic discourse among members of a class that enable them to develop CT. One of the central features in argumentation is the development of epistemic criteria for knowledge evaluation ( Jiménez Aleixandre and Erduran, 2008 ), which is a necessary skill to be a critical thinker. We see the practice of CT as the articulation of cognitive skills through the practice of argumentation ( Giri and Paily, 2020 ).

This article argues that science education needs to explore learning experiences and ways of instruction that support CT by engaging learners in argumentation on SSIs. Despite CT being considered a seminal goal in education and the large body of research on CT supporting this ( Dominguez, 2018 ), debates still persist about the manner in which CT skills can be achieved through education ( Abrami et al., 2008 ). Niu et al. (2013) remark that educators have made a striking effort to foster CT among students, showing that the belief that CT can be taught and learned has spread and gained support. Therefore, CT has slowly made its way into general school education and specific instructional interventions. Problem-based learning is one of the most widely used learning approaches nowadays in CT instruction ( Dominguez, 2018 ) because it is motivating, challenging, and enjoyable ( Pithers and Soden, 2000 ; Niu et al., 2013 ). We see active learning methodologies and real-word problems such as SSIs as appropriate contexts for CT development.

The view that CT can be developed by engagement in argumentation practices plays a central role in this study, as Kuhn (2019) suggested. However, the post-truth condition poses some challenges to the evaluation of sources of information and scientific evidence disseminated by social media. According to Sinatra and Lombardi (2020) , the post-truth context raises the need for critical evaluation of online information about SSIs. Students need to be better prepared to assess science information they can easily find online from a variety of sources. Previous studies described by these authors emphasized the importance of source evaluation instruction to equip students toward this goal ( Bråten et al., 2019 ), however, this is not sufficient. Sinatra and Lombardi (2020) note that students should learn how to evaluate the connections between sources of information and knowledge claims. This requires, from our view, engaging students in CT and epistemic performance. If we want students to learn to think critically about the claims they will encounter on social media, they need to practice argumentation as critical evaluation.

We draw on research on epistemic education ( Chinn et al., 2018 ) which considers that learning science entails students’ participation in the science epistemic goals ( Kelly and Licona, 2018 ); in other words, placing scientific practices at the center of SSI instruction. Our study is framed in a broader research project that aims to embed CT in epistemic design and performance. In Chinn et al. (2018) AIR model, epistemic cognition has three core elements that represent the three letters of the acronym: epistemic Aims, goals related to inquiry; epistemic Ideals, standards and criteria used to evaluate epistemic products, such as explanations or arguments; and Reliable processes for attaining epistemic achievements. Of particular interest for our focus on CT is that the AIR model also proposes that epistemic cognition has a social nature, and it is situated. The purpose of epistemic education ( Barzilai and Chinn, 2017 ) should be to enable students to succeed in epistemic activities ( apt epistemic performance ), such as constructing and evaluating arguments, and to assess through meta-competence when success can be achieved. This paper attends to one aspect of epistemic performance proposed by Barzilai and Chinn (2017) , which is cognitive engagement in epistemic assessment. Epistemic assessment encompasses in our study the evaluation of the content of claims disseminated by media. Aligned with these authors we understand that this process requires cognitive and metacognitive competences. Thus, epistemic assessment needs adequate disciplinary knowledge, but also meta-cognitive competence for recognizing unsupported beliefs.

Goal and Research Questions

This paper examines students’ competence to engage in argumentation and CT in an online task that requires them to critically assess diverse information presented in media headlines on COVID-19. Competence in general can be defined as “a disposition to succeed with a certain aim” ( Sosa, 2015 , p. 43) and epistemic competence, as a special case of competence, is at its core a dispositional ability to discern the true from the false in a certain domain. For the purposes of this paper, the attention is on epistemic competence, being the research questions that drive the analysis of the following:

1. What is the competence of students to assess the credibility of COVID-19 information appearing in news headlines?

2. What is the level of epistemic assessment showed in students’ arguments according to the criteria appealed while assessing COVID-19 news headlines?

Materials and Methods

Context, participants, and design.

A teaching sequence about COVID-19 was designed at the beginning of the lockdown in Spain (Mid-March 2020) in response to the rise of misinformation about coronavirus on the internet and social media. The design process involved collaboration between the first and second author (researchers in science education) and the third author (a biology teacher in secondary education).

The participants are a group of twenty secondary students (14–15 years old), eleven of them girls, from a state public school located in a well-known seaside village in Galicia (Spain). They were mostly from middle-class families and within an average range of ability and academic achievement.

Students were from the same classroom and participated in previous online activities as part of their biology classes, taught by their biology teacher, who collaborated on previous studies on CT and learning science through epistemic practices on health controversies.

The activities were integrated in their biology curriculum and carried out when participants received instruction on the topics of health, infectious diseases, and the immune system.

Google Forms was used for the design and implementation of all activities included in the sequence. The reason to select Google Forms is that it is free and a well-known tool for online surveys. Besides, all students were familiar with its use before the lockdown and the teacher valued its usefulness for engaging them in online debates and in their own evaluation processes. This online resource provides anonymous results and statistics that the teacher could share with the students for debates. It needs to be highlighted that during the lockdown students did not have the same work conditions; particularly, quality and availability of access to the internet differed among them. Thus, all activities were asynchronous. They had 1 week to complete each task and the teacher could be consulted anytime if they had difficulties or any question regarding the activities.

The design was inspired by a previous one carried out by the authors when the first case of Ebola disease was introduced in Spain ( Puig et al., 2016 ), and follows a constructivist and scientific-based approach. The sequence began with an initial task, in which students were required to express their own views and knowledge on COVID-19 and health notions related with it, before then being progressively involved in the application of knowledge through the practice of modeling and argumentation. The third activity engaged them in critical evaluation of COVID-19 information. A more detailed description of the activities carried out in the different steps of the sequence is provided below.

Stage 1: General Knowledge on Health Notions Related to COVID-19

An individual Google Forms survey around some notions and health concepts that appear in social media during the lockdown, such as “pandemic”, “virus,” etc.

Stage 2: Previous Knowledge on Coronavirus Disease

This stage consisted of three parts: (2.1) Individual online survey on infectious diseases; (2.2) Introduction of knowledge about infectious diseases provided in the e-bugs project website 1 and activities; virtual visit to the exhibition “Outbreaks: epidemics in a connected world” available in the Natural History Museum website (blinded for review); (2.3) Building a poster with the chain of infection of the COVID-19 disease and some relevant information to consider in order to stop the spread of the disease.

Stage 3: COVID-19, Sources of Information

This stage consisted first of a virtual forum in which students shared their own habits when consulting scientific information, particularly coronavirus-related, and debated on the main media sources they used to consult for this purpose. Secondly, students had to analyze ten news headlines on COVID-19 disseminated by social media during the outbreaks; six corresponded to fake news and four were true. They were asked to critically assess them and distinguish which they thought were true, providing their arguments. Media sources were not provided until the end of the task, since the act of asking for the source was considered as part of the data analysis (see Table 1 ). The second part of this stage is the focus of our analysis.


Table 1. COVID-19 News Headlines provided to students.

Stage 4: Act and Raise Awareness on COVID-19

The sequence ended with the creation of a short video in which the students had to provide some tips to avoid the transmission of the virus. The information provided in the video must be supported and based on established scientific knowledge.

Data Corpus and Analysis

Data collection includes all individual surveys and activities developed in Google Forms. We analyzed students’ individual responses ( N = 28) presented in Stage 3. The research is designed as a qualitative study that utilizes the methods of discourse analysis in accordance with the data and the purpose of the study. Discourse analysis allows the analysis of the content (implicit or explicit) of written arguments produced by students, and so the examination of the research questions. Our analysis focuses on students’ arguments and criteria used to assess the credibility of COVID-19 headlines (ten headlines in total). It was carried out through an iterative process in which students’ responses were read and revised several times in order to develop an open-coded scheme that captures the arguments provided. To ensure the internal reliability of our codes, each student response was examined by the first and the second author separately and then contrasted and discussed until 100% agreement was achieved. The codes obtained were established according to the following criteria, summarized in Table 2 .


Table 2. Code scheme for research questions 1 and 2.

For Research Question 1, we distributed the arguments in two main categories: (1) Arguments that question the credibility of the information ; (2) Arguments that do not question the credibility of the information.

For Research Question 2, we classify arguments that question the credibility of the headline in accordance with the level of epistemic assessment into three levels (see Table 2 ). The level of epistemic assessment (basic, medium, and high) was established by the authors based on the criteria that students applied and expressed explicitly or implicitly in their arguments. These criteria emerged from the data, thus the categories were not pre-established; they were coded by the authors as the following: content (using the knowledge that each student has about the topic), source (questioning the origin of the information), evidence (appealing to empirical evidence as real live situations that students experienced), authority (justifying according to who supports or is behind the claim) and scientific procedure (drawing on the evolution of scientific knowledge).

Students’ Competence to Critically Assess the Credibility of COVID-19 Claims

In general, most students were able to distinguish fallacious from true headlines, which was an important step to assess their credibility. For those that were false, students were able to question their credibility, providing arguments against them. On the contrary, for true news headlines, as it was expected, most participants developed arguments supporting them. Thus, they did not question their content. In both cases, the arguments elaborated by students appealed to different criteria discussed in the next section of results.

As shown in Table 3 , 147 arguments were elaborated by students to question the false headlines; they created just 22 arguments to assess the true ones. This finding was expected by the authors, as arguments for questioning or criticality appear more frequently when the information presented differs from students’ opinions.


Table 3. Number of students who questioned or not each news headline on COVID-19.

Students showed a higher capacity for questioning those claims they considered false or fake news , which can be related to the need to justify properly why they consider them false and/or what should be said to counter them.

The headlines that were most controversial, meaning they created diverse positions among students, were these three: “The COVID-19 virus can be transmitted in areas with hot and humid climates,” “Skin manifestations (urticaria, chilblains, rashes…) could be among the mild symptoms of coronavirus” and “Antibiotics are effective in preventing and treating coronavirus infection.”

The first two were questioned by 11 students out of 28, despite being real headlines. According to students’ answers, they were not familiar with this information, e.g., “I think the heat is not good for the virus.” On the contrary, 17 students did not question these headlines, arguing for instance as this student did: “because it was shown that both in hot climates and in cold climates it is contagious in the same way.”

A similar situation happened with the third headline, which is false. A proportion of students (9 out of 28) accepted that antibiotics could help to treat COVID-19, showing in their answers some misunderstanding regarding the use of antibiotics and the diseases they could treat. The rest of the participants (19 out of 28) questioned this headline, affirming that “because antibiotics are used to treat bacterial infections and coronavirus is a virus,” among other justifications for why it was false.

Levels of Epistemic Assessment in Students’ Arguments on COVID-19 News Headlines

To analyze the level of epistemic assessment showed in students’ arguments when dealing with each headline, attention was focused on the criteria students applied (see Table 2 ). As Table 4 summarizes, almost all arguments included only one criterion (139 out of 169), and 28 out of 169 did not incorporate any criterion. These types of arguments can be interpreted as low epistemic assessment, or even without epistemic assessment if no criterion is included.


Table 4. Arguments used by students to assess the credibility of each COVID-19 headline.

In the category of Basic Epistemic Assessment , we include all students’ arguments that included one criterion: Content or Empirical Evidence. Students assessed the content of the claim appealing to their own knowledge about that piece of information or to empirical evidence, without posing critical questions for assessing the credibility of the source of information. These two criteria, content and evidence, were included in students’ arguments with a frequency of 86 and 23, respectively, with this category the most common (109 out of 169) when questioning false and true headlines. In the case of true headlines, arguments under this category were identified in relation to headlines 2 and 4, whose credibility were questioned by appealing to the content, such as: “those are not the symptoms (skin manifestations) ” . Examples of arguments assessing the content of false headlines are provided below:

“Because the virus is inside the body, and even if you injected alcohol into the body it would only cause intoxication”

This student rejects headline 5, appealing to the fact that alcohol causes intoxication rather than the elimination of coronavirus.

“I know a person who had coronavirus and they only gave him paracetamol”

In this example, the student rejects headline 6 and appeals to his/her own experience during the pandemic, particularly a close person who had coronavirus, as evidence against the use of antibiotics for coronavirus disease treatment.

The category Medium Epistemic Assessment gathers arguments that make critical questions, particularly those asking for information about the authority or the source of information. For us, these criteria reflect a higher level of epistemic performance since they imply questioning beyond the veracity of the headline itself to its sources and authorship. There are 20 out of 169 arguments coded within this category.

The assessment of true headlines includes arguments that question the authority and source, e.g., “because they said it on the news” (headline 2), “that news does not seem very reliable to me” (headline 4). It is also an ordinary category in questioning false headlines, since students appealed to the source (16), “because in the news they clarified that it was a fake news and because it is not credible either” (headline 10) or the authority (4), “because the professionals said they were more vulnerable (people over 70 years old) but not that it only affected them” (headline 7).

For the highest category, High Epistemic Assessment , we consider those arguments (12 out of 169) in which students appealed to the scientific procedure (11) to justify why the headline is false, which manifests students’ reliance on epistemic processes, e.g., “because treatments that protect against coronavirus are still being investigated” (headline 9). Also, under this category we include arguments that combined more than one criterion, content and scientific procedure “Because antibiotics don’t treat those kinds of infections. In addition, no medication has yet been discovered that can prevent the coronavirus” (headline 6). Students’ arguments included in this category were elaborated to assess false headlines.

Lastly, a special mention is afforded to those arguments that did not include any criteria (28), which are contained in the category Non-Epistemic Assessment. It appears more frequently in students’ answers to headlines 8 and 10, as these examples show: “I don’t think it’s true because it doesn’t make much sense to me” (headline 8) or “I never heard it and I doubt it’s true” (regarding drinking alcohol, headline 10).

The findings of our study indicate that students were able to deal with fake news , identifying it as such. They showed capacity to critically assess the content of these news headlines, considering their inconsistencies in relation to their prior knowledge ( Britt et al., 2019 ). As Evagorou (2020) pointed out, SSIs are appropriate contexts for CT development and to value the relevance of science in our lives.

The examination of RQ1 shows that a proportion of students were able to perceive the lack of evidence behind them or even identified that those statements contradict what science presents. This is a remarkable finding and an important skill to fight against attempts to diminish trust in science produced in the post-truth condition ( Dillon and Avraamidou, 2020 ). CT and argumentation are closely allied ( Andrews, 2015 ) but as the results show, knowledge domain seems to play an important role in assessing SSIs news and their implications. Specific CT requires some of the same skills as generalizable CT, but it is highly contextual and requires particular knowledge ( Jones, 2015 ).

Students’ prior knowledge influenced the critical evaluation of some of the COVID-19 headlines provided in the activity. This is particularly relevant in responses to headline 6 (false) “Antibiotics are effective in preventing and treating coronavirus infection.” A previous study on the interactions between the CT and knowledge domain on vaccination ( Ageitos and Puig, 2021 ) showed that there is a correspondence between them. This points to the importance of health literacy for CT development, although it would not be sufficient to provide students with adequate knowledge only, as judgment skills, in this case regarding the proper use of antibiotics, are also required.

We found that the level of epistemic assessment (RQ2) linked to students’ CT capacity is low. A big majority of arguments were situated in a basic epistemic assessment level, and just a few in a higher epistemic assessment. One reason that might explain these results could be related to the task design and format, in which students worked autonomously in a virtual environment. As CT studies in e-learning environments have reinforced ( Niu et al., 2013 ), cooperative or collaborative learning favors CT skills, particularly when students have to discuss and justify their arguments on real-life problems. The circumstances in which students had to work during the outbreak did not allow them to work together since internet connections were not good for all of them, so synchronous activities were not possible. This aspect is a limitation for this research.

There were differences in the use of criteria, and thus in the level of epistemic assessment, when students dealt with true and false headlines. This could be related to diverse factors, such as the language. The claims are marred by language and they are formulated in a different way. Particularly, it is quite nuanced in true statements while certain and resolute in false headlines. The practice of CT requires an understanding of the language, the content under evaluation and other cognitive skills ( Andrews, 2015 ).

In the case of false headlines, most arguments appealed to their content and less to the criteria of source, authority, and the scientific procedure, whereas in the case of true headlines most of them appealed to the authority and/or source. According to the AIR model ( Chinn and Rinehart, 2016 ), epistemic ideals are the criteria used to evaluate the epistemic products, such as claims. In the case of COVID-19 claims, students need to have an ideal of high source credibility ( Duncan et al., 2021 ). This means that students acknowledge that information should be gathered from reliable news media that themselves obtained information from reliable experts.

Only few students used the criterion of scientific procedure when assessing false headlines, which shows a high level of epistemic assessment. Promoting this type of assessment is important since online discourse in the post-truth era is affected by misinformation and by appeals to emotions and ideology.

Conclusion and Implications

This research has been conducted during a moment in which the lives of people were paralyzed, and citizens were forced to stay at home to stop the spread of the coronavirus disease. During the lockdown and even after, apart from these containment measures, citizens in Spain and in many countries had to deal with a huge amount of information about the coronavirus disease, some of it false. The outbreak of COVID-19 has been accompanied by dissemination of inaccurate information spread at high speed, making it more difficult for the public to identify verified facts and advice from trusted sources ( World Health Organization (WHO), 2020 ). As the world responds to the COVID-19 pandemic, many studies have been carried out to analyze the impact of the pandemic on the life of children from diverse views ( Cachón-Zagalaz et al., 2020 ), but not from the perspective of exploring students’ ability to engage in the epistemic assessment of information and disinformation on COVID-19 under a situation of social isolation. This is an unprecedented context in many aspects, where online learning replaced in-person teaching and science uncertainties were more visible than ever.

Participants engaged in the epistemic assessment of coronavirus headlines and were able to put into practice their CT, arguing why they considered them as true or false by appealing to different criteria. We are aware that our results have limitations. Once such limitation is that students performed the activity independently, without creating a collaborative virtual environment, understood by the authors as one of the e-learning strategies that better promote CT ( Puig et al., 2020 ). Furthermore, despite the fact that teachers were available for students to solve any questions regarding the task, the remote and asynchronous process did not allow them to guide the activity in a way that helped the students to carry out a deeper analysis. CT development and epistemic cognition depends on many factors, and teachers have an important role in achieving these goals ( Greene and Yu, 2016 ; Chinn et al., 2020 ).

The analysis of arguments allows us to identify some factors that are crucial and directly affect the critical evaluation of headlines. Some of the students did not question the use of antibiotics for coronavirus disease. This result highlights the importance of health literacy and its interdependency with CT development, as previous studies on vaccine controversies and CT show ( Puig and Ageitos, 2021 ). Although it is not the focus of this paper; the results point to the importance of making students aware of their knowledge limitations for critical assessment. A key instructional implication from this work is making e-learning activities more cooperative, as we have noted, and epistemically guided. Moreover, CT dimensions could be made explicit in instructional materials and assessments. If we want to prepare students to develop CT in order to face real/false news spread by social media, we need to engage them in deep epistemic assessment, namely in the critical analysis of the content, the source, procedures and evidence behind the claims, apart from other tasks. Promoting students’ awareness and vigilance regarding misinformation and disinformation online may also promote more careful and attentive information use ( Barzilai and Chinn, 2020 ), thus activities oriented toward these goals are necessary.

Our study reinforces the need to design more CT activities that guide students in the critical assessment of diverse aspects behind controversial news as a way to fight against the rise of disinformation and develop good knowledge when dealing with SSIs. Students’ epistemological views can influence their performance on argumentation thus, if uncertainty of knowledge is explicitly address in SSI instruction and epistemic activities, students’ epistemological views may be developed, and such development may in turn influence their argumentation competence and consequently their performance on CT.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.

Ethics Statement

Written informed consent was obtained from the participants’ legal guardian/next of kin to participate in this study in accordance with the National Legislation and the Institutional Requirements.

Author Contributions

BP conducted the conceptual framework and designed the research study. PB-A conducted the data analysis and collaborated in manuscript preparation. JP-M implemented the didactical proposal and collected the data. All authors contributed to the article and approved the submitted version.

This work was supported by the project ESPIGA, funded by the Spanish Ministry of Science, Education and Universities, partly funded by the European Regional Development Fund (ERDF) Grant code: PGC2018-096581-B-C22.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.


This study was carried out within the RODA research group during the lockdown in Spain due to COVID-19 pandemic. We gratefully acknowledge all the participants for their implication, despite such difficult circumstances.

  • ^ https://www.e-bug.eu

Abrami, P. C., Bernard, R. M., Borokhovski, E., Wade, A., Surkes, M. A., Ramim, R., et al. (2008). Instructional interventions affecting critical thinking skills and dispositions: a stage 1 meta-analysis. Rev. Educ. Res . 78, 1102–1134. doi: 10.3102/0034654308326084

CrossRef Full Text | Google Scholar

Ageitos, N., and Puig, B. (2021). “Critical thinking to decide what to believe and what to do regarding vaccination in schools. a case study with primary pre-service teachers,” in Critical Thinking in Biology and Environmental Education. Facing Challenges in a Post-Truth World , eds B. Puig and M. P. Jiménez-Aleixandre (Berlin: Springer).

Google Scholar

Aikenhead, G. S. (1985). Collective decision making in the social context of science. Sci. Educ . 69, 453–475. doi: 10.1002/sce.3730690403

Andrews, R. (2015). “Critical thinking and/or argumentation in highr education,” in The Palgrave Handbook of Critical Thinking in Higher Education , eds M. Davies, R. Barnett, et al. (New York, NY: Palgrave Macmillan), 93–105.

Barzilai, S., and Chinn, C. A. (2017). On the goals of epistemic education: promoting apt epistemic performance. J. Learn. Sci . 27, 353–389. doi: 10.1080/10508406.2017.1392968

Barzilai, S., and Chinn, C. A. (2020). A review of educational responses to the “post-truth” condition: four lenses on “post-truth” problems. Educ. Psychol. 55, 107–119. doi: 10.1080/00461520.2020.1786388

Bencze, L., Halwany, S., and Zouda, M. (2020). “Critical and active public engagement in addressing socioscientific problems through science teacher education,” in Science Teacher Education for Responsible Citizenship , eds M. Evagorou, J. A. Nielsen, and J. Dillon (Berlin: Springer), 63–83. doi: 10.1007/978-3-030-40229-7_5

Bråten, I., Brante, E. W., and Strømsø, H. I. (2019). Teaching sourcing in upper secondary school: a comprehensive sourcing intervention with follow-up data. Read. Res. Q . 54, 481–505. doi: 10.1002/rrq.253

Britt, M. A., Rouet, J. F., Blaum, D., and Millis, K. K. (2019). A reasoned approach to dealing with fake news. Policy Insights Behav. Brain Sci . 6, 94–101. doi: 10.1177/2372732218814855

Cachón-Zagalaz, J., Sánchez-Zafra, M., Sanabrias-Moreno, D., González-Valero, G., Lara-Sánchez, A. J., and Zagalaz-Sánchez, M. L. (2020). Systematic review of the literature about the effects of the COVID-19 pandemic on the lives of school children. Front. Psychol . 11:569348. doi: 10.3389/fpsyg.2020.569348

PubMed Abstract | CrossRef Full Text | Google Scholar

Chakraborty, I., and Maity, P. (2020). COVID-19 outbreak: migration, effects on society, global environment and prevention. Sci. Total Environ . 728:138882. doi: 10.1016/j.scitotenv.2020.138882

Chinn, C., and Rinehart, R. W. (2016). “Epistemic cognition and philosophy: developing a new framework for epistemic cognition,” in Handbook of Epistemic Cognition , eds J. A. Greene, W. A. Sandoval, and I. Braten (New York, NY: Routledge), 460–478.

Chinn, C. A., Barzilai, S., and Duncan, R. G. (2020). Disagreeing about how to know. the instructional value of explorations into knowing. Educ. Psychol . 55, 167–180. doi: 10.1080/00461520.2020.1786387

Chinn, C. A., Duncan, R. G., and Rinehart, R. (2018). “Epistemic design: design to promote transferable epistemic growth in the PRACCIS project,” in Promoting Spontaneous Use of Learning and Reasoning Strategies. Theory, Research and Practice for Effective Transfer , eds E. Manalo, Y. Uesaka, and C. A. Chinn (Abingdon: Routledge), 243–259.

Davies, M., and Barnett, R. (2015). The Palgrave Handbook of Critical Thinking in Higher Education . London: Palgrave MacMillan. doi: 10.1057/9781137378057

Dillon, J., and Avraamidou, L. (2020). Towards a viable response to COVID-19 from the science education community. J. Activist Sci. Technol. Educ . 11, 1–6. doi: 10.33137/jaste.v11i2.34531

Dominguez, C. (2018). A European Review on Critical Thinking Educational Practices in Higher Education Institutions. Vila Real: UTAD. Available online at: https://www.researchgate.net/publication/322725947_A_European_review_on_Critical_Thinking_educational_practices_in_Higher_Education_Institutions

Duncan, R. G., Caver, V. L., and Chinn, C. A. (2021). “The role of evidence evaluation in critical thinking,” in Critical Thinking in Biology and Environmental Education. Facing Challenges in a Post-Truth World , eds B. Puig and M. P. Jiménez-Aleixandre (Berlin: Springer).

Duschl, R. (2020). Practical reasoning and decision making in science: struggles for truth. Educ. Psychol . 3, 187–192. doi: 10.1080/00461520.2020.1784735

Eastwood, J. L., Sadler, T. D., Zeidler, D. L., Lewis, A., Amiri, L., and Applebaum, S. (2012). Contextualizing nature of science instruction in socioscientific issues. Int. J. Sci. Educ . 34, 2289–2315. doi: 10.1080/09500693.2012.667582

Ennis, R. (2018). Critical thinking across the curriculum. Topoi 37, 165–184. doi: 10.1007/s11245-016-9401-4

European Council of the European Union (2020). Fighting Disinformation . Available online at: https://www.consilium.europa.eu/en/policies/coronavirus/fighting-disinformation/

Evagorou, M. (2020). “Introduction: socio-scientific issues as promoting responsible citizenship and the relevance of science,” in Science Teacher Education for Responsible Citizenship , eds M. Evagorou, J. A. Nielsen, and J. Dillon (Berlin: Springer), 1–11. doi: 10.1007/978-3-030-40229-7_1

Evagorou, M., Jimenez-Aleixandre, M. P., and Osborne, J. (2012). ‘Should we kill the grey squirrels?’ a study exploring students’ justifications and decision-making. Int. J. Sci. Educ . 34, 401–428. doi: 10.1080/09500693.2011.619211

Facione, P. A. (1990). Critical Thinking: a Statement of Expert Consensus for Purposes of Educational Assessment and Instruction. Fullerton, CA: California State University.

Feinstein, W. N., and Waddington, D. I. (2020). Individual truth judgments or purposeful, collective sensemaking? rethinking science education’s response to the post-truth era. Educ. Psychol . 55, 155–166. doi: 10.1080/00461520.2020.1780130

Giri, V., and Paily, M. U. (2020). Effect of scientific argumentation on the development of critical thinking. Sci. Educ . 29, 673–690. doi: 10.1007/s11191-020-00120-y

Greene, J. A., and Yu, S. B. (2016). Educating critical thinkers: the role of epistemic cognition. Policy Insights Behav. Brain Sci . 3, 45–53. doi: 10.1177/2372732215622223

Guterres, A (2020). Secretary-General Remarks on COVID-19: A Call for Solidarity . Available at: https://www.un.org/sites/un2.un.org/files/sg_remarks_on_covid-19_english_19_march_2020.pdf (accessed March 19, 2020).

Halpern, D. F. (1998). Teaching critical thinking for transfer across domains. dispositions, skills, structure training, and metacognitive monitoring. Am. Psychol . 53, 449–455. doi: 10.1037/0003-066x.53.4.449

Jiménez Aleixandre, M. P., and Erduran, S. (2008). “Argumentation in science education: an overview,” in Argumentation in Science Education: Perspectives from Classroom-Based Research , eds S. Erduran and M. P. Jiménez Aleixandre (Dordrecht: Springer), 3–27. doi: 10.1007/978-1-4020-6670-2_1

Jiménez-Aleixandre, M. P., and Puig, B. (2021). “Educating critical citizens to face post-truth: the time is now,” in Critical Thinking in Biology and Environmental Education. Facing Challenges in a Post-Truth World , eds B. Puig and M. P. Jiménez-Aleixandre (Berlin: Springer).

Jones, A. (2015). “A disciplined approach to CT,” in The Palgrave Handbook of Critical Thinking in Higher Education , eds M. Davies, R. Barnett, et al. (New York, NY: Palgrave Macmillan), 93–105.

Kelly, G. J., and Licona, P. (2018). “Epistemic practices and science education,” in History, Philosophy and Science Teaching , ed. M. R. Matthews (Dordrecht: Springer), 139–165. doi: 10.1007/978-3-319-62616-1_5

Kuhn, D. (2019). Critical thinking as discourse. Hum. Dev . 62, 146–164. doi: 10.1159/000500171

Niu, L., Behar-Horenstein, L. S., and Garvan, C. W. (2013). Do instructional interventions influence college students’ critical thinking skills? a meta-analysis. Educ. Res. Rev . 9, 114–128. doi: 10.1016/j.edurev.2012.12.002

Pithers, R. T., and Soden, R. (2000). Critical thinking in education: a review. Educ. Res . 42, 237–249.

Puig, B., and Ageitos, N. (2021). “Critical thinking to decide what to believe and what to do regarding vaccination in schools. a case study with primary pre-service teachers,” in Critical Thinking in Biology and Environmental Education. Facing Challenges in a Post-Truth World , eds B. Puig and M. P. Jiménez-Aleixandre (Berlin: Springer).

Puig, B., Blanco Anaya, P., and Bargiela, I. M. (2020). “A systematic review on e-learning environments for promoting critical thinking in higher education,” in Handbook of Research in Educational Communications and Technology , eds M. J. Bishop, E. Boling, J. Elen, and V. Svihla (Cham: Springer), 345–362. doi: 10.1007/978-3-030-36119-8_15

Puig, B., Blanco Anaya, P., Crujeiras Pérez, B., and Pérez Maceira, J. (2016). Ideas, emociones y argumentos del profesorado en formación acerca del virus del Ébola. Indagatio Didactica 8, 764–776.

Sinatra, G. M., and Lombardi, D. (2020). Evaluating sources of scientific evidence and claims in the post-truth era may require reappraising plausibility judgments. Educ. Psychol . 55, 120–131. doi: 10.1080/00461520.2020.1730181

Sosa, E. (2015). Judgment and Agency. Oxford: Oxford University Press.

Sperber, D., Clement, F., Heintz, C., Mascaro, O., Mercier, H., Origgi, G., et al. (2010). Epistemic vigilance. Mind Lang . 25, 359–393. doi: 10.1111/j.1468-0017.2010.01394.x

Thomas, K., and Lok, B. (2015). “Teaching critical thinking: an operational framework,” in The Palgrave Handbook of Critical Thinking in Higher Education , eds M. Davies, R. Barnett, et al. (New York, NY: Palgrave Macmillan), 93–105. doi: 10.1057/9781137378057_6

van Gelder, T. (2005). Teaching critical thinking. some lessons from cognitive science. Coll. Teach . 53, 41–48. doi: 10.3200/CTCH.53.1.41-48

Vincent-Lacrin, S., González-Sancho, C., Bouckaert, M., de Luca, F., Fernández-Barrera, M., Jacotin, G., et al. (2019). Fostering Students’ Creativity and Critical Thinking: What it Means in School, Educational Research and Innovation. Paris: OED Publishing.

World Health Organization (WHO) (2020). Managing the COVID-19 Infodemic: Promoting Healthy Behaviours and Mitigating the Harm from Misinformation and Disinformation. Available online at: https://www.who.int/news/item/23-09-2020-managing-the-covid-19-infodemic-promoting-healthy-behaviours-and-mitigating-the-harm-from-misinformation-and-disinformation

Keywords : critical thinking, argumentation, socio-scientific issues, COVID-19 disease, fake news, epistemic assessment, secondary education

Citation: Puig B, Blanco-Anaya P and Pérez-Maceira JJ (2021) “Fake News” or Real Science? Critical Thinking to Assess Information on COVID-19. Front. Educ. 6:646909. doi: 10.3389/feduc.2021.646909

Received: 28 December 2020; Accepted: 09 March 2021; Published: 03 May 2021.

Reviewed by:

Copyright © 2021 Puig, Blanco-Anaya and Pérez-Maceira. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Blanca Puig, [email protected]

This article is part of the Research Topic

Science Education for Citizenship through Socio-Scientific issues

The UNESCO Courier

Developing a critical mind against fake news

By Divina Frau-Meigs

Media and Information Literacy (MIL) is often called to the rescue these days, as the media is threatened on all sides, in totalitarian and democratic regimes alike. The alert was sounded in France on 7 January 2015, when the French satirical magazine, Charlie Hebdo , was attacked. It was an attack on one of the oldest forms of media in the world – caricature.

At the time, I was Director of the Centre pour l’éducation aux médias et à l’information (the centre for media and information literacy (CLEMI). We had to prepare students for their return to the classroom the day after the attack, and meet the needs of teachers and parents. We proceeded as we did after all major catastrophes – we searched our archives for educational fact sheets on caricature and propaganda, and posted media resources online (reference websites, a press review, a series of headlines). We also released an unpublished interview of Charb, which CLEMI had done in 2013, titled, “Can we laugh at everything?”. The cartoonist and journalist, whose real name was Stéphane Charbonnier, was murdered in the attack.

This crisis situation showed the strengths of MIL, but also its limitations. We were well-prepared to respond in terms of resources, but we did not anticipate the impact of social media.

Like pre-digital media, MIL must take a leap forward and include in its concerns what data does to the media – it pushes information to the fore through the regulation of algorithms, linked to people’s search histories. It can enclose people in a “filter bubble” to reinforce the biases of confirmation that support preconceived ideas, and reduce the diversity and pluralism of ideas by monetizing content (clicks by views). It is invasive of privacy and threatens fundamental freedoms by using digital footprints for purposes beyond the user’s control.

The latest crisis stemming from fake news – a blend of rumour, propaganda and plot theory – has shaken up MIL. Fake news is even stronger than disinformation, which is a toxic, but generally discernible mixture of truth and lies. Fake news is a phenomenon that falls into the category of disinformation, but its malicious intent is unprecedented, because information technology makes it trans-border and trans-media, and therefore viral.

Media and Information Literacy must imperatively take into account the digital transformation, which has moved from the “blue continent” to the “dark continent”. In other words, it has gone from surfing, babbling and chatting on platforms controlled by the GAFAM (an acronym for Google, Apple, Facebook, Amazon, Microsoft), to noxious data mining for the purpose of massive manipulation and destabilization.

It is in this respect that the decoding of online propaganda is complex, because it is a question of deciphering a form of disruptive ideology, which is technologically innovative, but paradoxically represents a conservative global revolution − designed to create chaos in existing political systems rather than proposing a system of progressive political thought.

The return of gossip

This is why MIL is obliged to rethink the media and the political and ethical foundations that legitimize it. The role of social media needs to be revisited, as do the exchanges that take place on it. The growth of digital media, which transforms old audiences into new communities of sharing and interpretation, also needs to be taken into account. The renewed tendency to gossip manifested by social media is not insignificant and should not be treated with contempt. A conversation in undertones that conveys a jumble of rumours, half-truths and hearsay, gossip makes what is private, public. It places authenticity above a truth that is perceived as fabricated by elites, far from daily and local concerns.

Social media, then, conveys news where truth is uncertain, and falsehoods have been used to arrive at the truth or by showing that the truth is not all that clear-cut. Hence the temptation to categorize social media as “post-truth”. But this stance reduces its scope and refuses to see in it the quest for a different truth, when the supposedly gold-standard systems of information go bankrupt. Social media centres once again on the eternal journalistic battle between objective facts and commentary based on opinion, that is played out in these models of influence.

In the information-communication sciences, gossip falls within the category of social bonding. It fulfills essential cognitive functions: monitoring the environment, providing help in decision-making by sharing news, aligning a given situation with the values of the group, etc. These functions have traditionally legitimized the importance of the media. But the media is now perceived as deficient and biased – this is symptomized by the reliance on online gossip, relayed by social media. The blame falls less on social media than on those who are responsible for public debate in real life.

In destabilized political situations all around the world, social media is restoring meaning to the regulatory role of social narrative. It highlights the violations of social norms, especially when political institutions boast of transparency, because secrets are no longer safe. Set against newspapers that toe party lines, social media is disrupting the norms of objectivity, which has become fossilized by requiring the presentation of one opinion for and one against. The public shows distrust of the “veracity” of this polarized discourse and is seduced by the strategy of authenticity. It establishes a close relationship of trust with the community of members that now constitutes the audience, and aims to involve them in debates, while basing itself on the principle of transparency. Thus social media pits the ethics of authenticity against the ethics of objectivity.

Explorer, analyst and creator

Social media and fake news consequently make up a textbook case for MIL, which calls upon its fundamental competence − critical thinking. But this critical thinking must have an understanding of the added value of the digital: participation, contribution, transparency and accountability, of course, but also disinformation and the interplay of influence. The critical mind can be exercised and trained, and can also act as a form of resistance to propaganda and plot theory. Young people must be put in a position of responsibility while being protected by the adults around them: they can be prompted to call into question their use of social media and to take into account the criticism against the consequences of their practices. We must also trust their sense of ethics, once it is called upon. In my Massive Open Online Course on Media Education – the MOOC DIY MIL, which received the 2016 UNESCO Global MIL Award – I offer students three critical roles: explorer, analyst and creator. The explorer gets to know the media and data; the analyst applies the concepts, such as source verification, fact-checking, respect for privacy; the creator tries his/her hand at producing his/her own content, sees the consequences of his/her choices and makes decisions about distribution.

The MOOC has given birth to projects such as “ Citoyen journaliste sur Twitter ” (citizen journalist on Twitter) and “ HoaxBuster ”, against plot theories. In all cases, the point is to ensure that young people acquire the critical thinking reflexes of MIL, so that they can avoid the traps of hate speech, non-voluntary internet traces and fake news. Other initiatives exist, including some led by UNESCO, which has founded the Global Alliance of Partners on MIL (GAPMIL) −  MIL CLICKS is a recent project to take ownership of MIL via social media.


Scaling up MIL

It is also important that MIL exercises critical thinking against the media itself. It turns out that the top press organizations are among the biggest influencers and the ones who tend to push rumours, on Twitter for example, before they are confirmed. The fake news that circulates on Facebook, the first of the social media to spread it, draws its grain of truth from the fact that news professionals are overly responsive to the pressure of the scoop, transmitted before it is checked, in the same manner as the amateurs. And the denials do not generate as much buzz as the rumours!

It is clear that challenges still exist to significantly scaling up MIL. Decision makers need convincing that trainers must be trained, teachers and journalists alike. My research at the Université Nouvelle Sorbonne, within the framework of the TRANSLIT  project of the Agence Nationale de la Recherche and the UNESCO Chair in “Savoir-devenir in sustainable digital development”, consists of comparing public policies in Europe. It shows that many resources and training opportunities exist on the ground, provided by organizations or teachers on their own initiative, rather than sponsored by universities. It points, however, to a lag at the public policy level, despite the inclusion of MIL in many national educational programmes. There are few interministerial mechanisms, little or no co-regulation, and little or no multi-stakeholder coordination. The governance of MIL emerges as composite, with three models existing in different countries: development, delegation, or… disengagement (D. Frau-Meigs et al, 2017).  

An ethical leap

The good news is that journalists are becoming increasingly aware, revising their ethics and realizing the value of MIL. Their ethical leap can help teachers to reposition MIL and provide valid resources to bolster resistance in favour of the integrity of data and media. Actions that are re-establishing the value of in-depth investigation are already taking shape − using data journalism, which reveals information that cannot be obtained otherwise.

Scandals such as the colossal leak of confidential documents known as the  Panama Papers  have helped moralize political life and restore confidence in the press. Other actions are aimed specifically at fighting fake news using digital means. These include AFP Correspondent , the Agence France Presse blog (which reveals what happens backstage at a large news network); Décodex, featured in the French newspaper, Le Monde (which lists sites according to their unreliability), Google's RevEye (which checks whether an image is genuine in three clicks), and Conspi Hunter on Spicee, the online TV reports and documentaries platform (to debunk plot theories).

In order to be deployed fully and to create an educated citizenship, MIL’s critical thinking must also be applied to the geo-economy of social media. The GAFAM digital platforms, all under California law, have long refused to be classified as media companies, to avoid all social responsibility and to evade any related public-service obligations. But algorithmic monitoring has revealed the ability of GAFAM to exercise editorial control over content that is worth monetizing. In doing so, these organizations define the truth, because it is real or ethical.

The GAFAM mega-media have so far played the card of self-regulation: they make their own rules, they decide to remove sites or accounts suspected of conveying fake news, with no accountability for themselves. But they cannot resist the need for a responsible model for long – it will probably be a hybrid between a “common carrier” and “public trustee”, if they want to preserve the trust of their online communities. The communities could also organize themselves, and even circumvent them, to co-regulate the news with journalists, as is the case with Décodex. The option of co-designing an algorithm that would have journalistic ethics and fundamental freedoms built into its DNA is undoubtedly one of the alternatives to come, according to digital logic!

Divina Frau-Meigs

Divina Frau-Meigs (France) is a professor of information and communication sciences at the Université Sorbonne Nouvelle, and holder of the UNESCO Chair “Savoir-devenir in sustainable digital development”. The author of several books, she has just published  Public Policies in Media and Information Literacy in Europe: Cross-Country Comparisons , which she has edited along with I. Velez and J. Flores Michel (London, Routledge, 2017).


In the same issue


Related items

  • Information and communication
  • Topics: 2017_2
  • Topics: Wide Angle

Other recent articles

Geoscience ambassadors of 2024

Article Q&A on the call for proposals for the 2025 GEM regional edition on distributed leadership in Latin America / Edición regional GEM 2025 sobre liderazgo distribuido en América Latina 2 April 2024

Article RFP Services Catering Services for Classroom and Life Skills Training Sessions at the Girls Education Center, Palo, Leyte 2 April 2024

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • My Account Login
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Review Article
  • Open access
  • Published: 12 June 2023

Distractions, analytical thinking and falling for fake news: A survey of psychological factors

  • Adrian Kwek   ORCID: orcid.org/0000-0002-9405-0601 1 ,
  • Luke Peh 2 ,
  • Josef Tan 3 &
  • Jin Xing Lee 4  

Humanities and Social Sciences Communications volume  10 , Article number:  319 ( 2023 ) Cite this article

2718 Accesses

1 Citations

8 Altmetric

Metrics details

  • Cultural and media studies
  • Science, technology and society

Analytical thinking safeguards us against believing or spreading fake news. In various forms, this common assumption has been reported, investigated, or implemented in fake news education programs. Some have associated this assumption with the inverse claim, that distractions from analytical thinking may render us vulnerable to believing or spreading fake news. This paper surveys the research done between 2016 and 2022 on psychological factors influencing one’s susceptibility to believing or spreading fake news, considers which of the psychological factors are plausible distractors to one’s exercise of analytical thinking, and discusses some implications of considering them as distractors to analytical thinking. From these, the paper draws five conclusions: (1) It is not analytical thinking per se, but analytical thinking directed to evaluating the truth that safeguards us from believing or spreading fake news. (2) While psychological factors can distract us from exercising analytical thinking and they can also distract us in exercising analytical thinking. (3) Whether a psychological factor functions as a distractor from analytical thinking or in analytical thinking may depend on contextual factors. (4) Measurements of analytical thinking may not indicate vulnerability to believing or spreading fake news. (5) The relevance of motivated reasoning to our tendency to believe fake news should not yet be dismissed. These findings may be useful to guide future research in the intersection of analytical thinking and susceptibility to believing or spreading fake news.

Similar content being viewed by others

critical thinking in fake news

Persistent interaction patterns across social media platforms and over time

Michele Avalle, Niccolò Di Marco, … Walter Quattrociocchi

critical thinking in fake news

Adults who microdose psychedelics report health related motivations and lower levels of anxiety and depression compared to non-microdosers

Joseph M. Rootman, Pamela Kryskow, … Zach Walsh

critical thinking in fake news

Negativity drives online news consumption

Claire E. Robertson, Nicolas Pröllochs, … Stefan Feuerriegel


Fake news has deleterious effects on society. The effects range from destabilizing society by cleaving racial and religious fault lines (Grambo, 2019 ), to influencing national elections (Alcott and Gentzkow, 2017 ), to derailing public health policy implementations (Waszak et al., 2018 ). Since the gradual appearance and entrenchment of the term “fake news” in the popular lexicon in 2016, much research has been done on the psychological and environmental determinants of people’s responses to fake news, as well as the effectiveness of interventions designed to reduce the deleterious effects of fake news on society. This review begins with Pennycook and Rand’s ( 2019 ) observation that people fall for fake news because they are distracted from thinking analytically. After elaborating on Pennycook and Rand’s observation and orienting it to the sharing of fake news, we conduct a survey of the relevant literature from 2016 to 2022 about possible psychological distractors that cause us to fall for fake news.

Disinformation is information that is (i) false, (ii) communicated as true, and (iii) intentionally communicated as true in order to influence people’s beliefs or behavior (see Duffy et al., 2020 ; McGonagle, 2017 ). Item (iii) distinguishes disinformation as a type of misinformation. Misinformation is information that is false and communicated as true but without the intention for the impression of its truth to influence people’s beliefs or behavior (see Ireton and Posetti, 2018 ; Wardle and Derakhshan, 2017 ). False information that is published as news due to sloppy journalism is an example of misinformation. Disinformation includes intentionally false propaganda, intentionally false advertising, and fake news. Fake news is distinguished from other types of disinformation by being designed to mislead people that the content is news, for example, by mimicking the visual format of reputable news websites.

There are at least three senses of the term “falling for fake news”. When one falls for fake news, one believes the disinformation, cannot distinguish it from genuine news, and/or retransmits it. We take Pennycook and Rand’s ( 2019 ) observation that people fall for fake news because they are distracted from thinking analytically as our starting point. Being saved from falling for misinformation of any kind by thinking carefully about the information is a common experience that many people have. This experience lends intuitive support to the assumption that we tend to fall for misinformation when we do not think carefully enough about the information.

For many studies, one’s propensity and competency in “thinking carefully” about information is indicated by the completion of analytical thinking scales. Pennycook and Rand ( 2018 , 2019 ), Nurse et al. ( 2022 ), and Pehlivanoglu et al. ( 2021 , 2022 ) use the cognitive reflection test (CRT). The CRT measures one’s propensity to engage in slow and careful thinking. It presents the subject with a question that admits to a quick but wrong answer and a correct answer that requires slow and careful thinking. Other tests indicating one’s propensity to think carefully about a subject matter include tests of conscientiousness (Buchanan, 2020 ; Lawson and Kakkar, 2022 ; Li et al., 2022 ) and tests of argumentative ability (Lantian et al., 2021 ).

While there have been literature reviews about psychological factors that render us susceptible to fake news, we have not found any that focus on the relation between the factors and analytical thinking directly. Due to the large number of disinformation-susceptibility studies investigating psychological factors and analytical thinking, an analytical review of such studies can help us to understand the extent to which analytical thinking can “save” us from falling for fake news.

Research question

What psychological distractions to analytical thinking mediate the belief in and/or retransmission of fake news?


Taking Pennycook and Rand’s ( 2019 ) paper as the seed article for our project, we identified key terms from it and its reference list by relevance to our research question, frequency of usage in journal article titles and abstracts, and terms with similar meanings to these terms that also frequently appear in titles and abstracts. We conducted a critical review (Grant et al., 2009 ) by searching for relevant literature on our university e-journal database and Google Scholar from 2016 onwards with the following 9 search strings:

(spread OR disseminat*) AND (“fake news” OR misinform* OR disinform*) AND (belie* OR identif* OR discern* OR assess* OR rat* OR evaluat*)

(spread OR disseminat*) AND (“fake news” OR misinform* OR disinform*)

(spread OR disseminat*) AND (“fake news” OR misinform* OR disinform*) AND (belie* OR identif* OR discern* OR assess* OR rat* OR evaluat*) AND (reflect* OR think* OR reflex* OR cogni* OR reason* OR motivated OR bullshit* OR profound)

(spread OR disseminat*) AND (“fake news” OR misinform* OR disinform*) AND (reflect* OR think* OR reflex* OR cogni* OR reason* OR motivated OR bullshit* OR profound)

(spread OR disseminat*) AND (“fake news” OR misinform* OR disinform*) AND (belie* OR identif* OR discern* OR assess* OR rat* OR evaluat*) AND (heuristic* OR familiar* OR source OR credib* OR “confirmation bias”)

(spread OR disseminat*) AND (“fake news” OR misinform* OR disinform*) AND (heuristic* OR familiar* OR source OR credib* OR “confirmation bias”)

(spread OR disseminat*) AND (“fake news” OR misinform* OR disinform*) AND (signal* OR reputation OR identity OR virtue OR overclaim*) AND (prosocial OR pro-social OR moral* OR outrage* OR punish* OR interesting* OR entertain* OR gossip* OR rumor)

(spread OR disseminat*) AND (“fake news” OR misinform* OR disinform*) AND (signal* OR reputation OR identity OR virtue OR overclaim*)

(spread OR disseminat*) AND (“fake news” OR misinform* OR disinform*) AND (prosocial OR pro-social OR moral* OR outrage* OR punish* OR interesting* OR entertain* OR gossip* OR rumor)

For repeated search results, we omitted the redundant items. In order to focus on psychological distractors mediating belief and retransmission of disinformation, we further filtered the search results with the following 5 exclusion criteria and respective rationales:

In this section, we summarize the contents of empirical studies that relate to our research questions. By searching Google Scholar and the journal databases that our university library has access to, and by manual filtering based on the above five criteria, we arrived at 87 relevant articles. In this section, we present our curation of articles according to psychological factors that are covered by research in the 2016–2022 time period.

Emotions can influence one’s responses to fake news. Martel et al. ( 2020 ) suggest the following influences on finding fake news persuasive. First, certain moods associated with happiness or higher motivation generally correlate with one’s tendency to believe false information and negatively correlate with one’s capacity to tell when one is being deceived; and conversely, that certain moods associated with sadness or lower motivation generally correlate with doubting and disbelief. Second, when one is angry, one is more likely to depend on heuristic cues in deciding whether to believe information and less likely when one is sad. Third, anxiety tends to render one more willing to entertain perspectives contrary to one’s own, while anger tends to decrease the propensity. Maldonado ( 2019 ) suggests that emotion is inextricably intertwined with reasoning, and confirmation bias may elicit positive emotions that have a physiological basis in dopamine production. Thus, certain emotions can not only render us more likely to believe in fake news but also make it pleasurable to do so.

In their own research, Martel et al. ( 2020 ) found that depending on one’s emotions in processing false news information made one more likely to believe them. Horner et al. ( 2021 ) have shown heightened emotions to positively correlate with finding fake news aligning with one’s existing beliefs persuasive and with disseminating disinformation, as well as withholding information incompatible with existing beliefs. The emotions of anxiety and anger have active causal roles. They cause one to fall for fake news in order to act with the information. In contrast, sadness is passive in the sense that it causes one to miss signs that the information is fake news, and to proceed as though it is genuine news.

Weimann-Saks et al., ( 2022 ) were unable to detect a mediating role played by emotions between one’s cognition of the information and one’s subsequent actions based on the cognition. The information in question is rumors, which share with fake news the quality of being information that is unverified but communicated as true. This finding does not detract from the general tenor that dependence on emotions renders one more susceptible to believing fake news as the subsequent actions in question pertain to spreading rather than believing the information. It does, however, indicate that one should separately evaluate the role of emotions in the believing of fake news and the role of emotions in the spreading of fake news.

The higher the stakes involved and the greater the perceived risk, the greater the anxiety. Anxiety, in turn, causes one to err on the side of treating the information as true and disseminating it as such, even when one is not sure. This phenomenon was widespread in the spreading of Covid-19 disinformation about ways to detect, prevent or treat the disease during the pandemic by well-meaning family and friends for whom the stakes could be the death of a loved one. Oh and Lee ( 2019 ) did a relatively early study linking health anxiety, together with health literacy and one’s perception of the significance of the information, to checking on the veracity of the information and spreading it. Laato et al. ( 2020 ) showed that one’s perception of danger posed by Covid-19 together with one’s assessment of susceptibility contributed to one’s propensity to share the information without verification. These studies suggest that the anxiety caused by the fear of death can lead to the spreading of fake news. Sun et al. ( 2020 ) showed that the elderly are more apt to retransmit unverified information the more they believe the information and the more anxious they are about it. Su ( 2021 ) showed that, when anxiety was taken into consideration as a mediator, the positive relationship between social media use and misinformation beliefs increased in significance. Kim and Kim ( 2020 ) showed that apparent risk and stigma increase belief in false information about Covid-19. Li et al. ( 2022 ) showed that such propensities are compounded by personality traits like extroversion, emotional instability and conscientiousness. Li et al. ( 2022 ) showed that people were more persuaded by Covid-19 misinformation and more likely to share it when their perception of risk from Covid-19 deaths increased and they experienced more intense negative emotions. Ahmed ( 2022 ) presents a different type of anxiety—the fear of missing out (FOMO). Ahmed shows FOMO positively correlating with the tendency to retransmit deep fakes, which is enhanced by low cognitive ability or increased social media use.

Chuai and Zhao ( 2022 ) showed that increased anger associated with a piece of fake news positively correlates with its increased virality and hypothesize that the anger incentivizes the retransmission of the information. Deng and Chau ( 2021 ) showed that consumers of online news become skeptical of content when the content is laced with angry expressions. This shows that the locus of anger is significant. If so, then the information should evoke anger rather than contain angry language. Han et al. ( 2020 ) showed that anger increased one’s propensity to disseminate disinformation by causing them to find disinformation scientifically persuasive. Bago et al., ( 2022 ) showed that anger is correlated with decreasing truth discernment of headlines, and also that truth discernment improved when the subjects actively reduce their emotions. Moral outrage and the anger that it generates have also been shown to make people believe disinformation. Ali et al. ( 2022a ) studied both fear and anger. They found that fear tended to make people who are skeptical of vaccines share false anti-vaccine information while anger tended to make people who are neither skeptical nor endorsing vaccines share false anti-vaccine information.

Repeated exposure

Repeated exposure to fake news may cause us to believe or retransmit the information without thinking analytically about its truth. Pennycook et al. ( 2018 ) found that even one encounter with disinformation can cause subjects to believe that the information is accurate. They hypothesize that this is due to the fluency of subsequent processing after the initial exposure. The effect persists even when the information is flagged as controversial by debunking services. In a different study about how visual images affect people’s accuracy judgments, Smelter and Calvillo ( 2020 ) corroborated Pennycook et al.’s findings that repeated exposure increases people’s propensity to judge false information as accurate. Repeated exposure to disinformation can also increase people’s propensity to retransmit false information. Effron and Raj ( 2020 ) found that subjects considered it less ethically wrong to retransmit headlines that are labeled false when they are exposed to them up to four times. This is so even when subjects did not believe the headlines. In Nadarevic et al. ( 2020 )’s investigation of the joint effect of several factors on accuracy evaluations, source credibility together with repeated exposure was found to have a robust effect on accuracy evaluations. Swire et al. ( 2017 ) found that familiarity caused by repeated exposure contributes to the continuing influence effect, where subjects believe disinformation even after being informed of its correction. They found the elderly to be more susceptible and found that explanations and attention to the facts work to dispel the continued influence effect.

Altruism is the desire to benefit one’s community without seeking recompense for oneself. It has been identified as one of the most important reasons for retransmitting fake news (Balakrishnan et al., 2021 ; Apuke and Omar, 2021 ). This can be as wide as the community of human beings or as parochial as one’s in-group. Anxiety can be associated with altruism when one’s desire to benefit others stems from one’s concern about their wellbeing. The studies that are relevant to altruism as a contributing factor to one’s susceptibility to disinformation are of two types. The first type studies the desire to benefit as a factor. Balakrishnan et al. ( 2021 ) found among a Malaysian sample that altruism was a significant factor in accounting for the retransmission of disinformation. The second type studies collectivism as a factor. Duffy and Tan ( 2022 ) found that, when unsure whether news information is true or false, subjects would share it because sharing contributes in part to the cohesiveness of the group, just as rumor does. Sun et al. ( 2020 ) and Weimann-Saks et al. ( 2022 ) also found similarities between the retransmission of disinformation and rumors. Comparing Chinese with American collectivism scores, Lin et al. ( 2022 ) found that Chinese subjects with increased collectivism scores tended to interpret arbitrarily produced and unclear messages as communicating significant information. They found that even fleeting experiences of collectivism caused subjects to find meaning in such information.

Identity protection or enhancement

Factors pertaining to the protection or display of one’s identity can contribute to one’s susceptibility to disinformation. Pennycook and Rand ( 2020 ) found that people who “overclaim”, that is, who signal that they know more than what they actually do are more susceptible to believing disinformation. Islam et al. ( 2020 ) found, among other factors, that self-promotion is positively correlated with the retransmission of information without checking its veracity. Littrell et al. ( 2021 ) found that people who indulge in “persuasive bullshitting”, that is, who produce untruths aiming to impress or convince an audience, are more susceptible to believing disinformation. Pereira et al. ( 2021 ) studied the effect of subjects’ interest in protecting their identity with respect to political affiliation on their credence and tendency to share news information and found a positive correlation. This study enables us to link confirmation bias or motivated reasoning with identity in susceptibility to disinformation. Druckman et al. ( 2021 ) found that subjects belonging to ethnic groups that are in the minority in society, who are fervently religious, or who have robust political allegiances are more likely to hold false beliefs. However, Islam et al. ( 2020 ) had earlier found that being religious is associated with a decrease in retransmission of unconfirmed content.

Confirmation bias

There are numerous studies on the effect of confirmation bias and people’s propensity to fall for fake news. In a study about confirmation bias in finding information about climate change persuasive and effective, Zhou and Shen ( 2022 ) established strong positive correlations. Bauer and Clemm von Hohenberg ( 2021 ) found that subjects who were supplied with false information that aligned with their beliefs were more likely to trust news information from the same source. Horner et al. ( 2021 ), in a study on emotions driving credence in false headlines and their dissemination, found that subjects had a greater tendency to believe information that supports beliefs they already possess. Tandoc et al. ( 2021 ) discovered an association between people’s credence in disinformation and political affiliation. The link between political affiliation and falling for fake news was corroborated by Pereira et al. ( 2021 ), who narrowed it to confirmation bias specifically, defined as a preference for news that dovetail with their prior stereotypical beliefs. Michael and Breaux ( 2021 ) explored the converse relation, between political affiliation and skepticism. They found that there is a tendency for people to be skeptical about news information -- i.e., regarded as “fake news”-- information that is incompatible with their political stances. This finding is corroborated by Baxter et al. ( 2019 ) in Scotland, Bozdağ and Koçer ( 2022 ) in Turkey, Hameleers and Brosius ( 2022 ) in the Netherlands, and Tsang ( 2021 ) in Hong Kong. Michael and Sanson ( 2021 ) studied confirmation bias as an effect of the joint influence of political affiliation and choice of news sources, which in turn affect people’s ability to distinguish real from fake news. In a similar vein, Traberg and van der Linden ( 2022 ) found that political affiliation mediated by perceptions of source credibility influences people to judge information aligned with their political orientations to be more accurate. Doing a comparison between conspiracy mindset and political affiliation, Faragó et al. ( 2020 ) found that the effect of political affiliation mediated by source credibility in influencing people to believe “wishful thinking” political disinformation was greater than that of conspiracy mindset. Specifically pertaining to one’s propensity to believe and one’s propensity to share disinformation, Turel and Osatuyi ( 2021 ) found that political affiliation is positively correlated to them, mediated by factors like the political affiliation of one’s social network. Pearson and Knobloch-Westerwick ( 2019 ) found that there is less confirmation bias in consuming printed vs. online news. Some studies cast doubt on the strong association between confirmation bias and falling for fake news. Baptista et al. ( 2021 ), found that participants with conservative views were more likely to believe and spread disinformation, regardless of their political orientation. Corroborating this finding, Calvillo et al. ( 2020 ) found that subjects with a conservative political orientation tend to be less accurate in distinguishing real from fake news.

Source credibility

The perceived authoritativeness pertaining to source credibility has received much attention (see Buchanan and Benson, 2019 ; Buchanan, 2020 ; Pehlivanoglu et al., 2021 ; Folkvord et al., 2022 ). Sterrett et al. ( 2019 ) distinguish between two factors that lead subjects to find social media news credible—the credibility of the individual who retransmits the information and the credibility of the news platform. Nadarevic et al. ( 2020 ) investigated the interaction between the perceived credibility of the source with other factors in truth discrimination. Faragó et al. ( 2020 ) found that perceived source credibility is an important mediator between political affiliation and belief in disinformation. The type of information sources that one holds to be authoritative can indicate one’s propensity to fall for fake news. Bonafé-Pontes et al. ( 2021 ) found for Brazilian participants that those who trusted the credibility of social media news tended to be worse at truth discrimination; and conversely, those who trust the WHO and traditional media (newspapers, radio and television) tended to be better at truth discrimination. However, Tsang ( 2021 ) did not find any association between types of sources and the perception that a piece of “news” is false. Furthermore, Hameleers et al. ( 2022 ) found that people who tend to identify fake news are those who tend to distrust mainstream news platforms and who tend to engage more with non-mainstream news platforms. Zimmermann and Kohring ( 2020 ) bridge the gap between distrust in news and belief in disinformation. Xiao et al. ( 2021 ) found that trust in social media news is an important mediator between the consumption of social media news and conspiracy ideation. De Coninck et al. ( 2021 )’s and Melki et al. ( 2021 )’s results corroborate others’ findings as follows: engagement with traditional media correlates negatively with belief in disinformation and conspiracy theories; engagement with political personalities, social media and personal social networks correlates positively with belief in disinformation and conspiracy theories. However, de Connick et al. ( 2021 ) also found that engagement with healthcare specialists correlates negatively with belief in conspiracy theories only, but does not reduce credence in disinformation. In a similar vein, Lobato et al. ( 2020 ) found low trust in mainstream medicine was among the factors associated with vulnerability to health disinformation. Hopp et al. ( 2020 ) found that people who have extreme beliefs, high distrust of mainstream media and high social distrust are also more likely to retransmit fake news. Finally, Laato et al. ( 2020 ) found that subjects’ propensity to retransmit information without checking its veracity is associated with their faith in information on the internet coupled with having to cope with too much information.

Social endorsement

The perception of social endorsement constitutes a second basis for people’s trust in the veracity of a message or willingness to share it. Studies about the perception of social endorsement can be subdivided into those where trust is based on some perceived reputational aspect of the source that is unrelated to the veracity of the message, and those where trust is based on the perceived popular reception of the message. An example of trust based on some perceived reputational aspect of the source is a celebrity source of a message about climate change as opposed to a scientific source. Sterrett et al. ( 2019 ) showed that social media news coming from societal elites is more credible than news coming from an established news outlet. An example of trust based on perceived popular reception of the message is the number of “likes” or comments that a message receives. Keselman et al. ( 2021 ) found that recommendations by their friends, good reviews and favorable but uncited research increases sharing proclivities. However, Buchanan ( 2020 ) found that neither source authority nor evidence of popular engagement with the message influenced sharing proclivities. Harff et al. ( 2022 ) found that relationships with online influencers did not make subjects more believing of the influencers’ claims. Avram et al. ( 2020 ) found that subjects’ awareness of the quantities of sharing and liking that a message of dubious veracity receives contributes to the risk of believing and spreading it. Ali et al. ( 2022b ) showed that a large amount of “likes” tends to increase the believability of a message. They theorized that a desire to embellish the information explains why people also tend to share such messages. Mena et al. ( 2020 ) found that the perceived trustworthiness of a source contributes to the credibility of a false message. Ren et al. ( 2021 ) offer a novel perspective. They showed that people can share messages that they are skeptical about because their decision is based on balancing between message veracity and message engagement by others. The prospect of getting lots of likes and comments factors in their decision to share a message of dubious veracity.

Conspiracy thinking

Anthony and Moulding ( 2019 ) found that a conspiratorial worldview conduces to someone’s credence in fake news and that other factors like normlessness relate to credence in fake news through their influence in conspiratorial thinking. On a large-scale study ( N  = 38,113), Kantorowicz-Reznichenko et al. ( 2022 ) found that subjects who tend to rely on conspiratorial thinking tend not to change their behavior to align with public health messaging during the Covid-19 pandemic and that people who think deliberately tend not to rely on conspiratorial thinking. This latter finding corroborates Lantian et al. ( 2021 ) findings that analytical thinking competency in the argumentative context correlates negatively with a conspiratorial worldview. Calvillo et al. ( 2021 ) found that among other factors, subjects who tended to have conspiratorial worldviews tended also to believe false political headlines. How might subjects who rely on conspiratorial thinking tend to be resistant to changing their behavior for public health promotion? Hughes et al. ( 2022 ) found a negative correlation between credence in conspiracy ideation and obedience to public health instructions because the subjects perceived low health risk coupled with high risk to livelihood and liberty. Lobato et al. ( 2020 ) found that people who scored high on social dominance orientation tend to retransmit disinformation, especially conspiratorial beliefs. Landrum and Olshansky ( 2019 ) found that, while conspiratorial and scientific orientations both have significant contributions to credence in scientific disinformation, the contribution of a conspiratorial worldview to resisting scientific knowledge is inconclusive. Miller et al. ( 2016 ) found that certain attributes of persons better predict belief in conspiracy theories, namely being well-informed about politics and exhibiting a high level of distrust.

Motivated reasoning

“Motivated reasoning” refers to the possibly unconscious exercise of one’s reasoning capacity ostensibly to arrive at a true conclusion, but to achieve some other interest than arriving at the truth. Motivated reasoning has been posited to be an important factor in one’s believing in or endorsing disinformation. A commonly studied form of motivated reasoning is reasoning in order to support one’s political affiliations. Pennycook and Rand ( 2019 ) found that subjects’ proclivity to engage in analytical thinking explains their competence at distinguishing true from false headlines even when the false headlines accord with their political affiliations. They conclude that it is inattentive thinking rather than motivated reasoning that contributes to one’s vulnerability to disinformation. Ross et al. ( 2021 ) corroborated Pennycook and Rand’s results with a large sample ( N  = 1973) and also found no significant correlation between analytic thinking and a greater inclination to retransmit messages that aligned with one’s political affiliations. Also corroborating Pennycook and Rand, Baptista et al. ( 2021 ) found for a sample of Portuguese participants that general political affiliation correlated with susceptibility to disinformation—right-wing participants tended to be more susceptible to left-wing participants, even for disinformation that is discordant with their political affiliation.

However, Calvillo and Smelter ( 2020 ) arrived at the opposite result. They found that subjects tended to judge headlines that conflict with their political affiliations as less accurate than headlines that align with their political affiliations, and that subjects that exhibit more analytic thinking were also more biased in their accuracy judgments of the headlines. Others also lend support to the role of motivated reasoning in susceptibility to fake news. Tsang ( 2021 ) found that subjects judged identical news information to be false to different levels based on their prior positions on the content, and took this to support the view that motivated reasoning contributes to the retransmission of disinformation. Michael and Sanson ( 2021 ) showed that people rely on heuristics that take less effort than analytic thinking in judging the veracity of the news. The bias in their distinguishing true from false headlines is due to the bias in the credence that they place on politically concordant sources rather than the information itself. This finding is corroborated by the findings of Traberg and van der Linden ( 2022 ), and goes against earlier findings (Clayton et al., 2019 ).

Stanley et al. ( 2022 )’s discussion gave some ways in which cognitive capacities involved in assessing truth can also contribute to people’s belief in disinformation, and can shed light on the mechanism linking motivated reasoning to belief in fake news. Vegetti and Mancosu ( 2020 ) found that, although people were inclined to perceive politically concordant messages as more believable, people who were well-informed politically can better distinguish true from fake news. With respect to sharing of fake news, Osmundsen et al. ( 2021 ) found that political affiliation engenders emotional factors like hatred, which fuels the retransmission of disinformation. Wischnewski et al. ( 2021 ) found partial evidence that subjects are more likely to retransmit messages that accord with their own political beliefs and attributed this to motivated reasoning.

Influential findings

From the 87 curated articles, we looked for those with the most influential findings. We took the number of citations as a proxy for the influence of a paper’s findings. Our inclusion threshold is the threshold number of citations for the 3rd yearly quartile among the articles we curated. Because the number of citations for a paper with influential findings in more recent years can be expected to be fewer than those that have been in the literature for a longer time, we calculated our inclusion threshold on a yearly basis. Based on this method, we included papers with more than 104 citations for 2019, papers with more than 124 citations for 2020, papers with more than 36 citations for 2021, and papers with more than 14 citations for 2022. The increase in the value of the threshold for 2020 over 2019 could be due to interest in pandemic-related disinformation. There was 1 paper each in the years 2016, 2017 and 2018 in our curation, with citation counts of 493, 250 and 1001, respectively. As quartile computation was not possible within these years, we approximated an increase of 1.5 times yearly over the combined 3rd quartile threshold of 2019 and 2020. This yielded the thresholds of 167, 250 and 375 for 2018, 2017 and 2016 respectively. Because they met these thresholds, the 3 papers were included. We had a total of 24 papers. In Appendix 2 consisting of 2 tables, we list the papers and summarize their methodology and findings in the first table. In the second table, we present the papers and the factors that they investigate to display relations between factors.

In this section, we identify five themes from the findings in the previous section and discuss them.

Anxiety vs. anger

According to Bodenhausen et al. ( 1994 ), feeling angry causes subjects to depend more on heuristic cues to arrive at judgment in an argumentative context but sadness causes subjects to depend less on the same cues. Forgas and East ( 2008 ) found that being in a bad mood causes one to be more generally disbelieving while being in a good mood causes one to be more vulnerable to being deceived. MacKuen et al. ( 2010 ) found that anger encourages heuristic as opposed to systematic reasoning, quickly relying on intuitive cues rather than following a logical order in thinking, while anxious information processing could cause one to entertain contrary perspectives (Martel et al., 2020 ). In general, the background literature aligns with the studies linking anger with an increased susceptibility to disinformation but appears to oppose the studies linking anxiety with an increased susceptibility to disinformation. This may be due to the possibility that anxiety influences our susceptibility to disinformation jointly with other emotions, beliefs and personality traits, while anger affects our motivation to rely on heuristic cues in judging the truth of information more directly. This possibility is supported by studies of anxiety among a raft of other factors, like risk perception and extraversion.

While the connection between anxiety or anger and cognitive processing like believing has gained wide attention, the connection between anxiety or anger and the other dimension of susceptibility to fake news—retransmitting it—is underexplored. Studies suggest support for the retransmission of fake news via the fact that we are more likely to retransmit what we believe to be true (Buchanan, 2020 ; Altay et al., 2022 ), including when we are motivated by altruism (Apuke and Omar, 2021 ) to help others with what we believe to be true, and believing to be true on the basis of trust (Sterrett et al., 2019 ; Laato et al., 2020 ; Melki et al., 2021 ). However, there appears to be inadequate coverage of mediators of anxiety or anger and retransmission of fake news. Solovev and Pröllochs ( 2022 ) found that disinformation spreads more quickly when they contain numerous instances of vocabulary that condemn others. This suggests that people may retransmit what they do not believe to be true in order to express their anger or to punish the perceived perpetrators (see Hartung et al., 2019 ). Osmundsen et al. ( 2021 ) found that hatred may contribute to retransmitting disinformation, which also suggests that punishment is a motivation for retransmission, which may mediate anger and retransmission.

Identity-protective mechanisms

The anxiety and anger that interfere with judgment may be connected to identity protection. In our literature review of identity, many of the factors studied like political affiliation, bullshitting and overclaiming are indicators of susceptibility to fake news—they correlate with one’s propensity to believe or retransmit disinformation. However, it is unclear what the mechanism linking these factors to identity is. One possibility is that, since these factors relate to one’s promotion and protection of one’s identity, some familiar emotions like anxiety (Wischnewski and Krämer, 2021 ) or anger mediate between one’s identity and one’s susceptibility to disinformation. Kahan et al. ( 2017 ) discuss identity protection. Affinity groups are groups consisting of people who share allegiance to a set of fundamental values. For Kahan, people invest a lot in preserving their positions in affinity groups and the reputation of affinity groups to which they belong. Wischnewski and Krämer ( 2021 ) linked Kahan’s thesis to anger and anxiety by postulating that, in encountering information that goes against one’s identity, people will feel angry or anxious, and the emotions will, in turn, cause them to resist the information even if it is true. Van Bavel and Pereira ( 2018 ) found that political affiliation can affect recall, tacit assessments and how we perceive. However, identity-protective mechanisms may not exhaust the space of identity as a factor influencing one’s susceptibility to fake news. One might, instead of protecting one’s identity, intend to signal some aspect of one’s identity in conveying information. This hypothesis is compatible with a range of factors influencing one’s susceptibility to spreading disinformation, for example, to signal that one is altruistic. We did not find any research on signaling as a factor in the retransmission of fake news in our curation. The closest was self-promotion (Islam et al., 2020 ), which can encompass signaling but can also be due to intentions to elicit others’ approval of oneself without going through the recognition that one possesses some valued attribute that is signaled through the retransmission.

Altruism, communitarianism and emotions

Altruism is one of the factors that receive attention in studies (e.g. Apuke and Omar, 2021 ). Altruism is a construct that consists of the motivation to benefit others without the expectation of personal recompense. Plume and Slade ( 2018 ) document many studies finding altruism as a motivation to disseminate content on social media. It is difficult to tease apart the motivation of pure altruism, communitarian beliefs (Lin, et al., 2022 ) and associated emotions. For example, a subject may report sharing COVID-19 disinformation on social media out of altruism where the motivator could be a communitarian mindset coupled with anxiety, or a subject could have a desire to share COVID-19 disinformation not from an articulable reason to help others without seeking recompense, but in pursuit of the good feelings that come from perceiving that one has helped others. Studies on communitarian mindsets and susceptibility to the retransmission of disinformation suggest that we should be looking at community-oriented goals and emotions generated in context rather than the general attribute of altruism per se. Given that even momentary priming of subjects to be community-oriented increased their propensity to find meaning in unclear messaging and may increase their propensity to share such messages if they trigger emotions like anxiety (Lin et al., 2022 ), focusing on altruism as a stable trait may obscure the context-sensitive nature of the attribute.

Bias triggers vs. emotions

The studies linking greater susceptibility to disinformation with repeated exposure to the disinformation suggest that there can be non-emotional distractors from thinking analytically. The link between repeated exposure and belief is well-documented in studies about the illusory truth effect. This is the phenomenon of finding information truer with repeated exposure to it. The phenomenon was first identified in Hasher et al.’s ( 1977 ) study where students were more likely to rate sentences as “true” after repeated exposures to them. Hassan and Barber ( 2021 ) survey the research in the context of disinformation. Studies on the continued influence effect show that the effect persists even in the face of correction of false information. A recent overview of research on the effect can be found in Kan et al. ( 2021 ). The illusory truth effect and the continued influence effect could account for “falling for fake news” in the sense of explaining why subjects believe fake news. However, Swire et al. ( 2017 ) showed that repeated exposure to a myth in corrections only served to slow down believing the correct information. They did not find evidence of the myth-repeating correction backfiring to promote belief in the myth. While the bulk of the literature pertains to believing disinformation, Effron and Raj ( 2020 ) study suggests that repeated exposure can also affect one’s motivation to retransmit disinformation, in their case, by decreasing the perceived ethicality of retransmitting disinformation headlines. Stepping back, we observe that repeated exposures differ from emotions in that they may render us susceptible to disinformation without involving emotions, yet also differ from those factors that operate within reasoning in that they do not appear to feature in reasoning about the truth. They may be simply tendencies we have that are triggered by our exposure to the world.

The “saving” function of analytical thinking

The literature about the significance of analytical thinking to susceptibility to fake news overwhelmingly relates analytical thinking to truth discernment (e.g. Martel et al., 2020 ; Pennycook and Rand, 2019 , 2020 ; Lantian et al., 2021 ; Ross et al., 2021 ). From here, if there is a link to the spreading of fake news, it is conceivably by way of the fact that we find the information true and want to inform others of the truth, or that we are not stopped from spreading the information by finding it false. This suggests the assumption that analytical thinking has a “saving” function—the exercise of analytical thinking facilitates one’s distinguishing truth from falsity and “saves” us from believing and/or spreading false information. Associated with the assumption is the idea that the better one is at analytical thinking or the more one is disposed to employ analytical thinking, the better one is at using the right heuristics to defend against the threat of fake news, for example, to correctly rely on scientific sources of information, to appropriately assess that social endorsement is relevant, and to accurately fact-check.

Pit against these ideas is the view that analytical thinking is a double-edged sword. It may facilitate credence in and spread of disinformation as much as it may prevent them. In this section, we consider three points that mitigate the significance of analytical thinking as a contributing factor in truth discernment and the retransmission of fake news. The first is that biases could operate in the analytical thinking of someone who does well in analytical thinking measures. The second is that someone who is good at analytical thinking or disposed to think analytically may also be good at or disposed to exercise motivated reasoning. The third is that good reasoning need not be directed at seeking the truth. Before that, let us take a closer look at how factors serve as distractions in analytical thinking and the implications of such roles.

Biases that operate in analytical thinking

Confirmation bias has been proposed to make us more effective persuaders by helping us recruit information relevant to supporting our standpoints (Mercier and Sperber, 2011 ), contribute to the rigor and comprehensiveness of a group’s deliberation when its members exert themselves in defending their standpoints (Smith and Wald, 2019 ), facilitate group coordination by reducing intentions of group members that are incompatible with the rest of the group (Norman, 2016 ), and to effect social change in order to match our beliefs (Peters, 2022 ). Having a bias towards treating as true information that derives from the right types of sources (peer-reviewed scientific journals, mainstream news platforms, domain-relevant experts) can save us time and effort in fact-checking every piece of information that comes our way. Similarly, ideally, user ratings or opinions that are based on a large and varied enough sample and not affected by other biases of the raters (see de Langhe et al., 2016 ; Godden, 2008 ) should enable consumers to judge the goodness of a product or the truth of information without having to do further research. Conspiracist thinking allows us to detect true conspiracies and defend ourselves against them (van Prooijen and van Vugt, 2018 ).

Ideal analytical thinking proceeds by way of careful reasoning—whether conducted between interlocutors or with oneself in analytical reflection—and is aimed at uncovering the truth. Reasoning consists of giving reasons for claims, which summarize premises for conclusions. However, making and evaluating arguments is cognitively demanding and time-consuming. To make thinking more efficient, we rely on mental heuristics. While they may usually serve us well, mental heuristics can also cause us to believe in disinformation (see Ali and Zain-ul-abdin, 2021 ; Brashier and Marsh, 2020 ). Confirmation bias, source credibility, social endorsement and conspiracist thinking can interfere with the course of ideal analytical thinking. They can cause us to accept information as true or they can cause us to judge information as true on their bases, and to use the information as reasons for our claims. They can cause us to skip steps in evaluating the reasons for a claim or to uncritically accept claims. Because these biases operate within the exercise of analytical thinking, tests for a general propensity or competence to think analytically may not detect their operation. One may be predisposed to engage in analytical thinking or be good at the tested type of analytical thinking (for example, traditional CRT’s numerical analytical thinking) but have their reasons and claims subject to interference by biases. Biases that operate within analytical thinking, then, can undermine hypotheses that analytical thinking guards us against falling for fake news. Due to the operation of biases in analytical thinking—in the selection of reasons for claims, the weight accorded to reasons and the requirement for rigor in reasoning—one may fall for fake news even when one is predisposed to thinking analytically.

Inconclusiveness about motivated reasoning’s role

Pennycook and Rand ( 2019 ) argued that it is being distracted from thinking analytically, rather than motivated reasoning, that causes us to fall for fake news. Their claim is based on their findings that, regardless of content alignment with subjects’ beliefs, subjects’ competence in analytical thinking as measured by the CRT correlated positively with truth discernment. The motivated reasoning thesis would predict positive correlations between high analytical thinking competency scores and truth discernment accuracy when discerning the truth of content that is aligned with subjects’ beliefs, and a negative correlation between high competency scores and truth discernment accuracy when discerning the truth of content that are not thus aligned.

There is substantial work on analytic thinking and its effect on believing fake news that is aligned with Pennycook and Rand’s ( 2019 ) conclusions in undermining the hypothesis that our believing fake news is due to motivated reasoning (e.g. Pennycook and Rand, 2020 ; Clayton et al., 2019 ; Martel et al., 2020 ; Wischnewski and Krämer, 2021 ). Nevertheless, earlier work on motivated reasoning for topics like public discourse (Kahan et al., 2017 ) and misinformation (Kahan, 2017 ), together with suggestive work on confirmation bias (Zhou and Shen, 2022 ) and possibly opposing conclusions from recent work (Calvillo and Smelter, 2020 ; Tsang, 2021 ; Michael and Sanson, 2021 ; Traberg and van der Linden, 2022 ) urges caution in jettisoning the motivated reasoning research on fake news believability.

There are two points standing in the way of downplaying the causal relevance of motivated reasoning to believing disinformation. First, subjects who are good at analytical thinking may engage in motivated reasoning only when identity or reputational stakes are high. This would occur in a setting where one has to defend one’s position to others, rather than answer analytical questions in private and anonymously. Pennycook and Rand admit that their study could have different results if the questions were less factual and more personal (Pennycook and Rand, 2019 ). Second, the CRT items—analytical questions of “brain teaser” variety— may prime subjects to focus on exercising their analytical thinking abilities to the exclusion of the factors that typically motivate motivated reasoning.

Non-truth-directed reasoning to spread fake news

Non-truth-directed reasoning may cause individuals to retransmit fake news despite scoring well on analytical thinking measures. Reasoning need not aim at evaluating truth. Some researchers distinguish between “accuracy-oriented” and “goal-oriented” reasoning (for example, Osmundsen et al., 2021 ), where the former aims at the truth while the latter does not. Goal-oriented or means-ends reasoning can lead one to accept the truth of content that one is unsure of. This may occur in the minds of anxious individuals debating whether to believe in some potentially life-saving remedy in the panic and confusion at the beginning of the Covid-19 pandemic. Or, they may also reason to the conclusion that they should retransmit the information despite being uncertain about its truth. Similar means-ends reasoning can be conducted rigorously for deciding to retransmit unverified information for the sake of eliciting social engagement (likes, comments, ratings), to signal political affiliation, convey moral outrage, inflict social punishment, and/or to protect one’s identity or enhance one’s reputation in some other way. More perniciously, quasi-“truth directed” reasoning can cause people to classify information that they are unsure about, or that they believe is false, as “interesting if true (Altay et al., 2022 )” or “likely to become true” (Helgason and Effron, 2022 ). As research has shown, such classifications suffice to motivate the retransmission of the relevant information.


To our research question “What psychological distractions to analytical thinking mediate the belief in and/or retransmission of fake news?”, we have identified the following distractions by reviewing the research literature from 2016 to 2022: Emotions (especially anxiety and anger), repeated exposure, altruism, identity protection or enhancement, confirmation bias, source credibility, social endorsement, conspiracy thinking and possibly motivated reasoning. We think that Pennycook and Rand ( 2019 ) are generally correct when they say that people fall for fake news because they are distracted from thinking. However, their view can be qualified by these five conclusions from our discussion.

First, it is not analytical thinking per se, but analytical thinking directed to evaluating the truth that safeguards us from believing or spreading fake news. Someone may be good at analytical thinking and employ it in means-ends reasoning to decide to accept information that one is uncertain about as true, or to spread information that one knows is false in order to achieve their purposes. Some means-ends reasoning masquerade as truth evaluations—quasi-“truth directed” reasoning like “interesting if true” or “likely to become true” may inform one’s decision to believe the information.

Second, psychological factors can distract us from exercising analytical thinking and they can also distract us in exercising analytical thinking. They distract us from analytical thinking when, for example, we find a statement true because we were exposed many times to it. They distract us in analytical thinking when we favor certain premises in our reasoning because they support what we already believe. When we believe fake news that aligns with our political orientation, do we believe as a result of reasoning to confirm what we already believe or do we believe as a result of being triggered by prior exposure? The former is a bias that distracts within analytical thinking while the latter is a bias that distracts from analytical thinking.

Third, whether a psychological factor functions as a distractor within analytical thinking or from analytical thinking may depend on contextual factors. For example, under what conditions do we deliberate about source credibility in order to make an accuracy judgment and under what conditions do we make a snap accuracy judgment based on source credibility? The same heuristics that facilitate reasoning or are adaptive in some way can cause us to believe or retransmit disinformation. This suggests that investigating contextual factors influencing the choice of heuristics or how a heuristic is deployed is a potentially fruitful line of research.

Fourth, due to a possible mismatch between the type of analytical thinking that tests of analytical thinking employed by many studies on analytical thinking and disinformation employ and the type of analytical thinking that informs one’s decision to believe or spread information, measurements of analytical thinking may not indicate vulnerability to believing or spreading fake news. Many disinformation studies that seek to measure analytical thinking ability rely on CRT. Might relevant dimensions of analytical thinking ability be missed out by the CRT? Dimensions pertaining to argumentation—assessing reasons for claims, evidence for reasons, and effectiveness of objections and counterexamples—seem relevant to the persuasiveness of fake news. These dimensions are not well served by the CRT and only one study (Lantian et al., 2021 ) in our curation investigated them.

Fifth, despite the influential Pennycook and Rand ( 2019 ) and substantial other research aligned with its skepticism about the role of motivated reasoning to fake news susceptibility, the relevance of motivated reasoning to our tendency to believe fake news should not yet be dismissed. The conditions under which people undertake motivated reasoning (for example, only when faced with an imminent threat to one’s identity or reputation) and the priming effects of analytical tests on accuracy assessments of true and false information have not been adequately investigated. For the latter, it is not only the CRT but any analytical test could prime subjects to treat tasks in the immediate future as analytic ones, requiring objective and careful reasoning. This could preclude the operation of possible emotional triggers of motivated reasoning like anger or anxiety.

What is the outlook for research on psychological factors affecting reasoning and disinformation? With the advent of disinformation fabricated by artificial intelligence right down to proper citations of fabricated sources, the arms race against disinformation is intensifying. The identification of psychological factors that can distract us away from reasoning or distract us within reasoning is not an end in itself. It is for the purpose of education and public messaging that can deflect at least the most pernicious of fake news disinformation influences. The five conclusions above correspondingly suggest the following topics of research that can contribute to the design of education and public messaging initiatives:

Truth-oriented values (e.g. intellectual integrity, open-mindedness, intellectual humility) and how they might serve as constraints to goal-oriented reasoning, as opposed to accuracy-oriented reasoning

Distractors away from reasoning and distractors within reasoning from among the extant psychological factors that influence our vulnerability to disinformation (especially in untangling the potentially conflicting roles of anxiety and worry)

Conditions under which a psychological factor distracts us away from reasoning and conditions under which it distracts us from reasoning

Measures of other dimensions of analytical thinking (especially argumentative reasoning) to gauge the analytical thinking factor influencing our vulnerability to disinformation

Topics concerning motivated reasoning (especially high-stakes motivated reasoning as a factor influencing our vulnerability to disinformation, the role of emotion in motivated reasoning causing us to believe/spread fake news, and conceptual work distinguishing confirmation bias from motivated reasoning)

Finally, while it is not within the ambit of our research question, we noticed that there is little work done on the relation between believing and spreading fake news. The studies that we curated understand “falling for fake news” or “vulnerability to fake news” in terms of believing it, spreading it, or both. The propensity to believe and the propensity to spread fake news are usually taken as dependent variables on other factors but not on one another. Yet, these two propensities practically exhaust our understanding of what it is to be vulnerable to fake news. It is therefore important for our taming of fake news to learn how belief in it interacts with its retransmission.

Ahmed S (2022) Disinformation sharing thrives with fear of missing out among low cognitive news users: a cross-national examination of intentional sharing of deep fakes. J Broadcast Electron Media 66(1):Article 1. https://doi.org/10.1080/08838151.2022.2034826

Article   MathSciNet   Google Scholar  

Alcott H, Gentzkow M (2017) Social media and fake news in the 2016 election. J Econ Perspect 31(2):211–236. https://doi.org/10.1257/jep.31.2.211

Article   Google Scholar  

Ali K, Zain-ul-abdin K (2021) Post-truth propaganda: heuristic processing of political fake news on Facebook during the 2016 U.S. Presidential election. J Appl Commun Res 49(1):Article 1. https://doi.org/10.1080/00909882.2020.1847311

Ali K, Li C, Zain-ul-abdin K, Zaffar MA (2022b) Fake news on Facebook: examining the impact of heuristic cues on perceived credibility and sharing intention. Internet Res 32(1):Article 1. https://doi.org/10.1108/INTR-10-2019-0442

Ali K, Li C, Zain-ul-abdin K, Muqtadir SA (2022a) The effects of emotions, individual attitudes towards vaccination, and social endorsements on perceived fake news credibility and sharing motivations. Comput Hum Behav 134:107307. https://doi.org/10.1016/j.chb.2022.107307

Altay S, de Araujo E, Mercier H (2022) “If this account is true, it is most enormously wonderful”: interestingness-if-true and the sharing of true and false news. Digit Journalism 10(3):Article 3. https://doi.org/10.1080/21670811.2021.1941163

Anthony A, Moulding R (2019) Breaking the news: belief in fake news and conspiracist beliefs. Aust J Psychol 71(2):Article 2. https://doi.org/10.1111/ajpy.12233

Apuke OD, Omar B (2021) Fake news and covid-19: modelling the predictors of fake news sharing among social media users. Telematics Inform 56:101475. https://doi.org/10.1016/j.tele.2020.101475

Arias Maldonado M (2019) Understanding fake news: technology, affects, and the politics of the untruth. Hist Comun Soc 24(2):Article 2. https://doi.org/10.5209/hics.66298

Avram M, Micallef N, Patil S, Menczer F (2020) Exposure to social engagement metrics increases vulnerability to misinformation. Harv Kennedy Sch Misinf Rev. https://doi.org/10.37016/mr-2020-033

Bago B, Rosenzweig LR, Berinsky AJ, Rand DG (2022) Emotion may predict susceptibility to fake news but emotion regulation does not seem to help. Cogn Emotion 36(6):Article 6. https://doi.org/10.1080/02699931.2022.2090318

Balakrishnan V, Ng KS, Rahim HA (2021) To share or not to share—the underlying motives of sharing fake news amidst the Covid-19 pandemic in Malaysia. Technol Soc 66:101676. https://doi.org/10.1016/j.techsoc.2021.101676

Article   PubMed   PubMed Central   Google Scholar  

Baptista JP, Correia E, Gradim A, Piñeiro-Naval V (2021) The influence of political ideology on fake news belief: the Portuguese case. Publications 9(2):Article 2. https://doi.org/10.3390/publications9020023

Bauer PC, Clemm von Hohenberg B (2021) Believing and sharing information by fake sources: an experiment. Political Commun 38(6):Article 6. https://doi.org/10.1080/10584609.2020.1840462

Van Bavel JJ, Pereira A (2018) The Partisan Brain: an identity-based model of political belief. Trends Cogn Sci 22(3):213–224. https://doi.org/10.1016/j.tics.2018.01.004

Article   PubMed   Google Scholar  

Baxter G, Marcella R, Walicka A (2019) Scottish citizens’ perceptions of the credibility of online political “facts” in the “fake news” era: an exploratory study. J Doc 75(5):Article 5. https://doi.org/10.1108/JD-10-2018-0161

Bodenhausen GV, Kramer GP, Süsser K (1994) Happiness and stereotypic thinking in social judgment. J Person Soc Psychol 66(4):621–632. https://doi.org/10.1037/0022-3514.66.4.621

Bonafé-Pontes A, Couto C, Kakinohana R, Travain M, Schimidt L, Pilati R (2021) Covid-19 as Infodemic: the impact of political orientation and open-mindedness on the discernment of misinformation in Whatsapp. Judgm Decision Mak 16(6):Article 6

Google Scholar  

Bozdağ Ç, Koçer S (2022) Skeptical inertia in the face of polarization: news consumption and misinformation in Turkey. Media Commun 10(2):Article 2. https://doi.org/10.17645/mac.v10i2.5057

Brashier NM, Marsh EJ (2020) Judging truth. Annu Rev Psychol 71(1):499–515. https://doi.org/10.1146/annurev-psych-010419-050807

Buchanan T (2020) Why do people spread false information online? The effects of message and viewer characteristics on self-reported likelihood of sharing social media disinformation. PLoS ONE 15(10):Article 10. https://doi.org/10.1371/journal.pone.0239666

Article   CAS   Google Scholar  

Buchanan T, Benson V (2019) Spreading disinformation on facebook: do trust in message source, risk propensity, or personality affect the organic reach of “fake news”? Soc Media+Soc 5(4):Article 4. https://doi.org/10.1177/2056305119888654

Calvillo DP, Smelter TJ (2020) An initial accuracy focus reduces the effect of prior exposure on perceived accuracy of news headlines. Cogn Res 5(1):Article 1. https://doi.org/10.1186/s41235-020-00257-y

Calvillo DP, Rutchick AM, Garcia RJB (2021) Individual differences in belief in fake news about election fraud after the 2020 U.S. election. Behav Sci 11(12):Article 12. https://doi.org/10.3390/bs11120175

Calvillo DP, Ross BJ, Garcia RJB, Smelter TJ, Rutchick AM (2020) Political ideology predicts perceptions of the threat of COVID-19 (and susceptibility to fake news about it). Soc Psychol Personal Sci 11(8):Article 8. https://doi.org/10.1177/1948550620940539

Chuai Y, Zhao J (2022) Anger can make fake news viral online. Frontiers in Physics. https://doi.org/10.3389/fphy.2022.970174

Clayton K, Davis J, Hinckley K, Horiuchi Y (2019) Partisan motivated reasoning and misinformation in the media: is news from ideologically uncongenial sources more suspicious. Jpn J Political Sci 20(3):Article 3. https://doi.org/10.1017/S1468109919000082

De Coninck D, Frissen T, Matthijs K, d’Haenens L, Lits G, Champagne-Poirier O, Carignan M-E, David MD, Pignard-Cheynel N, Salerno S, Généreux M (2021) Beliefs in conspiracy theories and misinformation about COVID-19: comparative perspectives on the role of anxiety, depression and exposure to and trust in information sources. Front Psychol 12:646394. https://doi.org/10.3389/fpsyg.2021.646394

Deng B, Chau M (2021) The effect of the expressed anger and sadness on online news believability. J Manag Inf Syst 38(4):Article 4. https://doi.org/10.1080/07421222.2021.1990607

Druckman JN, Ognyanova K, Baum MA, Lazer D, Perlis RH, Volpe JD, Santillana M, Chwe H, Quintana A, Simonson M (2021) The role of race, religion, and partisanship in misperceptions about Covid-19. Group Process Intergroup Relat 24(4):Article 4. https://doi.org/10.1177/1368430220985912

Duffy A, Tan NN (2022) Dubious news: the social processing of uncertain facts in uncertain times. Digit Journalism 10(3):Article 3. https://doi.org/10.1080/21670811.2021.1953390

Duffy A, Tandoc E, Ling R (2020) Too good to be true, too good not to share: the social utility of fake news. Inf Commun Soc 23(13):1965–1979. https://doi.org/10.1080/1369118X.2019.1623904

Effron DA, Raj M (2020) Misinformation and morality: encountering fake-news headlines makes them seem less unethical to publish and share. Psychol Sci 31(1):Article 1. https://doi.org/10.1177/0956797619887896

Faragó L, Kende A, Krekó P (2020) We only believe in news that we doctored ourselves: the connection between partisanship and political fake news. Soc Psychol 51(2):Article 2. https://doi.org/10.1027/1864-9335/a000391

Folkvord F, Snelting F, Anschutz D, Hartmann T, Theben A, Gunderson L, Vermeulen I, Lupiáñez-Villanueva F (2022) Effect of source type and protective message on the critical evaluation of news messages on facebook: randomized controlled trial in the Netherlands. J Med Internet Res 24(3):Article 3. https://doi.org/10.2196/27945

Forgas JP, East R (2008) On being happy and gullible: mood effects on skepticism and the detection of deception. J Exp Soc Psychol 44(5):1362–1367. https://doi.org/10.1016/j.jesp.2008.04.010

Godden DM (2008) On common knowledge and ad populum: acceptance as grounds for acceptability. Philos Rhetoric 41(2):101–129

Grambo K (2019) Fake news and racial, ethnic and religious minorities: A precarious quest for truth. U Pa J Const L 1299

Grant MJ, Booth A(2009) typology of reviews: an analysis of 14 review types and associated methodologies: a typology of reviews, maria j. Grant & Andrew booth. Health Inf Libr J 26(2):91–108. https://doi.org/10.1111/j.1471-1842.2009.00848.x

Hameleers M, Brosius A (2022) You are wrong because I am right! The perceived causes and ideological biases of misinformation beliefs. Int J Public Opin Res 34(1):Article 1. https://doi.org/10.1093/ijpor/edab028

Hameleers M, Brosius A, de Vreese CH (2022) Whom to trust? Media exposure patterns of citizens with perceptions of misinformation and disinformation related to the news media. Eur J Commun 37(3):Article 3. https://doi.org/10.1177/02673231211072667

Han J, Cha M, Lee W (2020) Anger contributes to the spread of Covid-19 misinformation. Harv Kennedy School Misinf Rev. https://doi.org/10.37016/mr-2020-39

Harff D, Bollen C, Schmuck D (2022) Responses to social media influencers’ misinformation about COVID-19: a pre-registered multiple-exposure experiment. Media Psychol 25(6):Article 6. https://doi.org/10.1080/15213269.2022.2080711

Hartung F-M, Krohn C, Pirschtat M (2019) Better than its reputation? Gossip and the reasons why we and individuals with “dark” personalities talk about others. Front Psychol 10:1162. https://doi.org/10.3389/fpsyg.2019.01162

Hasher L, Goldstein D, Toppino T (1977) Frequency and the conference of referential validity. J Verbal Learn Verbal Behav 16:107–112

Hassan A, Barber SJ (2021) The effects of repetition frequency on the illusory truth effect. Cogn Res 6(1):38. https://doi.org/10.1186/s41235-021-00301-5

Helgason BA, Effron DA (2022) It might become true: how prefactual thinking licenses dishonesty. J Person Soc Psychol 123(5):Article 5. https://doi.org/10.1037/pspa0000308

Hopp T, Ferrucci P, Vargo CJ (2020) Why do people share ideologically extreme, false, and misleading content on social media? A self-report and trace data–based analysis of countermedia content dissemination on Facebook and Twitter. Hum Commun Res 46(4):Article 4. https://doi.org/10.1093/hcr/hqz022

Horner CG, Galletta D, Crawford J, Shirsat A (2021) Emotions: the unexplored fuel of fake news on social media. J Manag Inf Syst 38(4):Article 4. https://doi.org/10.1080/07421222.2021.1990610

Hughes JP, Efstratiou A, Komer SR, Baxter LA, Vasiljevic M, Leite AC (2022) The impact of risk perceptions and belief in conspiracy theories on covid-19 pandemic-related behaviours. PLOS ONE 17(2):e0263716. https://doi.org/10.1371/journal.pone.0263716

Article   CAS   PubMed   PubMed Central   Google Scholar  

Ireton C, Posetti J (2018) Journalism, fake news & disinformation: handbook for journalism education and training. UNESCO

Islam AKMN, Laato S, Talukder S, Sutinen E (2020) Misinformation sharing and social media fatigue during Covid-19: an affordance and cognitive load perspective. Technol Forecast Soc Change 159:120201. https://doi.org/10.1016/j.techfore.2020.120201

Kahan DM, Peters E, Dawson EC, Slovic P (2017) Motivated numeracy and enlightened self-government. Behav Public Policy 1(1):54–86. https://doi.org/10.1017/bpp.2016.2

Kahan DM (2017) Misconceptions, misinformation, and the logic of identity-protective cognition. SSRN Electron J. https://doi.org/10.2139/ssrn.2973067

Kan IP, Pizzonia KL, Drummey AB, Mikkelsen EJV (2021) Exploring factors that mitigate the continued influence of misinformation. Cogn Res 6(1):76. https://doi.org/10.1186/s41235-021-00335-9

Kantorowicz-Reznichenko E, Folmer CR, Kantorowicz J (2022) Don’t believe it! A global perspective on cognitive reflection and conspiracy theories about Covid-19 pandemic. Person Individ Diff 194:111666. https://doi.org/10.1016/j.paid.2022.111666

Keselman A, Arnott Smith C, Leroy G, Kaufman DR (2021) Factors influencing willingness to share health misinformation videos on the internet: web-based survey. J Med Internet Res 23(12):Article 12. https://doi.org/10.2196/30323

Kim S, Kim S (2020) The crisis of public health and infodemic: analyzing belief structure of fake news about COVID-19 pandemic. Sustainability 12(23):Article 23. https://doi.org/10.3390/su12239904

Laato S, Islam AKMN, Islam MN, Whelan E (2020) What drives unverified information sharing and cyberchondria during the Covid-19 pandemic. Eur J Inf Syst 29(3):Article 3. https://doi.org/10.1080/0960085X.2020.1770632

Landrum AR, Olshansky A (2019) The role of conspiracy mentality in denial of science and susceptibility to viral deception about science. Politics Life Sci 38(2):Article 2. https://doi.org/10.1017/pls.2019.9

de Langhe B, Fernbach PM, Lichtenstein DR (2016) Navigating by the stars: investigating the actual and perceived validity of online user ratings. J Consum Res 42(6):817–833. https://doi.org/10.1093/jcr/ucv047

Lantian A, Bagneux V, Delouvée S, Gauvrit N (2021) Maybe a free thinker but not a critical one: high conspiracy belief is associated with low critical thinking ability. Appl Cogn Psychol 35(3):Article 3. https://doi.org/10.1002/acp.3790

Lawson MA, Kakkar H (2022) Of pandemics, politics, and personality: the role of conscientiousness and political ideology in the sharing of fake news. J Exp Psychol 151(5):Article 5. https://doi.org/10.1037/xge0001120

Li K, Li J, Zhou F (2022) The effects of personality traits on online rumor sharing: the mediating role of fear of COVID-19. Int J Environ Res Public Health 19(10):Article 10. https://doi.org/10.3390/ijerph19106157

Lin Y, Zhang YC, Oyserman D (2022) Seeing meaning even when none may exist: collectivism increases belief in empty claims. J Person Soc Psychol 122(3):351–366. https://doi.org/10.1037/pspa0000280

Littrell S, Risko EF, Fugelsang JA (2021) ‘You can’t bullshit a bullshitter’ (or can you?): bullshitting frequency predicts receptivity to various types of misleading information. Br J Soc Psychol 60(4):Article 4. https://doi.org/10.1111/bjso.12447

Lobato EJC, Powell M, Padilla LMK, Holbrook C (2020) Factors predicting willingness to share COVID-19 misinformation. Front Psychol 11:566108. https://doi.org/10.3389/fpsyg.2020.566108

MacKuen M, Wolak J, Keele L, Marcus GE (2010) Civic engagements: resolute partisanship or reflective deliberation. Am J Political Sci 54(2):440–458. https://doi.org/10.1111/j.1540-5907.2010.00440.x

Martel C, Pennycook G, Rand DG (2020) Reliance on emotion promotes belief in fake news. Cogn Res 5(1):Article 1. https://doi.org/10.1186/s41235-020-00252-3

McGonagle T (2017) “Fake news”: false fears or real concerns? Neth Q Hum Rights 35(4):203–209. https://doi.org/10.1177/0924051917738685

Melki J, Tamim H, Hadid D, Makki M, El Amine J, Hitti E (2021) Mitigating infodemics: the relationship between news exposure and trust and belief in Covid-19 fake news and social media spreading. PLoS ONE 16(6):Article 6. https://doi.org/10.1371/journal.pone.0252830

Mena P, Barbe D, Chan-Olmsted S (2020) Misinformation on Instagram: the impact of trusted endorsements on message credibility. Soc Media+Soc 6(2):Article 2. https://doi.org/10.1177/2056305120935102

Mercier H, Sperber D (2011) Why do humans reason? Arguments for an argumentative theory. Behav Brain Sci 34(2):57–74. https://doi.org/10.1017/S0140525X10000968

Michael RB, Breaux BO (2021) The relationship between political affiliation and beliefs about sources of “fake news”. Cogn Res 6(1):Article 1. https://doi.org/10.1186/s41235-021-00278-1

Michael RB, Sanson M (2021) Source information affects interpretations of the news across multiple age groups in the United States. Societies 11(4):Article 4. https://doi.org/10.3390/soc11040119

Miller JM, Saunders KL, Farhart CE (2016) Conspiracy endorsement as motivated reasoning: the moderating roles of political knowledge and trust. Am J Political Sci 60(4):Article 4. https://doi.org/10.1111/ajps.12234

Nadarevic L, Reber R, Helmecke AJ, Köse D (2020) Perceived Truth of statements and simulated social media postings: an experimental investigation of source credibility, repeated exposure, and presentation format. Cogn Res 5(1):Article 1. https://doi.org/10.1186/s41235-020-00251-4

Norman A (2016) Why we reason: intention-alignment and the genesis of human rationality. Biol Philos 31(5):685–704. https://doi.org/10.1007/s10539-016-9532-4

Nurse MS, Ross RM, Isler O, Van Rooy D (2022) Analytic thinking predicts accuracy ratings and willingness to share Covid-19 misinformation in Australia. Mem Cogn 50(2):Article 2. https://doi.org/10.3758/s13421-021-01219-5

Oh HJ, Lee H (2019) When do people verify and share health rumors on social media? The effects of message importance, health anxiety, and health literacy. J Health Commun 24(11):Article 11. https://doi.org/10.1080/10810730.2019.1677824

Osmundsen M, Bor A, Vahlstrup PB, Bechmann A, Petersen MB (2021) Partisan polarization is the primary psychological motivation behind political fake news sharing on Twitter. Am Political Sci Rev 115(3):Article 3. https://doi.org/10.1017/S0003055421000290

Pearson GDH, Knobloch-Westerwick S (2019) Is the confirmation bias bubble larger online? Pre-election confirmation bias in selective exposure to online versus print political information. Mass Commun Soc 22(4):Article 4. https://doi.org/10.1080/15205436.2019.1599956

Pehlivanoglu D, Lin T, Deceus F, Heemskerk A, Ebner NC, Cahill BS (2021) The role of analytical reasoning and source credibility on the evaluation of real and fake full-length news articles. Cogn Res 6(1):Article 1. https://doi.org/10.1186/s41235-021-00292-3

Pehlivanoglu D, Lighthall NR, Lin T, Chi KJ, Polk R, Perez E, Cahill BS, Ebner NC (2022) Aging in an “Infodemic”: the role of analytical reasoning, affect, and news consumption frequency on news veracity detection. J Exp Psychol 28(3):Article 3. https://doi.org/10.1037/xap0000426

Pennycook G, Cannon TD, Rand DG (2018) Prior exposure increases perceived accuracy of fake news. J Exp Psychol 147(12):1865–1880. https://doi.org/10.1037/xge0000465

Pennycook G, Rand DG (2019) Lazy, not biased: susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition 188:39–50. https://doi.org/10.1016/j.cognition.2018.06.011

Pennycook G, Rand DG (2020) Who falls for fake news? The roles of bullshit receptivity, overclaiming, familiarity, and analytic thinking. J Person 88(2):185–200. https://doi.org/10.1111/jopy.12476

Pereira A, Harris E, Van Bavel JJ (2021) Identity concerns drive belief: the impact of partisan identity on the belief and dissemination of true and false news. group processes & intergroup relations, 136843022110300. https://doi.org/10.1177/13684302211030004

Peters U (2022) What is the function of confirmation bias. Erkenntnis 87(3):1351–1376. https://doi.org/10.1007/s10670-020-00252-1

Plume CJ, Slade EL (2018) Sharing of sponsored advertisements on social media: a uses and gratifications perspective. Inf Syst Front 20(3):471–483. https://doi.org/10.1007/s10796-017-9821-8

van Prooijen J-W, van Vugt M (2018) Conspiracy theories: evolved functions and psychological mechanisms perspectives on. Psychol Sci 13(6):770–788. https://doi.org/10.1177/1745691618774270

Ren Z (Bella), Dimant E, Schweitzer ME (2021) Social motives for sharing conspiracy theories. SSRN Electron J. https://doi.org/10.2139/ssrn.3919364

Ross RM, Rand DG, Pennycook G (2021) Beyond “fake news”: Analytic thinking and the detection of false and hyperpartisan news headlines Abstract. Judgment Decis Mak 16(2):484–504. https://doi.org/10.1017/S1930297500008640

Smelter TJ, Calvillo DP (2020) Pictures and repeated exposure increase perceived accuracy of news headlines. Appl Cogn Psychol 34(5):Article 5. https://doi.org/10.1002/acp.3684

Smith JJ, Wald B (2019) Collectivized intellectualism. Res Philos 96(2):199–227. https://doi.org/10.11612/resphil.1766

Solovev K, Pröllochs N (2022) Moral emotions shape the virality of COVID-19 misinformation on social media. In: Proceedings of the ACM web conference. pp. 3706–3717. https://doi.org/10.1145/3485447.3512266

Stanley ML, Whitehead PS, Marsh EJ (2022) The cognitive processes underlying false beliefs. J Consum Psychol 32(2):359–369. https://doi.org/10.1002/jcpy.1289

Sterrett D, Malato D, Benz J, Kantor L, Tompson T, Rosenstiel T, Sonderman J, Loker K (2019) Who shared it?: Deciding what news to trust on social media. Digit Journalism 7(6):Article 6. https://doi.org/10.1080/21670811.2019.1623702

Su Y (2021) It doesn’t take a village to fall for misinformation: social media use, discussion heterogeneity preference, worry of the virus, faith in scientists, and Covid-19-related misinformation beliefs. Telemat Inform 58:101547. https://doi.org/10.1016/j.tele.2020.101547

Sun Z, Cheng X, Zhang R, Yang B (2020) Factors influencing rumour re-spreading in a public health crisis by the middle-aged and elderly populations. Int J Environ Res Public Health 17(18):Article 18. https://doi.org/10.3390/ijerph17186542

Swire B, Ecker UKH, Lewandowsky S (2017) The role of familiarity in correcting inaccurate information. J Exp Psychol 43(12):1948–1961. https://doi.org/10.1037/xlm0000422

Tandoc EC, Lee J, Chew M, Tan FX, Goh ZH (2021) Falling for fake news: the role of political bias and cognitive ability. Asian J Commun 31(4):Article 4. https://doi.org/10.1080/01292986.2021.1941149

Traberg CS, van der Linden S (2022) Birds of a feather are persuaded together: perceived source credibility mediates the effect of political bias on misinformation susceptibility. Personal Individ Differ 185:111269. https://doi.org/10.1016/j.paid.2021.111269

Tsang SJ (2021) Motivated fake news perception: the impact of news sources and policy support on audiences’ assessment of news fakeness. Journalism Mass Commun Q 98(4):Article 4. https://doi.org/10.1177/1077699020952129

Turel O, Osatuyi B (2021) Biased credibility and sharing of fake news on social media: considering peer context and self-objectivity state. J Manag Inf Syst 38(4):Article 4. https://doi.org/10.1080/07421222.2021.1990614

Vegetti F, Mancosu M (2020) The impact of political sophistication and motivated reasoning on misinformation. Political Commun 37(5):Article 5. https://doi.org/10.1080/10584609.2020.1744778

Wardle C, Derakhshan H (2017) Information disorder: toward an interdisciplinary framework for research and policy making. Counc Eur. https://rm.coe.int/information-disorder-toward-an-interdisciplinary-framework-for-researc/168076277c

Waszak PM, Kasprzycka-Waszak W, Kubanek A (2018) The spread of medical fake news in social media - The pilot quantitative study. Health Policy Technol 7(2):115–118. https://doi.org/10.1016/j.hlpt.2018.03.002

Weimann-Saks D, Elshar-Malka V, Ariel Y, Weimann G (2022) Spreading online rumours during the Covid-19 pandemic: the role of users’ knowledge, trust and emotions as predictors of the spreading patterns. J Int Commun 28(2):Article 2. https://doi.org/10.1080/13216597.2022.2099443

Wischnewski M, Bruns A, Keller T (2021) Shareworthiness and motivated reasoning in hyper-partisan news sharing behavior on Twitter. Digit Journalism 9(5):Article 5. https://doi.org/10.1080/21670811.2021.1903960

Wischnewski M, Krämer N (2021) The role of emotions and identity-protection cognition when processing (mis)information. Technol Mind Behav 2(1). https://doi.org/10.1037/tmb0000029

Xiao X, Borah P, Su Y (2021) The dangers of blind trust: examining the interplay among social media news use, misinformation identification, and news trust on conspiracy beliefs. Public Underst Sci 30(8):Article 8. https://doi.org/10.1177/0963662521998025

Zhou Y, Shen L (2022) Confirmation bias and the persistence of misinformation on climate change. Commun Res 49(4):Article 4. https://doi.org/10.1177/00936502211028049

Zimmermann F, Kohring M (2020) Mistrust, disinforming news, and vote choice: a panel survey on the origins and consequences of believing disinformation in the 2017 German Parliamentary election. Political Commun 37(2):Article 2. https://doi.org/10.1080/10584609.2019.1686095

Download references


This study was funded by Singapore Ministry of Education (MOE) under the Education Research Funding Programme (DEV 03/19 AK) and administered by National Institute of Education (NIE), Nanyang Technological University, Singapore. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the Singapore MOE and NIE.

Author information

Authors and affiliations.

College of Interdisciplinary and Experiential Learning, Singapore University of Social Sciences, Singapore, Singapore

Adrian Kwek

School of Science and Technology, Singapore University of Social Sciences, Singapore, Singapore

Curriculum Planning and Development Division, Ministry of Education, Singapore, Singapore

School of Computing, National University of Singapore, Singapore, Singapore

Jin Xing Lee

You can also search for this author in PubMed   Google Scholar


The authors confirm their contribution to the paper as follows: study conception and design: AK; data collection: AK, JXL, LP, JT; analysis and interpretation of results: AK, LP, JT; draft manuscript preparation: AK, LP, JT. All authors reviewed the results and approved the final version of the manuscript.

Corresponding author

Correspondence to Adrian Kwek .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Ethical approval

This article does not contain any studies with human participants performed by any of the authors.

Informed consent

Additional information.

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Distractions, analytical thinking and falling for fake news: a survey of psychological factors, rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Kwek, A., Peh, L., Tan, J. et al. Distractions, analytical thinking and falling for fake news: A survey of psychological factors. Humanit Soc Sci Commun 10 , 319 (2023). https://doi.org/10.1057/s41599-023-01813-9

Download citation

Received : 02 January 2023

Accepted : 30 May 2023

Published : 12 June 2023

DOI : https://doi.org/10.1057/s41599-023-01813-9

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

critical thinking in fake news


  1. Critical Thinking

    critical thinking in fake news

  2. Critical thinking and Fake News

    critical thinking in fake news

  3. Critical Thinking & Worldview in a Fake News World

    critical thinking in fake news

  4. Critical thinking in Fake News

    critical thinking in fake news

  5. (PDF) Fighting Fake News! Grades 4-6: Teaching Critical Thinking and

    critical thinking in fake news

  6. FAKE NEWS UNIT: Teach students to analyze news and get the facts

    critical thinking in fake news


  1. The Use of Critical Thinking to Identify Fake News: A Systematic...

    The thematic discussion grouped and synthesised the articles in this review according to the main themes of fake news, information literacy and critical thinking. The Fake news and accountability discussion raised the question of who becomes accountable for the spreading of fake news between social media and the user. The articles presented a ...

  2. Critical Thinking and Fake News | SkillsYouNeed

    The bad news is that ‘fake news’ is often very believable, and it is extremely easy to get caught out. This page explains how you can apply critical thinking techniques to news stories to reduce the chances of believing fake news, or at least starting to understand that ‘not everything you read is true’.

  3. Opinion | Don’t Go Down the Rabbit Hole - The New York Times

    “The goal of disinformation is to capture attention, and critical thinking is deep attention,” he wrote in 2018. People learn to think critically by focusing on something and contemplating it ...

  4. Fake News & Critical Thinking - Shenandoah University

    What is Critical Thinking? "Critical thinking is the disciplined art of ensuring that you use the best thinking you are capable of in any set of circumstances." Think critically about what you read, hear, see, and about how you absorb information. It's a good way to begin figuring out and interpreting fake news, alternative facts, post-truth ...

  5. The Use of Critical Thinking to Identify Fake News: A ...

    Critical thinking, as a form of information literacy, provides a means to critically engage with online content, for example by looking for evidence to support claims and by evaluating the ...

  6. Frontiers | “Fake News” or Real Science? Critical Thinking to ...

    Few people question the important role of critical thinking in students becoming active citizens; however, the way science is taught in schools continues to be more oriented toward “what to think” rather than “how to think.” Researchers understand critical thinking as a tool and a higher-order thinking skill necessary for being an active citizen when dealing with socio-scientific ...

  7. Developing a critical mind against fake news - UNESCO

    Social media and fake news consequently make up a textbook case for MIL, which calls upon its fundamental competence − critical thinking. But this critical thinking must have an understanding of the added value of the digital: participation, contribution, transparency and accountability, of course, but also disinformation and the interplay of influence.

  8. The Use of Critical Thinking to Identify Fake News: A ...

    The purpose of this study is to investigate the current state of knowledge on the use of critical thinking to identify fake news and recommend that information literacy be included in academic institutions, specifically to encourage critical thinking. With the large amount of news currently being published online, the ability to evaluate the credibility of online news has become essential ...

  9. Distractions, analytical thinking and falling for fake news ...

    Analytical thinking safeguards us against believing or spreading fake news. In various forms, this common assumption has been reported, investigated, or implemented in fake news education programs.

  10. Falsehoods and fake news: The importance of critical thinking ...

    The importance of critical thinking for students is reflected in research that shows how easily they can be taken in by fake news. In one study of nearly 8,000 students conducted by the Stanford Graduate School of Education, the majority consistently failed to validate an online information source (URL, video, tweet) and were ‘easily duped ...