U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Springer Nature - PMC COVID-19 Collection

Logo of phenaturepg

The Use of Critical Thinking to Identify Fake News: A Systematic Literature Review

Paul machete.

Department of Informatics, University of Pretoria, Pretoria, 0001 South Africa

Marita Turpin

With the large amount of news currently being published online, the ability to evaluate the credibility of online news has become essential. While there are many studies involving fake news and tools on how to detect it, there is a limited amount of work that focuses on the use of information literacy to assist people to critically access online information and news. Critical thinking, as a form of information literacy, provides a means to critically engage with online content, for example by looking for evidence to support claims and by evaluating the plausibility of arguments. The purpose of this study is to investigate the current state of knowledge on the use of critical thinking to identify fake news. A systematic literature review (SLR) has been performed to identify previous studies on evaluating the credibility of news, and in particular to see what has been done in terms of the use of critical thinking to evaluate online news. During the SLR’s sifting process, 22 relevant studies were identified. Although some of these studies referred to information literacy, only three explicitly dealt with critical thinking as a means to identify fake news. The studies on critical thinking noted critical thinking as an essential skill for identifying fake news. The recommendation of these studies was that information literacy be included in academic institutions, specifically to encourage critical thinking.

Introduction

The information age has brought a significant increase in available sources of information; this is in line with the unparalleled increase in internet availability and connection, in addition to the accessibility of technological devices [ 1 ]. People no longer rely on television and print media alone for obtaining news, but increasingly make use of social media and news apps. The variety of information sources that we have today has contributed to the spread of alternative facts [ 1 ]. With over 1.8 billion active users per month in 2016 [ 2 ], Facebook accounted for 20% of total traffic to reliable websites and up to 50% of all the traffic to fake news sites [ 3 ]. Twitter comes second to Facebook, with over 400 million active users per month [ 2 ]. Posts on social media platforms such as Facebook and Twitter spread rapidly due to how they attempt to grab the readers’ attention as quickly as possible, with little substantive information provided, and thus create a breeding ground for the dissemination of fake news [ 4 ].

While social media is a convenient way of accessing news and staying connected to friends and family, it is not easy to distinguish real news from fake news on social media [ 5 ]. Social media continues to contribute to the increasing distribution of user-generated information; this includes hoaxes, false claims, fabricated news and conspiracy theories, with primary sources being social media platforms such as Facebook and Twitter [ 6 ]. This means that any person who is in possession of a device, which can connect to the internet, is potentially a consumer or distributor of fake news. While social media platforms and search engines do not encourage people to believe the information being circulated, they are complicit in people’s propensity to believe the information they come across on these platforms, without determining their validity [ 6 ]. The spread of fake news can cause a multitude of damages to the subject; varying from reputational damage of an individual, to having an effect on the perceived value of a company [ 7 ].

The purpose of this study is to investigate the use of critical thinking methods to detect news stories that are untrue or otherwise help to develop a critical attitude to online news. This work was performed by means of a systematic literature review (SLR). The paper is presented as follows. The next section provides background information on fake news, its importance in the day-to-day lives of social media users and how information literacy and critical thinking can be used to identify fake news. Thereafter, the SLR research approach is discussed. Following this, the findings of the review are reported, first in terms of descriptive statistics and the in terms of a thematic analysis of the identified studies. The paper ends with the Conclusion and recommendations.

Background: Fake News, Information Literacy and Critical Thinking

This section discusses the history of fake news, the fake news that we know today and the role of information literacy can be used to help with the identification of fake news. It also provides a brief definition of critical thinking.

The History of Fake News

Although fake news has received increased attention recently, the term has been used by scholars for many years [ 4 ]. Fake news emerged from the tradition of yellow journalism of the 1890s, which can be described as a reliance on the familiar aspects of sensationalism—crime news, scandal and gossip, divorces and sex, and stress upon the reporting of disasters, sports sensationalism as well as possibly satirical news [ 5 ]. The emergence of online news in the early 2000s raised concerns, among them being that people who share similar ideologies may form “echo chambers” where they can filter out alternative ideas [ 2 ]. This emergence came about as news media transformed from one that was dominated by newspapers printed by authentic and trusted journalists to one where online news from an untrusted source is believed by many [ 5 ]. The term later grew to describe “satirical news shows”, “parody news shows” or “fake-news comedy shows” where a television show, or segment on a television show was dedicated to political satire [ 4 ]. Some of these include popular television shows such as The Daily Show (now with Trevor Noah), Saturday Night Live ’s “The Weekend Update” segment, and other similar shows such as Last Week Tonight with John Oliver and The Colbert Report with Stephen Colbert [ 4 ]. News stories in these shows were labelled “fake” not because of their content, but for parodying network news for the use of sarcasm, and using comedy as a tool to engage real public issues [ 4 ]. The term “Fake News” further became prominent during the course of the 2016 US presidential elections, as members of the opposing parties would post incorrect news headlines in order to sway the decision of voters [ 6 ].

Fake News Today

The term fake news has a more literal meaning today [ 4 ]. The Macquarie Dictionary named fake news the word of the year for 2016 [ 8 ]. In this dictionary, fake news is described it as a word that captures a fascinating evolution in the creation of deceiving content, also allowing people to believe what they see fit. There are many definitions for the phrase, however, a concise description of the term can be found in Paskin [ 4 ] who states that certain news articles originating from either social media or mainstream (online or offline) platforms, that are not factual, but are presented as such and are not satirical, are considered fake news. In some instances, editorials, reports, and exposés may be knowingly disseminating information with intent to deceive for the purposes of monetary or political benefit [ 4 ].

A distinction amongst three types of fake news can be made on a conceptual level, namely: serious fabrications, hoaxes and satire [ 3 ]. Serious fabrications are explained as news items written on false information, including celebrity gossip. Hoaxes refer to false information provided via social media, aiming to be syndicated by traditional news platforms. Lastly, satire refers to the use of humour in the news to imitate real news, but through irony and absurdity. Some examples of famous satirical news platforms in circulation in the modern day are The Onion and The Beaverton , when contrasted with real news publishers such as The New York Times [ 3 ].

Although there are many studies involving fake news and tools on how to detect it, there is a limited amount of academic work that focuses on the need to encourage information literacy so that people are able to critically access the information they have been presented, in order to make better informed decisions [ 9 ].

Stein-Smith [ 5 ] urges that information/media literacy has become a more critical skill since the appearance of the notion of fake news has become public conversation. Information literacy is no longer a nice-to-have proficiency but a requirement for interpreting news headlines and participation in public discussions. It is essential for academic institutions of higher learning to present information literacy courses that will empower students and staff members with the prerequisite tools to identify, select, understand and use trustworthy information [ 1 ]. Outside of its academic uses, information literacy is also a lifelong skill with multiple applications in everyday life [ 5 ]. The choices people make in their lives, and opinions they form need to be informed by the appropriate interpretation of correct, opportune, and significant information [ 5 ].

Critical Thinking

Critical thinking covers a broad range of skills that includes the following: verbal reasoning skills; argument analysis; thinking as hypothesis testing; dealing with likelihood and uncertainties; and decision making and problem solving skills [ 10 ]. For the purpose of this study, where we are concerned with the evaluation of the credibility of online news, the following definition will be used: critical thinking is “the ability to analyse and evaluate arguments according to their soundness and credibility, respond to arguments and reach conclusions through deduction from given information” [ 11 ]. In this study, we want to investigate how the skills mentioned by [ 11 ] can be used as part of information literacy, to better identify fake news.

The next section presents the research approach that was followed to perform the SLR.

Research Method

This section addresses the research question, the search terms that were applied to a database in relation to the research question, as well as the search criteria used on the search results. The following research question was addressed in this SLR:

  • What is the role of critical thinking in identifying fake news, according to previous studies?

The research question was identified in accordance to the research topic. The intention of the research question is to determine if the identified studies in this review provide insights into the use of critical thinking to evaluate the credibility of online news and in particular to identify fake news.

Delimitations.

In the construction of this SLR, the following definitions of fake news and other related terms have been excluded, following the suggestion of [ 2 ]:

  • Unintentional reporting mistakes;
  • Rumours that do not originate from a particular news article;
  • Conspiracy theories;
  • Satire that is unlikely to be misconstrued as factual;
  • False statements by politicians; and
  • Reports that are slanted or misleading, but not outright false.

Search Terms.

The database tool used to extract sources to conduct the SLR was Google Scholar ( https://scholar.google.com ). The process for extracting the sources involved executing the search string on Google Scholar and the retrieval of the articles and their meta-data into a tool called Mendeley, which was used for reference management.

The search string used to retrieve the sources was defined below:

(“critical think*” OR “critically (NEAR/2) reason*” OR “critical (NEAR/2) thought*” OR “critical (NEAR/2) judge*” AND “fake news” AND (identify* OR analyse* OR find* OR describe* OR review).

To construct the search criteria, the following factors have been taken into consideration: the research topic guided the search string, as the key words were used to create the base search criteria. The second step was to construct the search string according to the search engine requirements on Google Scholar.

Selection Criteria.

The selection criteria outlined the rules applied in the SLR to identify sources, narrow down the search criteria and focus the study on a specific topic. The inclusion and exclusion criteria are outlined in Table  1 to show which filters were applied to remove irrelevant sources.

Table 1.

Inclusion and exclusion criteria for paper selection

Source Selection.

The search criteria were applied on the online database and 91 papers were retrieved. The criteria in Table  1 were used on the search results in order to narrow down the results to appropriate papers only.

PRISMA Flowchart.

The selection criteria included four stages of filtering and this is depicted in Fig.  1 . In then Identification stage, the 91 search results from Google Scholar were returned and 3 sources were derived from the sources already identified from the search results, making a total of 94 available sources. In the screening stage, no duplicates were identified. After a thorough screening of the search results, which included looking at the availability of the article (free to use), 39 in total records were available – to which 55 articles were excluded. Of the 39 articles, nine were excluded based on their titles and abstract being irrelevant to the topic in the eligibility stage. A final list of 22 articles was included as part of this SLR. As preparation for the data analysis, a data extraction table was made that classified each article according to the following: article author; article title; theme (a short summary of the article); year; country; and type of publication. The data extraction table assisted in the analysis of findings as presented in the next section.

An external file that holds a picture, illustration, etc.
Object name is 497534_1_En_20_Fig1_HTML.jpg

PRISMA flowchart

Analysis of Findings

Descriptive statistics.

Due to the limited number of relevant studies, the information search did not have a specified start date. Articles were included up to 31 August 2019. The majority of the papers found were published in 2017 (8 papers) and 2018 (9 papers). This is in line with the term “fake news” being announced the word of the year in the 2016 [ 8 ].

The selected papers were classified into themes. Figure  2 is a Venn diagram that represents the overlap of articles by themes across the review. Articles that fall under the “fake news” theme had the highest number of occurrences, with 11 in total. Three articles focused mainly on “Critical Thinking”, and “Information Literacy” was the main focus of four articles. Two articles combined all three topics of critical thinking, information literacy, and fake news.

An external file that holds a picture, illustration, etc.
Object name is 497534_1_En_20_Fig2_HTML.jpg

Venn diagram depicting the overlap of articles by main focus

An analysis of the number of articles published per country indicate that the US had a dominating amount of articles published on this topic, a total of 17 articles - this represents 74% of the selected articles in this review. The remaining countries where articles were published are Australia, Germany, Ireland, Lebanon, Saudi Arabia, and Sweden - with each having one article published.

In terms of publication type, 15 of the articles were journal articles, four were reports, one was a thesis, one was a magazine article and one, a web page.

Discussion of Themes

The following emerged from a thematic analysis of the articles.

Fake News and Accountability.

With the influence that social media has on the drive of fake news [ 2 ], who then becomes responsible for the dissemination and intake of fake news by the general population? The immediate assumption is that in the digital age, social media platforms like Facebook and Twitter should be able to curate information, or do some form of fact-checking when posts are uploaded onto their platforms [ 12 ], but that leans closely to infringing on freedom of speech. While different authors agree that there need to be measures in place for the minimisation of fake news being spread [ 12 , 13 ], where that accountability lies differs between the authors. Metaxas and Mustafaraj [ 13 ] aimed to develop algorithms or plug-ins that can assist in trust and postulated that consumers should be able to identify misinformation, thus making an informed decision on whether to share that information or not. Lazer et al. [ 12 ] on the other hand, believe the onus should be on the platform owners to put restrictions on the kind of data distributed. Considering that the work by Metaxas and Mustafaraj [ 13 ] was done seven years ago, one can conclude that the use of fact-checking algorithms/plug-ins has not been successful in curbing the propulsion of fake news.

Fake News and Student Research.

There were a total of four articles that had a focus on student research in relation to fake news. Harris, Paskin and Stein-Smith [ 4 , 5 , 14 ] all agree that students do not have the ability to discern between real and fake news. A Stanford History Education Group study reveals that students are not geared up for distinguishing real from fake news [ 4 ]. Most students are able to perform a simple Google search for information; however, they are unable to identify the author of an online source, or if the information is misleading [ 14 ]. Furthermore, students are not aware of the benefits of learning information literacy in school in equipping them with the skills required to accurately identify fake news [ 5 ]. At the Metropolitan Campus of Fairleigh Dickson University, librarians have undertaken the role of providing training on information literacy skills for identifying fake news [ 5 ].

Fake News and Social Media.

A number of authors [ 6 , 15 ] are in agreement that social media, the leading source of news, is the biggest driving force for fake news. It provides substantial advantage to broadcast manipulated information. It is an open platform of unfiltered editors and open to contributions from all. According to Nielsen and Graves as well as Janetzko, [ 6 , 15 ], people are unable to identify fake news correctly. They are likely to associate fake news with low quality journalism than false information designed to mislead. Two articles, [ 15 ] and [ 6 ] discussed the role of critical thinking when interacting on social media. Social media presents information to us that has been filtered according to what we already consume, thereby making it a challenge for consumers to think critically. The study by Nielsen and Graves [ 6 ] confirm that students’ failure to verify incorrect online sources requires urgent attention as this could indicate that students are a simple target for presenting manipulated information.

Fake News That Drive Politics.

Two studies mention the effect of social and the spread of fake news, and how it may have propelled Donald Trump to win the US election in 2016 [ 2 , 16 ]. Also, [ 8 ] and [ 2 ] mention how a story on the Pope supporting Trump in his presidential campaign, was widely shared (more than a million times) on Facebook in 2016. These articles also point out how in the information age, fact-checking has become relatively easy, but people are more likely to trust their intuition on news stories they consume, rather than checking the reliability of a story. The use of paid trolls and Russian bots to populate social media feeds with misinformation in an effort to swing the US presidential election in Donald Trump’s favour, is highlighted [ 16 ]. The creation of fake news, with the use of alarmist headlines (“click bait”), generates huge traffic into the original websites, which drives up advertising revenue [ 2 ]. This means content creators are compelled to create fake news, to drive ad revenue on their websites - even though they may not be believe in the fake news themselves [ 2 ].

Information Literacy.

Information literacy is when a person has access to information, and thus can process the parts they need, and create ways in which to best use the information [ 1 ]. Teaching students the importance of information literacy skills is key, not only for identifying fake news but also for navigating life aspects that require managing and scrutinising information, as discussed by [ 1 , 17 ], and [ 9 ]. Courtney [ 17 ] highlights how journalism students, above students from other disciplines, may need to have some form of information literacy incorporated into their syllabi to increase their awareness of fake news stories, creating a narrative of being objective and reliable news creators. Courtney assessed different universities that teach journalism and media-related studies, and established that students generally lack awareness on how useful library services are in offering services related to information literacy. Courtney [ 17 ] and Rose-Wiles [ 9 ] discuss how the use of library resources should be normalised to students. With millennials and generation Z having social media as their first point of contact, Rose-Wiles [ 9 ] urges universities, colleges and other academic research institutes to promote the use of more library resources than those from the internet, to encourage students to lean on reliable sources. Overall, this may prove difficult, therefore Rose-Wiles [ 9 ] proposes that by teaching information literacy skills and critical thinking, students can use these skills to apply in any situation or information source.

Referred to as “truth decay”, people have reached a point where they no longer need to agree with facts [ 18 ]. Due to political polarisation, the general public hold the opinion of being part of an oppressed group of people, and therefore will believe a political leader who appeals to that narrative [ 18 ]. There needs to be tangible action put into driving civil engagement, to encourage people to think critically, analyse information and not believe everything they read.

Critical Thinking.

Only three of the articles had critical thinking as a main theme. Bronstein et al. [ 19 ] discuss how certain dogmatic and religious beliefs create a tendency in individuals to belief any information given, without them having a need to interrogate the information further and then deciding ion its veracity. The article further elaborates how these individuals are also more likely to engage in conspiracy theories, and tend to rationalise absurd events. Bronstein et al.’s [ 19 ] study conclude that dogmatism and religious fundamentalism highly correlate with a belief in fake news. Their study [ 19 ] suggests the use of interventions that aim to increase open-minded thinking, and also increase analytical thinking as a way to help religious, curb belief in fake news. Howlett [ 20 ] describes critical thinking as evidence-based practice, which is taking the theories of the skills and concepts of critical thinking and converting those for use in everyday applications. Jackson [ 21 ] explains how the internet purposely prides itself in being a platform for “unreviewed content”, due to the idea that people may not see said content again, therefore it needs to be attention-grabbing for this moment, and not necessarily accurate. Jackson [ 21 ] expands that social media affected critical thinking in how it changed the view on published information, what is now seen as old forms of information media. This then presents a challenge to critical thinking in that a large portion of information found on the internet is not only unreliable, it may also be false. Jackson [ 21 ] posits that one of the biggest dangers to critical thinking may be that people have a sense of perceived power for being able to find the others they seek with a simple web search. People are no longer interested in evaluation the credibility of the information they receive and share, and thus leading to the propagation of fake news [ 21 ].

Discussion of Findings

The aggregated data in this review has provided insight into how fake news is perceived, the level of attention it is receiving and the shortcomings of people when identifying fake news. Since the increase in awareness of fake news in 2016, there has been an increase in academic focus on the subject, with most of the articles published between 2017 and 2018. Fifty percent of the articles released focused on the subject of fake news, with 18% reflecting on information literacy, and only 13% on critical thinking.

The thematic discussion grouped and synthesised the articles in this review according to the main themes of fake news, information literacy and critical thinking. The Fake news and accountability discussion raised the question of who becomes accountable for the spreading of fake news between social media and the user. The articles presented a conclusion that fact-checking algorithms are not successful in reducing the dissemination of fake news. The discussion also included a focus on fake news and student research , whereby a Stanford History Education Group study revealed that students are not well educated in thinking critically and identifying real from fake news [ 4 ]. The Fake news and social media discussion provided insight on social media is the leading source of news as well as a contributor to fake news. It provides a challenge for consumers who are not able to think critically about online news, or have basic information literacy skills that can aid in identifying fake news. Fake news that drive politics highlighted fake news’ role in politics, particularly the 2016 US presidential elections and the influence it had on the voters [ 22 ].

Information literacy related publications highlighted the need for educating the public on being able to identify fake news, as well as the benefits of having information literacy as a life skill [ 1 , 9 , 17 ]. It was shown that students are often misinformed about the potential benefits of library services. The authors suggested that university libraries should become more recognised and involved as role-players in providing and assisting with information literacy skills.

The articles that focused on critical thinking pointed out two areas where a lack of critical thinking prevented readers from discerning between accurate and false information. In the one case, it was shown that people’s confidence in their ability to find information online gave made them overly confident about the accuracy of that information [ 21 ]. In the other case, it was shown that dogmatism and religious fundamentalism, which led people to believe certain fake news, were associated with a lack of critical thinking and a questioning mind-set [ 21 ].

The articles that focused on information literacy and critical thinking were in agreement on the value of promoting and teaching these skills, in particular to the university students who were often the subjects of the studies performed.

This review identified 22 articles that were synthesised and used as evidence to determine the role of critical thinking in identifying fake news. The articles were classified according to year of publication, country of publication, type of publication and theme. Based on the descriptive statistics, fake news has been a growing trend in recent years, predominantly in the US since the presidential election in 2016. The research presented in most of the articles was aimed at the assessment of students’ ability to identify fake news. The various studies were consistent in their findings of research subjects’ lack of ability to distinguish between true and fake news.

Information literacy emerged as a new theme from the studies, with Rose-Wiles [ 9 ] advising academic institutions to teach information literacy and encourage students to think critically when accessing online news. The potential role of university libraries to assist in not only teaching information literacy, but also assisting student to evaluate the credibility of online information, was highlighted. The three articles that explicitly dealt with critical thinking, all found critical thinking to be lacking among their research subjects. They further indicated how this lack of critical thinking could be linked to people’s inability to identify fake news.

This review has pointed out people’s general inability to identify fake news. It highlighted the importance of information literacy as well as critical thinking, as essential skills to evaluate the credibility of online information.

The limitations in this review include the use of students as the main participants in most of the research - this would indicate a need to shift the academic focus towards having the general public as participants. This is imperative because anyone who possesses a mobile device is potentially a contributor or distributor of fake news.

For future research, it is suggested that the value of the formal teaching of information literacy at universities be further investigated, as a means to assist students in assessing the credibility of online news. Given the very limited number of studies on the role of critical thinking to identify fake news, this is also an important area for further research.

Contributor Information

Marié Hattingh, Email: [email protected] .

Machdel Matthee, Email: [email protected] .

Hanlie Smuts, Email: [email protected] .

Ilias Pappas, Email: [email protected] .

Yogesh K. Dwivedi, Email: moc.liamg@ideviwdky .

Matti Mäntymäki, Email: [email protected] .

  • Share full article

Advertisement

Supported by

Don’t Go Down the Rabbit Hole

Critical thinking, as we’re taught to do it, isn’t helping in the fight against misinformation.

critical thinking in fake news

By Charlie Warzel

Mr. Warzel is an Opinion writer at large.

For an academic, Michael Caulfield has an odd request: Stop overthinking what you see online.

Mr. Caulfield, a digital literacy expert at Washington State University Vancouver, knows all too well that at this very moment, more people are fighting for the opportunity to lie to you than at perhaps any other point in human history.

Misinformation rides the greased algorithmic rails of powerful social media platforms and travels at velocities and in volumes that make it nearly impossible to stop. That alone makes information warfare an unfair fight for the average internet user. But Mr. Caulfield argues that the deck is stacked even further against us. That the way we’re taught from a young age to evaluate and think critically about information is fundamentally flawed and out of step with the chaos of the current internet.

“We’re taught that, in order to protect ourselves from bad information, we need to deeply engage with the stuff that washes up in front of us,” Mr. Caulfield told me recently. He suggested that the dominant mode of media literacy (if kids get taught any at all) is that “you’ll get imperfect information and then use reasoning to fix that somehow. But in reality, that strategy can completely backfire.”

In other words: Resist the lure of rabbit holes, in part, by reimagining media literacy for the internet hellscape we occupy.

It’s often counterproductive to engage directly with content from an unknown source, and people can be led astray by false information. Influenced by the research of Sam Wineburg, a professor at Stanford, and Sarah McGrew, an assistant professor at the University of Maryland, Mr. Caulfield argued that the best way to learn about a source of information is to leave it and look elsewhere , a concept called lateral reading .

For instance, imagine you were to visit Stormfront, a white supremacist message board, to try to understand racist claims in order to debunk them. “Even if you see through the horrible rhetoric, at the end of the day you gave that place however many minutes of your time,” Mr. Caulfield said. “Even with good intentions, you run the risk of misunderstanding something, because Stormfront users are way better at propaganda than you. You won’t get less racist reading Stormfront critically, but you might be overloaded by information and overwhelmed.”

Our current information crisis, Mr. Caulfield argues, is an attention crisis.

“The goal of disinformation is to capture attention, and critical thinking is deep attention,” he wrote in 2018 . People learn to think critically by focusing on something and contemplating it deeply — to follow the information’s logic and the inconsistencies.

That natural human mind-set is a liability in an attention economy. It allows grifters, conspiracy theorists, trolls and savvy attention hijackers to take advantage of us and steal our focus. “Whenever you give your attention to a bad actor, you allow them to steal your attention from better treatments of an issue, and give them the opportunity to warp your perspective,” Mr. Caulfield wrote.

One way to combat this dynamic is to change how we teach media literacy: Internet users need to learn that our attention is a scarce commodity that is to be spent wisely.

In 2016, Mr. Caulfield met Mr. Wineburg, who suggested modeling the process after the way professional fact checkers assess information. Mr. Caulfield refined the practice into four simple principles:

2. Investigate the source.

3. Find better coverage.

4. Trace claims, quotes and media to the original context.

Otherwise known as SIFT.

Mr. Caulfield walked me through the process using an Instagram post from Robert F. Kennedy Jr., a prominent anti-vaccine activist, falsely alleging a link between the human papillomavirus vaccine and cancer. “If this is not a claim where I have a depth of understanding, then I want to stop for a second and, before going further, just investigate the source,” Mr. Caulfield said. He copied Mr. Kennedy’s name in the Instagram post and popped it into Google. “Look how fast this is,” he told me as he counted the seconds out loud. In 15 seconds, he navigated to Wikipedia and scrolled through the introductory section of the page, highlighting with his cursor the last sentence, which reads that Mr. Kennedy is an anti-vaccine activist and a conspiracy theorist.

“Is Robert F. Kennedy Jr. the best, unbiased source on information about a vaccine? I’d argue no. And that’s good enough to know we should probably just move on,” he said.

He probed deeper into the method to find better coverage by copying the main claim in Mr. Kennedy’s post and pasting that into a Google search. The first two results came from Agence France-Presse’s fact-check website and the National Institutes of Health. His quick searches showed a pattern: Mr. Kennedy’s claims were outside the consensus — a sign they were motivated by something other than science.

The SIFT method and the instructional teaching unit (about six hours of class work) that accompanies it has been picked up by dozens of universities across the country and in some Canadian high schools. What is potentially revolutionary about SIFT is that it focuses on making quick judgments. A SIFT fact check can and should take just 30, 60, 90 seconds to evaluate a piece of content.

The four steps are based on the premise that you often make a better decision with less information than you do with more. Also, spending 15 minutes to determine a single fact in order to decipher a tweet or a piece of news coming from a source you’ve never seen before will often leave you more confused than you were before. “The question we want students asking is: Is this a good source for this purpose, or could I find something better relatively quickly?” Mr. Caulfield said. “I’ve seen in the classroom where a student finds a great answer in three minutes but then keeps going and ends up won over by bad information.”

SIFT has its limits. It’s designed for casual news consumers, not experts or those attempting to do deep research. A reporter working on an investigative story or trying to synthesize complex information will have to go deep. But for someone just trying to figure out a basic fact, it’s helpful not to get bogged down. “We’ve been trained to think that Googling or just checking one resource we trust is almost like cheating,” he said. “But when people search Google, the best results may not always be first, but the good information is usually near the top. Often you see a pattern in the links of a consensus that’s been formed. But deeper into the process, it often gets weirder. It’s important to know when to stop.”

Christina Ladam, an assistant political science professor at the University of Nevada, Reno, has seen the damage firsthand. While teaching an introductory class as a Ph.D. student in 2015, she noticed her students had trouble vetting sources and distinguishing credible news from untrustworthy information. During one research assignment on the 2016 presidential race, multiple students cited a debunked claim from a satirical website claiming that Ben Carson, a candidate that year, had been endorsed by the Ku Klux Klan. “Some of these students had never had somebody even talk to them about checking sources or looking for fake news,” she told me. “It was just uncritical acceptance if it fit with the narrative in their head or complete rejection if it didn’t.”

Ms. Ladam started teaching a SIFT-based media literacy unit in her political science classes because of the method’s practical application. The unit is short, only two weeks long. Her students latched onto quick tricks like how to hover over a Twitter handle and see if the account looks legitimate or is a parody account or impersonation. They learned how to reverse image search using Google to check if a photo had been doctored or if similar photos had been published by trusted news outlets. Students were taught to identify claims in Facebook or Instagram posts and, with a few searches, decide — even if they’re unsure of the veracity — whether the account seems to be a trustworthy guide or if they should look elsewhere.

The goal isn’t to make political judgments or to talk students out of a particular point of view, but to try to get them to understand the context of a source of information and make decisions about its credibility. The course is not precious about overly academic sources, either.

“The students are confused when I tell them to try and trace something down with a quick Wikipedia search, because they’ve been told not to do it,” she said. “Not for research papers, but if you’re trying to find out if a site is legitimate or if somebody has a history as a conspiracy theorist and you show them how to follow the page’s citation, it’s quick and effective, which means it’s more likely to be used.”

As a journalist who can be a bit of a snob about research methods, it makes me anxious to type this advice. Use Wikipedia for quick guidance! Spend less time torturing yourself with complex primary sources! A part of my brain hears this and reflexively worries these methods could be exploited by conspiracy theorists. But listening to Ms. Ladam and Mr. Caulfield describe disinformation dynamics, it seems that snobs like me have it backward.

Think about YouTube conspiracy theorists or many QAnon or anti-vaccine influencers. Their tactic, as Mr. Caulfield noted, is to flatter viewers while overloading them with three-hour videos laced with debunked claims and pseudoscience, as well as legitimate information. “The internet offers this illusion of explanatory depth,” he said. “Until 20 seconds ago, you’d never thought about, say, race and IQ, but now, suddenly, somebody is treating you like an expert. It’s flattering your intellect, and so you engage, but you don’t really stand a chance.”

What he described is a kind of informational hubris we have that is quite difficult to fight. But what SIFT and Mr. Caulfield’s lessons seem to do is flatter their students in a different way: by reminding us our attention is precious.

The goal of SIFT isn’t to be the arbiter of truth but to instill a reflex that asks if something is worth one’s time and attention and to turn away if not. Because the method is less interested in political judgments, Mr. Caulfield and Ms. Ladam noticed, students across the political spectrum are more likely to embrace it. By the end of the two-week course, Ms. Ladam said, students are better at finding primary sources for research papers. In discussions they’re less likely to fall back on motivated reasoning. Students tend to be less defensive when confronted with a piece of information they disagree with. Even if their opinions on a broader issue don’t change, a window is open that makes conversation possible. Perhaps most promising, she has seen her students share the methods with family members who post dubious news stories online. “It sounds so simple, but I think that teaching people how to check their news source by even a quick Wikipedia can have profound effects,” she said.

SIFT is not an antidote to misinformation. Poor media literacy is just one component of a broader problem that includes more culpable actors like politicians, platforms and conspiracy peddlers. If powerful, influential people with the ability to command vast quantities of attention use that power to warp reality and platforms don’t intervene, no mnemonic device can stop them. But SIFT may add a bit of friction into the system. Most important, it urges us to take the attention we save with SIFT and apply it to issues that matter to us.

“Right now we are taking the scarcest, most valuable resource we have — our attention — and we’re using it to try to repair the horribly broken information ecosystem,” Mr. Caulfield said. “We’re throwing good money after bad.”

Our focus isn’t free, and yet we’re giving it away with every glance at a screen. But it doesn’t have to be that way. In fact, the economics are in our favor. Demand for our attention is at an all-time high, and we control supply. It’s time we increased our price.

The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips . And here’s our email: [email protected] .

Follow The New York Times Opinion section on Facebook , Twitter (@NYTopinion) and Instagram .

An earlier version of this article misattributed a quotation about determining the reliability of a news source. It was Michael Caulfield — not Robert F. Kennedy Jr. — who said, “The question we want students asking is: Is this a good source for this purpose, or could I find something better relatively quickly?”

How we handle corrections

Charlie Warzel , a New York Times Opinion writer at large, covers technology, media, politics and online extremism. He welcomes your tips and feedback: [email protected]  | @ cwarzel

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • My Account Login
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Review Article
  • Open access
  • Published: 12 June 2023

Distractions, analytical thinking and falling for fake news: A survey of psychological factors

  • Adrian Kwek   ORCID: orcid.org/0000-0002-9405-0601 1 ,
  • Luke Peh 2 ,
  • Josef Tan 3 &
  • Jin Xing Lee 4  

Humanities and Social Sciences Communications volume  10 , Article number:  319 ( 2023 ) Cite this article

3097 Accesses

1 Citations

8 Altmetric

Metrics details

  • Cultural and media studies
  • Science, technology and society

Analytical thinking safeguards us against believing or spreading fake news. In various forms, this common assumption has been reported, investigated, or implemented in fake news education programs. Some have associated this assumption with the inverse claim, that distractions from analytical thinking may render us vulnerable to believing or spreading fake news. This paper surveys the research done between 2016 and 2022 on psychological factors influencing one’s susceptibility to believing or spreading fake news, considers which of the psychological factors are plausible distractors to one’s exercise of analytical thinking, and discusses some implications of considering them as distractors to analytical thinking. From these, the paper draws five conclusions: (1) It is not analytical thinking per se, but analytical thinking directed to evaluating the truth that safeguards us from believing or spreading fake news. (2) While psychological factors can distract us from exercising analytical thinking and they can also distract us in exercising analytical thinking. (3) Whether a psychological factor functions as a distractor from analytical thinking or in analytical thinking may depend on contextual factors. (4) Measurements of analytical thinking may not indicate vulnerability to believing or spreading fake news. (5) The relevance of motivated reasoning to our tendency to believe fake news should not yet be dismissed. These findings may be useful to guide future research in the intersection of analytical thinking and susceptibility to believing or spreading fake news.

Similar content being viewed by others

critical thinking in fake news

Determinants of behaviour and their efficacy as targets of behavioural change interventions

critical thinking in fake news

Adults who microdose psychedelics report health related motivations and lower levels of anxiety and depression compared to non-microdosers

critical thinking in fake news

Mechanisms linking social media use to adolescent mental health vulnerability

Introduction.

Fake news has deleterious effects on society. The effects range from destabilizing society by cleaving racial and religious fault lines (Grambo, 2019 ), to influencing national elections (Alcott and Gentzkow, 2017 ), to derailing public health policy implementations (Waszak et al., 2018 ). Since the gradual appearance and entrenchment of the term “fake news” in the popular lexicon in 2016, much research has been done on the psychological and environmental determinants of people’s responses to fake news, as well as the effectiveness of interventions designed to reduce the deleterious effects of fake news on society. This review begins with Pennycook and Rand’s ( 2019 ) observation that people fall for fake news because they are distracted from thinking analytically. After elaborating on Pennycook and Rand’s observation and orienting it to the sharing of fake news, we conduct a survey of the relevant literature from 2016 to 2022 about possible psychological distractors that cause us to fall for fake news.

Disinformation is information that is (i) false, (ii) communicated as true, and (iii) intentionally communicated as true in order to influence people’s beliefs or behavior (see Duffy et al., 2020 ; McGonagle, 2017 ). Item (iii) distinguishes disinformation as a type of misinformation. Misinformation is information that is false and communicated as true but without the intention for the impression of its truth to influence people’s beliefs or behavior (see Ireton and Posetti, 2018 ; Wardle and Derakhshan, 2017 ). False information that is published as news due to sloppy journalism is an example of misinformation. Disinformation includes intentionally false propaganda, intentionally false advertising, and fake news. Fake news is distinguished from other types of disinformation by being designed to mislead people that the content is news, for example, by mimicking the visual format of reputable news websites.

There are at least three senses of the term “falling for fake news”. When one falls for fake news, one believes the disinformation, cannot distinguish it from genuine news, and/or retransmits it. We take Pennycook and Rand’s ( 2019 ) observation that people fall for fake news because they are distracted from thinking analytically as our starting point. Being saved from falling for misinformation of any kind by thinking carefully about the information is a common experience that many people have. This experience lends intuitive support to the assumption that we tend to fall for misinformation when we do not think carefully enough about the information.

For many studies, one’s propensity and competency in “thinking carefully” about information is indicated by the completion of analytical thinking scales. Pennycook and Rand ( 2018 , 2019 ), Nurse et al. ( 2022 ), and Pehlivanoglu et al. ( 2021 , 2022 ) use the cognitive reflection test (CRT). The CRT measures one’s propensity to engage in slow and careful thinking. It presents the subject with a question that admits to a quick but wrong answer and a correct answer that requires slow and careful thinking. Other tests indicating one’s propensity to think carefully about a subject matter include tests of conscientiousness (Buchanan, 2020 ; Lawson and Kakkar, 2022 ; Li et al., 2022 ) and tests of argumentative ability (Lantian et al., 2021 ).

While there have been literature reviews about psychological factors that render us susceptible to fake news, we have not found any that focus on the relation between the factors and analytical thinking directly. Due to the large number of disinformation-susceptibility studies investigating psychological factors and analytical thinking, an analytical review of such studies can help us to understand the extent to which analytical thinking can “save” us from falling for fake news.

Research question

What psychological distractions to analytical thinking mediate the belief in and/or retransmission of fake news?

Methodology

Taking Pennycook and Rand’s ( 2019 ) paper as the seed article for our project, we identified key terms from it and its reference list by relevance to our research question, frequency of usage in journal article titles and abstracts, and terms with similar meanings to these terms that also frequently appear in titles and abstracts. We conducted a critical review (Grant et al., 2009 ) by searching for relevant literature on our university e-journal database and Google Scholar from 2016 onwards with the following 9 search strings:

(spread OR disseminat*) AND (“fake news” OR misinform* OR disinform*) AND (belie* OR identif* OR discern* OR assess* OR rat* OR evaluat*)

(spread OR disseminat*) AND (“fake news” OR misinform* OR disinform*)

(spread OR disseminat*) AND (“fake news” OR misinform* OR disinform*) AND (belie* OR identif* OR discern* OR assess* OR rat* OR evaluat*) AND (reflect* OR think* OR reflex* OR cogni* OR reason* OR motivated OR bullshit* OR profound)

(spread OR disseminat*) AND (“fake news” OR misinform* OR disinform*) AND (reflect* OR think* OR reflex* OR cogni* OR reason* OR motivated OR bullshit* OR profound)

(spread OR disseminat*) AND (“fake news” OR misinform* OR disinform*) AND (belie* OR identif* OR discern* OR assess* OR rat* OR evaluat*) AND (heuristic* OR familiar* OR source OR credib* OR “confirmation bias”)

(spread OR disseminat*) AND (“fake news” OR misinform* OR disinform*) AND (heuristic* OR familiar* OR source OR credib* OR “confirmation bias”)

(spread OR disseminat*) AND (“fake news” OR misinform* OR disinform*) AND (signal* OR reputation OR identity OR virtue OR overclaim*) AND (prosocial OR pro-social OR moral* OR outrage* OR punish* OR interesting* OR entertain* OR gossip* OR rumor)

(spread OR disseminat*) AND (“fake news” OR misinform* OR disinform*) AND (signal* OR reputation OR identity OR virtue OR overclaim*)

(spread OR disseminat*) AND (“fake news” OR misinform* OR disinform*) AND (prosocial OR pro-social OR moral* OR outrage* OR punish* OR interesting* OR entertain* OR gossip* OR rumor)

For repeated search results, we omitted the redundant items. In order to focus on psychological distractors mediating belief and retransmission of disinformation, we further filtered the search results with the following 5 exclusion criteria and respective rationales:

In this section, we summarize the contents of empirical studies that relate to our research questions. By searching Google Scholar and the journal databases that our university library has access to, and by manual filtering based on the above five criteria, we arrived at 87 relevant articles. In this section, we present our curation of articles according to psychological factors that are covered by research in the 2016–2022 time period.

Emotions can influence one’s responses to fake news. Martel et al. ( 2020 ) suggest the following influences on finding fake news persuasive. First, certain moods associated with happiness or higher motivation generally correlate with one’s tendency to believe false information and negatively correlate with one’s capacity to tell when one is being deceived; and conversely, that certain moods associated with sadness or lower motivation generally correlate with doubting and disbelief. Second, when one is angry, one is more likely to depend on heuristic cues in deciding whether to believe information and less likely when one is sad. Third, anxiety tends to render one more willing to entertain perspectives contrary to one’s own, while anger tends to decrease the propensity. Maldonado ( 2019 ) suggests that emotion is inextricably intertwined with reasoning, and confirmation bias may elicit positive emotions that have a physiological basis in dopamine production. Thus, certain emotions can not only render us more likely to believe in fake news but also make it pleasurable to do so.

In their own research, Martel et al. ( 2020 ) found that depending on one’s emotions in processing false news information made one more likely to believe them. Horner et al. ( 2021 ) have shown heightened emotions to positively correlate with finding fake news aligning with one’s existing beliefs persuasive and with disseminating disinformation, as well as withholding information incompatible with existing beliefs. The emotions of anxiety and anger have active causal roles. They cause one to fall for fake news in order to act with the information. In contrast, sadness is passive in the sense that it causes one to miss signs that the information is fake news, and to proceed as though it is genuine news.

Weimann-Saks et al., ( 2022 ) were unable to detect a mediating role played by emotions between one’s cognition of the information and one’s subsequent actions based on the cognition. The information in question is rumors, which share with fake news the quality of being information that is unverified but communicated as true. This finding does not detract from the general tenor that dependence on emotions renders one more susceptible to believing fake news as the subsequent actions in question pertain to spreading rather than believing the information. It does, however, indicate that one should separately evaluate the role of emotions in the believing of fake news and the role of emotions in the spreading of fake news.

The higher the stakes involved and the greater the perceived risk, the greater the anxiety. Anxiety, in turn, causes one to err on the side of treating the information as true and disseminating it as such, even when one is not sure. This phenomenon was widespread in the spreading of Covid-19 disinformation about ways to detect, prevent or treat the disease during the pandemic by well-meaning family and friends for whom the stakes could be the death of a loved one. Oh and Lee ( 2019 ) did a relatively early study linking health anxiety, together with health literacy and one’s perception of the significance of the information, to checking on the veracity of the information and spreading it. Laato et al. ( 2020 ) showed that one’s perception of danger posed by Covid-19 together with one’s assessment of susceptibility contributed to one’s propensity to share the information without verification. These studies suggest that the anxiety caused by the fear of death can lead to the spreading of fake news. Sun et al. ( 2020 ) showed that the elderly are more apt to retransmit unverified information the more they believe the information and the more anxious they are about it. Su ( 2021 ) showed that, when anxiety was taken into consideration as a mediator, the positive relationship between social media use and misinformation beliefs increased in significance. Kim and Kim ( 2020 ) showed that apparent risk and stigma increase belief in false information about Covid-19. Li et al. ( 2022 ) showed that such propensities are compounded by personality traits like extroversion, emotional instability and conscientiousness. Li et al. ( 2022 ) showed that people were more persuaded by Covid-19 misinformation and more likely to share it when their perception of risk from Covid-19 deaths increased and they experienced more intense negative emotions. Ahmed ( 2022 ) presents a different type of anxiety—the fear of missing out (FOMO). Ahmed shows FOMO positively correlating with the tendency to retransmit deep fakes, which is enhanced by low cognitive ability or increased social media use.

Chuai and Zhao ( 2022 ) showed that increased anger associated with a piece of fake news positively correlates with its increased virality and hypothesize that the anger incentivizes the retransmission of the information. Deng and Chau ( 2021 ) showed that consumers of online news become skeptical of content when the content is laced with angry expressions. This shows that the locus of anger is significant. If so, then the information should evoke anger rather than contain angry language. Han et al. ( 2020 ) showed that anger increased one’s propensity to disseminate disinformation by causing them to find disinformation scientifically persuasive. Bago et al., ( 2022 ) showed that anger is correlated with decreasing truth discernment of headlines, and also that truth discernment improved when the subjects actively reduce their emotions. Moral outrage and the anger that it generates have also been shown to make people believe disinformation. Ali et al. ( 2022a ) studied both fear and anger. They found that fear tended to make people who are skeptical of vaccines share false anti-vaccine information while anger tended to make people who are neither skeptical nor endorsing vaccines share false anti-vaccine information.

Repeated exposure

Repeated exposure to fake news may cause us to believe or retransmit the information without thinking analytically about its truth. Pennycook et al. ( 2018 ) found that even one encounter with disinformation can cause subjects to believe that the information is accurate. They hypothesize that this is due to the fluency of subsequent processing after the initial exposure. The effect persists even when the information is flagged as controversial by debunking services. In a different study about how visual images affect people’s accuracy judgments, Smelter and Calvillo ( 2020 ) corroborated Pennycook et al.’s findings that repeated exposure increases people’s propensity to judge false information as accurate. Repeated exposure to disinformation can also increase people’s propensity to retransmit false information. Effron and Raj ( 2020 ) found that subjects considered it less ethically wrong to retransmit headlines that are labeled false when they are exposed to them up to four times. This is so even when subjects did not believe the headlines. In Nadarevic et al. ( 2020 )’s investigation of the joint effect of several factors on accuracy evaluations, source credibility together with repeated exposure was found to have a robust effect on accuracy evaluations. Swire et al. ( 2017 ) found that familiarity caused by repeated exposure contributes to the continuing influence effect, where subjects believe disinformation even after being informed of its correction. They found the elderly to be more susceptible and found that explanations and attention to the facts work to dispel the continued influence effect.

Altruism is the desire to benefit one’s community without seeking recompense for oneself. It has been identified as one of the most important reasons for retransmitting fake news (Balakrishnan et al., 2021 ; Apuke and Omar, 2021 ). This can be as wide as the community of human beings or as parochial as one’s in-group. Anxiety can be associated with altruism when one’s desire to benefit others stems from one’s concern about their wellbeing. The studies that are relevant to altruism as a contributing factor to one’s susceptibility to disinformation are of two types. The first type studies the desire to benefit as a factor. Balakrishnan et al. ( 2021 ) found among a Malaysian sample that altruism was a significant factor in accounting for the retransmission of disinformation. The second type studies collectivism as a factor. Duffy and Tan ( 2022 ) found that, when unsure whether news information is true or false, subjects would share it because sharing contributes in part to the cohesiveness of the group, just as rumor does. Sun et al. ( 2020 ) and Weimann-Saks et al. ( 2022 ) also found similarities between the retransmission of disinformation and rumors. Comparing Chinese with American collectivism scores, Lin et al. ( 2022 ) found that Chinese subjects with increased collectivism scores tended to interpret arbitrarily produced and unclear messages as communicating significant information. They found that even fleeting experiences of collectivism caused subjects to find meaning in such information.

Identity protection or enhancement

Factors pertaining to the protection or display of one’s identity can contribute to one’s susceptibility to disinformation. Pennycook and Rand ( 2020 ) found that people who “overclaim”, that is, who signal that they know more than what they actually do are more susceptible to believing disinformation. Islam et al. ( 2020 ) found, among other factors, that self-promotion is positively correlated with the retransmission of information without checking its veracity. Littrell et al. ( 2021 ) found that people who indulge in “persuasive bullshitting”, that is, who produce untruths aiming to impress or convince an audience, are more susceptible to believing disinformation. Pereira et al. ( 2021 ) studied the effect of subjects’ interest in protecting their identity with respect to political affiliation on their credence and tendency to share news information and found a positive correlation. This study enables us to link confirmation bias or motivated reasoning with identity in susceptibility to disinformation. Druckman et al. ( 2021 ) found that subjects belonging to ethnic groups that are in the minority in society, who are fervently religious, or who have robust political allegiances are more likely to hold false beliefs. However, Islam et al. ( 2020 ) had earlier found that being religious is associated with a decrease in retransmission of unconfirmed content.

Confirmation bias

There are numerous studies on the effect of confirmation bias and people’s propensity to fall for fake news. In a study about confirmation bias in finding information about climate change persuasive and effective, Zhou and Shen ( 2022 ) established strong positive correlations. Bauer and Clemm von Hohenberg ( 2021 ) found that subjects who were supplied with false information that aligned with their beliefs were more likely to trust news information from the same source. Horner et al. ( 2021 ), in a study on emotions driving credence in false headlines and their dissemination, found that subjects had a greater tendency to believe information that supports beliefs they already possess. Tandoc et al. ( 2021 ) discovered an association between people’s credence in disinformation and political affiliation. The link between political affiliation and falling for fake news was corroborated by Pereira et al. ( 2021 ), who narrowed it to confirmation bias specifically, defined as a preference for news that dovetail with their prior stereotypical beliefs. Michael and Breaux ( 2021 ) explored the converse relation, between political affiliation and skepticism. They found that there is a tendency for people to be skeptical about news information -- i.e., regarded as “fake news”-- information that is incompatible with their political stances. This finding is corroborated by Baxter et al. ( 2019 ) in Scotland, Bozdağ and Koçer ( 2022 ) in Turkey, Hameleers and Brosius ( 2022 ) in the Netherlands, and Tsang ( 2021 ) in Hong Kong. Michael and Sanson ( 2021 ) studied confirmation bias as an effect of the joint influence of political affiliation and choice of news sources, which in turn affect people’s ability to distinguish real from fake news. In a similar vein, Traberg and van der Linden ( 2022 ) found that political affiliation mediated by perceptions of source credibility influences people to judge information aligned with their political orientations to be more accurate. Doing a comparison between conspiracy mindset and political affiliation, Faragó et al. ( 2020 ) found that the effect of political affiliation mediated by source credibility in influencing people to believe “wishful thinking” political disinformation was greater than that of conspiracy mindset. Specifically pertaining to one’s propensity to believe and one’s propensity to share disinformation, Turel and Osatuyi ( 2021 ) found that political affiliation is positively correlated to them, mediated by factors like the political affiliation of one’s social network. Pearson and Knobloch-Westerwick ( 2019 ) found that there is less confirmation bias in consuming printed vs. online news. Some studies cast doubt on the strong association between confirmation bias and falling for fake news. Baptista et al. ( 2021 ), found that participants with conservative views were more likely to believe and spread disinformation, regardless of their political orientation. Corroborating this finding, Calvillo et al. ( 2020 ) found that subjects with a conservative political orientation tend to be less accurate in distinguishing real from fake news.

Source credibility

The perceived authoritativeness pertaining to source credibility has received much attention (see Buchanan and Benson, 2019 ; Buchanan, 2020 ; Pehlivanoglu et al., 2021 ; Folkvord et al., 2022 ). Sterrett et al. ( 2019 ) distinguish between two factors that lead subjects to find social media news credible—the credibility of the individual who retransmits the information and the credibility of the news platform. Nadarevic et al. ( 2020 ) investigated the interaction between the perceived credibility of the source with other factors in truth discrimination. Faragó et al. ( 2020 ) found that perceived source credibility is an important mediator between political affiliation and belief in disinformation. The type of information sources that one holds to be authoritative can indicate one’s propensity to fall for fake news. Bonafé-Pontes et al. ( 2021 ) found for Brazilian participants that those who trusted the credibility of social media news tended to be worse at truth discrimination; and conversely, those who trust the WHO and traditional media (newspapers, radio and television) tended to be better at truth discrimination. However, Tsang ( 2021 ) did not find any association between types of sources and the perception that a piece of “news” is false. Furthermore, Hameleers et al. ( 2022 ) found that people who tend to identify fake news are those who tend to distrust mainstream news platforms and who tend to engage more with non-mainstream news platforms. Zimmermann and Kohring ( 2020 ) bridge the gap between distrust in news and belief in disinformation. Xiao et al. ( 2021 ) found that trust in social media news is an important mediator between the consumption of social media news and conspiracy ideation. De Coninck et al. ( 2021 )’s and Melki et al. ( 2021 )’s results corroborate others’ findings as follows: engagement with traditional media correlates negatively with belief in disinformation and conspiracy theories; engagement with political personalities, social media and personal social networks correlates positively with belief in disinformation and conspiracy theories. However, de Connick et al. ( 2021 ) also found that engagement with healthcare specialists correlates negatively with belief in conspiracy theories only, but does not reduce credence in disinformation. In a similar vein, Lobato et al. ( 2020 ) found low trust in mainstream medicine was among the factors associated with vulnerability to health disinformation. Hopp et al. ( 2020 ) found that people who have extreme beliefs, high distrust of mainstream media and high social distrust are also more likely to retransmit fake news. Finally, Laato et al. ( 2020 ) found that subjects’ propensity to retransmit information without checking its veracity is associated with their faith in information on the internet coupled with having to cope with too much information.

Social endorsement

The perception of social endorsement constitutes a second basis for people’s trust in the veracity of a message or willingness to share it. Studies about the perception of social endorsement can be subdivided into those where trust is based on some perceived reputational aspect of the source that is unrelated to the veracity of the message, and those where trust is based on the perceived popular reception of the message. An example of trust based on some perceived reputational aspect of the source is a celebrity source of a message about climate change as opposed to a scientific source. Sterrett et al. ( 2019 ) showed that social media news coming from societal elites is more credible than news coming from an established news outlet. An example of trust based on perceived popular reception of the message is the number of “likes” or comments that a message receives. Keselman et al. ( 2021 ) found that recommendations by their friends, good reviews and favorable but uncited research increases sharing proclivities. However, Buchanan ( 2020 ) found that neither source authority nor evidence of popular engagement with the message influenced sharing proclivities. Harff et al. ( 2022 ) found that relationships with online influencers did not make subjects more believing of the influencers’ claims. Avram et al. ( 2020 ) found that subjects’ awareness of the quantities of sharing and liking that a message of dubious veracity receives contributes to the risk of believing and spreading it. Ali et al. ( 2022b ) showed that a large amount of “likes” tends to increase the believability of a message. They theorized that a desire to embellish the information explains why people also tend to share such messages. Mena et al. ( 2020 ) found that the perceived trustworthiness of a source contributes to the credibility of a false message. Ren et al. ( 2021 ) offer a novel perspective. They showed that people can share messages that they are skeptical about because their decision is based on balancing between message veracity and message engagement by others. The prospect of getting lots of likes and comments factors in their decision to share a message of dubious veracity.

Conspiracy thinking

Anthony and Moulding ( 2019 ) found that a conspiratorial worldview conduces to someone’s credence in fake news and that other factors like normlessness relate to credence in fake news through their influence in conspiratorial thinking. On a large-scale study ( N  = 38,113), Kantorowicz-Reznichenko et al. ( 2022 ) found that subjects who tend to rely on conspiratorial thinking tend not to change their behavior to align with public health messaging during the Covid-19 pandemic and that people who think deliberately tend not to rely on conspiratorial thinking. This latter finding corroborates Lantian et al. ( 2021 ) findings that analytical thinking competency in the argumentative context correlates negatively with a conspiratorial worldview. Calvillo et al. ( 2021 ) found that among other factors, subjects who tended to have conspiratorial worldviews tended also to believe false political headlines. How might subjects who rely on conspiratorial thinking tend to be resistant to changing their behavior for public health promotion? Hughes et al. ( 2022 ) found a negative correlation between credence in conspiracy ideation and obedience to public health instructions because the subjects perceived low health risk coupled with high risk to livelihood and liberty. Lobato et al. ( 2020 ) found that people who scored high on social dominance orientation tend to retransmit disinformation, especially conspiratorial beliefs. Landrum and Olshansky ( 2019 ) found that, while conspiratorial and scientific orientations both have significant contributions to credence in scientific disinformation, the contribution of a conspiratorial worldview to resisting scientific knowledge is inconclusive. Miller et al. ( 2016 ) found that certain attributes of persons better predict belief in conspiracy theories, namely being well-informed about politics and exhibiting a high level of distrust.

Motivated reasoning

“Motivated reasoning” refers to the possibly unconscious exercise of one’s reasoning capacity ostensibly to arrive at a true conclusion, but to achieve some other interest than arriving at the truth. Motivated reasoning has been posited to be an important factor in one’s believing in or endorsing disinformation. A commonly studied form of motivated reasoning is reasoning in order to support one’s political affiliations. Pennycook and Rand ( 2019 ) found that subjects’ proclivity to engage in analytical thinking explains their competence at distinguishing true from false headlines even when the false headlines accord with their political affiliations. They conclude that it is inattentive thinking rather than motivated reasoning that contributes to one’s vulnerability to disinformation. Ross et al. ( 2021 ) corroborated Pennycook and Rand’s results with a large sample ( N  = 1973) and also found no significant correlation between analytic thinking and a greater inclination to retransmit messages that aligned with one’s political affiliations. Also corroborating Pennycook and Rand, Baptista et al. ( 2021 ) found for a sample of Portuguese participants that general political affiliation correlated with susceptibility to disinformation—right-wing participants tended to be more susceptible to left-wing participants, even for disinformation that is discordant with their political affiliation.

However, Calvillo and Smelter ( 2020 ) arrived at the opposite result. They found that subjects tended to judge headlines that conflict with their political affiliations as less accurate than headlines that align with their political affiliations, and that subjects that exhibit more analytic thinking were also more biased in their accuracy judgments of the headlines. Others also lend support to the role of motivated reasoning in susceptibility to fake news. Tsang ( 2021 ) found that subjects judged identical news information to be false to different levels based on their prior positions on the content, and took this to support the view that motivated reasoning contributes to the retransmission of disinformation. Michael and Sanson ( 2021 ) showed that people rely on heuristics that take less effort than analytic thinking in judging the veracity of the news. The bias in their distinguishing true from false headlines is due to the bias in the credence that they place on politically concordant sources rather than the information itself. This finding is corroborated by the findings of Traberg and van der Linden ( 2022 ), and goes against earlier findings (Clayton et al., 2019 ).

Stanley et al. ( 2022 )’s discussion gave some ways in which cognitive capacities involved in assessing truth can also contribute to people’s belief in disinformation, and can shed light on the mechanism linking motivated reasoning to belief in fake news. Vegetti and Mancosu ( 2020 ) found that, although people were inclined to perceive politically concordant messages as more believable, people who were well-informed politically can better distinguish true from fake news. With respect to sharing of fake news, Osmundsen et al. ( 2021 ) found that political affiliation engenders emotional factors like hatred, which fuels the retransmission of disinformation. Wischnewski et al. ( 2021 ) found partial evidence that subjects are more likely to retransmit messages that accord with their own political beliefs and attributed this to motivated reasoning.

Influential findings

From the 87 curated articles, we looked for those with the most influential findings. We took the number of citations as a proxy for the influence of a paper’s findings. Our inclusion threshold is the threshold number of citations for the 3rd yearly quartile among the articles we curated. Because the number of citations for a paper with influential findings in more recent years can be expected to be fewer than those that have been in the literature for a longer time, we calculated our inclusion threshold on a yearly basis. Based on this method, we included papers with more than 104 citations for 2019, papers with more than 124 citations for 2020, papers with more than 36 citations for 2021, and papers with more than 14 citations for 2022. The increase in the value of the threshold for 2020 over 2019 could be due to interest in pandemic-related disinformation. There was 1 paper each in the years 2016, 2017 and 2018 in our curation, with citation counts of 493, 250 and 1001, respectively. As quartile computation was not possible within these years, we approximated an increase of 1.5 times yearly over the combined 3rd quartile threshold of 2019 and 2020. This yielded the thresholds of 167, 250 and 375 for 2018, 2017 and 2016 respectively. Because they met these thresholds, the 3 papers were included. We had a total of 24 papers. In Appendix 2 consisting of 2 tables, we list the papers and summarize their methodology and findings in the first table. In the second table, we present the papers and the factors that they investigate to display relations between factors.

In this section, we identify five themes from the findings in the previous section and discuss them.

Anxiety vs. anger

According to Bodenhausen et al. ( 1994 ), feeling angry causes subjects to depend more on heuristic cues to arrive at judgment in an argumentative context but sadness causes subjects to depend less on the same cues. Forgas and East ( 2008 ) found that being in a bad mood causes one to be more generally disbelieving while being in a good mood causes one to be more vulnerable to being deceived. MacKuen et al. ( 2010 ) found that anger encourages heuristic as opposed to systematic reasoning, quickly relying on intuitive cues rather than following a logical order in thinking, while anxious information processing could cause one to entertain contrary perspectives (Martel et al., 2020 ). In general, the background literature aligns with the studies linking anger with an increased susceptibility to disinformation but appears to oppose the studies linking anxiety with an increased susceptibility to disinformation. This may be due to the possibility that anxiety influences our susceptibility to disinformation jointly with other emotions, beliefs and personality traits, while anger affects our motivation to rely on heuristic cues in judging the truth of information more directly. This possibility is supported by studies of anxiety among a raft of other factors, like risk perception and extraversion.

While the connection between anxiety or anger and cognitive processing like believing has gained wide attention, the connection between anxiety or anger and the other dimension of susceptibility to fake news—retransmitting it—is underexplored. Studies suggest support for the retransmission of fake news via the fact that we are more likely to retransmit what we believe to be true (Buchanan, 2020 ; Altay et al., 2022 ), including when we are motivated by altruism (Apuke and Omar, 2021 ) to help others with what we believe to be true, and believing to be true on the basis of trust (Sterrett et al., 2019 ; Laato et al., 2020 ; Melki et al., 2021 ). However, there appears to be inadequate coverage of mediators of anxiety or anger and retransmission of fake news. Solovev and Pröllochs ( 2022 ) found that disinformation spreads more quickly when they contain numerous instances of vocabulary that condemn others. This suggests that people may retransmit what they do not believe to be true in order to express their anger or to punish the perceived perpetrators (see Hartung et al., 2019 ). Osmundsen et al. ( 2021 ) found that hatred may contribute to retransmitting disinformation, which also suggests that punishment is a motivation for retransmission, which may mediate anger and retransmission.

Identity-protective mechanisms

The anxiety and anger that interfere with judgment may be connected to identity protection. In our literature review of identity, many of the factors studied like political affiliation, bullshitting and overclaiming are indicators of susceptibility to fake news—they correlate with one’s propensity to believe or retransmit disinformation. However, it is unclear what the mechanism linking these factors to identity is. One possibility is that, since these factors relate to one’s promotion and protection of one’s identity, some familiar emotions like anxiety (Wischnewski and Krämer, 2021 ) or anger mediate between one’s identity and one’s susceptibility to disinformation. Kahan et al. ( 2017 ) discuss identity protection. Affinity groups are groups consisting of people who share allegiance to a set of fundamental values. For Kahan, people invest a lot in preserving their positions in affinity groups and the reputation of affinity groups to which they belong. Wischnewski and Krämer ( 2021 ) linked Kahan’s thesis to anger and anxiety by postulating that, in encountering information that goes against one’s identity, people will feel angry or anxious, and the emotions will, in turn, cause them to resist the information even if it is true. Van Bavel and Pereira ( 2018 ) found that political affiliation can affect recall, tacit assessments and how we perceive. However, identity-protective mechanisms may not exhaust the space of identity as a factor influencing one’s susceptibility to fake news. One might, instead of protecting one’s identity, intend to signal some aspect of one’s identity in conveying information. This hypothesis is compatible with a range of factors influencing one’s susceptibility to spreading disinformation, for example, to signal that one is altruistic. We did not find any research on signaling as a factor in the retransmission of fake news in our curation. The closest was self-promotion (Islam et al., 2020 ), which can encompass signaling but can also be due to intentions to elicit others’ approval of oneself without going through the recognition that one possesses some valued attribute that is signaled through the retransmission.

Altruism, communitarianism and emotions

Altruism is one of the factors that receive attention in studies (e.g. Apuke and Omar, 2021 ). Altruism is a construct that consists of the motivation to benefit others without the expectation of personal recompense. Plume and Slade ( 2018 ) document many studies finding altruism as a motivation to disseminate content on social media. It is difficult to tease apart the motivation of pure altruism, communitarian beliefs (Lin, et al., 2022 ) and associated emotions. For example, a subject may report sharing COVID-19 disinformation on social media out of altruism where the motivator could be a communitarian mindset coupled with anxiety, or a subject could have a desire to share COVID-19 disinformation not from an articulable reason to help others without seeking recompense, but in pursuit of the good feelings that come from perceiving that one has helped others. Studies on communitarian mindsets and susceptibility to the retransmission of disinformation suggest that we should be looking at community-oriented goals and emotions generated in context rather than the general attribute of altruism per se. Given that even momentary priming of subjects to be community-oriented increased their propensity to find meaning in unclear messaging and may increase their propensity to share such messages if they trigger emotions like anxiety (Lin et al., 2022 ), focusing on altruism as a stable trait may obscure the context-sensitive nature of the attribute.

Bias triggers vs. emotions

The studies linking greater susceptibility to disinformation with repeated exposure to the disinformation suggest that there can be non-emotional distractors from thinking analytically. The link between repeated exposure and belief is well-documented in studies about the illusory truth effect. This is the phenomenon of finding information truer with repeated exposure to it. The phenomenon was first identified in Hasher et al.’s ( 1977 ) study where students were more likely to rate sentences as “true” after repeated exposures to them. Hassan and Barber ( 2021 ) survey the research in the context of disinformation. Studies on the continued influence effect show that the effect persists even in the face of correction of false information. A recent overview of research on the effect can be found in Kan et al. ( 2021 ). The illusory truth effect and the continued influence effect could account for “falling for fake news” in the sense of explaining why subjects believe fake news. However, Swire et al. ( 2017 ) showed that repeated exposure to a myth in corrections only served to slow down believing the correct information. They did not find evidence of the myth-repeating correction backfiring to promote belief in the myth. While the bulk of the literature pertains to believing disinformation, Effron and Raj ( 2020 ) study suggests that repeated exposure can also affect one’s motivation to retransmit disinformation, in their case, by decreasing the perceived ethicality of retransmitting disinformation headlines. Stepping back, we observe that repeated exposures differ from emotions in that they may render us susceptible to disinformation without involving emotions, yet also differ from those factors that operate within reasoning in that they do not appear to feature in reasoning about the truth. They may be simply tendencies we have that are triggered by our exposure to the world.

The “saving” function of analytical thinking

The literature about the significance of analytical thinking to susceptibility to fake news overwhelmingly relates analytical thinking to truth discernment (e.g. Martel et al., 2020 ; Pennycook and Rand, 2019 , 2020 ; Lantian et al., 2021 ; Ross et al., 2021 ). From here, if there is a link to the spreading of fake news, it is conceivably by way of the fact that we find the information true and want to inform others of the truth, or that we are not stopped from spreading the information by finding it false. This suggests the assumption that analytical thinking has a “saving” function—the exercise of analytical thinking facilitates one’s distinguishing truth from falsity and “saves” us from believing and/or spreading false information. Associated with the assumption is the idea that the better one is at analytical thinking or the more one is disposed to employ analytical thinking, the better one is at using the right heuristics to defend against the threat of fake news, for example, to correctly rely on scientific sources of information, to appropriately assess that social endorsement is relevant, and to accurately fact-check.

Pit against these ideas is the view that analytical thinking is a double-edged sword. It may facilitate credence in and spread of disinformation as much as it may prevent them. In this section, we consider three points that mitigate the significance of analytical thinking as a contributing factor in truth discernment and the retransmission of fake news. The first is that biases could operate in the analytical thinking of someone who does well in analytical thinking measures. The second is that someone who is good at analytical thinking or disposed to think analytically may also be good at or disposed to exercise motivated reasoning. The third is that good reasoning need not be directed at seeking the truth. Before that, let us take a closer look at how factors serve as distractions in analytical thinking and the implications of such roles.

Biases that operate in analytical thinking

Confirmation bias has been proposed to make us more effective persuaders by helping us recruit information relevant to supporting our standpoints (Mercier and Sperber, 2011 ), contribute to the rigor and comprehensiveness of a group’s deliberation when its members exert themselves in defending their standpoints (Smith and Wald, 2019 ), facilitate group coordination by reducing intentions of group members that are incompatible with the rest of the group (Norman, 2016 ), and to effect social change in order to match our beliefs (Peters, 2022 ). Having a bias towards treating as true information that derives from the right types of sources (peer-reviewed scientific journals, mainstream news platforms, domain-relevant experts) can save us time and effort in fact-checking every piece of information that comes our way. Similarly, ideally, user ratings or opinions that are based on a large and varied enough sample and not affected by other biases of the raters (see de Langhe et al., 2016 ; Godden, 2008 ) should enable consumers to judge the goodness of a product or the truth of information without having to do further research. Conspiracist thinking allows us to detect true conspiracies and defend ourselves against them (van Prooijen and van Vugt, 2018 ).

Ideal analytical thinking proceeds by way of careful reasoning—whether conducted between interlocutors or with oneself in analytical reflection—and is aimed at uncovering the truth. Reasoning consists of giving reasons for claims, which summarize premises for conclusions. However, making and evaluating arguments is cognitively demanding and time-consuming. To make thinking more efficient, we rely on mental heuristics. While they may usually serve us well, mental heuristics can also cause us to believe in disinformation (see Ali and Zain-ul-abdin, 2021 ; Brashier and Marsh, 2020 ). Confirmation bias, source credibility, social endorsement and conspiracist thinking can interfere with the course of ideal analytical thinking. They can cause us to accept information as true or they can cause us to judge information as true on their bases, and to use the information as reasons for our claims. They can cause us to skip steps in evaluating the reasons for a claim or to uncritically accept claims. Because these biases operate within the exercise of analytical thinking, tests for a general propensity or competence to think analytically may not detect their operation. One may be predisposed to engage in analytical thinking or be good at the tested type of analytical thinking (for example, traditional CRT’s numerical analytical thinking) but have their reasons and claims subject to interference by biases. Biases that operate within analytical thinking, then, can undermine hypotheses that analytical thinking guards us against falling for fake news. Due to the operation of biases in analytical thinking—in the selection of reasons for claims, the weight accorded to reasons and the requirement for rigor in reasoning—one may fall for fake news even when one is predisposed to thinking analytically.

Inconclusiveness about motivated reasoning’s role

Pennycook and Rand ( 2019 ) argued that it is being distracted from thinking analytically, rather than motivated reasoning, that causes us to fall for fake news. Their claim is based on their findings that, regardless of content alignment with subjects’ beliefs, subjects’ competence in analytical thinking as measured by the CRT correlated positively with truth discernment. The motivated reasoning thesis would predict positive correlations between high analytical thinking competency scores and truth discernment accuracy when discerning the truth of content that is aligned with subjects’ beliefs, and a negative correlation between high competency scores and truth discernment accuracy when discerning the truth of content that are not thus aligned.

There is substantial work on analytic thinking and its effect on believing fake news that is aligned with Pennycook and Rand’s ( 2019 ) conclusions in undermining the hypothesis that our believing fake news is due to motivated reasoning (e.g. Pennycook and Rand, 2020 ; Clayton et al., 2019 ; Martel et al., 2020 ; Wischnewski and Krämer, 2021 ). Nevertheless, earlier work on motivated reasoning for topics like public discourse (Kahan et al., 2017 ) and misinformation (Kahan, 2017 ), together with suggestive work on confirmation bias (Zhou and Shen, 2022 ) and possibly opposing conclusions from recent work (Calvillo and Smelter, 2020 ; Tsang, 2021 ; Michael and Sanson, 2021 ; Traberg and van der Linden, 2022 ) urges caution in jettisoning the motivated reasoning research on fake news believability.

There are two points standing in the way of downplaying the causal relevance of motivated reasoning to believing disinformation. First, subjects who are good at analytical thinking may engage in motivated reasoning only when identity or reputational stakes are high. This would occur in a setting where one has to defend one’s position to others, rather than answer analytical questions in private and anonymously. Pennycook and Rand admit that their study could have different results if the questions were less factual and more personal (Pennycook and Rand, 2019 ). Second, the CRT items—analytical questions of “brain teaser” variety— may prime subjects to focus on exercising their analytical thinking abilities to the exclusion of the factors that typically motivate motivated reasoning.

Non-truth-directed reasoning to spread fake news

Non-truth-directed reasoning may cause individuals to retransmit fake news despite scoring well on analytical thinking measures. Reasoning need not aim at evaluating truth. Some researchers distinguish between “accuracy-oriented” and “goal-oriented” reasoning (for example, Osmundsen et al., 2021 ), where the former aims at the truth while the latter does not. Goal-oriented or means-ends reasoning can lead one to accept the truth of content that one is unsure of. This may occur in the minds of anxious individuals debating whether to believe in some potentially life-saving remedy in the panic and confusion at the beginning of the Covid-19 pandemic. Or, they may also reason to the conclusion that they should retransmit the information despite being uncertain about its truth. Similar means-ends reasoning can be conducted rigorously for deciding to retransmit unverified information for the sake of eliciting social engagement (likes, comments, ratings), to signal political affiliation, convey moral outrage, inflict social punishment, and/or to protect one’s identity or enhance one’s reputation in some other way. More perniciously, quasi-“truth directed” reasoning can cause people to classify information that they are unsure about, or that they believe is false, as “interesting if true (Altay et al., 2022 )” or “likely to become true” (Helgason and Effron, 2022 ). As research has shown, such classifications suffice to motivate the retransmission of the relevant information.

Conclusions

To our research question “What psychological distractions to analytical thinking mediate the belief in and/or retransmission of fake news?”, we have identified the following distractions by reviewing the research literature from 2016 to 2022: Emotions (especially anxiety and anger), repeated exposure, altruism, identity protection or enhancement, confirmation bias, source credibility, social endorsement, conspiracy thinking and possibly motivated reasoning. We think that Pennycook and Rand ( 2019 ) are generally correct when they say that people fall for fake news because they are distracted from thinking. However, their view can be qualified by these five conclusions from our discussion.

First, it is not analytical thinking per se, but analytical thinking directed to evaluating the truth that safeguards us from believing or spreading fake news. Someone may be good at analytical thinking and employ it in means-ends reasoning to decide to accept information that one is uncertain about as true, or to spread information that one knows is false in order to achieve their purposes. Some means-ends reasoning masquerade as truth evaluations—quasi-“truth directed” reasoning like “interesting if true” or “likely to become true” may inform one’s decision to believe the information.

Second, psychological factors can distract us from exercising analytical thinking and they can also distract us in exercising analytical thinking. They distract us from analytical thinking when, for example, we find a statement true because we were exposed many times to it. They distract us in analytical thinking when we favor certain premises in our reasoning because they support what we already believe. When we believe fake news that aligns with our political orientation, do we believe as a result of reasoning to confirm what we already believe or do we believe as a result of being triggered by prior exposure? The former is a bias that distracts within analytical thinking while the latter is a bias that distracts from analytical thinking.

Third, whether a psychological factor functions as a distractor within analytical thinking or from analytical thinking may depend on contextual factors. For example, under what conditions do we deliberate about source credibility in order to make an accuracy judgment and under what conditions do we make a snap accuracy judgment based on source credibility? The same heuristics that facilitate reasoning or are adaptive in some way can cause us to believe or retransmit disinformation. This suggests that investigating contextual factors influencing the choice of heuristics or how a heuristic is deployed is a potentially fruitful line of research.

Fourth, due to a possible mismatch between the type of analytical thinking that tests of analytical thinking employed by many studies on analytical thinking and disinformation employ and the type of analytical thinking that informs one’s decision to believe or spread information, measurements of analytical thinking may not indicate vulnerability to believing or spreading fake news. Many disinformation studies that seek to measure analytical thinking ability rely on CRT. Might relevant dimensions of analytical thinking ability be missed out by the CRT? Dimensions pertaining to argumentation—assessing reasons for claims, evidence for reasons, and effectiveness of objections and counterexamples—seem relevant to the persuasiveness of fake news. These dimensions are not well served by the CRT and only one study (Lantian et al., 2021 ) in our curation investigated them.

Fifth, despite the influential Pennycook and Rand ( 2019 ) and substantial other research aligned with its skepticism about the role of motivated reasoning to fake news susceptibility, the relevance of motivated reasoning to our tendency to believe fake news should not yet be dismissed. The conditions under which people undertake motivated reasoning (for example, only when faced with an imminent threat to one’s identity or reputation) and the priming effects of analytical tests on accuracy assessments of true and false information have not been adequately investigated. For the latter, it is not only the CRT but any analytical test could prime subjects to treat tasks in the immediate future as analytic ones, requiring objective and careful reasoning. This could preclude the operation of possible emotional triggers of motivated reasoning like anger or anxiety.

What is the outlook for research on psychological factors affecting reasoning and disinformation? With the advent of disinformation fabricated by artificial intelligence right down to proper citations of fabricated sources, the arms race against disinformation is intensifying. The identification of psychological factors that can distract us away from reasoning or distract us within reasoning is not an end in itself. It is for the purpose of education and public messaging that can deflect at least the most pernicious of fake news disinformation influences. The five conclusions above correspondingly suggest the following topics of research that can contribute to the design of education and public messaging initiatives:

Truth-oriented values (e.g. intellectual integrity, open-mindedness, intellectual humility) and how they might serve as constraints to goal-oriented reasoning, as opposed to accuracy-oriented reasoning

Distractors away from reasoning and distractors within reasoning from among the extant psychological factors that influence our vulnerability to disinformation (especially in untangling the potentially conflicting roles of anxiety and worry)

Conditions under which a psychological factor distracts us away from reasoning and conditions under which it distracts us from reasoning

Measures of other dimensions of analytical thinking (especially argumentative reasoning) to gauge the analytical thinking factor influencing our vulnerability to disinformation

Topics concerning motivated reasoning (especially high-stakes motivated reasoning as a factor influencing our vulnerability to disinformation, the role of emotion in motivated reasoning causing us to believe/spread fake news, and conceptual work distinguishing confirmation bias from motivated reasoning)

Finally, while it is not within the ambit of our research question, we noticed that there is little work done on the relation between believing and spreading fake news. The studies that we curated understand “falling for fake news” or “vulnerability to fake news” in terms of believing it, spreading it, or both. The propensity to believe and the propensity to spread fake news are usually taken as dependent variables on other factors but not on one another. Yet, these two propensities practically exhaust our understanding of what it is to be vulnerable to fake news. It is therefore important for our taming of fake news to learn how belief in it interacts with its retransmission.

Ahmed S (2022) Disinformation sharing thrives with fear of missing out among low cognitive news users: a cross-national examination of intentional sharing of deep fakes. J Broadcast Electron Media 66(1):Article 1. https://doi.org/10.1080/08838151.2022.2034826

Article   MathSciNet   Google Scholar  

Alcott H, Gentzkow M (2017) Social media and fake news in the 2016 election. J Econ Perspect 31(2):211–236. https://doi.org/10.1257/jep.31.2.211

Article   Google Scholar  

Ali K, Zain-ul-abdin K (2021) Post-truth propaganda: heuristic processing of political fake news on Facebook during the 2016 U.S. Presidential election. J Appl Commun Res 49(1):Article 1. https://doi.org/10.1080/00909882.2020.1847311

Ali K, Li C, Zain-ul-abdin K, Zaffar MA (2022b) Fake news on Facebook: examining the impact of heuristic cues on perceived credibility and sharing intention. Internet Res 32(1):Article 1. https://doi.org/10.1108/INTR-10-2019-0442

Ali K, Li C, Zain-ul-abdin K, Muqtadir SA (2022a) The effects of emotions, individual attitudes towards vaccination, and social endorsements on perceived fake news credibility and sharing motivations. Comput Hum Behav 134:107307. https://doi.org/10.1016/j.chb.2022.107307

Altay S, de Araujo E, Mercier H (2022) “If this account is true, it is most enormously wonderful”: interestingness-if-true and the sharing of true and false news. Digit Journalism 10(3):Article 3. https://doi.org/10.1080/21670811.2021.1941163

Anthony A, Moulding R (2019) Breaking the news: belief in fake news and conspiracist beliefs. Aust J Psychol 71(2):Article 2. https://doi.org/10.1111/ajpy.12233

Apuke OD, Omar B (2021) Fake news and covid-19: modelling the predictors of fake news sharing among social media users. Telematics Inform 56:101475. https://doi.org/10.1016/j.tele.2020.101475

Arias Maldonado M (2019) Understanding fake news: technology, affects, and the politics of the untruth. Hist Comun Soc 24(2):Article 2. https://doi.org/10.5209/hics.66298

Avram M, Micallef N, Patil S, Menczer F (2020) Exposure to social engagement metrics increases vulnerability to misinformation. Harv Kennedy Sch Misinf Rev. https://doi.org/10.37016/mr-2020-033

Bago B, Rosenzweig LR, Berinsky AJ, Rand DG (2022) Emotion may predict susceptibility to fake news but emotion regulation does not seem to help. Cogn Emotion 36(6):Article 6. https://doi.org/10.1080/02699931.2022.2090318

Balakrishnan V, Ng KS, Rahim HA (2021) To share or not to share—the underlying motives of sharing fake news amidst the Covid-19 pandemic in Malaysia. Technol Soc 66:101676. https://doi.org/10.1016/j.techsoc.2021.101676

Article   PubMed   PubMed Central   Google Scholar  

Baptista JP, Correia E, Gradim A, Piñeiro-Naval V (2021) The influence of political ideology on fake news belief: the Portuguese case. Publications 9(2):Article 2. https://doi.org/10.3390/publications9020023

Bauer PC, Clemm von Hohenberg B (2021) Believing and sharing information by fake sources: an experiment. Political Commun 38(6):Article 6. https://doi.org/10.1080/10584609.2020.1840462

Van Bavel JJ, Pereira A (2018) The Partisan Brain: an identity-based model of political belief. Trends Cogn Sci 22(3):213–224. https://doi.org/10.1016/j.tics.2018.01.004

Article   PubMed   Google Scholar  

Baxter G, Marcella R, Walicka A (2019) Scottish citizens’ perceptions of the credibility of online political “facts” in the “fake news” era: an exploratory study. J Doc 75(5):Article 5. https://doi.org/10.1108/JD-10-2018-0161

Bodenhausen GV, Kramer GP, Süsser K (1994) Happiness and stereotypic thinking in social judgment. J Person Soc Psychol 66(4):621–632. https://doi.org/10.1037/0022-3514.66.4.621

Bonafé-Pontes A, Couto C, Kakinohana R, Travain M, Schimidt L, Pilati R (2021) Covid-19 as Infodemic: the impact of political orientation and open-mindedness on the discernment of misinformation in Whatsapp. Judgm Decision Mak 16(6):Article 6

Google Scholar  

Bozdağ Ç, Koçer S (2022) Skeptical inertia in the face of polarization: news consumption and misinformation in Turkey. Media Commun 10(2):Article 2. https://doi.org/10.17645/mac.v10i2.5057

Brashier NM, Marsh EJ (2020) Judging truth. Annu Rev Psychol 71(1):499–515. https://doi.org/10.1146/annurev-psych-010419-050807

Buchanan T (2020) Why do people spread false information online? The effects of message and viewer characteristics on self-reported likelihood of sharing social media disinformation. PLoS ONE 15(10):Article 10. https://doi.org/10.1371/journal.pone.0239666

Article   CAS   Google Scholar  

Buchanan T, Benson V (2019) Spreading disinformation on facebook: do trust in message source, risk propensity, or personality affect the organic reach of “fake news”? Soc Media+Soc 5(4):Article 4. https://doi.org/10.1177/2056305119888654

Calvillo DP, Smelter TJ (2020) An initial accuracy focus reduces the effect of prior exposure on perceived accuracy of news headlines. Cogn Res 5(1):Article 1. https://doi.org/10.1186/s41235-020-00257-y

Calvillo DP, Rutchick AM, Garcia RJB (2021) Individual differences in belief in fake news about election fraud after the 2020 U.S. election. Behav Sci 11(12):Article 12. https://doi.org/10.3390/bs11120175

Calvillo DP, Ross BJ, Garcia RJB, Smelter TJ, Rutchick AM (2020) Political ideology predicts perceptions of the threat of COVID-19 (and susceptibility to fake news about it). Soc Psychol Personal Sci 11(8):Article 8. https://doi.org/10.1177/1948550620940539

Chuai Y, Zhao J (2022) Anger can make fake news viral online. Frontiers in Physics. https://doi.org/10.3389/fphy.2022.970174

Clayton K, Davis J, Hinckley K, Horiuchi Y (2019) Partisan motivated reasoning and misinformation in the media: is news from ideologically uncongenial sources more suspicious. Jpn J Political Sci 20(3):Article 3. https://doi.org/10.1017/S1468109919000082

De Coninck D, Frissen T, Matthijs K, d’Haenens L, Lits G, Champagne-Poirier O, Carignan M-E, David MD, Pignard-Cheynel N, Salerno S, Généreux M (2021) Beliefs in conspiracy theories and misinformation about COVID-19: comparative perspectives on the role of anxiety, depression and exposure to and trust in information sources. Front Psychol 12:646394. https://doi.org/10.3389/fpsyg.2021.646394

Deng B, Chau M (2021) The effect of the expressed anger and sadness on online news believability. J Manag Inf Syst 38(4):Article 4. https://doi.org/10.1080/07421222.2021.1990607

Druckman JN, Ognyanova K, Baum MA, Lazer D, Perlis RH, Volpe JD, Santillana M, Chwe H, Quintana A, Simonson M (2021) The role of race, religion, and partisanship in misperceptions about Covid-19. Group Process Intergroup Relat 24(4):Article 4. https://doi.org/10.1177/1368430220985912

Duffy A, Tan NN (2022) Dubious news: the social processing of uncertain facts in uncertain times. Digit Journalism 10(3):Article 3. https://doi.org/10.1080/21670811.2021.1953390

Duffy A, Tandoc E, Ling R (2020) Too good to be true, too good not to share: the social utility of fake news. Inf Commun Soc 23(13):1965–1979. https://doi.org/10.1080/1369118X.2019.1623904

Effron DA, Raj M (2020) Misinformation and morality: encountering fake-news headlines makes them seem less unethical to publish and share. Psychol Sci 31(1):Article 1. https://doi.org/10.1177/0956797619887896

Faragó L, Kende A, Krekó P (2020) We only believe in news that we doctored ourselves: the connection between partisanship and political fake news. Soc Psychol 51(2):Article 2. https://doi.org/10.1027/1864-9335/a000391

Folkvord F, Snelting F, Anschutz D, Hartmann T, Theben A, Gunderson L, Vermeulen I, Lupiáñez-Villanueva F (2022) Effect of source type and protective message on the critical evaluation of news messages on facebook: randomized controlled trial in the Netherlands. J Med Internet Res 24(3):Article 3. https://doi.org/10.2196/27945

Forgas JP, East R (2008) On being happy and gullible: mood effects on skepticism and the detection of deception. J Exp Soc Psychol 44(5):1362–1367. https://doi.org/10.1016/j.jesp.2008.04.010

Godden DM (2008) On common knowledge and ad populum: acceptance as grounds for acceptability. Philos Rhetoric 41(2):101–129

Grambo K (2019) Fake news and racial, ethnic and religious minorities: A precarious quest for truth. U Pa J Const L 1299

Grant MJ, Booth A(2009) typology of reviews: an analysis of 14 review types and associated methodologies: a typology of reviews, maria j. Grant & Andrew booth. Health Inf Libr J 26(2):91–108. https://doi.org/10.1111/j.1471-1842.2009.00848.x

Hameleers M, Brosius A (2022) You are wrong because I am right! The perceived causes and ideological biases of misinformation beliefs. Int J Public Opin Res 34(1):Article 1. https://doi.org/10.1093/ijpor/edab028

Hameleers M, Brosius A, de Vreese CH (2022) Whom to trust? Media exposure patterns of citizens with perceptions of misinformation and disinformation related to the news media. Eur J Commun 37(3):Article 3. https://doi.org/10.1177/02673231211072667

Han J, Cha M, Lee W (2020) Anger contributes to the spread of Covid-19 misinformation. Harv Kennedy School Misinf Rev. https://doi.org/10.37016/mr-2020-39

Harff D, Bollen C, Schmuck D (2022) Responses to social media influencers’ misinformation about COVID-19: a pre-registered multiple-exposure experiment. Media Psychol 25(6):Article 6. https://doi.org/10.1080/15213269.2022.2080711

Hartung F-M, Krohn C, Pirschtat M (2019) Better than its reputation? Gossip and the reasons why we and individuals with “dark” personalities talk about others. Front Psychol 10:1162. https://doi.org/10.3389/fpsyg.2019.01162

Hasher L, Goldstein D, Toppino T (1977) Frequency and the conference of referential validity. J Verbal Learn Verbal Behav 16:107–112

Hassan A, Barber SJ (2021) The effects of repetition frequency on the illusory truth effect. Cogn Res 6(1):38. https://doi.org/10.1186/s41235-021-00301-5

Helgason BA, Effron DA (2022) It might become true: how prefactual thinking licenses dishonesty. J Person Soc Psychol 123(5):Article 5. https://doi.org/10.1037/pspa0000308

Hopp T, Ferrucci P, Vargo CJ (2020) Why do people share ideologically extreme, false, and misleading content on social media? A self-report and trace data–based analysis of countermedia content dissemination on Facebook and Twitter. Hum Commun Res 46(4):Article 4. https://doi.org/10.1093/hcr/hqz022

Horner CG, Galletta D, Crawford J, Shirsat A (2021) Emotions: the unexplored fuel of fake news on social media. J Manag Inf Syst 38(4):Article 4. https://doi.org/10.1080/07421222.2021.1990610

Hughes JP, Efstratiou A, Komer SR, Baxter LA, Vasiljevic M, Leite AC (2022) The impact of risk perceptions and belief in conspiracy theories on covid-19 pandemic-related behaviours. PLOS ONE 17(2):e0263716. https://doi.org/10.1371/journal.pone.0263716

Article   CAS   PubMed   PubMed Central   Google Scholar  

Ireton C, Posetti J (2018) Journalism, fake news & disinformation: handbook for journalism education and training. UNESCO

Islam AKMN, Laato S, Talukder S, Sutinen E (2020) Misinformation sharing and social media fatigue during Covid-19: an affordance and cognitive load perspective. Technol Forecast Soc Change 159:120201. https://doi.org/10.1016/j.techfore.2020.120201

Kahan DM, Peters E, Dawson EC, Slovic P (2017) Motivated numeracy and enlightened self-government. Behav Public Policy 1(1):54–86. https://doi.org/10.1017/bpp.2016.2

Kahan DM (2017) Misconceptions, misinformation, and the logic of identity-protective cognition. SSRN Electron J. https://doi.org/10.2139/ssrn.2973067

Kan IP, Pizzonia KL, Drummey AB, Mikkelsen EJV (2021) Exploring factors that mitigate the continued influence of misinformation. Cogn Res 6(1):76. https://doi.org/10.1186/s41235-021-00335-9

Kantorowicz-Reznichenko E, Folmer CR, Kantorowicz J (2022) Don’t believe it! A global perspective on cognitive reflection and conspiracy theories about Covid-19 pandemic. Person Individ Diff 194:111666. https://doi.org/10.1016/j.paid.2022.111666

Keselman A, Arnott Smith C, Leroy G, Kaufman DR (2021) Factors influencing willingness to share health misinformation videos on the internet: web-based survey. J Med Internet Res 23(12):Article 12. https://doi.org/10.2196/30323

Kim S, Kim S (2020) The crisis of public health and infodemic: analyzing belief structure of fake news about COVID-19 pandemic. Sustainability 12(23):Article 23. https://doi.org/10.3390/su12239904

Laato S, Islam AKMN, Islam MN, Whelan E (2020) What drives unverified information sharing and cyberchondria during the Covid-19 pandemic. Eur J Inf Syst 29(3):Article 3. https://doi.org/10.1080/0960085X.2020.1770632

Landrum AR, Olshansky A (2019) The role of conspiracy mentality in denial of science and susceptibility to viral deception about science. Politics Life Sci 38(2):Article 2. https://doi.org/10.1017/pls.2019.9

de Langhe B, Fernbach PM, Lichtenstein DR (2016) Navigating by the stars: investigating the actual and perceived validity of online user ratings. J Consum Res 42(6):817–833. https://doi.org/10.1093/jcr/ucv047

Lantian A, Bagneux V, Delouvée S, Gauvrit N (2021) Maybe a free thinker but not a critical one: high conspiracy belief is associated with low critical thinking ability. Appl Cogn Psychol 35(3):Article 3. https://doi.org/10.1002/acp.3790

Lawson MA, Kakkar H (2022) Of pandemics, politics, and personality: the role of conscientiousness and political ideology in the sharing of fake news. J Exp Psychol 151(5):Article 5. https://doi.org/10.1037/xge0001120

Li K, Li J, Zhou F (2022) The effects of personality traits on online rumor sharing: the mediating role of fear of COVID-19. Int J Environ Res Public Health 19(10):Article 10. https://doi.org/10.3390/ijerph19106157

Lin Y, Zhang YC, Oyserman D (2022) Seeing meaning even when none may exist: collectivism increases belief in empty claims. J Person Soc Psychol 122(3):351–366. https://doi.org/10.1037/pspa0000280

Littrell S, Risko EF, Fugelsang JA (2021) ‘You can’t bullshit a bullshitter’ (or can you?): bullshitting frequency predicts receptivity to various types of misleading information. Br J Soc Psychol 60(4):Article 4. https://doi.org/10.1111/bjso.12447

Lobato EJC, Powell M, Padilla LMK, Holbrook C (2020) Factors predicting willingness to share COVID-19 misinformation. Front Psychol 11:566108. https://doi.org/10.3389/fpsyg.2020.566108

MacKuen M, Wolak J, Keele L, Marcus GE (2010) Civic engagements: resolute partisanship or reflective deliberation. Am J Political Sci 54(2):440–458. https://doi.org/10.1111/j.1540-5907.2010.00440.x

Martel C, Pennycook G, Rand DG (2020) Reliance on emotion promotes belief in fake news. Cogn Res 5(1):Article 1. https://doi.org/10.1186/s41235-020-00252-3

McGonagle T (2017) “Fake news”: false fears or real concerns? Neth Q Hum Rights 35(4):203–209. https://doi.org/10.1177/0924051917738685

Melki J, Tamim H, Hadid D, Makki M, El Amine J, Hitti E (2021) Mitigating infodemics: the relationship between news exposure and trust and belief in Covid-19 fake news and social media spreading. PLoS ONE 16(6):Article 6. https://doi.org/10.1371/journal.pone.0252830

Mena P, Barbe D, Chan-Olmsted S (2020) Misinformation on Instagram: the impact of trusted endorsements on message credibility. Soc Media+Soc 6(2):Article 2. https://doi.org/10.1177/2056305120935102

Mercier H, Sperber D (2011) Why do humans reason? Arguments for an argumentative theory. Behav Brain Sci 34(2):57–74. https://doi.org/10.1017/S0140525X10000968

Michael RB, Breaux BO (2021) The relationship between political affiliation and beliefs about sources of “fake news”. Cogn Res 6(1):Article 1. https://doi.org/10.1186/s41235-021-00278-1

Michael RB, Sanson M (2021) Source information affects interpretations of the news across multiple age groups in the United States. Societies 11(4):Article 4. https://doi.org/10.3390/soc11040119

Miller JM, Saunders KL, Farhart CE (2016) Conspiracy endorsement as motivated reasoning: the moderating roles of political knowledge and trust. Am J Political Sci 60(4):Article 4. https://doi.org/10.1111/ajps.12234

Nadarevic L, Reber R, Helmecke AJ, Köse D (2020) Perceived Truth of statements and simulated social media postings: an experimental investigation of source credibility, repeated exposure, and presentation format. Cogn Res 5(1):Article 1. https://doi.org/10.1186/s41235-020-00251-4

Norman A (2016) Why we reason: intention-alignment and the genesis of human rationality. Biol Philos 31(5):685–704. https://doi.org/10.1007/s10539-016-9532-4

Nurse MS, Ross RM, Isler O, Van Rooy D (2022) Analytic thinking predicts accuracy ratings and willingness to share Covid-19 misinformation in Australia. Mem Cogn 50(2):Article 2. https://doi.org/10.3758/s13421-021-01219-5

Oh HJ, Lee H (2019) When do people verify and share health rumors on social media? The effects of message importance, health anxiety, and health literacy. J Health Commun 24(11):Article 11. https://doi.org/10.1080/10810730.2019.1677824

Osmundsen M, Bor A, Vahlstrup PB, Bechmann A, Petersen MB (2021) Partisan polarization is the primary psychological motivation behind political fake news sharing on Twitter. Am Political Sci Rev 115(3):Article 3. https://doi.org/10.1017/S0003055421000290

Pearson GDH, Knobloch-Westerwick S (2019) Is the confirmation bias bubble larger online? Pre-election confirmation bias in selective exposure to online versus print political information. Mass Commun Soc 22(4):Article 4. https://doi.org/10.1080/15205436.2019.1599956

Pehlivanoglu D, Lin T, Deceus F, Heemskerk A, Ebner NC, Cahill BS (2021) The role of analytical reasoning and source credibility on the evaluation of real and fake full-length news articles. Cogn Res 6(1):Article 1. https://doi.org/10.1186/s41235-021-00292-3

Pehlivanoglu D, Lighthall NR, Lin T, Chi KJ, Polk R, Perez E, Cahill BS, Ebner NC (2022) Aging in an “Infodemic”: the role of analytical reasoning, affect, and news consumption frequency on news veracity detection. J Exp Psychol 28(3):Article 3. https://doi.org/10.1037/xap0000426

Pennycook G, Cannon TD, Rand DG (2018) Prior exposure increases perceived accuracy of fake news. J Exp Psychol 147(12):1865–1880. https://doi.org/10.1037/xge0000465

Pennycook G, Rand DG (2019) Lazy, not biased: susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition 188:39–50. https://doi.org/10.1016/j.cognition.2018.06.011

Pennycook G, Rand DG (2020) Who falls for fake news? The roles of bullshit receptivity, overclaiming, familiarity, and analytic thinking. J Person 88(2):185–200. https://doi.org/10.1111/jopy.12476

Pereira A, Harris E, Van Bavel JJ (2021) Identity concerns drive belief: the impact of partisan identity on the belief and dissemination of true and false news. group processes & intergroup relations, 136843022110300. https://doi.org/10.1177/13684302211030004

Peters U (2022) What is the function of confirmation bias. Erkenntnis 87(3):1351–1376. https://doi.org/10.1007/s10670-020-00252-1

Plume CJ, Slade EL (2018) Sharing of sponsored advertisements on social media: a uses and gratifications perspective. Inf Syst Front 20(3):471–483. https://doi.org/10.1007/s10796-017-9821-8

van Prooijen J-W, van Vugt M (2018) Conspiracy theories: evolved functions and psychological mechanisms perspectives on. Psychol Sci 13(6):770–788. https://doi.org/10.1177/1745691618774270

Ren Z (Bella), Dimant E, Schweitzer ME (2021) Social motives for sharing conspiracy theories. SSRN Electron J. https://doi.org/10.2139/ssrn.3919364

Ross RM, Rand DG, Pennycook G (2021) Beyond “fake news”: Analytic thinking and the detection of false and hyperpartisan news headlines Abstract. Judgment Decis Mak 16(2):484–504. https://doi.org/10.1017/S1930297500008640

Smelter TJ, Calvillo DP (2020) Pictures and repeated exposure increase perceived accuracy of news headlines. Appl Cogn Psychol 34(5):Article 5. https://doi.org/10.1002/acp.3684

Smith JJ, Wald B (2019) Collectivized intellectualism. Res Philos 96(2):199–227. https://doi.org/10.11612/resphil.1766

Solovev K, Pröllochs N (2022) Moral emotions shape the virality of COVID-19 misinformation on social media. In: Proceedings of the ACM web conference. pp. 3706–3717. https://doi.org/10.1145/3485447.3512266

Stanley ML, Whitehead PS, Marsh EJ (2022) The cognitive processes underlying false beliefs. J Consum Psychol 32(2):359–369. https://doi.org/10.1002/jcpy.1289

Sterrett D, Malato D, Benz J, Kantor L, Tompson T, Rosenstiel T, Sonderman J, Loker K (2019) Who shared it?: Deciding what news to trust on social media. Digit Journalism 7(6):Article 6. https://doi.org/10.1080/21670811.2019.1623702

Su Y (2021) It doesn’t take a village to fall for misinformation: social media use, discussion heterogeneity preference, worry of the virus, faith in scientists, and Covid-19-related misinformation beliefs. Telemat Inform 58:101547. https://doi.org/10.1016/j.tele.2020.101547

Sun Z, Cheng X, Zhang R, Yang B (2020) Factors influencing rumour re-spreading in a public health crisis by the middle-aged and elderly populations. Int J Environ Res Public Health 17(18):Article 18. https://doi.org/10.3390/ijerph17186542

Swire B, Ecker UKH, Lewandowsky S (2017) The role of familiarity in correcting inaccurate information. J Exp Psychol 43(12):1948–1961. https://doi.org/10.1037/xlm0000422

Tandoc EC, Lee J, Chew M, Tan FX, Goh ZH (2021) Falling for fake news: the role of political bias and cognitive ability. Asian J Commun 31(4):Article 4. https://doi.org/10.1080/01292986.2021.1941149

Traberg CS, van der Linden S (2022) Birds of a feather are persuaded together: perceived source credibility mediates the effect of political bias on misinformation susceptibility. Personal Individ Differ 185:111269. https://doi.org/10.1016/j.paid.2021.111269

Tsang SJ (2021) Motivated fake news perception: the impact of news sources and policy support on audiences’ assessment of news fakeness. Journalism Mass Commun Q 98(4):Article 4. https://doi.org/10.1177/1077699020952129

Turel O, Osatuyi B (2021) Biased credibility and sharing of fake news on social media: considering peer context and self-objectivity state. J Manag Inf Syst 38(4):Article 4. https://doi.org/10.1080/07421222.2021.1990614

Vegetti F, Mancosu M (2020) The impact of political sophistication and motivated reasoning on misinformation. Political Commun 37(5):Article 5. https://doi.org/10.1080/10584609.2020.1744778

Wardle C, Derakhshan H (2017) Information disorder: toward an interdisciplinary framework for research and policy making. Counc Eur. https://rm.coe.int/information-disorder-toward-an-interdisciplinary-framework-for-researc/168076277c

Waszak PM, Kasprzycka-Waszak W, Kubanek A (2018) The spread of medical fake news in social media - The pilot quantitative study. Health Policy Technol 7(2):115–118. https://doi.org/10.1016/j.hlpt.2018.03.002

Weimann-Saks D, Elshar-Malka V, Ariel Y, Weimann G (2022) Spreading online rumours during the Covid-19 pandemic: the role of users’ knowledge, trust and emotions as predictors of the spreading patterns. J Int Commun 28(2):Article 2. https://doi.org/10.1080/13216597.2022.2099443

Wischnewski M, Bruns A, Keller T (2021) Shareworthiness and motivated reasoning in hyper-partisan news sharing behavior on Twitter. Digit Journalism 9(5):Article 5. https://doi.org/10.1080/21670811.2021.1903960

Wischnewski M, Krämer N (2021) The role of emotions and identity-protection cognition when processing (mis)information. Technol Mind Behav 2(1). https://doi.org/10.1037/tmb0000029

Xiao X, Borah P, Su Y (2021) The dangers of blind trust: examining the interplay among social media news use, misinformation identification, and news trust on conspiracy beliefs. Public Underst Sci 30(8):Article 8. https://doi.org/10.1177/0963662521998025

Zhou Y, Shen L (2022) Confirmation bias and the persistence of misinformation on climate change. Commun Res 49(4):Article 4. https://doi.org/10.1177/00936502211028049

Zimmermann F, Kohring M (2020) Mistrust, disinforming news, and vote choice: a panel survey on the origins and consequences of believing disinformation in the 2017 German Parliamentary election. Political Commun 37(2):Article 2. https://doi.org/10.1080/10584609.2019.1686095

Download references

Acknowledgements

This study was funded by Singapore Ministry of Education (MOE) under the Education Research Funding Programme (DEV 03/19 AK) and administered by National Institute of Education (NIE), Nanyang Technological University, Singapore. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the Singapore MOE and NIE.

Author information

Authors and affiliations.

College of Interdisciplinary and Experiential Learning, Singapore University of Social Sciences, Singapore, Singapore

Adrian Kwek

School of Science and Technology, Singapore University of Social Sciences, Singapore, Singapore

Curriculum Planning and Development Division, Ministry of Education, Singapore, Singapore

School of Computing, National University of Singapore, Singapore, Singapore

Jin Xing Lee

You can also search for this author in PubMed   Google Scholar

Contributions

The authors confirm their contribution to the paper as follows: study conception and design: AK; data collection: AK, JXL, LP, JT; analysis and interpretation of results: AK, LP, JT; draft manuscript preparation: AK, LP, JT. All authors reviewed the results and approved the final version of the manuscript.

Corresponding author

Correspondence to Adrian Kwek .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Ethical approval

This article does not contain any studies with human participants performed by any of the authors.

Informed consent

Additional information.

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Distractions, analytical thinking and falling for fake news: a survey of psychological factors, rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Kwek, A., Peh, L., Tan, J. et al. Distractions, analytical thinking and falling for fake news: A survey of psychological factors. Humanit Soc Sci Commun 10 , 319 (2023). https://doi.org/10.1057/s41599-023-01813-9

Download citation

Received : 02 January 2023

Accepted : 30 May 2023

Published : 12 June 2023

DOI : https://doi.org/10.1057/s41599-023-01813-9

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

critical thinking in fake news

  • Reference Manager
  • Simple TEXT file

People also looked at

Original research article, “fake news” or real science critical thinking to assess information on covid-19.

critical thinking in fake news

  • 1 Department of Applied Didactics, Universidade de Santiago de Compostela (USC), Santiago de Compostela, Spain
  • 2 IES Ramón Cabanillas, Xunta de Galicia, Cambados, Spain

Few people question the important role of critical thinking in students becoming active citizens; however, the way science is taught in schools continues to be more oriented toward “what to think” rather than “how to think.” Researchers understand critical thinking as a tool and a higher-order thinking skill necessary for being an active citizen when dealing with socio-scientific information and making decisions that affect human life, which the pandemic of COVID-19 provides many opportunities for. The outbreak of COVID-19 has been accompanied by what the World Health Organization (WHO) has described as a “massive infodemic.” Fake news covering all aspects of the pandemic spread rapidly through social media, creating confusion and disinformation. This paper reports on an empirical study carried out during the lockdown in Spain (March–May 2020) with a group of secondary students ( N = 20) engaged in diverse online activities that required them to practice critical thinking and argumentation for dealing with coronavirus information and disinformation. The main goal is to examine students’ competence at engaging in argumentation as critical assessment in this context. Discourse analysis allows for the exploration of the arguments and criteria applied by students to assess COVID-19 news headlines. The results show that participants were capable of identifying true and false headlines and assessing the credibility of headlines by appealing to different criteria, although most arguments were coded as needing only a basic epistemic level of assessment, and only a few appealed to the criterion of scientific procedure when assessing the headlines.

Introduction: Critical Thinking for Social Responsibility – An Urgent Need in the Covid-19 Pandemic

The COVID-19 pandemic is a global phenomenon that affects almost all spheres of our life, aside from its obvious direct impacts on human health and well-being. As mentioned by the UN Secretary General, in his call for solidarity, “We are facing a global health crisis unlike any in the 75-year history of the United Nations — one that is spreading human suffering, infecting the global economy and upending people’s lives.” (19 March 2020, Guterres, 2020 ). COVID-19 has revealed the vulnerability of global systems’ abilities to protect the environment, health and economy, making it urgent to provide a responsible response that involves collaboration between diverse social actors. For science education the pandemic has raised new and unthinkable challenges ( Dillon and Avraamidou, 2020 ; Jiménez-Aleixandre and Puig, 2021 ), which highlight the importance of critical thinking (CT) development in promoting responsible actions and responses to the coronavirus disease, which is the focus of this paper. Despite the general public’s respect of science and scientific advances, denial movements – such as the ones that reject the use of vaccines and advocate for alternative health therapies – are increasing during this period ( Dillon and Avraamidou, 2020 ). The rapid global spread of the coronavirus disease has been accompanied by what the World Health Organization (WHO) has described as the COVID-19 social media infodemic. The term infodemic refers to an overabundance of information (real or not) associated with a specific topic, whose growth can occur exponentially in a short period of time [ World Health Organization (WHO), 2020 ]. The case of the COVID-19 pandemic shows the crucial importance of socio-scientific instruction toward students’ development of critical thinking (CT) for citizenship.

Critical thinking is embedded within the framework of “21st century skills” and is considered one of the goals of education ( van Gelder, 2005 ). Despite its importance, there is not a clear consensus on how to better promote CT in science instruction, and teachers often find it unclear what CT means and requires from them in their teaching practice ( Vincent-Lacrin et al., 2019 ). CT is understood in this study as a set of skills and dispositions that enable students and people to take critical actions based on reasons and values, but also as independent thinking ( Jiménez-Aleixandre and Puig, 2021 ). It is also considered as a dialogic practice that students can enact and thereby become predisposed to practice ( Kuhn, 2019 ). We consider that CT has two fundamental roles in SSI instruction: one role linked to the promotion of rational arguments, cognitive skills and dispositions; and the other related to the idea of critical action and social activism, which is consistent with the characterization of CT provided by Jiménez-Aleixandre and Puig (2021) . Although research on SSIs has provided us with empirical evidence supporting the benefits of SSI instruction, particularly argumentation and students’ motivation toward learning science, there is still scarce knowledge on how CT is articulated in these contexts. One challenge with promoting CT, especially in SSIs, is linked to new forms of communication that generate a rapid increase of information and easy access to it ( Puig et al., 2020 ).

The study was developed in an unprecedented scenario, during the lockdown in Spain (March–May 2020), which forced the change of face-to-face teaching to virtual teaching, involving students in online activities that embraced the application of scientific notions related to COVID-19 and CT for assessing claims published in news headlines related to it. Previous studies have pointed out the benefits of virtual environments to foster CT among students, particularly asynchronous discussions that minimize social presence and favor all students expressing their own opinion ( Puig et al., 2020 ).

In this research, we aim to explore students’ ability to critically engage in the assessment of the credibility of COVID-19 claims during a moment in which fake news disseminated by social media was shared by the general public and disinformation on the virus was easier to access than real news.

Theoretical Framework

We will first discuss the crucial role of CT to address controversial issues and to fight against the rise of misinformation on COVID-19; and then turn attention to the role of argumentation in students’ development of CT in SSI instruction in epistemic education.

Critical Thinking on Socio-Scientific Instruction to Face the Rise of Disinformation

SSIs are compelling issues for the application of knowledge and processes contributing to the development of CT. They are multifaceted problems, as is the case of COVID-19, that involve informal reasoning and elements of critique where decisions present direct consequences to the well-being of human society and the environment ( Jiménez-Aleixandre and Puig, 2021 ). People need to balance subject matter knowledge, personal values, and societal norms when making decisions on SSIs ( Aikenhead, 1985 ) but they also have to be critical of the discourses that shape their own beliefs and practices to act responsibly ( Bencze et al., 2020 ). According to Duschl (2020) , science education should involve the creation of a dialogic discourse among members of a class that focuses on the teaching and learning of “how did we come to know?” and “why do we accept that knowledge over alternatives?” Studies on SSIs during the last decades have pointed out students’ difficulties in building arguments and making critical choices based on evidence ( Evagorou et al., 2012 ). However, literature also indicates that students find SSIs motivational for learning and increase their community involvement ( Eastwood et al., 2012 ; Evagorou, 2020 ), thus they are appropriate contexts for CT development. While research on content knowledge and different modes of reasoning on SSIs is extensive, the practice of CT is understudied in science instruction. Of particular interest in science education are SSIs that involve health controversies, since they include some of the challenges posed by the post-truth era, as the health crisis produced by coronavirus shows. The COVID-19 pandemic is affecting most countries and territories around the world, which is why it is considered the greatest challenge that humankind has faced since the 2nd World War ( Chakraborty and Maity, 2020 ). Issues like COVID-19 that affect society in multiple ways require literate citizens who are capable of making critical decisions and taking actions based on reasons. As the world responds to the COVID-19 pandemic, we face the challenge of an overabundance of information related to the virus. Some of this information may be false and potentially harmful [ World Health Organization (WHO), 2020 ]. In the context of growing disinformation related to the COVID-19 outbreak, EU institutions have worked to raise awareness of the dangers of disinformation and promoted the use of authoritative sources ( European Council of the European Union, 2020 ). Educators and science educators have been increasingly concerned with what can be done in science instruction to face the spread of misinformation and denial of well-established claims; helping students to identify what is true can be a hard task ( Barzilai and Chinn, 2020 ). As these authors suggest, diverse factors may shape what people perceive as true, such as the socio-cultural context in which people live, their personal experiences and their own judgments, that could be biased. We concur with these authors and Feinstein and Waddington (2020) , who argue that science education should not focus on achieving the knowledge, but rather on gaining appropriate scientific knowledge and skills, which in our view involves CT development. Furthermore, according to Sperber et al. (2010) , there are factors that affect the acceptance or rejection of a piece of information. These factors have to do either with the source of the information – “who to believe” – or with its content – “what to believe.” The pursuit of truth when dealing with SSIs can be facilitated by the social practices used to develop knowledge ( Duschl, 2020 ), such as argumentation understood as the evaluation of claims based on evidence, which is part of CT development.

We consider CT and argumentation as overlapping competencies in their contexts of practice; for instance, when assessing claims on COVID-19, as in this study. According to Sperber et al. (2010) , we now have almost no filters on information, and this requires a much more vigilant, knowledgeable reader. As these authors point out, individuals need to become aware of their own cognitive biases and how to avoid being victims themselves. If we want students to learn how to critically evaluate the information and claims they will encounter in social media outside the classroom, we need to engage them in the practice of argumentation and CT. This raises the question of what type of information is easier or harder for students to assess, especially when they are directly affected by the problem. In this paper we aim to explore this issue by exploring students’ arguments while assessing diverse claims on COVID-19. We think that students’ arguments reflect their ability to apply CT in this context, although this does not mean that CT skills always produce a well-reasoned argument ( Halpern, 1998 ). Students should be encouraged to express their own thoughts in SSI instruction, but also to support their views reasonably ( Puig and Ageitos, 2021 ). Specifically, when they must assess the validity of information that affects not only them as individuals but also the whole society and environment. CT may equip citizens to discard fake news and to use appropriate criteria to evaluate information. This requires the design and implementation of specific CT tasks, as this study presents.

Argumentation to Enhance Critical Thinking Development in Epistemic Education on SSIs

While the concept of CT has a long tradition and educators agree on its importance, there is a lack of agreement on what this notion involves ( Thomas and Lok, 2015 ). CT has been used with a wide range of meanings in theoretical literature ( Facione, 1990 ; Ennis, 2018 ). In 1990, The American Philosophical Association convened an authoritative panel of forty-six noted experts on CT to produce a definitive account of the concept, which was published in the Delphi Report ( Facione, 1990 ). The Delphi definition provides a list of skills and dispositions that can be useful and guide CT instruction. However, as Davies and Barnett (2015) point out, this Delphi definition does not include the phenomenon of action. We concur with these authors that CT education should involve students in “CT for action,” since decision making – a way of deciding on a course of action – is based on judgments derived from argumentation using CT. Drawing from Halpern (1998) , we also think that CT requires awareness of one’s own knowledge. CT requires, for instance, insight into what one knows and the extent and importance of what one does not know in order to assess socio-scientific news and its implications ( Puig and Ageitos, 2021 ).

Critical thinking and argumentation share core elements like rationality and reflection ( Andrews, 2015 ). Some researchers suggest understanding CT as a dialogic practice ( Kuhn, 2019 ) has implications in CT instruction and development. Argumentation on SSIs, particularly on health controversies, is receiving increasing attention in science education in the post-truth era, as the coronavirus pandemic and denial movements related to its origin, prevention, and treatment show. Science education should involve the creation of a dialogic discourse among members of a class that enable them to develop CT. One of the central features in argumentation is the development of epistemic criteria for knowledge evaluation ( Jiménez Aleixandre and Erduran, 2008 ), which is a necessary skill to be a critical thinker. We see the practice of CT as the articulation of cognitive skills through the practice of argumentation ( Giri and Paily, 2020 ).

This article argues that science education needs to explore learning experiences and ways of instruction that support CT by engaging learners in argumentation on SSIs. Despite CT being considered a seminal goal in education and the large body of research on CT supporting this ( Dominguez, 2018 ), debates still persist about the manner in which CT skills can be achieved through education ( Abrami et al., 2008 ). Niu et al. (2013) remark that educators have made a striking effort to foster CT among students, showing that the belief that CT can be taught and learned has spread and gained support. Therefore, CT has slowly made its way into general school education and specific instructional interventions. Problem-based learning is one of the most widely used learning approaches nowadays in CT instruction ( Dominguez, 2018 ) because it is motivating, challenging, and enjoyable ( Pithers and Soden, 2000 ; Niu et al., 2013 ). We see active learning methodologies and real-word problems such as SSIs as appropriate contexts for CT development.

The view that CT can be developed by engagement in argumentation practices plays a central role in this study, as Kuhn (2019) suggested. However, the post-truth condition poses some challenges to the evaluation of sources of information and scientific evidence disseminated by social media. According to Sinatra and Lombardi (2020) , the post-truth context raises the need for critical evaluation of online information about SSIs. Students need to be better prepared to assess science information they can easily find online from a variety of sources. Previous studies described by these authors emphasized the importance of source evaluation instruction to equip students toward this goal ( Bråten et al., 2019 ), however, this is not sufficient. Sinatra and Lombardi (2020) note that students should learn how to evaluate the connections between sources of information and knowledge claims. This requires, from our view, engaging students in CT and epistemic performance. If we want students to learn to think critically about the claims they will encounter on social media, they need to practice argumentation as critical evaluation.

We draw on research on epistemic education ( Chinn et al., 2018 ) which considers that learning science entails students’ participation in the science epistemic goals ( Kelly and Licona, 2018 ); in other words, placing scientific practices at the center of SSI instruction. Our study is framed in a broader research project that aims to embed CT in epistemic design and performance. In Chinn et al. (2018) AIR model, epistemic cognition has three core elements that represent the three letters of the acronym: epistemic Aims, goals related to inquiry; epistemic Ideals, standards and criteria used to evaluate epistemic products, such as explanations or arguments; and Reliable processes for attaining epistemic achievements. Of particular interest for our focus on CT is that the AIR model also proposes that epistemic cognition has a social nature, and it is situated. The purpose of epistemic education ( Barzilai and Chinn, 2017 ) should be to enable students to succeed in epistemic activities ( apt epistemic performance ), such as constructing and evaluating arguments, and to assess through meta-competence when success can be achieved. This paper attends to one aspect of epistemic performance proposed by Barzilai and Chinn (2017) , which is cognitive engagement in epistemic assessment. Epistemic assessment encompasses in our study the evaluation of the content of claims disseminated by media. Aligned with these authors we understand that this process requires cognitive and metacognitive competences. Thus, epistemic assessment needs adequate disciplinary knowledge, but also meta-cognitive competence for recognizing unsupported beliefs.

Goal and Research Questions

This paper examines students’ competence to engage in argumentation and CT in an online task that requires them to critically assess diverse information presented in media headlines on COVID-19. Competence in general can be defined as “a disposition to succeed with a certain aim” ( Sosa, 2015 , p. 43) and epistemic competence, as a special case of competence, is at its core a dispositional ability to discern the true from the false in a certain domain. For the purposes of this paper, the attention is on epistemic competence, being the research questions that drive the analysis of the following:

1. What is the competence of students to assess the credibility of COVID-19 information appearing in news headlines?

2. What is the level of epistemic assessment showed in students’ arguments according to the criteria appealed while assessing COVID-19 news headlines?

Materials and Methods

Context, participants, and design.

A teaching sequence about COVID-19 was designed at the beginning of the lockdown in Spain (Mid-March 2020) in response to the rise of misinformation about coronavirus on the internet and social media. The design process involved collaboration between the first and second author (researchers in science education) and the third author (a biology teacher in secondary education).

The participants are a group of twenty secondary students (14–15 years old), eleven of them girls, from a state public school located in a well-known seaside village in Galicia (Spain). They were mostly from middle-class families and within an average range of ability and academic achievement.

Students were from the same classroom and participated in previous online activities as part of their biology classes, taught by their biology teacher, who collaborated on previous studies on CT and learning science through epistemic practices on health controversies.

The activities were integrated in their biology curriculum and carried out when participants received instruction on the topics of health, infectious diseases, and the immune system.

Google Forms was used for the design and implementation of all activities included in the sequence. The reason to select Google Forms is that it is free and a well-known tool for online surveys. Besides, all students were familiar with its use before the lockdown and the teacher valued its usefulness for engaging them in online debates and in their own evaluation processes. This online resource provides anonymous results and statistics that the teacher could share with the students for debates. It needs to be highlighted that during the lockdown students did not have the same work conditions; particularly, quality and availability of access to the internet differed among them. Thus, all activities were asynchronous. They had 1 week to complete each task and the teacher could be consulted anytime if they had difficulties or any question regarding the activities.

The design was inspired by a previous one carried out by the authors when the first case of Ebola disease was introduced in Spain ( Puig et al., 2016 ), and follows a constructivist and scientific-based approach. The sequence began with an initial task, in which students were required to express their own views and knowledge on COVID-19 and health notions related with it, before then being progressively involved in the application of knowledge through the practice of modeling and argumentation. The third activity engaged them in critical evaluation of COVID-19 information. A more detailed description of the activities carried out in the different steps of the sequence is provided below.

Stage 1: General Knowledge on Health Notions Related to COVID-19

An individual Google Forms survey around some notions and health concepts that appear in social media during the lockdown, such as “pandemic”, “virus,” etc.

Stage 2: Previous Knowledge on Coronavirus Disease

This stage consisted of three parts: (2.1) Individual online survey on infectious diseases; (2.2) Introduction of knowledge about infectious diseases provided in the e-bugs project website 1 and activities; virtual visit to the exhibition “Outbreaks: epidemics in a connected world” available in the Natural History Museum website (blinded for review); (2.3) Building a poster with the chain of infection of the COVID-19 disease and some relevant information to consider in order to stop the spread of the disease.

Stage 3: COVID-19, Sources of Information

This stage consisted first of a virtual forum in which students shared their own habits when consulting scientific information, particularly coronavirus-related, and debated on the main media sources they used to consult for this purpose. Secondly, students had to analyze ten news headlines on COVID-19 disseminated by social media during the outbreaks; six corresponded to fake news and four were true. They were asked to critically assess them and distinguish which they thought were true, providing their arguments. Media sources were not provided until the end of the task, since the act of asking for the source was considered as part of the data analysis (see Table 1 ). The second part of this stage is the focus of our analysis.

www.frontiersin.org

Table 1. COVID-19 News Headlines provided to students.

Stage 4: Act and Raise Awareness on COVID-19

The sequence ended with the creation of a short video in which the students had to provide some tips to avoid the transmission of the virus. The information provided in the video must be supported and based on established scientific knowledge.

Data Corpus and Analysis

Data collection includes all individual surveys and activities developed in Google Forms. We analyzed students’ individual responses ( N = 28) presented in Stage 3. The research is designed as a qualitative study that utilizes the methods of discourse analysis in accordance with the data and the purpose of the study. Discourse analysis allows the analysis of the content (implicit or explicit) of written arguments produced by students, and so the examination of the research questions. Our analysis focuses on students’ arguments and criteria used to assess the credibility of COVID-19 headlines (ten headlines in total). It was carried out through an iterative process in which students’ responses were read and revised several times in order to develop an open-coded scheme that captures the arguments provided. To ensure the internal reliability of our codes, each student response was examined by the first and the second author separately and then contrasted and discussed until 100% agreement was achieved. The codes obtained were established according to the following criteria, summarized in Table 2 .

www.frontiersin.org

Table 2. Code scheme for research questions 1 and 2.

For Research Question 1, we distributed the arguments in two main categories: (1) Arguments that question the credibility of the information ; (2) Arguments that do not question the credibility of the information.

For Research Question 2, we classify arguments that question the credibility of the headline in accordance with the level of epistemic assessment into three levels (see Table 2 ). The level of epistemic assessment (basic, medium, and high) was established by the authors based on the criteria that students applied and expressed explicitly or implicitly in their arguments. These criteria emerged from the data, thus the categories were not pre-established; they were coded by the authors as the following: content (using the knowledge that each student has about the topic), source (questioning the origin of the information), evidence (appealing to empirical evidence as real live situations that students experienced), authority (justifying according to who supports or is behind the claim) and scientific procedure (drawing on the evolution of scientific knowledge).

Students’ Competence to Critically Assess the Credibility of COVID-19 Claims

In general, most students were able to distinguish fallacious from true headlines, which was an important step to assess their credibility. For those that were false, students were able to question their credibility, providing arguments against them. On the contrary, for true news headlines, as it was expected, most participants developed arguments supporting them. Thus, they did not question their content. In both cases, the arguments elaborated by students appealed to different criteria discussed in the next section of results.

As shown in Table 3 , 147 arguments were elaborated by students to question the false headlines; they created just 22 arguments to assess the true ones. This finding was expected by the authors, as arguments for questioning or criticality appear more frequently when the information presented differs from students’ opinions.

www.frontiersin.org

Table 3. Number of students who questioned or not each news headline on COVID-19.

Students showed a higher capacity for questioning those claims they considered false or fake news , which can be related to the need to justify properly why they consider them false and/or what should be said to counter them.

The headlines that were most controversial, meaning they created diverse positions among students, were these three: “The COVID-19 virus can be transmitted in areas with hot and humid climates,” “Skin manifestations (urticaria, chilblains, rashes…) could be among the mild symptoms of coronavirus” and “Antibiotics are effective in preventing and treating coronavirus infection.”

The first two were questioned by 11 students out of 28, despite being real headlines. According to students’ answers, they were not familiar with this information, e.g., “I think the heat is not good for the virus.” On the contrary, 17 students did not question these headlines, arguing for instance as this student did: “because it was shown that both in hot climates and in cold climates it is contagious in the same way.”

A similar situation happened with the third headline, which is false. A proportion of students (9 out of 28) accepted that antibiotics could help to treat COVID-19, showing in their answers some misunderstanding regarding the use of antibiotics and the diseases they could treat. The rest of the participants (19 out of 28) questioned this headline, affirming that “because antibiotics are used to treat bacterial infections and coronavirus is a virus,” among other justifications for why it was false.

Levels of Epistemic Assessment in Students’ Arguments on COVID-19 News Headlines

To analyze the level of epistemic assessment showed in students’ arguments when dealing with each headline, attention was focused on the criteria students applied (see Table 2 ). As Table 4 summarizes, almost all arguments included only one criterion (139 out of 169), and 28 out of 169 did not incorporate any criterion. These types of arguments can be interpreted as low epistemic assessment, or even without epistemic assessment if no criterion is included.

www.frontiersin.org

Table 4. Arguments used by students to assess the credibility of each COVID-19 headline.

In the category of Basic Epistemic Assessment , we include all students’ arguments that included one criterion: Content or Empirical Evidence. Students assessed the content of the claim appealing to their own knowledge about that piece of information or to empirical evidence, without posing critical questions for assessing the credibility of the source of information. These two criteria, content and evidence, were included in students’ arguments with a frequency of 86 and 23, respectively, with this category the most common (109 out of 169) when questioning false and true headlines. In the case of true headlines, arguments under this category were identified in relation to headlines 2 and 4, whose credibility were questioned by appealing to the content, such as: “those are not the symptoms (skin manifestations) ” . Examples of arguments assessing the content of false headlines are provided below:

“Because the virus is inside the body, and even if you injected alcohol into the body it would only cause intoxication”

This student rejects headline 5, appealing to the fact that alcohol causes intoxication rather than the elimination of coronavirus.

“I know a person who had coronavirus and they only gave him paracetamol”

In this example, the student rejects headline 6 and appeals to his/her own experience during the pandemic, particularly a close person who had coronavirus, as evidence against the use of antibiotics for coronavirus disease treatment.

The category Medium Epistemic Assessment gathers arguments that make critical questions, particularly those asking for information about the authority or the source of information. For us, these criteria reflect a higher level of epistemic performance since they imply questioning beyond the veracity of the headline itself to its sources and authorship. There are 20 out of 169 arguments coded within this category.

The assessment of true headlines includes arguments that question the authority and source, e.g., “because they said it on the news” (headline 2), “that news does not seem very reliable to me” (headline 4). It is also an ordinary category in questioning false headlines, since students appealed to the source (16), “because in the news they clarified that it was a fake news and because it is not credible either” (headline 10) or the authority (4), “because the professionals said they were more vulnerable (people over 70 years old) but not that it only affected them” (headline 7).

For the highest category, High Epistemic Assessment , we consider those arguments (12 out of 169) in which students appealed to the scientific procedure (11) to justify why the headline is false, which manifests students’ reliance on epistemic processes, e.g., “because treatments that protect against coronavirus are still being investigated” (headline 9). Also, under this category we include arguments that combined more than one criterion, content and scientific procedure “Because antibiotics don’t treat those kinds of infections. In addition, no medication has yet been discovered that can prevent the coronavirus” (headline 6). Students’ arguments included in this category were elaborated to assess false headlines.

Lastly, a special mention is afforded to those arguments that did not include any criteria (28), which are contained in the category Non-Epistemic Assessment. It appears more frequently in students’ answers to headlines 8 and 10, as these examples show: “I don’t think it’s true because it doesn’t make much sense to me” (headline 8) or “I never heard it and I doubt it’s true” (regarding drinking alcohol, headline 10).

The findings of our study indicate that students were able to deal with fake news , identifying it as such. They showed capacity to critically assess the content of these news headlines, considering their inconsistencies in relation to their prior knowledge ( Britt et al., 2019 ). As Evagorou (2020) pointed out, SSIs are appropriate contexts for CT development and to value the relevance of science in our lives.

The examination of RQ1 shows that a proportion of students were able to perceive the lack of evidence behind them or even identified that those statements contradict what science presents. This is a remarkable finding and an important skill to fight against attempts to diminish trust in science produced in the post-truth condition ( Dillon and Avraamidou, 2020 ). CT and argumentation are closely allied ( Andrews, 2015 ) but as the results show, knowledge domain seems to play an important role in assessing SSIs news and their implications. Specific CT requires some of the same skills as generalizable CT, but it is highly contextual and requires particular knowledge ( Jones, 2015 ).

Students’ prior knowledge influenced the critical evaluation of some of the COVID-19 headlines provided in the activity. This is particularly relevant in responses to headline 6 (false) “Antibiotics are effective in preventing and treating coronavirus infection.” A previous study on the interactions between the CT and knowledge domain on vaccination ( Ageitos and Puig, 2021 ) showed that there is a correspondence between them. This points to the importance of health literacy for CT development, although it would not be sufficient to provide students with adequate knowledge only, as judgment skills, in this case regarding the proper use of antibiotics, are also required.

We found that the level of epistemic assessment (RQ2) linked to students’ CT capacity is low. A big majority of arguments were situated in a basic epistemic assessment level, and just a few in a higher epistemic assessment. One reason that might explain these results could be related to the task design and format, in which students worked autonomously in a virtual environment. As CT studies in e-learning environments have reinforced ( Niu et al., 2013 ), cooperative or collaborative learning favors CT skills, particularly when students have to discuss and justify their arguments on real-life problems. The circumstances in which students had to work during the outbreak did not allow them to work together since internet connections were not good for all of them, so synchronous activities were not possible. This aspect is a limitation for this research.

There were differences in the use of criteria, and thus in the level of epistemic assessment, when students dealt with true and false headlines. This could be related to diverse factors, such as the language. The claims are marred by language and they are formulated in a different way. Particularly, it is quite nuanced in true statements while certain and resolute in false headlines. The practice of CT requires an understanding of the language, the content under evaluation and other cognitive skills ( Andrews, 2015 ).

In the case of false headlines, most arguments appealed to their content and less to the criteria of source, authority, and the scientific procedure, whereas in the case of true headlines most of them appealed to the authority and/or source. According to the AIR model ( Chinn and Rinehart, 2016 ), epistemic ideals are the criteria used to evaluate the epistemic products, such as claims. In the case of COVID-19 claims, students need to have an ideal of high source credibility ( Duncan et al., 2021 ). This means that students acknowledge that information should be gathered from reliable news media that themselves obtained information from reliable experts.

Only few students used the criterion of scientific procedure when assessing false headlines, which shows a high level of epistemic assessment. Promoting this type of assessment is important since online discourse in the post-truth era is affected by misinformation and by appeals to emotions and ideology.

Conclusion and Implications

This research has been conducted during a moment in which the lives of people were paralyzed, and citizens were forced to stay at home to stop the spread of the coronavirus disease. During the lockdown and even after, apart from these containment measures, citizens in Spain and in many countries had to deal with a huge amount of information about the coronavirus disease, some of it false. The outbreak of COVID-19 has been accompanied by dissemination of inaccurate information spread at high speed, making it more difficult for the public to identify verified facts and advice from trusted sources ( World Health Organization (WHO), 2020 ). As the world responds to the COVID-19 pandemic, many studies have been carried out to analyze the impact of the pandemic on the life of children from diverse views ( Cachón-Zagalaz et al., 2020 ), but not from the perspective of exploring students’ ability to engage in the epistemic assessment of information and disinformation on COVID-19 under a situation of social isolation. This is an unprecedented context in many aspects, where online learning replaced in-person teaching and science uncertainties were more visible than ever.

Participants engaged in the epistemic assessment of coronavirus headlines and were able to put into practice their CT, arguing why they considered them as true or false by appealing to different criteria. We are aware that our results have limitations. Once such limitation is that students performed the activity independently, without creating a collaborative virtual environment, understood by the authors as one of the e-learning strategies that better promote CT ( Puig et al., 2020 ). Furthermore, despite the fact that teachers were available for students to solve any questions regarding the task, the remote and asynchronous process did not allow them to guide the activity in a way that helped the students to carry out a deeper analysis. CT development and epistemic cognition depends on many factors, and teachers have an important role in achieving these goals ( Greene and Yu, 2016 ; Chinn et al., 2020 ).

The analysis of arguments allows us to identify some factors that are crucial and directly affect the critical evaluation of headlines. Some of the students did not question the use of antibiotics for coronavirus disease. This result highlights the importance of health literacy and its interdependency with CT development, as previous studies on vaccine controversies and CT show ( Puig and Ageitos, 2021 ). Although it is not the focus of this paper; the results point to the importance of making students aware of their knowledge limitations for critical assessment. A key instructional implication from this work is making e-learning activities more cooperative, as we have noted, and epistemically guided. Moreover, CT dimensions could be made explicit in instructional materials and assessments. If we want to prepare students to develop CT in order to face real/false news spread by social media, we need to engage them in deep epistemic assessment, namely in the critical analysis of the content, the source, procedures and evidence behind the claims, apart from other tasks. Promoting students’ awareness and vigilance regarding misinformation and disinformation online may also promote more careful and attentive information use ( Barzilai and Chinn, 2020 ), thus activities oriented toward these goals are necessary.

Our study reinforces the need to design more CT activities that guide students in the critical assessment of diverse aspects behind controversial news as a way to fight against the rise of disinformation and develop good knowledge when dealing with SSIs. Students’ epistemological views can influence their performance on argumentation thus, if uncertainty of knowledge is explicitly address in SSI instruction and epistemic activities, students’ epistemological views may be developed, and such development may in turn influence their argumentation competence and consequently their performance on CT.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.

Ethics Statement

Written informed consent was obtained from the participants’ legal guardian/next of kin to participate in this study in accordance with the National Legislation and the Institutional Requirements.

Author Contributions

BP conducted the conceptual framework and designed the research study. PB-A conducted the data analysis and collaborated in manuscript preparation. JP-M implemented the didactical proposal and collected the data. All authors contributed to the article and approved the submitted version.

This work was supported by the project ESPIGA, funded by the Spanish Ministry of Science, Education and Universities, partly funded by the European Regional Development Fund (ERDF) Grant code: PGC2018-096581-B-C22.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

This study was carried out within the RODA research group during the lockdown in Spain due to COVID-19 pandemic. We gratefully acknowledge all the participants for their implication, despite such difficult circumstances.

  • ^ https://www.e-bug.eu

Abrami, P. C., Bernard, R. M., Borokhovski, E., Wade, A., Surkes, M. A., Ramim, R., et al. (2008). Instructional interventions affecting critical thinking skills and dispositions: a stage 1 meta-analysis. Rev. Educ. Res . 78, 1102–1134. doi: 10.3102/0034654308326084

CrossRef Full Text | Google Scholar

Ageitos, N., and Puig, B. (2021). “Critical thinking to decide what to believe and what to do regarding vaccination in schools. a case study with primary pre-service teachers,” in Critical Thinking in Biology and Environmental Education. Facing Challenges in a Post-Truth World , eds B. Puig and M. P. Jiménez-Aleixandre (Berlin: Springer).

Google Scholar

Aikenhead, G. S. (1985). Collective decision making in the social context of science. Sci. Educ . 69, 453–475. doi: 10.1002/sce.3730690403

Andrews, R. (2015). “Critical thinking and/or argumentation in highr education,” in The Palgrave Handbook of Critical Thinking in Higher Education , eds M. Davies, R. Barnett, et al. (New York, NY: Palgrave Macmillan), 93–105.

Barzilai, S., and Chinn, C. A. (2017). On the goals of epistemic education: promoting apt epistemic performance. J. Learn. Sci . 27, 353–389. doi: 10.1080/10508406.2017.1392968

Barzilai, S., and Chinn, C. A. (2020). A review of educational responses to the “post-truth” condition: four lenses on “post-truth” problems. Educ. Psychol. 55, 107–119. doi: 10.1080/00461520.2020.1786388

Bencze, L., Halwany, S., and Zouda, M. (2020). “Critical and active public engagement in addressing socioscientific problems through science teacher education,” in Science Teacher Education for Responsible Citizenship , eds M. Evagorou, J. A. Nielsen, and J. Dillon (Berlin: Springer), 63–83. doi: 10.1007/978-3-030-40229-7_5

Bråten, I., Brante, E. W., and Strømsø, H. I. (2019). Teaching sourcing in upper secondary school: a comprehensive sourcing intervention with follow-up data. Read. Res. Q . 54, 481–505. doi: 10.1002/rrq.253

Britt, M. A., Rouet, J. F., Blaum, D., and Millis, K. K. (2019). A reasoned approach to dealing with fake news. Policy Insights Behav. Brain Sci . 6, 94–101. doi: 10.1177/2372732218814855

Cachón-Zagalaz, J., Sánchez-Zafra, M., Sanabrias-Moreno, D., González-Valero, G., Lara-Sánchez, A. J., and Zagalaz-Sánchez, M. L. (2020). Systematic review of the literature about the effects of the COVID-19 pandemic on the lives of school children. Front. Psychol . 11:569348. doi: 10.3389/fpsyg.2020.569348

PubMed Abstract | CrossRef Full Text | Google Scholar

Chakraborty, I., and Maity, P. (2020). COVID-19 outbreak: migration, effects on society, global environment and prevention. Sci. Total Environ . 728:138882. doi: 10.1016/j.scitotenv.2020.138882

Chinn, C., and Rinehart, R. W. (2016). “Epistemic cognition and philosophy: developing a new framework for epistemic cognition,” in Handbook of Epistemic Cognition , eds J. A. Greene, W. A. Sandoval, and I. Braten (New York, NY: Routledge), 460–478.

Chinn, C. A., Barzilai, S., and Duncan, R. G. (2020). Disagreeing about how to know. the instructional value of explorations into knowing. Educ. Psychol . 55, 167–180. doi: 10.1080/00461520.2020.1786387

Chinn, C. A., Duncan, R. G., and Rinehart, R. (2018). “Epistemic design: design to promote transferable epistemic growth in the PRACCIS project,” in Promoting Spontaneous Use of Learning and Reasoning Strategies. Theory, Research and Practice for Effective Transfer , eds E. Manalo, Y. Uesaka, and C. A. Chinn (Abingdon: Routledge), 243–259.

Davies, M., and Barnett, R. (2015). The Palgrave Handbook of Critical Thinking in Higher Education . London: Palgrave MacMillan. doi: 10.1057/9781137378057

Dillon, J., and Avraamidou, L. (2020). Towards a viable response to COVID-19 from the science education community. J. Activist Sci. Technol. Educ . 11, 1–6. doi: 10.33137/jaste.v11i2.34531

Dominguez, C. (2018). A European Review on Critical Thinking Educational Practices in Higher Education Institutions. Vila Real: UTAD. Available online at: https://www.researchgate.net/publication/322725947_A_European_review_on_Critical_Thinking_educational_practices_in_Higher_Education_Institutions

Duncan, R. G., Caver, V. L., and Chinn, C. A. (2021). “The role of evidence evaluation in critical thinking,” in Critical Thinking in Biology and Environmental Education. Facing Challenges in a Post-Truth World , eds B. Puig and M. P. Jiménez-Aleixandre (Berlin: Springer).

Duschl, R. (2020). Practical reasoning and decision making in science: struggles for truth. Educ. Psychol . 3, 187–192. doi: 10.1080/00461520.2020.1784735

Eastwood, J. L., Sadler, T. D., Zeidler, D. L., Lewis, A., Amiri, L., and Applebaum, S. (2012). Contextualizing nature of science instruction in socioscientific issues. Int. J. Sci. Educ . 34, 2289–2315. doi: 10.1080/09500693.2012.667582

Ennis, R. (2018). Critical thinking across the curriculum. Topoi 37, 165–184. doi: 10.1007/s11245-016-9401-4

European Council of the European Union (2020). Fighting Disinformation . Available online at: https://www.consilium.europa.eu/en/policies/coronavirus/fighting-disinformation/

Evagorou, M. (2020). “Introduction: socio-scientific issues as promoting responsible citizenship and the relevance of science,” in Science Teacher Education for Responsible Citizenship , eds M. Evagorou, J. A. Nielsen, and J. Dillon (Berlin: Springer), 1–11. doi: 10.1007/978-3-030-40229-7_1

Evagorou, M., Jimenez-Aleixandre, M. P., and Osborne, J. (2012). ‘Should we kill the grey squirrels?’ a study exploring students’ justifications and decision-making. Int. J. Sci. Educ . 34, 401–428. doi: 10.1080/09500693.2011.619211

Facione, P. A. (1990). Critical Thinking: a Statement of Expert Consensus for Purposes of Educational Assessment and Instruction. Fullerton, CA: California State University.

Feinstein, W. N., and Waddington, D. I. (2020). Individual truth judgments or purposeful, collective sensemaking? rethinking science education’s response to the post-truth era. Educ. Psychol . 55, 155–166. doi: 10.1080/00461520.2020.1780130

Giri, V., and Paily, M. U. (2020). Effect of scientific argumentation on the development of critical thinking. Sci. Educ . 29, 673–690. doi: 10.1007/s11191-020-00120-y

Greene, J. A., and Yu, S. B. (2016). Educating critical thinkers: the role of epistemic cognition. Policy Insights Behav. Brain Sci . 3, 45–53. doi: 10.1177/2372732215622223

Guterres, A (2020). Secretary-General Remarks on COVID-19: A Call for Solidarity . Available at: https://www.un.org/sites/un2.un.org/files/sg_remarks_on_covid-19_english_19_march_2020.pdf (accessed March 19, 2020).

Halpern, D. F. (1998). Teaching critical thinking for transfer across domains. dispositions, skills, structure training, and metacognitive monitoring. Am. Psychol . 53, 449–455. doi: 10.1037/0003-066x.53.4.449

Jiménez Aleixandre, M. P., and Erduran, S. (2008). “Argumentation in science education: an overview,” in Argumentation in Science Education: Perspectives from Classroom-Based Research , eds S. Erduran and M. P. Jiménez Aleixandre (Dordrecht: Springer), 3–27. doi: 10.1007/978-1-4020-6670-2_1

Jiménez-Aleixandre, M. P., and Puig, B. (2021). “Educating critical citizens to face post-truth: the time is now,” in Critical Thinking in Biology and Environmental Education. Facing Challenges in a Post-Truth World , eds B. Puig and M. P. Jiménez-Aleixandre (Berlin: Springer).

Jones, A. (2015). “A disciplined approach to CT,” in The Palgrave Handbook of Critical Thinking in Higher Education , eds M. Davies, R. Barnett, et al. (New York, NY: Palgrave Macmillan), 93–105.

Kelly, G. J., and Licona, P. (2018). “Epistemic practices and science education,” in History, Philosophy and Science Teaching , ed. M. R. Matthews (Dordrecht: Springer), 139–165. doi: 10.1007/978-3-319-62616-1_5

Kuhn, D. (2019). Critical thinking as discourse. Hum. Dev . 62, 146–164. doi: 10.1159/000500171

Niu, L., Behar-Horenstein, L. S., and Garvan, C. W. (2013). Do instructional interventions influence college students’ critical thinking skills? a meta-analysis. Educ. Res. Rev . 9, 114–128. doi: 10.1016/j.edurev.2012.12.002

Pithers, R. T., and Soden, R. (2000). Critical thinking in education: a review. Educ. Res . 42, 237–249.

Puig, B., and Ageitos, N. (2021). “Critical thinking to decide what to believe and what to do regarding vaccination in schools. a case study with primary pre-service teachers,” in Critical Thinking in Biology and Environmental Education. Facing Challenges in a Post-Truth World , eds B. Puig and M. P. Jiménez-Aleixandre (Berlin: Springer).

Puig, B., Blanco Anaya, P., and Bargiela, I. M. (2020). “A systematic review on e-learning environments for promoting critical thinking in higher education,” in Handbook of Research in Educational Communications and Technology , eds M. J. Bishop, E. Boling, J. Elen, and V. Svihla (Cham: Springer), 345–362. doi: 10.1007/978-3-030-36119-8_15

Puig, B., Blanco Anaya, P., Crujeiras Pérez, B., and Pérez Maceira, J. (2016). Ideas, emociones y argumentos del profesorado en formación acerca del virus del Ébola. Indagatio Didactica 8, 764–776.

Sinatra, G. M., and Lombardi, D. (2020). Evaluating sources of scientific evidence and claims in the post-truth era may require reappraising plausibility judgments. Educ. Psychol . 55, 120–131. doi: 10.1080/00461520.2020.1730181

Sosa, E. (2015). Judgment and Agency. Oxford: Oxford University Press.

Sperber, D., Clement, F., Heintz, C., Mascaro, O., Mercier, H., Origgi, G., et al. (2010). Epistemic vigilance. Mind Lang . 25, 359–393. doi: 10.1111/j.1468-0017.2010.01394.x

Thomas, K., and Lok, B. (2015). “Teaching critical thinking: an operational framework,” in The Palgrave Handbook of Critical Thinking in Higher Education , eds M. Davies, R. Barnett, et al. (New York, NY: Palgrave Macmillan), 93–105. doi: 10.1057/9781137378057_6

van Gelder, T. (2005). Teaching critical thinking. some lessons from cognitive science. Coll. Teach . 53, 41–48. doi: 10.3200/CTCH.53.1.41-48

Vincent-Lacrin, S., González-Sancho, C., Bouckaert, M., de Luca, F., Fernández-Barrera, M., Jacotin, G., et al. (2019). Fostering Students’ Creativity and Critical Thinking: What it Means in School, Educational Research and Innovation. Paris: OED Publishing.

World Health Organization (WHO) (2020). Managing the COVID-19 Infodemic: Promoting Healthy Behaviours and Mitigating the Harm from Misinformation and Disinformation. Available online at: https://www.who.int/news/item/23-09-2020-managing-the-covid-19-infodemic-promoting-healthy-behaviours-and-mitigating-the-harm-from-misinformation-and-disinformation

Keywords : critical thinking, argumentation, socio-scientific issues, COVID-19 disease, fake news, epistemic assessment, secondary education

Citation: Puig B, Blanco-Anaya P and Pérez-Maceira JJ (2021) “Fake News” or Real Science? Critical Thinking to Assess Information on COVID-19. Front. Educ. 6:646909. doi: 10.3389/feduc.2021.646909

Received: 28 December 2020; Accepted: 09 March 2021; Published: 03 May 2021.

Reviewed by:

Copyright © 2021 Puig, Blanco-Anaya and Pérez-Maceira. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Blanca Puig, [email protected]

This article is part of the Research Topic

Science Education for Citizenship through Socio-Scientific issues

SkillsYouNeed

  • LEARNING SKILLS

Critical Thinking and Fake News

Search SkillsYouNeed:

Learning Skills:

  • A - Z List of Learning Skills
  • What is Learning?
  • Learning Approaches
  • Learning Styles
  • 8 Types of Learning Styles
  • Understanding Your Preferences to Aid Learning
  • Lifelong Learning
  • Decisions to Make Before Applying to University
  • Top Tips for Surviving Student Life
  • Living Online: Education and Learning
  • 8 Ways to Embrace Technology-Based Learning Approaches
  • Critical Thinking Skills
  • Understanding and Addressing Conspiracy Theories
  • Critical Analysis
  • Study Skills
  • Exam Skills
  • Writing a Dissertation or Thesis
  • Research Methods
  • Teaching, Coaching, Mentoring and Counselling
  • Employability Skills for Graduates

Subscribe to our FREE newsletter and start improving your life in just 5 minutes a day.

You'll get our 5 free 'One Minute Life Skills' and our weekly newsletter.

We'll never share your email address and you can unsubscribe at any time.

Since the 2016 US presidential election, the phrase ‘ fake news ’ has become standard currency. But what does the term actually mean, and how can you distinguish fake from ‘real’ news?

The bad news is that ‘fake news’ is often very believable, and it is extremely easy to get caught out.

This page explains how you can apply critical thinking techniques to news stories to reduce the chances of believing fake news, or at least starting to understand that ‘not everything you read is true’.

What is ‘ Fake News ’?

‘Fake news’ is news stories that are either completely untrue, or do not contain all the truth, with a view to deliberately misleading readers.

Fake news became prominent during the US election, with supporters of both sides tweeting false information in the hope of influencing voters. But it is nothing new.

“The report of my death was an exaggeration.”

In May 1897, Mark Twain, the American author, was in London. Rumours reached the US that he was very ill and, later, that he had died. In a letter to Frank Marshall White, a journalist who inquired after his health as a result, Mark Twain suggested that the rumours had started because his cousin, who shared his surname, had been ill a few weeks before. He noted dryly to White,

“The report of my death was an exaggeration”.

It had, nonetheless, been widely reported in the US, with one newspaper even printing an obituary.

Fake news is not:

Articles on satirical or humorous websites, or related publications, that make a comment on the news by satirising them, because this is intended to inform and amuse, not misinform;

Anything obvious that ‘everyone already knows’ (often described using the caption ‘that’s not news’; or

An article whose content you disagree with.

The deliberate intention of fake news to mislead is crucial.

Why is Fake News a Problem?

If fake news has been around for so long, why is it suddenly a problem?

The answer is that social media means that credible fake news stories can spread very quickly.

In the worst cases, they can have major effects. There are suggestions that fake news influenced the 2016 US election. In another case, a gunman opened fire at a pizzeria that had been falsely but widely reported as being the centre of a paedophile ring involving prominent politicians. In less critical cases, fake news reports can result in distress or reputational damage for the people or organisations mentioned in the articles.

It is, therefore, important to be alert to the potential for reports to be fake, and to ensure that you are not party to their spread.

Spotting fake news

Unfortunately, it is not always easy to spot false news.

Sometimes, a story may be obviously false – for example, it may contain typos or spelling mistakes, or formatting errors. Like phishing emails, however, some fake news stories are a lot more subtle than that.

Facebook famously issued a guide to spotting fake news in May 2017. Its advice ranges from the obvious to the much less intuitive. Useful tips include:

Investigate the source

Be wary of stories written by unknown sources, and check their website for more information. Stories from reliable news sources, such as national newspapers or broadcasters, are more likely to have been checked and verified. It is also worth looking at the URL, to make sure it is a genuine news organisation.

Look at the evidence on which the article bases its claims, and check whether they seem credible. If there are no sources given, or the source is an unknown ‘expert’ or ‘friend’ of someone concerned, be sceptical.

Check whether other, reliable news sources are carrying the story

Sometimes, even otherwise reliable news sources get carried away and forget to do all the necessary checks. But one very good check is to ask whether other reliable sources are also carrying the story. If yes, it is likely to be correct. If not, you should at least be doubtful.

Facebook’s advice boils down to reading news stories critically.

That does not mean looking for their flaws, or criticising them, although this can be part of critical reading and thinking. Instead, it means applying logic and reason to your thinking and reading, so that you make a sensible judgement about what you are reading.

In practice, this means being alert to why the article has been written, and what the author wants you to feel, think or even do as a result of reading it. Even accurate stories may have been written in a way that is designed to steer you towards a particular point of view or action.

For more about this see our pages on Critical Thinking and Critical Reading .

A word about bias

It is worth remembering that everyone has their opinions, and therefore sources of potential bias in what they write. These may be conscious or unconscious. News organisations tend to have an organisational ‘view’ or political slant. For example, the UK’s Guardian is broadly left-wing, and most of the UK tabloids are right-wing in their views, and this affects both what they report and how they report it.

As a reader, you also have biases, both conscious and unconscious, and these affect the stories you choose to read, and the sources you use. It is therefore possible to self-select only stories that confirm your own view of the world, and social media is very good at helping with this.

To overcome this, it is important to use more than one source of information, and try to ensure that they have at least small differences in their political views.

A final thought

Fake news spreads so fast because we all like the idea of telling people something that they did not already know, something exclusive, and because we want to share our view of the world. It’s a bit like gossip.

But like false gossip, fake news can harm. Next time, before you click on ‘share’ or ‘retweet’, just take a moment to think about whether the story that you are spreading is likely to be true or not. Even if you think it is true, consider the possible effect of spreading it. Is it going to hurt anyone if it turns out to be false?

If so, don’t go there, think before you share.

Continue to: Understanding and Addressing Conspiracy Theories Critical Thinking Skills

See also: Making Sense of Financial News Understanding and Interpreting Online Product Reviews Critical Analysis

Christopher Dwyer Ph.D.

False Memories

10 ways to spot fake news, evaluating 'news’ online through critical thinking..

Posted October 4, 2019 | Reviewed by Jessica Schrader

Over the summer, I wrote a couple of pieces about how we can infuse critical thinking into our writing and avoid presenting Fake News . During my writing, it dawned on me that it’s arguably easier to avoid presenting fake news than it is to identify fake news. Indeed, research indicates both that students struggle to evaluate the credibility of information online (Wineburg et al., 2016) and that approximately 2% of children have the critical literacy skills necessary to identify whether a news story is fake (Commission on Fake News and the Teaching of Critical Literacy in Schools, 2018).

‘Fake news’ is not a new concept, though it has increased in influence since the dawn of social media , which facilitates easier transmission of such ‘stories.’ The manner in which information exchange has evolved over the past 15 years provides us with more and more examples of fake news; for example, through clickbait, biased reporting, propaganda and bad journalism. Fake news can be exchanged accidentally or deliberately—to which the parliamentary Committee on Digital, Culture, Media and Sport in the UK refer to as misinformation and disinformation , respectively.

Indeed, perhaps one of the best tips for avoiding the presentation of fake news is the advancement of one’s ability to identify it in the first place. As a result, I present 10 Ways to Spot Fake News —but before I begin—it’s worth noting that there already exists a variety of models for spotting fake news, many of which are helpful. However, what is often glossed over is that what they’re really talking about, in a watered-down version, is the importance of critical evaluation . Thus, I present these tips in the context of evaluation, which refers to the critical thinking skill that is used in the assessment of propositions and the conclusions they infer with respect to their credibility , relevance , logical strength, and balance in the argument; thus, deciding the overall strength or weakness of the argument (Dwyer, 2017; Dwyer, Hogan & Stewart, 2014; Facione, 1990).

Credibility

Evaluating the credibility of claims and arguments involves progressing beyond merely identifying the source of propositions in an argument, to actually examining the credibility of those identified sources (e.g., personal experiences, common beliefs, opinions, expert/authority opinion, statistics, and research evidence). So…

1. Don’t just read the headline — dig deeper. Read the full article and assess the sources of the claims.

2. Look for evidence, not opinion (unless it’s a relevant expert’s opinion—remember, they’re an expert in a particular field for a reason). Personal experiences and common beliefs are not credible sources.

3. Look for replication —has the same story been published elsewhere? If multiple sources are covering it, it’s more likely legitimate than if this is the only source. If not, it could be unreliable. What website are you reading it from? Does the address look dodgy or untrustworthy? If so, you should…

4. Read about the site, the author, or publisher. Knowing more about these will help inform your evaluation of balance as well (see below). What do you do if the evidence they present isn’t language-based … what if it’s an image? A picture is worth a thousand words, right? Not necessarily—pictures can lie as well. Perhaps it might be worth conducting a reverse image search to see if it is a fake image?

Evaluation also implies deep consideration of the relevance of claims within an argument, which is accomplished through assessing the contextual pertinence or applicability of one proposition to another. So…

5. Ask yourself, are all the reasons presented to you for believing something actually relevant to the central claim? For example, suppose that in the heat of debate on the biological basis of aggression , a person says to you:

“Well, you mentioned that men and women have different levels of testosterone — men have more testosterone and this is one reason why men are more aggressive. But, did you know that testosterone has also been implicated in the structural brain differences that underpin gender differences in language ability and spatial ability?”

critical thinking in fake news

Though related, is this argument regarding gender differences in language and spatial ability truly relevant to the claim about aggression? I’ve seen and heard many arguments go on tangents because they trail off on a path of what’s related , but not what’s relevant . If evidence and logic cease to become relevant to the central claim, it could be a case of sloppy writing or perhaps, more insidiously, a crafty means of biasing the reader—either way, there’s a good chance that this might be fake news.

Logical Strength

An argument is not just a heated debate—every piece of text you read that contains the words but or because , however , yet , therefore , thus , etc., is an argument. Evaluating the logical strength of an argument is accomplished by monitoring both the logical relationships among propositions and the claims they infer. The overall structure of an argument needs to be logical if the argument is to be considered strong.

6. If the structure lacks logic or what the writer deems to be logic is weak, this may be a sign that you’re dealing with fake news.

7. Logic is objective; so, look out for dramatic punctuation (!) and sensationalist language. Yes, the real news often presents its headlines in a sensationalist manner, but fake news can go overboard with this. With that, a good rule of thumb is to evaluate any sensationalist reports with extra care, regardless of the source.

8. Construction of logic requires care, so look out for careless presentation. Imagine a piece seems logical—OK, fair enough. Did you notice spelling mistakes or any concerning issues with the manner in which the piece was presented? These may also be signs of fake news.

The final feature of evaluation is the assessment of the extent to which there is a balance of evidence in an argument structure.

9. Count the reasons and objections (i.e., reasons for and against). If there’s a relatively large difference between these counts, then we can consider the argument imbalanced, which may imply that that the argument’s author is in some way biased. However, it may also mean that there is an imbalanced amount of evidence available for evaluation (and thus, it’s not necessarily biased writing; rather research as complete as it can be); so, be extra careful in your assessment (and don't be overly skeptical ).

Furthermore, an argument may be biased in the sense that a person has a belief or prejudgment that makes them focus only on reasoning that supports their belief (e.g., confirmation bias ). There are two extremes of bias and many shades of difference between these two extremes. The first extreme is where a person wholeheartedly agrees with a claim and offers only supporting arguments (i.e., omitting objections). The second extreme is where a person vehemently opposes a claim and offers only objections (i.e., omitting supports). In both cases, the person may be overlooking some important arguments; and in both cases, we need to…

10. Question the intentions of the author and ask, what is the purpose of this news story? I find that an often useful way of approaching the assessment of bias is through asking whether the piece or 'story' made me feel something. Remember, news stories are supposed to be objective. If what was presented to you evokes some kind of emotion , then it means that either the author is biased or you are. Be honest with yourself about this and assess both possibilities. Notably, this ‘emotion evoking’ is commonplace in pieces like editorials or opinion-based articles; so, it’s not always necessarily fake news . With that, however, you must be aware of the piece’s format, its role, and its purpose.

Similarly, arguments can be biased as a result of deliberately pitting weak propositions (e.g., with regard to relevance or credibility) on one side against strong propositions on the other. For example, by placing a string of three anecdotes on one side of a debate against one good quality piece of research on the other side, we may well wonder if the author is not deliberately pitting the strong against the weak in order to make us rethink our overall conclusion. Furthermore, the author might be presenting weak statements against one strong proposition in order to feign a balanced argument, when, in reality, the argument is both imbalanced and biased. Evaluating the potential for omission, bias, and imbalance in an argument allows us to identify and address an argument’s underlying motives.

Commission on Fake News and the Teaching of Critical Literacy in Schools (2018). Fake news and critical literacy: Final report. National Literacy Trust: UK.

Dwyer, C. P. (2017). Critical thinking: Conceptual perspectives and practical guidelines. Cambridge, UK: Cambridge University Press.

Dwyer, C. P., Hogan, M. J., & Stewart, I. (2014b). An integrated critical thinking framework for the 21st century. Thinking Skills & Creativity, 12, 43–52.

Facione, P. A. (1990). The Delphi report: Committee on pre-college philosophy. Millbrae, CA: California Academic Press.

Wineburg, Sam and McGrew, Sarah and Breakstone, Joel and Ortega, Teresa. (2016). Evaluating Information: The Cornerstone of Civic Online Reasoning. Stanford Digital Repository. Available at: http://purl.stanford.edu/fv751yt5934

Christopher Dwyer Ph.D.

Christopher Dwyer, Ph.D., is a lecturer at the Technological University of the Shannon in Athlone, Ireland.

  • Find a Therapist
  • Find a Treatment Center
  • Find a Psychiatrist
  • Find a Support Group
  • Find Online Therapy
  • United States
  • Brooklyn, NY
  • Chicago, IL
  • Houston, TX
  • Los Angeles, CA
  • New York, NY
  • Portland, OR
  • San Diego, CA
  • San Francisco, CA
  • Seattle, WA
  • Washington, DC
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Therapy Center NEW
  • Diagnosis Dictionary
  • Types of Therapy

March 2024 magazine cover

Understanding what emotional intelligence looks like and the steps needed to improve it could light a path to a more emotionally adept world.

  • Emotional Intelligence
  • Gaslighting
  • Affective Forecasting
  • Neuroscience
  • Skip to main content
  • Keyboard shortcuts for audio player

Planet Money

  • Planet Money Podcast
  • The Indicator Podcast
  • Planet Money Newsletter Archive
  • Planet Money Summer School

Planet Money

  • LISTEN & FOLLOW
  • Apple Podcasts
  • Google Podcasts
  • Amazon Music

Your support helps make our show possible and unlocks access to our sponsor-free feed.

How do you counter misinformation? Critical thinking is step one

Greg Rosalsky, photographed for NPR, 2 August 2022, in New York, NY. Photo by Mamadi Doumbouya for NPR.

Greg Rosalsky

Misinformation

Late last year, in the days before the Slovakian parliamentary elections, two viral audio clips threatened to derail the campaign of a pro-Western, liberal party leader named Michal Šimečka. The first was a clip of Šimečka announcing he wanted to double the price of beer, which, in a nation known for its love of lagers and pilsners, is not exactly a popular policy position.

In a second clip, Šimečka can be heard telling a journalist about his intentions to commit fraud and rig the election. Talk about career suicide, especially for someone known as a champion of liberal democracy.

There was, however, just one issue with these audio clips: They were completely fake.

The International Press Institute has called this episode in Slovakia the first time that AI deepfakes — fake audio clips, images, or videos generated by artificial intelligence — have played a prominent role in a national election. While it's unclear whether these bogus audio clips were decisive in Slovakia's electoral contest, the fact is Šimečka's party lost the election, and a pro-Kremlin populist now leads Slovakia.

In January, a report from the World Economic Forum found that over 1,400 security experts consider misinformation and disinformation (misinformation created with the intention to mislead) the biggest global risk in the next two years — more dangerous than war, extreme weather events, inflation, and everything else that's scary. There are a bevy of new books and a constant stream of articles that wrestle with this issue. Now even economists are working to figure out how to fight misinformation.

In a new study , "Toward an Understanding of the Economics of Misinformation: Evidence from a Demand Side Field Experiment on Critical Thinking," economists John A. List, Lina M. Ramírez, Julia Seither, Jaime Unda and Beatriz Vallejo conduct a real-world experiment to see whether simple, low-cost nudges can be effective in helping consumers to reject misinformation. (Side note: List is a groundbreaking empirical economist at the University of Chicago, and he's a longtime friend of the show and this newsletter ).

While most studies have focused on the supply side of misinformation — social media platforms, nefarious suppliers of lies and hoaxes, and so on — these authors say much less attention has been paid to the demand side: increasing our capacity, as individuals, to identify and think critically about the bogus information that we may encounter in our daily lives.

A Real-Life Experiment To Fight Misinformation

The economists conducted their field experiment in the run-up to the 2022 presidential election in Colombia. Like the United States, Colombia is grappling with political polarization. Within a context of extreme tribalism, the authors suggest, truth becomes more disposable and the demand for misinformation rises. People become willing to believe and share anything in their quest for their political tribe to win.

To figure out effective ways to lower the demand for misinformation, the economists recruited over 2,000 Colombians to participate in an online experiment. These participants were randomly distributed into four different groups.

One group was shown a video demonstrating "how automatic thinking and misperceptions can affect our everyday lives." The video shows an interaction between two people from politically antagonistic social groups who, before interacting, express negative stereotypes about the other's group. The video shows a convincing journey of these two people overcoming their differences. Ultimately, they express regret over unthinkingly using stereotypes to dehumanize one another. The video ends by encouraging viewers to question their own biases by "slowing down" their thinking and thinking more critically.

Another group completed a "a personality test that shows them their cognitive traits and how this makes them prone to behavioral biases." The basic idea is they see their biases in action and become more self-aware and critical of them, thereby decreasing their demand for misinformation.

A third group both watched the video and took the personality test.

Finally, there was a control group, which neither watched the video nor took the personality test.

To gauge whether these nudges get participants to be more critical of misinformation, each group was shown a series of headlines, some completely fake and some real. Some of these headlines leaned left, others leaned right, and some were politically neutral. The participants were then asked to determine whether these headlines were fake. In addition, the participants were shown two untrue tweets, one political and one not. They were asked whether they were truthful and whether they would report either to social media moderators as misinformation.

What They Found

The economists find that the simple intervention of showing a short video of people from politically antagonistic backgrounds getting along inspires viewers to be more skeptical of and less susceptible to misinformation. They find that participants who watch the video are over 30 percent less likely to "consider fake news reliable." At the same time, the video did little to encourage viewers to report fake tweets as misinformation.

Meanwhile, the researchers find that the personality test, which forces participants to confront their own biases, has little or no effect on their propensity to believe or reject fake news. It turns out being called out on our lizard brain tribalism and other biases doesn't necessarily improve our thinking.

In a concerning twist, the economists found that participants who both took the test and watched the video became so skeptical that they were about 31 percent less likely to view true headlines as reliable. In other words, they became so distrustful that even the truth became suspect. As has become increasingly clear, this is a danger in the new world of deepfakes: not only do they make people believe untrue things, they also may make people so disoriented that they don't believe true things.

As for why the videos are successful in helping to fight misinformation, the researchers suggest that it's because they encourage people to stop dehumanizing their political opponents, think more critically, and be less willing to accept bogus narratives even when it bolsters their political beliefs or goals. Often — in a sort of kumbaya way — centrist political leaders encourage us to recognize our commonalities as fellow countrymen and work together across partisan lines. It turns out that may also help us sharpen our thinking skills and improve our ability to recognize and reject misinformation.

Critical Thinking In The Age Of AI

Of course, this study was conducted back in 2022. Back then, misinformation, for the most part, was pretty low-tech. Misinformation may now be getting turbocharged with the rapid proliferation and advancement of artificial intelligence.

List and his colleagues are far from the first scholars to suggest that helping us become more critical thinkers is an effective way to combat misinformation. University of Cambridge psychologist Sander van der Linden has done a lot of work in the realm of what's known as "psychological inoculation," basically getting people to recognize how and why we're susceptible to misinformation as a way to make us less likely to believe it when we encounter it. He's the author of a new book called Foolproof: Why Misinformation Infects Our Minds and How to Build Immunity . Drawing an analogy to how vaccinations work, Van der Linden advocates exposing people to misinformation and showing how it's false as a way to help them spot and to reject misinformation in the wild. He calls it "prebunking" (as in debunking something before it happens).

Of course, especially with the advent of AI deepfakes, misinformation cannot only be combated on the demand side. Social media platforms, AI companies, and the government will all likely have to play an important role. There's clearly a long way to go to overcoming this problem, but we have recently seen some progress. For example, OpenAI recently began "watermarking" AI-generated images that their software produces to help people spot pictures that aren't real. And the federal government recently encouraged four companies to create new technologies to help people distinguish between authentic human speech and AI deepfakes.

This new world where the truth is harder to believe may be pretty scary. But, as this new study suggests, nudges and incentives to get us to slow our thinking, think more critically, and be less tribal could be an important part of the solution.

  • misinformation
  • Open access
  • Published: 26 April 2023

Fake news detection on social media: the predictive role of university students’ critical thinking dispositions and new media literacy

  • Ali Orhan   ORCID: orcid.org/0000-0003-1234-3919 1  

Smart Learning Environments volume  10 , Article number:  29 ( 2023 ) Cite this article

13k Accesses

6 Citations

3 Altmetric

Metrics details

This study aimed to investigate the predictive role of critical thinking dispositions and new media literacies on the ability to detect fake news on social media. The sample group of the study consisted of 157 university students. Sosu Critical Thinking Dispositions Scale, New Media Literacy Scale, and fake news detection task were employed to gather the data. It was found that university students possess high critical thinking dispositions and new media literacies as well as high fake news detection abilities and there is a positive and moderate relationship among these variables. Also, this study revealed that critical thinking dispositions and new media literacies significantly predicted university students’ abilities to detect fake news on social media and they together explained 18% of the total variance on fake news detection. Besides, university students’ critical thinking dispositions presented a larger effect on their abilities to detect fake news than new media literacies.

Introduction

With the great enhancement of the internet, social media (SM) has become one of the most widely used sources of information today and a great number of people use SM platforms to learn news (Aldwairi & Alwahedi, 2018 ). There is a great pile of information on the internet and this can be disseminated very easily and quickly on SM. We can learn breaking news quicker on SM than any other conventional means of communication. However, there is a great problem here. Although SM provides a space for news to spread at an impressive rate, it can also become a hotbed of misinformation (Gaozhao, 2021 ; Shu et al., 2017 ). Instead of reaching true and unbiased news, people are bombarded with a great number of fake news (FN) on SM. As a recent example, there has been a rash of digital FN about COVID-19 on SM resulting in undesired health, social, and cultural consequences (Kouzy et al., 2020 ). Also, FN had an undeniable role in the Brexit Referendum and in the USA Presidential Election in 2016 (Allcott & Gentzkow, 2017 ; Bastos & Mercea, 2019 ).

FN can be briefly defined as inaccurate or fictitious content which is released or disseminated as real information although it is not (Gaozhao, 2021 ). FN—in other words, fabricated news—can be easily disseminated in the form of real news either on SM or through other conventional means of communication (Molina et al., 2021 ) and it does not have any objective evidence to show the authenticity of the information which it conveys (Pennycook & Rand, 2021 ). FN attracts a lot more attention and spreads more quickly than real news and it mostly includes emotionally charged language (Vosoughi et al., 2018 ). Although FN phenomenon has hundreds of years of history (Tandoc et al., 2018 ) and the ability to detect it has been prized for a long time (Beiler & Kiesler, 2018 ; Burkhardt, 2017 ), its reach and mostly deleterious effect have elevated significantly today because of a wide range of SM platforms (Allcott & Gentzkow, 2017 ). FN is undesirable because it can influence people in a psychological and social way by distorting their beliefs resulting in misinformed and wrong decisions (Zimmermann & Kohring, 2020 ). Also, FN does not only negatively affect individuals, but it is also harmful to society in many ways. It can have many political, economic, social, and cultural consequences. People can lose their trust in the media (Vaccari & Chadwick, 2020 ) and balance of the news ecosystem can be ruined because of the increasing spread of FN (Shu et al., 2017 ).

Therefore, although it is vital to ascertain news veracity on SM where information is easily accessible, it is not easy to do this because of the nature of SM (Hernon, 1995 ). SM users can have broad access to produce information without any filtering or editorial judgement (Allcott & Gentzkow, 2017 ) and most of the content on SM is spontaneous and unprofessional (Robinson & DeShano, 2011 ). Besides, the great volume of information on SM makes it impossible to check the authenticity of the news on SM (Pennycook & Rand, 2019 ; Zhang & Ghorbani, 2020 ). Also, people tend to share the information easily and repeatedly with others when they believe FN is true which proliferates the spread of FN (Oh et al., 2018 ) and they can unwittingly contribute to the dissemination of FN produced by others on SM.

Although producing and consuming FN are clearly two different behaviors, the difference between them has become blurred because of the characteristics of the SM platforms. Individuals can easily create, share, and consume information on SM within a few seconds from anywhere and at any time. Therefore, individuals can easily change their roles from FN producers to consumers, or vice versa (Kim et al., 2021 ). Also, they can do this intentionally or unintentionally. Therefore, SM is a great place for FN to become extremely influential and spread extremely fast. As SM provides a space for the proliferation of FN and a lot of people rely on SM platforms as the main source of information (Lazer et al., 2018 ), educating people to fight against FN and equipping them with the necessary tools to identify FN are crucial (Zhang & Ghorbani, 2020 ). Previous literature indicates that critical thinking (CT) and new media literacy (NML) are two of the most essential tools that can be used by individuals to protect themselves against FN on SM.

Critical thinking and fake news detection

CT is a functional, reflective, and logical thinking process employed by individuals before deciding what to do or what to believe (Ennis, 2000 ). In other words, CT can help individuals to make true and reasonable decisions about their actions or on the accuracy of information. CT leads to more accurate and systematic processing of ideas, arguments, and information (Ruggerio, 1988 ) in which the quality and accuracy of them are examined and evaluated (Lewis & Smith, 1993 ) and after this careful and logical examination process, they decide to believe or support them. Therefore, it can be said that CT works as armor that protects individuals against fake information (Epstein & Kernberger, 2012 ). An adequate critical thinker is “habitually inquisitive, well-informed, trustful of reason, open-minded, fair-minded in evaluation, prudent in making judgments, willing to reconsider, diligent in seeking relevant information, reasonable in the selection of criteria, and focused in inquiry” (Facione, 1990 , p. 2). For adequate critical thinkers, all assumptions are questionable and divergent ideas are always welcomed. They are always willing to inquire and this inquiry is not affected by their emotions, heuristics, or prejudices, and is not biased in favor of a particular outcome (Kurfiss, 1988 ).

Therefore, CT can be seen as an effective weapon to combat FN (Bronstein et al., 2019 ; Wilson, 2018 ). CT is a thinking process employed by individuals to perceive whether the information is real or fake (Paul, 1990 ). Good critical thinkers—in other words, people who have high CT skills and dispositions at the same time—tend to examine and evaluate the news they encounter on SM to see if it is accurate and real (Lewis & Smith, 1993 ). They evaluate the sensibility and accuracy of given news, examine the source of it, and look for sound evidence to trust in the accuracy of it (Mason, 2008 ). Based on the results of this careful examination process, they can decide whether to share this news with others or not. Therefore, CT can also be seen as an important barrier against the proliferation of FN on SM. While individuals lacking CT tend to share the information easily and repeatedly with others without checking the accuracy of it resulting in the quick spread of FN (Oh et al., 2018 ), individuals with high CT skills and dispositions tend to evaluate the content and the source of it before sharing. After this careful examination based on sound evidence instead of heuristics and emotions (Kahneman, 2011 ), they can decide not to share it with others if they decide that it is fake and misleading or it does not have strong arguments, and hence, they can break the chain and decelerate the dissemination of FN on SM. Therefore, it can be said that adequate critical thinkers not only do not fall into traps of FN but also they do not contribute to the dissemination of FN produced by others on SM.

Previous literature has reported some empirical evidence indicating CT has a positive effect on detecting FN on SM. In their study with 1129 participants, Escola-Gascon et al. ( 2021 ) found out that CT dispositions significantly predicted the detection of FN. In their experimental study aiming to examine if adding CT recommendations to SM posts can help people to better discriminate true news from FN, Kruijt et al. ( 2022 ) concluded that participants who were exposed to CT recommendations presented better performance to detect FN. Lutzke et al. ( 2019 ) found that participants exposed to guidelines priming CT performed better to detect FN in their experimental study which was carried out to investigate the effectiveness of guidelines priming CT on willingness to trust, like, and share FN.

New media literacy and fake news detection

NML can be seen as a broader term which involves different kinds of literacy like classic (e.g., reading and writing), audiovisual (electronic media), digital, and information literacies. It includes some important process skills like access, analysis, evaluation, critique, production, and participation in media content (Hobbs & Jensen, 2009 ; Lee et al., 2015 ; Zhang et al., 2014 ). The interaction between individuals and media content can be divided into two categories, namely, consuming and prosuming (Toffler, 1981 ). Also, the term of literacy can be divided into two categories which are functional literacies and critical literacies (Buckingham, 2003 ). While functional literacies, which are related to skills and knowledge, refer to individuals’ capability of knowing how, critical literacies are about individuals’ capability of meaning-making and evaluating the credibility, accuracy, and usefulness of the message (Buckingham, 2003 ). Based on these two categorizations, a conceptual framework for NML is proposed by Chen et al. ( 2011 ). This framework proposes that NML includes functional consuming, functional prosuming, critical consuming, and critical prosuming literacies. While functional consuming literacy is about individuals’ capability of gaining access to created new media content and understanding the message it conveys, critical consuming literacy refers to individuals’ capability of investigating the media content in terms of different perspectives such as cultural, political, social, and economic (Chen et al., 2011 ). Also, while functional prosuming literacy refers to the ability to create media content, critical prosuming literacy involves the contextual interpretation of the media content by individuals during their activities on media (Chen et al., 2011 ).

Individuals with high NML are aware of the way the messages are created, disseminated, and commercialized all over the world (Thoman & Jolls, 2004 ) and can use different media platforms consciously, distinguish and evaluate different media content, investigate the media types, its effects, and the messages they convey in a critical way, and (re)produce new media content (Kellner & Share, 2007 ). In other words, new media literate individuals can critically access, decode, understand, and analyze the messages which various kinds of media content convey (Leaning, 2017 ; Potter, 2010 ) and they can make independent judgements about the veracity of media content (Buckingham, 2015 ; Leaning, 2017 ). Therefore, we can say that high NML provides the necessary skills for individuals to actively investigate, evaluate, and analyze the media content and its underlying messages instead of passively consuming the media content and accepting the veracity of the conveying messages which can include potentially misinformation and disinformation without thinking, and hence, it can protect individuals from the negative effects of new media platforms (Hobbs, 2017 ). New media literate individuals are capable of investigating and evaluating the credibility of the information or news in a critical way, verifying the authenticity of them, and using them ethically. In short, NML increases the possibility that individuals take a critical standpoint toward FN (Kim et al., 2021 ) and it has an important effect on the degree of consumption and dissemination of FN on SM (Staksrud et al., 2013 ). Therefore, new media literate individuals can not only protect themselves against the consuming FN but also they are possibly going to be unwilling to share the news without being sure about the accuracy of it, and hence, they are going to have a proactive role in stopping the spread of FN (Parikh & Atrey, 2018 ).

Previous research has provided some empirical evidence indicating NML has a positive effect on the ability to detect FN on SM. In their study aiming to investigate the relation between students’ level of new media literacies and their ability to discern FN, Luo et al. ( 2022 ) concluded that NML and FN detection performance are significantly related to each other. In her experimental study aiming to investigate the effectiveness of media and information literacy on FN detection, Adjin-Tettey ( 2022 ) found that media and information literacy trained participants were more likely to detect FN and less likely to share it. Similarly, Al Zou’bi ( 2022 ) carried out an experimental study to investigate the effectiveness of media and information literacy on FN detection and concluded that media and information literate students presented better abilities to detect FN. Moore and Hancock ( 2022 ) also reported similar results in their experimental study. Besides, Guess et al. ( 2020 ) concluded that participants who received media literacy education were more successful in FN detection. Also, Lee et al. ( 2022 ) concluded that NML possesses an important role on mitigating the FN problem in their study which was carried out to investigate the effectiveness of NML on perception of FN, media trust, and fact-checking motivation.

The current study

FN is an important problem and it can pollute the public sphere and harm democracy, journalism, and freedom of expression (Pogue, 2017 ). Indeed, it is listed as one of the most important threats to society by the World Economic Forum (Del Vicario et al., 2016 ). Therefore, equipping individuals with the necessary tools to identify FN is crucial (Zhang & Ghorbani, 2020 ). However, the usage of these necessary tools, especially cognitive ones, to combat FN is not investigated sufficiently (Machete & Turpin, 2020 ; Wu et al., 2022 ) and there is a clear need for other studies investigating what can be done and how these tools can be used to combat against FN (Au et al., 2021 ). Previous literature has showed that CT and NML, which can also be seen as a survival kit for this century, are two of the most essential cognitive tools that can be used by individuals to protect themselves against FN on SM. Although there is a well-established theoretical base regarding the positive effect of CT and NML on FN detection, there is not enough empirical evidence indicating the positive role of CT and NML in fighting against FN in the literature (Xiao et al., 2021 ; Zanuddin & Shin, 2020 ). Also, most of the previous research regarding the positive effect of CT and NML on FN detection consists of correlational and experimental studies and there are not enough studies examining the predictive role of CT and NML on FN detection, and hence, empirical evidence regarding the predictive power of these variables is limited. Therefore, we can say that examining the predictive power of CT and NML on FN detection is a promising area of research that may be useful to shed light on what extent these two variables are effective on FN detection on SM. The dearth of research on the predictive role of CT and NML on FN detection provides a sufficient reason for this study aiming to examine the effectiveness of CT and NML on FN detection. Therefore, this study aimed to examine the predictive role of CT dispositions and NML of university students on their ability to detect FN on SM. To this end, the following questions were sought:

What are university students’ levels of CT dispositions and NML?

Are university students’ Sosu Critical Thinking Dispositions Scale (CTDS) and New Media Literacy Scale (NMLS) scores significant predictors of their ability to detect FN?

In this non-experimental quantitative study, a cross-sectional survey design was used. University students’ ability to detect FN was determined as the dependent variable of the study and their scores on the CTDS and NMLS were determined as predictor variables.

Study group

This study was conducted with 157 university students (66 females, 91 males) studying in a state university in Turkey in the academic year of 2022–2023. The students were recruited on a voluntary basis. The mean age of them was 18.96 (SD = 1.00) and their age ranged between 17 and 24. All of the students were in their one-year English preparatory year which is compulsory for them before starting their education in their departments. They are learning only English in this year. The majority of students’ mothers graduated from high school (31.8%) and primary school (29.3%) while most of their fathers are high school (38.2%) and university (25.5%) graduates. A-priori power analysis was conducted using G*Power 3 by Faul et al. ( 2007 ) for linear multiple regression analysis (alpha = 0.05; power = 0.95; two predictors) and it showed that the minimal sample size should be 107 to detect a medium effect size (f 2  = 0.15). So, it can be said that the sample size of 157 was adequate.

Data collection tools

Sosu critical thinking dispositions scale (ctds).

The CTDS developed by Sosu ( 2013 ) and adapted into Turkish by Orhan ( 2023 ) was used to measure the university students’ CT dispositions. The CTDS has 11 items and two sub-dimensions, namely, critical openness (7 items) and reflective skepticism (4 items). Turkish adaptation study with two independent samples indicated that the Turkish version of CTDS has the same factor structure as the original one. The reliability coefficient of the CTDS was found to be 0.92 for sample 1 and 0.94 for sample 2 in the adaptation study. In this study, the reliability coefficient was calculated as 0.80 for the total scale.

New Media Literacy Scale (NMLS)

The NMLS developed by Koç and Barut ( 2016 ) was used to determine students’ new media literacies. The NMLS has 35 items and four sub-dimensions, namely, functional consumption (7 items), critical consumption (11 items), functional prosumption (7 items), and critical prosumption (10 items). The reliability coefficients of the sub-dimensions ranged between 0.85 and 0.93 while it was calculated as 0.95 for the total scale. In this study, the reliability coefficient was 0.92 for the total scale.

  • Ability to detect fake news

The university students’ ability to detect FN on SM was measured using a FN detection task created by the researcher based on the previous study of Preston et al. ( 2021 ). The FN detection task includes six news items, three of them present real news content while three of them include FN content. The FN items include topics related to a claim that Red Cross has ceased its activities in Ukraine (fake), a claim that NASA has stopped its research on oceans (fake), and a claim that Starbucks no more accepts payment by cash money (fake). The first real news item claims that Dwayne Johnson has become the highest-paid actor for the second consecutive year. The second real news item says that an electric bus produced by KARSAN (a Turkish company) started to provide public transportation services in Norway. The third real news indicates that Turkey has become the country that produces the most figs in 2020 with 320 thousand tons of production according to 2020 data from the Food and Agriculture Organization of the United Nations. Two independent and impartial fact-checking websites ( www.dogrula.org and www.teyit.org ) were used to obtain information related to fake and real news items. These two websites are really popular in Turkey and free to use as a fact-checking resource.

Four main components, namely, news sharing source, original news item source, content level, and author argument were considered while developing the mock Facebook post items to increase the possibility for the students to evaluate levels of objectivity, professionalism, argument strength, and trustworthiness in the items. The news items look like a typical Facebook news post including likes, comments, and shares in which an article is shared by an organization related to its content (see Fig.  1 ). For example, the Facebook page named “Sinema & Sinema” is sharing an article from “boxofficeturkiye.com”. The number of comments, likes, and shares are similar between fake and real news items groups in order not to affect students’ choices.

figure 1

Examples of FN items (on the left) and real news items (on the right)

For the first component, more objective content names (e.g. Sinema & Sinema) were used for the real news to influence impressions of professionalism and objectivity while more subjective-sounding content names (e.g. bilim günlüğü) were chosen for FN. Also, for the second component, more objective content suggesting website names were chosen for the real news items (e.g. boxofficeturkiye.com) while FN items included websites suggestive of more subjective content (e.g. savunmatr.com). For the third component, FN items presented short information written in a subjective style and without using any credible source to suggest low trustworthiness. For the fourth component, FN items included author arguments written using emotive language and without references to reliable sources suggesting subjectivity and low levels of argument strength, professionalism, and trustworthiness. On the other hand, I employed the opposite strategies like references to reliable sources and non-emotive language for the real news items.

After preparing the six news items, students were asked to critically analyze each of them and answer four questions prefaced with the text “to what extent do you agree with the following statement”. The first question is “the author and shared article are objective” and it aims to evaluate the level of objectivity. The second question which aims to evaluate professionalism is “the article seems to be produced by a professional”. The third question designed to evaluate the argument strength is “the article presents a strong argument”. The last question is “this source of information is credible and trustworthy” and it aims to evaluate the trustworthiness. Students can answer the questions via 5 point Likert scale ranging from 1 (strongly disagree) to 5 (strongly agree). Students’ responses to the FN are reverse-coded. Scores on the FN detection task can range from 24 to 120 with a midpoint of 60. Higher scores indicate a stronger ability to detect FN while lower scores indicate a weaker ability to detect FN. The reliability coefficient calculated for the FN detection task was 0.84 in this study.

Data collection

Following the ethical committee approval from ZBEU, the data were gathered in the fall term of 2021–2022 academic year. Privacy and confidentiality issues and the aim of the study were shared with all students and they were informed about their right to withdraw from the study if they want. Students completed the instruments in about 30 min.

Data analysis

First, all variables were investigated to see whether they had any missing data and it was seen that they had no missing data. After that, the data were investigated in terms of normality and Skewness and Kurtosis values indicated that the data presented normal distribution for each variable (see Table 1 ). Then, possible multivariate outliers and outliers per variable were checked with Mahalanobis Distance scores and Z transformation values. These scores revealed that the data did not have any influential outliers which should be excluded. Pearson correlation, CI, VIF, and tolerance values were examined to check if there is a high correlation among the variables and it was seen that there is no high correlation. Descriptive statistics, Pearson correlation, and multiple linear regression with enter method were conducted using SPSS 20 statistical software to analyze the data.

As shown in Table 1 , university students presented high CT dispositions ( \(\overline{{\text{X}}}\)  = 3.89) and new media literacies ( \(\overline{{\text{X}}}\)  = 3.83). Also, their mean score for the FN detection task is 76.84. As the scores of the FN detection task can range between 24 and 120 with a midpoint of 60, we can say that the students have higher scores than the midpoint indicating that they have high abilities to detect FN.

As shown in Table 2 , university students’ CT dispositions ( r  = 0.399) and new media literacies ( r  = 0.303) have a moderate and positive relationship with their abilities to detect FN. Also, there is a positive and moderate relationship between university students’ CT dispositions and new media literacies ( r  = 0.417).

As it can be seen in Table 3 , multiple linear regression analysis results indicated that university students’ CT dispositions (β = 0.329, t (157)= 4.106, p < 0.05) and new media literacies (β = 0.165, t (157)= 2.062, p < 0.05) significantly predicted their abilities to detect FN on SM (R = 0.426, R 2  = 0.181, p < 0.01). One-way ANOVA test results showed that the established regression model was significant (F (2,156)  = 17.065, p < 0.01). Students’ CT dispositions and new media literacies together explained 18% of the total variance on their abilities to detect FN on SM. Besides, university students’ CT dispositions (β = 0.329) presented a larger effect on their abilities to detect FN than new media literacies (β = 0.165).

This study aimed to investigate the predictive role of CT dispositions and NML of university students on their ability to detect FN on SM. It was seen that university students presented high CT dispositions and NML as well as high abilities to detect FN. Another result obtained in the study revealed that CT dispositions and NML of university students were positively and moderately related to their abilities to detect FN. Also, a positive and moderate relationship between university students’ CT dispositions and NML was found.

This study also revealed that university students’ CT dispositions and new media literacies significantly predicted their abilities to detect FN on SM. Students’ CT dispositions and new media literacies together explained 18% of the total variance on their abilities to detect FN on SM. Previous literature revealed similar results regarding the positive effect of CT dispositions (Escola-Gascon et al., 2021 ; Kruijt et al., 2022 ; Lutzke et al., 2019 ) and NML (Adjin-Tettey, 2022 ; Al Zou’bi, 2022 ; Guess et al., 2020 ; Lee et al., 2022 ; Luo et al., 2022 ; Moore & Hancock, 2022 ) on FN detection on SM. Therefore, we can say that previous literature confirmed the results of this study.

Individuals with high CT skills and dispositions do not make instant decisions about their behaviors or the accuracy of information without employing a systematic and logical thinking process in which they examine and evaluate the quality and accuracy of ideas, arguments, and information (Lewis & Smith, 1993 ; Ruggerio, 1988 ). They acquire the most accurate information about their environment and can make the best decision about their actions thanks to CT. Individuals wear CT as armor and protect themselves against fake information (Epstein & Kernberger, 2012 ). Therefore, we can say that CT is a vital skill for individuals in their daily life and it is even more important during the time they spend on SM where people have been bombarded with a great number of FN. An adequate critical thinker tends to examine and evaluate the accuracy of the news they encounter on SM (Lewis & Smith, 1993 ) and they are unwilling to share this news with others until they make sure about the sensibility and accuracy of it (Mason, 2008 ). Therefore, CT not only helps individuals not to fall into traps of FN but also works as a barrier against the dissemination of FN produced by others on SM which makes CT an important and effective weapon to combat FN (Bronstein et al., 2019 ; Wilson, 2018 ). Indeed, Machete and Turpin ( 2020 ) concluded that previous relevant literature indicated that CT is an important skill to identify FN in their systematic review study aiming to present the current state of the literature on the usage of CT to detect FN.

Individuals with high NML possess the ability to access, decode, understand, and analyze the messages which different kinds of media content convey (Leaning, 2017 ; Potter, 2010 ). They can also make independent judgements about the veracity of these media content (Buckingham, 2015 ; Leaning, 2017 ). New media literate individuals tend to employ an active process in which they investigate, evaluate, and analyze the media content and its underlying messages instead of passively consuming it because they are aware of the way the media content is created, commercialized, and disseminated all over the world (Thoman & Jolls, 2004 ) and they know that they can include potentially misinformation and disinformation. Therefore, NML equips individuals with the necessary skills to investigate and evaluate the credibility of the information or news and its source, verify the authenticity of them, and use them ethically. Thanks to NML, individuals take a critical standpoint toward FN (Kim et al., 2021 ) and it decreases the possibility of consumption and dissemination of FN on SM (Staksrud et al., 2013 ).

Therefore, we can say that the results of this study indicating CT dispositions and NML were significant predictors of university students’ abilities to detect FN coincide with the theoretical background and the results of previous research. Besides, university students’ CT dispositions presented a larger effect on their abilities to detect FN than new media literacies. This finding shows that CT dispositions are a much more powerful weapon in fighting against FN than NML. CT dispositions equip individuals with the necessary skills to examine and evaluate the quality and accuracy of ideas, claims, and judgments as well as of their source. Also, individuals with high CT dispositions are open to new ideas and they are willing to modify their ideas and arguments when a piece of convincing evidence appears. Therefore, individuals with high CT dispositions are skeptical of any information they encounter in their daily life and they habitually tend to evaluate the veracity of information itself as well as the source of this information. On the other hand, NML is not only about consuming but also producing media content. New media literate individuals are capable of consuming media content in a critical way. They have the necessary skills to actively evaluate the credibility, accuracy, and usefulness of the media content and they can investigate it in terms of different perspectives such as cultural, political, social, and economic. Also, NML is about the capability of gaining access to created new media content, understanding the message it conveys, and (re)producing new media content. They can also use different media platforms consciously, distinguish different media content, and investigate the media types. Correspondingly, we can say that consuming literacies are directly related to FN detection while producing skills have an indirect relation with FN detection ability. On the other hand, CT dispositions are directly related to abilities to detect FN. Therefore, we can say that the result regarding CT dispositions are more effective to combat FN can be explained by this.

In short, this study showed that CT dispositions and NML were significant predictors of university students’ abilities to detect FN. This result which is confirmed by previous research shows the important role of CT dispositions and NML to combat FN which is one of the most important threats to society in today’s world (Del Vicario et al., 2016 ) and can damage democracy, journalism, and freedom of expression (Pogue, 2017 ). We can say that individuals possessing high CT dispositions and NML are more competent to combat FN. They can not only protect themselves against FN on SM but also do not contribute to the dissemination of FN by preferring not to share them with other people. Consequently, CT dispositions and NML should be implemented during the effort of fighting against FN on SM. CT dispositions (Kennedy et al., 1991 ; Lewis & Smith, 1993 ) and NML (Buckingham, 2003 ; Hobbs & Jensen, 2009 ) are teachable skills through appropriate education, and hence, enhancing individuals’ CT dispositions and NML should be included among the most important aims of educational systems because they do not only positively contribute to individuals’ daily and school life (Halpern, 2003 ; Orhan, 2022a , 2022b ; Paul & Elder, 2001 ) but also work as important barriers against the consumption and dissemination of FN on SM. Therefore, we can say that enhancing individuals’ CT dispositions and NML would be a good idea to equip them with the necessary skills that are useful in the fight against FN on SM. It can be said that this study has significantly contributed to the literature by presenting additional evidence regarding the predictive role of CT dispositions and NML on the ability to detect FN on SM because previous literature lacks enough evidence indicating the predictive roles of these two variables (Xiao et al., 2021 ; Zanuddin & Shin, 2020 ). Also, this study differs from the previous studies because it investigated the predictive role of CT dispositions and NML on FN detection comparatively and showed that CT dispositions are a more powerful weapon in fighting against FN than NML.

Limitations and implications for further research

This study has several limitations although it provides important results regarding the predictive role of CT dispositions and NML on FN detection. First, the study group can be shown as a limitation because this study was conducted with a study group consisting of only university students. Therefore, similar studies can be conducted with other sample groups consisting of students from various educational levels, especially high school because high school students spend most of their time on SM. Second, data collection tools can be shown as another limitation of the study because only self-report quantitative tools were employed to gather the data for this study and these tools can be influenced by social desirability. Therefore, further studies using qualitative or mixed methods can be conducted to better understand the predictive role of CT dispositions and NML on FN detection. Third, several other factors like students’ level of media knowledge may affect their ability to detect FN on SM. However, these factors were not taken into account in this study and this can be shown as the third limitation of the study.

Availability of data and materials

The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.

Abbreviations

Critical thinking

  • New media literacy
  • Social media

Critical Thinking Dispositions Scale

New Media Literacy Scale

Adjin-Tettey, T. D. (2022). Combating fake news, disinformation, and misinformation: Experimental evidence for media literacy education. Cogent Arts & Humanities, 9 (1), 1–17.

Google Scholar  

Al Zou’bi, R. M. (2022). The impact of media and information literacy on students’ acquisition of the skills needed to detect fake news. Journal of Media Literacy Education, 14 (2), 58–71.

Article   Google Scholar  

Aldwairi, M., & Alwahedi, A. (2018). Detecting fake news in social media networks. Procedia Computer Science, 141 , 215–222.

Allcott, H., & Gentzkow, M. (2017). Social media and fake news in the 2016 election. Journal of Economic Perspectives, 31 (2), 211–236.

Au, C. H., Ho, K. K. W., & Chiu, D. K. W. (2021). The role of online misinformation and fake news in ideological polarization: Barriers, catalysts, and implications. Information Systems Frontiers, 2021 , 1–24.

Bastos, M. T., & Mercea, D. (2019). The Brexit botnet and user-generated hyperpartisan news. Social Science Computer Review, 37 (1), 38–54.

Beiler, M., & Kiesler, J. (2018). “Lugenpresse! Lying press!” is the press lying? In K. Otto & A. Köhler (Eds.), Trust in media and journalism: Empirical perspectives on ethics, norms, impacts and populism in Europe (pp. 155–179). Springer.

Chapter   Google Scholar  

Bronstein, M. V., Pennycook, G., Bear, A., Rand, D. G., & Cannon, T. D. (2019). Belief in fake news is associated with delusionality, dogmatism, religious fundamentalism, and reduced analytic thinking. Journal of Applied Research in Memory and Cognition, 8 (1), 108–117.

Buckingham, D. (2003). Media education: Literacy, learning and contemporary culture . Polity Press.

Buckingham, D. (2015). Defining digital literacy: What do young people need to know about digital media? Nordic Journal of Digital Literacy, 10 , 21–35.

Burkhardt, J. M. (2017). History of fake news. Library Technology Reports, 53 (8), 5–9.

Chen, D. T., Wu, J., & Wang, Y. M. (2011). Unpacking new media literacy. Journal on Systemics, Cybernetics and Informatics, 9 (2), 84–88.

Del Vicario, M., Bessi, A., Zollo, F., Petroni, F., Scala, A., Caldarelli, G. H., Stanley, E., & Quattrociocchi, W. (2016). The spreading of misinformation online. Proceedings of the National Academy of Sciences of the United States of America, 113 (3534), 554–559.

Ennis, R. H. (2000). Goals for a critical thinking curriculum and its assessment. In A. L. Costa (Ed.), Developing minds: A resource book for teaching thinking (pp. 44–46). ASCD.

Epstein, R. L., & Kernberger, C. (2012). Critical thinking . Advanced Reasoning Forum.

Escola-Gascon, A., Dagnall, N., & Gallifa, J. (2021). Critical thinking predicts reductions in Spanish physicians’ stress levels and promotes fake news detection. Thinking Skills and Creativity, 42 (100934), 1–10.

Facione, P. A. (1990). Critical thinking: A statement of expert consensus for purposes of educational assessment and instruction - executive summary - the Delphi report . The California Academic Press.

Faul, F., Erdfelder, E., Lang, A. G., & Buchner, A. (2007). G* power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods, 39 (2), 175–191.

Gaozhao, D. (2021). Flagging fake news on social media: An experimental study of media consumers’ identification of fake news. Government Information Quarterly, 38 (101591), 1–13.

Guess, A. M., Lerner, M., Lyons, B., Montgomery, J. M., Nyhan, B., Reifler, J., & Sircar, N. (2020). A digital media literacy intervention increases discernment between mainstream and false news in the United States and India. Proceedings of the National Academy of Sciences, 17 (27), 15536–15545.

Halpern, D. F. (2003). Thought & knowledge: An introduction to critical thinking . Lawrence Erlbaum Assocıates Publishers.

Hernon, P. (1995). Disinformation and misinformation through the internet: Findings of an exploratory study. Government Information Quarterly, 12 (2), 133–139.

Hobbs, R. (2017). Teaching and learning in a post-truth world. Educational Leadership, 75 (3), 26–31.

Hobbs, R., & Jensen, A. (2009). The past, present and future of media literacy education. The Journal of Media Literacy Education, 1 (1), 1–11.

Kahneman, D. (2011). Thinking, fast and slow . Farrar.

Kellner, D., & Share, J. (2007). Critical media literacy, democracy, and the reconstruction of education. In D. Macedo & S. R. Steinberg (Eds.), Media literacy: A reader (pp. 3–23). Peter Lang Publishing.

Kennedy, M., Fisher, M. B., & Ennis, R. H. (1991). Critical thinking: Literature review and needed research. In L. Idol & B. Fly Jones (Eds.), Educational values and cognitive instruction: Implications for reform (pp. 11–40). Lawrence Erlbaum.

Kim, B., Xiong, A., Lee, D., & Han, K. (2021). A systematic review on fake news research through the lens of news creation and consumption: Research efforts, challenges, and future directions. PLoS ONE, 16 (12), 1–28.

Koç, M., & Barut, E. (2016). Development and validation of New Media Literacy Scale (NMLS) for university students. Computers in Human Behavior, 63 , 834–843.

Kouzy, R., Abi Jaoude, J., Kraitem, A., El Alam, M. B., Karam, B., Adib, E., Zarka, J., Traboulsi, C., Akl, E. W., & Baddour, K. (2020). Coronavirus goes viral: Quantifying the COVID-19 misinformation epidemic on Twitter. Cureus, 12 (3), 1–9.

Kruijt, J., Meppelink, C. S., & Vandeberg, L. (2022). Stop and think! Exploring the role of news truth discernment, information literacy, and impulsivity in the effect of critical thinking recommendations on trust in fake Covid-19 news. European Journal of Health Communication, 3 (2), 40–63.

Kurfiss, J. G. (1988). Critical thinking: Theory, research, practice, and possibilities . ASHE-ERIC Higher Education Reports.

Lazer, D. M. J., Baum, M. A., Benkler, Y., Berinsky, A. J., Greenhill, K. M., Menczer, F., Metzger, M. J., Nyhan, B., Pennycook, G., Rothschild, D., Schudson, M., Sloman, S. A., Sunstein, C. R., Thorson, E. A., Watts, D. J., & Zittrain, J. L. (2018). The science of fake news. Science, 359 (6380), 1094–1096.

Leaning, M. (2017). Media and information literacy: An integrated approach for the 21st century . Elsevier Science & Technology.

Book   Google Scholar  

Lee, E. H., Lee, T. D., & Lee, B. K. (2022). Understanding the role of new media literacy in the diffusion of unverified information during the COVID-19 pandemic. New Media & Society, 2022 , 1–24.

Lee, L., Chen, D. T., Li, J. Y., & Lin, T. B. (2015). Understanding new media literacy: The development of a measuring instrument. Computers & Education, 85 , 84–93.

Lewis, A., & Smith, D. (1993). Defining higher order thinking. Theory into Practice, 32 (3), 131–137.

Luo, Y. F., Yang, S. C., & Kang, S. (2022). New media literacy and news trustworthiness: An application of importance–performance analysis. Computers & Education, 185 (104529), 1–15.

Lutzke, L., Drummond, C., Slovic, P., & Árvai, J. (2019). Priming critical thinking: Simple interventions limit the influence of fake news about climate change on Facebook. Global Environmental Change, 58 (101964), 1–8.

Machete, P., & Turpin, M. (2020). The use of critical thinking to identify fake news: A systematic literature review. In M. Hattingh, M. Matthee, H. Smuts, I. Pappas, Y. K. Dwivedi, & M. Mäntymäki (Eds.), Responsible design, implementation and use of information and communication technology (pp. 235–246). Springer.

Mason, M. (2008). Critical thinking and learning . Blackwell Publishing.

Molina, M. D., Sundar, S. S., Le, T., & Lee, D. (2021). “Fake news” is not simply false information: A concept explication and taxonomy of online content. American Behavioral Scientist, 65 (2), 180–212.

Moore, R. C., & Hancock, J. T. (2022). A digital media literacy intervention for older adults improves resilience to fake news. Scientific Reports, 12 (6008), 1–9.

Oh, O., Gupta, P., Agrawal, M., & Raghav Rao, H. (2018). ICT mediated rumor beliefs and resulting user actions during a community crisis. Government Information Quarterly, 35 (2), 243–258.

Orhan, A. (2022a). Critical thinking dispositions and decision making as predictors of high school students’ perceived problem solving skills. The Journal of Educational Research, 115 (4), 235–245.

Orhan, A. (2022b). The relationship between critical thinking and academic achievement: A meta-analysis study. Psycho-Educational Research Reviews, 11 (1), 283–299.

Orhan, A. (2023). Investigating psychometric properties of the Turkish version of Sosu Critical Thinking Disposition Scale: Evidence from two independent Samples. International Journal of Psychology and Educational Studies, 10 (2), 348–359. https://doi.org/10.52380/ijpes.2023.10.2.1017

Parikh, S. & Atrey, P. (2018, April). Media-rich fake news detection: A survey [Paper presentation]. IEEE Conference on Multimedia Information Processing and Retrieval, Miam, Florida, the USA.

Paul, R. (1990). Critical thinking: What every person needs to survive in a rapidly changing world . Center for Critical Thinking and Moral Critique.

Paul, R., & Elder, L. (2001). Critical thinking: Tools for taking charge of your learning and your life . Prentice Hall.

Pennycook, G., & Rand, D. G. (2019). Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition, 188 , 39–50.

Pennycook, G., & Rand, D. G. (2021). The psychology of fake news. Trends in Cognitive Sciences, 25 (5), 388–402.

Pogue, D. (2017). How to stamp out fake news. Scientific American, 316 (2), 24–24.

Potter, W. J. (2010). The state of media literacy. Journal of Broadcasting & Electronic Media, 54 (4), 675–696.

Preston, S., Anderson, A., Robertson, D. J., Shephard, M. P., & Huhe, N. (2021). Detecting fake news on Facebook: The role of emotional intelligence. PLoS ONE, 16 (3), 1–13.

Robinson, S., & DeShano, C. (2011). ‘Anyone can know’: Citizen journalism and the interpretive community of the mainstream press. Journalism: Theory, Practice & Criticism, 12 (8), 963–982.

Ruggerio, V. R. (1988). Teaching thinking across the curriculum . Harper & Row.

Shu, K., Sliva, A., Wang, S., Tang, J., & Liu, H. (2017). Fake news detection on social media: A data mining perspective. Explorations Newsletter, 19 (1), 22–36.

Sosu, E. M. (2013). The development and psychometric validation of a Critical Thinking Disposition Scale. Thinking Skills and Creativity, 9 , 107–119.

Staksrud, E., Olafsson, K., & Livingstone, S. (2013). Does the use of social networking sites increase children’s risk of harm? Computers in Human Behavior, 29 (1), 40–50. https://doi.org/10.1016/j.chb.2012.05.026

Tandoc, E. C., Lim, W. Z., & Ling, R. (2018). Defining “fake news”: A typology of scholarly definitions. Digital Journalism, 6 (2), 137–153.

Thoman, E., & Jolls, T. (2004). Media literacy: A national priority for a changing world. American Behavioral Scientist, 48 (1), 18–29.

Toffler, A. (1981). The third wave . Morrow.

Vaccari, C., & Chadwick, A. (2020). Deepfakes and disinformation: Exploring the impact of synthetic political video on deception, uncertainty, and trust in news. Social Media and Society, 6 (1), 1–13.

Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359 (6380), 1146–1151.

Wilson, J. A. (2018). Reducing pseudoscientific and paranormal beliefs in university students through a course in science and critical thinking. Science & Education, 27 , 183–210.

Wu, Y., Ngai, E. W. T., Wu, P., & Wu, C. (2022). Fake news on the internet: A literature review, synthesis and directions for future research. Internet Research, 32 (5), 1662–1699.

Xiao, X., Su, Y., & Lee, D. K. L. (2021). Who consumes new media content more wisely? Examining personality factors, SNS use, and new media literacy in the era of misinformation. Social Media + Society, 7 (1), 1–2.

Zanuddin, H., & Shin, C. Y. (2020). Relationship between new media literacy and motivation in solving fake news problem. International Transaction Journal of Engineering, Management, & Applied Sciences & Technologies, 11 (8), 1–10.

Zhang, H., Zhu, C., & Sang, G. (2014). Teachers’ stages of concern for media literacy education and the integration of MLE in Chinese primary schools. Asia Pacific Education Review, 15 (3), 459–471.

Zhang, X., & Ghorbani, A. A. (2020). An overview of online fake news: Characterization, detection, and discussion. Information Processing and Management, 57 (102025), 1–26.

Zimmermann, F., & Kohring, M. (2020). Mistrust, disinforming news, and vote choice: A panel survey on the origins and consequences of believing disinformation in the 2017 German parliamentary election. Political Communication, 37 (2), 215–237.

Download references

Acknowledgements

Not applicable.

The author(s) received no financial support for the research, authorship, and/or publication of this article.

Author information

Authors and affiliations.

School of Foreign Languages, Zonguldak Bülent Ecevit University, İncirharmanı Campus, Kozlu, Zonguldak, Turkey

You can also search for this author in PubMed   Google Scholar

Contributions

The author have read and approved the final version of the manuscript.

Corresponding author

Correspondence to Ali Orhan .

Ethics declarations

Competing interests.

The author(s) declared no potential competing interests with respect to the research, authorship, and/or publication of this article.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Orhan, A. Fake news detection on social media: the predictive role of university students’ critical thinking dispositions and new media literacy. Smart Learn. Environ. 10 , 29 (2023). https://doi.org/10.1186/s40561-023-00248-8

Download citation

Received : 18 January 2023

Accepted : 17 April 2023

Published : 26 April 2023

DOI : https://doi.org/10.1186/s40561-023-00248-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Critical thinking dispositions
  • University students

critical thinking in fake news

  • Original article
  • Open access
  • Published: 07 October 2020

Reliance on emotion promotes belief in fake news

  • Cameron Martel   ORCID: orcid.org/0000-0003-3181-4309 1 ,
  • Gordon Pennycook 2 &
  • David G. Rand 1 , 3  

Cognitive Research: Principles and Implications volume  5 , Article number:  47 ( 2020 ) Cite this article

64k Accesses

132 Citations

538 Altmetric

Metrics details

What is the role of emotion in susceptibility to believing fake news? Prior work on the psychology of misinformation has focused primarily on the extent to which reason and deliberation hinder versus help the formation of accurate beliefs. Several studies have suggested that people who engage in more reasoning are less likely to fall for fake news. However, the role of reliance on emotion in belief in fake news remains unclear. To shed light on this issue, we explored the relationship between experiencing specific emotions and believing fake news (Study 1; N  = 409). We found that across a wide range of specific emotions, heightened emotionality at the outset of the study was predictive of greater belief in fake (but not real) news posts. Then, in Study 2, we measured and manipulated reliance on emotion versus reason across four experiments (total N  = 3884). We found both correlational and causal evidence that reliance on emotion increases belief in fake news: self-reported use of emotion was positively associated with belief in fake (but not real) news, and inducing reliance on emotion resulted in greater belief in fake (but not real) news stories compared to a control or to inducing reliance on reason. These results shed light on the unique role that emotional processing may play in susceptibility to fake news.

Introduction

The 2016 US presidential election and UK Brexit vote focused attention on the spread of “fake news” (“fabricated information that mimics news media content in form but not in organizational process or intent”; Lazer et al. 2018 , p. 1094) via social media. Although the fabrication of ostensible news events has been around in media such as tabloid magazines since the early twentieth century (Lazer et al. 2018 ), technological advances and the rise of social media provide opportunity for anyone to create a website and publish fake news that might be seen by many thousands (or even millions) of people.

The threat of misinformation is perhaps most prevalent and salient within the domain of politics. For example, within the 3 months prior to the US election, estimates indicate that fake news stories favoring Trump were shared approximately 30 million times on Facebook, while those favoring Clinton were shared approximately 8 million times (Allcott and Gentzkow 2017 ). Furthermore, a recent analysis suggests that, among news stories fact-checked by independent fact-checking organizations, false stories spread farther, faster, and more broadly on Twitter than true stories, with false political stories reaching more people in a shorter period of time than all other types of false stories (Vosoughi et al. 2018 ). These fake news stories are not only spread, but are also often believed to be true (Silverman and Singer-Vine 2016 ). And, in fact, merely being exposed to a fake news headline increases later belief in that headline (Pennycook et al. 2018 ).

Some recent studies have, in contrast, suggested that fears over widespread exposure to and consumption of fake news may be overstated, as fake news accounts for less than half a percent of Americans’ daily media diet (Allen et al. 2020 ). However, while similar findings have supported the conclusion that fake news websites make up a small proportion of media diets overall, these studies have also shown that fake news is disproportionately visited by specific groups of people (e.g., supporters of Donald Trump; Guess et al. 2020 ; social media users over the age of 65; Guess et al. 2019 ). Thus, regardless of the impact of fake news on the average Americans’ overall media consumption, fake news may still impact the belief in and spread of news in key political and demographic communities.

Here, we explore the psychology underlying belief in blatantly false (and implausible) news stories. In particular, we focus on the role of emotional processing in such (mis)belief.

Motivated cognition versus classical reasoning

From a theoretical perspective, what role might we expect emotion to play? One popular perspective on belief in misinformation, which we will call the motivated cognition account , argues that analytic thinking—rather than emotional responses—are primarily to blame (Kahan 2017 ). By this account, people reason like lawyers rather than scientists, using their reasoning abilities to protect their identities and ideological commitments rather than to uncover the truth (Kahan 2013 ). Thus, our reasoning abilities are hijacked by partisanship, and therefore those who rely more on reasoning are better able to convince themselves of the truth of false stories that align with their ideology. This account is supported by evidence that people who engage in more analytic thinking show more political polarization regarding climate change (Kahan et al. 2012 ; see also Drummond and Fischhoff 2017 ), gun control (Kahan et al. 2017 ; see also Ballarini and Sloman 2017 ; Kahan and Peters 2017 ), and selective exposure to political information (Knobloch-Westerwick et al. 2017 ).

An alternative perspective, which we will call the classical reasoning account , argues that reasoning and analytic thinking do typically help uncover the truth of news content (Pennycook and Rand 2019a ). By this account, individuals engaging in reasoning and reflection are less likely to mistake fake news as accurate. And, by extension, misinformation often succeeds when individuals fail to utilize reason and analytic thinking. The classical reasoning account fits within the tradition of dual-process theories of judgment, in which analytic thinking (rather than relying on “gut feelings”) is thought to often (but not always) support sound judgment (Evans 2003 ; Stanovich 2005 ). Recent research supports this account as it relates to fake news by linking the propensity to engage in analytic thinking with skepticism about epistemically suspect beliefs (Pennycook et al. 2015a , b ; however, this association may be specific to Western individuals and moderated as a function of culture; see Majima et al. 2020 ; also see Bahçekapılı and Yilmaz 2017 ), such as paranormal and superstitious beliefs (Pennycook et al. 2012 ), conspiracy beliefs (Swami et al. 2014 ), delusions (Bronstein et al. 2019 ), and pseudo-profound bullshit (Pennycook et al. 2015a , b ). Of most direct relevance, people who were more willing to think analytically when given a set of reasoning problems were less likely to erroneously believe fake news articles regardless of their partisan alignment (Pennycook and Rand 2019a ), and experimental manipulations of deliberation yield similar results (Bagò et al. 2020 ). Moreover, analytic thinking is associated with lower trust in fake news sources (Pennycook and Rand 2019b ) and less sharing of links to low quality sources on Twitter (Mosleh et al. 2020 ). Belief in fake news has also been associated with dogmatism, religious fundamentalism, and reflexive (rather than active/reflective) open-minded thinking (Bronstein et al. 2019 ; Pennycook and Rand 2019c ). A recent experiment has even shown that encouraging people to think deliberately, rather than intuitively, decreased self-reported likelihood of “liking” or sharing fake news on social media (Effron and Raj 2020 ), as did asking people to judge the accuracy of every headline prior to making a sharing decision (Fazio 2020 ) or simply asking for a single accuracy judgment at the outset of the study (Pennycook et al. 2019 , 2020 ). Indeed, encouraging individuals to think deliberately and focus on retrieving accurate information has also been shown to reduce the influence of misinformation in contexts beyond fake news—for instance, when encouraged to deliberate, fact check, and edit fictional texts with inaccurate assertions, people are less influenced by the inaccurate claims they encounter (Rapp et al. 2014 ).

Emotion and engagement with fake news

Prior research has also focused in part on the roles of individuals’ emotional experiences, rather than on the use of deliberation and reason, when engaging in accuracy judgments. Different emotions have been suggested to differentially impact judgment in general, as well as perceptions of political fake news in particular. An extensive literature assesses the differential impact of specific emotions on cognition and decision-making (e.g., Appraisal-Tendency Framework; Lerner and Keltner 2001 ; Feelings-as-information theory; Schwarz 2011 ). For instance, Bodenhausen et al. ( 1994 ) found that anger elicits greater reliance upon heuristic cues in a persuasion paradigm, whereas sadness promotes an opposite, decreased reliance on heuristic cues. Literature on the relationship between emotion and gullibility has found that a negative mood state generally increases skepticism, whereas a positive mood state increases gullibility and decreases the ability to detect deception (Forgas and East 2008 ; Forgas 2019 ). Affective feelings have also been found to demonstrate a flexible influence on cognition; that is, both positive and negative emotions may improve cognitive performance, depending on the nature of the task (e.g., creative versus analytic) and processing styles available (e.g., heuristic versus systematic; see Huntsinger and Ray 2016 ).

More specifically within the domain of political fake news, anger has been suggested to promote politically aligned motivated belief in misinformation, whereas anxiety has been posited to increase belief in politically discordant fake news due to increased general feelings of doubt (Weeks 2015 ). In other words, anger may promote biased, intuitive, motivated reasoning, whereas anxiety may encourage individuals to consider opposing viewpoints (MacKuen et al. 2010 ) and perhaps even improve the overall quality of information seeking (Valentino et al. 2008 ). These hypotheses suggest that experience and use of specific emotions may elicit distinct, dissociable effects on news accuracy perception. Furthermore, evidence suggests that the illusory truth effect (i.e., believing fake news content after repeated exposure) is in some part driven by feelings of positivity cueing truth (Unkelbach et al. 2011 ), whereas sadness may reduce the illusory truth effect (Koch and Forgas 2012 ). Related research generally posits that claims are more likely to be judged as “truthful” when individuals are experiencing positive or neutral emotions, whereas negative emotions may encourage people to be more skeptical (see Brashier and Marsh 2020 ; Forgas 2019 ).

These prior assessments of the relationship between specific emotions and forming accuracy judgments are potentially also compatible with the classical reasoning account of why people fall for fake news. For instance, sad individuals may engage in analytic thinking more often and thus are more skeptical of fake news, while the opposite may be true for happy individuals (see Forgas 2019 ).

However, the classical reasoning account has also been conceptualized more commonly within the framework of a dual-process model of cognition, in which emotional “gut feelings” are posited to contribute to less accurate judgments and heightened belief in falsehoods. For instance, faith in intuition and one’s general feelings associated with information processing (e.g., ‘I trust my initial feelings about the facts’) have been found to be associated with belief in conspiracy theories and falsehoods in science and politics (Garrett and Weeks 2017 ). Furthermore, some evidence suggests that even negative emotions, generally thought to promote skepticism (Forgas 2019 ), can also contribute to belief in conspiracy theories, particularly when such emotions are related to the subject of the conspiracy theory (e.g., dejection-agitation; Mashuri et al. 2016 ). Such findings suggest that relying on existing feelings may contribute to inaccurate assessments of truth by directly increasing credulity of typically implausible content, rather than solely by reducing analytic thinking. However, prior work has yet to garner broad consensus as to the effects of experiencing or utilizing emotion per se on fake news.

Current research

We aim to add to the current state of knowledge regarding belief in fake news in three main ways. First, little previous work has looked at the effects of experiencing specific emotions on belief in fake news. Looking at these effects will help us determine whether the potential effect(s) of emotion on fake news belief is isolated to a few specific emotions (presumably for a few idiosyncratic reasons) or whether a broader dual-process framework where emotion and reason are differentially responsible for the broad phenomenon of falling for fake news is more appropriate.

Second, much prior work on fake news has focused almost exclusively on reasoning, rather than investigating the role of emotional processing per se. In other words, prior research has treated the extent of reason and emotion as unidimensional, such that any increase in use of reason necessarily implies a decrease in use of emotion and vice-versa. In contrast, both emotion and reason may complimentarily aid in the formation of beliefs (Mercer 2010 ). The current study addresses this issue by separately modulating the use of reason and use of emotion. This approach, as well as the inclusion of a baseline condition in our experimental design, allows us to ask whether belief in fake news is more likely to be the result of merely failing to engage in reasoning rather than being specifically promoted by reliance on emotion. Furthermore, it allows for differentiable assessments regarding use of reason and use of emotion, rather than treating reason and emotion simply as two directions on the same continuum.

Third, prior work has been almost entirely correlational, comparing people who are predisposed to engage in more versus less reasoning. Therefore, whether a causal impact of reasoning on resistance to fake news—and/or a causal effect of emotion on susceptibility to fake news—exists remains unclear. In the current research, we address this issue by experimentally manipulating reliance on emotion versus reason when judging the veracity of news headlines.

In Study 1, we examine the association between experiencing specific emotions and believing fake news. In this study, we assess emotionality by measuring participant’s current experience of emotion prior to engaging with any news headlines (i.e., participant’s momentary “mood state”; see Rusting 1998 ). We examine whether heightened emotionality is associated with increased belief in fake news and decreased ability to discern between real and fake news. In Study 2, we engage in a large-scale investigation in which we separately manipulate and measure the extent to which participants utilize reason and emotion while evaluating the accuracy of news headlines. Here, we focus directly on manipulating the emotional processing (i.e., “reliance on emotion”) of individuals while judging the accuracy of news headlines (Rusting 1998 ). We examine whether causal evidence suggesting that inducing reliance on emotion results in greater belief in fake news exists and whether inducing reliance on reason decreases belief in fake news. We also assess whether inducing reliance on emotion or reason affects the ability to discriminate between fake and real news.

Study 1 investigates the association between state-based emotionality and accuracy judgments of real and fake news. In particular, we assess whether increased experience of emotion prior to viewing news headlines is associated with heightened belief in fake news headlines and decreased ability to discern between fake and real news.

Materials and procedure

In this exploratory study, N  = 409 participants (227 female, M age  = 35.18) were recruited via Amazon Mechanical Turk. Footnote 1 We did not have a sense of our expected effect size prior to this study. However, we a priori committed to our sample size (as indicated in our preregistration; https://osf.io/gm4dp/?view_only=3b3754d7086d469cb421beb4c6659556 ) with the goal of maximizing power within our budgetary constraints. Participants first completed demographics questions, including age, sex, and political preferences. Next, participants completed the 20-item Positive and Negative Affect Schedule scale (PANAS; Watson et al. 1988 ). For each item, participants were asked “To what extent do you feel [item-specific emotion] at this moment?” Likert-scale: 1 =  Very slightly or not at all , 2 =  A little , 3 =  Moderately , 4 =  Quite a bit , 5 =  Extremely . This measure was designed to assess the current mood state of each participant.

After completing this measure, participants received a series of 20 actual headlines that appeared on social media, half of which were factually accurate ( real news ) and half of which were entirely untrue ( fake news ) Furthermore, half of the headlines were favorable to the Democratic Party and half were favorable to the Republican Party (based on ratings collected in a pretest, described in Pennycook and Rand 2019a ). Participants in the pretest also rated the headlines on a number of other dimensions (including prior familiarity); however, they were only balanced on partisanship. These headlines were selected randomly from a larger set of 32 possible headlines—again half real, half fake, and half Democrat-favorable, and half Republican-favorable. All fake news headlines were taken from Snopes.com, a well-known fact-checking website. Real news headlines were selected from mainstream news sources (e.g., NPR, The Washington Post) and selected to be roughly contemporary to the fake news headlines. The headlines were presented in the format of a Facebook post—namely, with a picture accompanied by a headline, byline, and a source (see Fig.  1 ). For each headline, participants were asked: “To the best of your knowledge, how accurate is the claim in the above headline” using a 4-point Likert-scale: 1 = Not at all accurate, 2 = Not very accurate, 3 = Somewhat accurate, 4 = Very accurate.

figure 1

Example article with picture, headline, byline, and source. Our news items are available online ( https://osf.io/gm4dp/?view_only=3b3754d7086d469cb421beb4c6659556 )

Results and discussion

Across emotions, greater emotionality predicts increased belief in fake news and decreased truth discernment.

In our first analysis, we assessed the relationship between emotionality (i.e., momentary mood state of experiencing a particular emotion) and perceived accuracy of real and fake news. We used the R packages lme4 (Bates et al. 2015 ), lmerTest (Kuznetsova et al. 2017 ), and arm (Gelman and Su 2018 ) to perform linear mixed-effects analyses of the relationship between perceived accuracy, specific emotions measured by the PANAS, and type of news headline (fake, real). A mixed-effects model allows us to account for the interdependency between observations due to by-participant and by-item variation. We entered the PANAS score for the item of interest, type of news headline, and an interaction between the two terms into the model as fixed effects. We had intercepts for headline items and participants, as well as by-item random slopes for the effect of the PANAS emotion-item rating and by-participant random slopes for the effect of type of news headline, for random effects. The reference level for type of news headline was “fake.” Since 20 emotions were assessed by the PANAS, we performed 20 linear mixed-effects analyses. To further demonstrate the generalizability of our results across emotions, we also performed two additional linear mixed-effects analyses with aggregated PANAS scores for negative and positive emotions, which were calculated via a varimax rotation on a two-factor analysis of the 20 PANAS items. The beta coefficients for the interaction between emotion and news type are reported as “Discernment” (i.e., the difference between real and fake news, with a larger coefficient indicating higher overall accuracy in media truth discernment), and the betas for real news were calculated via joint significance tests (i.e., F-tests of overall significance). Our results are summarized in Table 1 . Footnote 2

Overall, our results indicate that, for nearly every emotion evaluated by the PANAS scale, Footnote 3 increased emotionality is associated with increased belief in fake news. Furthermore, we also find that nearly every emotion also has a significant interaction with type of news headline, such that greater emotionality also predicts decreased discernment between real and fake news. Indeed, the only emotions for which we do not see these effects are “interested,” “alert,” “determined,” and “attentive,” which arguably are all more closely associated with analytic thinking rather than emotionality per se; however, although we do not find significant relationships between these emotions and belief in fake news or discernment, we also do not provide evidence that such relationships do not exist. Our results also suggest that the relationship between emotion and news accuracy judgments appear to be specific to fake news; that is, for every emotion except “attentive” and “alert,” no significant relationship exists with real news belief. Our key findings are also robust when controlling for headline familiarity (see Additional file 1 , which contains descriptive statistics and additional analyses).

We not only find statistically significant associations between experiencing emotion and believing fake news but also observe rather substantial effect sizes. Our mixed-effects model indicates that belief in fake news (relative to the scale minimum value of 1) is nearly twice as high for participants with the highest aggregated positive and negative emotion scores (accuracy ratings of 0.96 and 1.45 above scale minimum, respectively) compared to participants with the lowest aggregated positive and negative emotion scores (accuracy ratings of 0.34 and 0.50 above scale minimum, respectively). Therefore, although even participants who experience high emotion are still, on average, able to discern between fake and true news, we observe notable increases in belief in fake news as emotionality increases.

As shown by most of our 20 previous linear mixed-effects models, both positive and negative emotion are associated with higher accuracy ratings for fake headlines (Fig.  2 ), and this relationship does not exist as clearly for real headlines.

figure 2

Plotting reported news headline accuracy as a function of aggregated positive or negative PANAS score shows a positive relationship between both positive and negative emotion and belief in fake news. This relationship is not as evident for belief in real news. Dot size is proportional to the number of observations (i.e., a specific participant viewing a specific headline). Error bars, mean ± 95% confidence intervals

Interactions with headline political concordance

Some prior work has argued that an interaction may exist between specific types of emotions and political concordance of news when assessing belief in fake news (e.g., Weeks 2015 ). Therefore, we next performed multiple linear mixed-effects analyses of the relationship between specific emotions, type of news headline, participant’s partisanship (z-scored; continuous Democrat vs. Republican), and headline political concordance (z-scored; concordant (participant and headline partisanship align), discordant (participant and headline partisanship oppose)), allowing for interactions between all items. Our maximal linear mixed model failed to converge, so we followed the guidelines for how to achieve convergence in Brauer and Curtin ( 2018 ), and removed the by-unit random slopes for within-unit predictors and lower-order interactions, leaving the by-unit random slopes for the highest order interactions (see also: Barr 2013 ). This left us with by-item random slopes for the interaction between PANAS emotion, concordance, and political party and by-participant random slopes for the interaction between type of headline and concordance. We again assessed how each emotion was associated with belief in fake news and real news, as well as the interaction between news type and emotion. Furthermore, we also assessed the interaction between emotion and concordance for fake news, as well as the three-way interaction among news type, emotion, and political concordance (reported as “Discernment × Concordant”). Our key results are summarized in Table 2 .

As with our prior models, we again find that for nearly all of the emotions assessed by the PANAS, greater emotionality is associated with heightened belief in fake news and decreased discernment between real and fake news. Emotion also appears to selectively affect fake news judgment and is unrelated to belief in real news. Looking at the interaction between emotion and concordance, our results are less consistent: some emotions significantly interact with concordance, though these coefficients are relatively small compared to the interaction with type of news. Our results also suggest that a significant interaction exists between negative emotion and concordance but not between positive emotion and concordance, indicating some specificity of effects of emotion on belief in fake news. However, no differences are observed between emotions hypothesized to have differentiable effects on belief in fake news. For example, emotions such as “hostile” and “nervous” similarly interact with political concordance. This finding is in contrast with those of Weeks ( 2015 ), who suggests that anger selectively heightens belief in politically concordant fake news, while anxiety increases belief in politically discordant fake news. Rather, our results instead tentatively suggest that emotion in general heightens belief in fake news and that different emotions do not necessarily interact with political concordance in a meaningful way. Furthermore, across all emotions, no significant three-way interactions were observed among news type, emotion, and political concordance, and therefore, we do not find evidence suggesting that political concordance interacts with the relationship between emotion and discernment.

A potential limitation of Study 1 is that our results could be in partly driven by floor effects, as most participants self-reported experiencing a relatively low level of emotion. However, the average mean score across all twenty individual emotions ( M  = 2.19) and the average median score across all twenty emotions ( M  = 1.95) were relatively similar, and both were still well above the lowest end of the PANAS scale. To verify that our results are not being driven primarily by floor effects, we also analyzed the relationships between aggregated positive and negative emotion and news accuracy ratings while only including participants who had above the median scores for positive and negative emotion, respectively. Looking at the relationship between aggregated positive emotion and belief in news headlines for only participants with above-median positive emotion, we still find that greater positive emotion relates to increased belief in fake headlines ( b  = 0.23, SE = 0.06, t (135.44) = 3.93, p  < 0.001), and that greater positive emotion results in decreased discernment between real and fake news ( b  = − 0.17, SE = 0.07, t (111.60) = − 2.34, p  = 0.021. We again do not find that greater positive emotion relates to increased belief in real headlines ( p  = 0.239). Similarly, looking at the relationship between aggregated negative emotion and belief in news headlines for participants with above-median negative emotion, we again find that greater negative emotion relates to increased belief in fake headlines ( b  = 0.19, SE = 0.03, t (117.46) = 5.60, p  < 0.001), and that greater negative emotion results in decreased discernment between real and fake news ( b  = − 0.20, SE = 0.05, t (105.60) = − 4.24, p  < 0.001). We once again do not find that greater negative emotion relates to increased belief in fake headlines ( p  = 0.887).

Another potential concern with Study 1 is that participants with higher PANAS scores are simply less attentive, and these inattentive participants are those performing worse on discriminating between real and fake news. However, this alternative explanation does not account for our findings that certain emotions (e.g., interested, alert, attentive) are not associated with decreased discernment between real and fake news, which demonstrate that our correlational findings are specific to a distinct set of emotions assessed by the PANAS, thus alleviating some concerns of floor effects driving our results.

Taken together, the results from Study 1 suggest that emotion in general, regardless of the specific type of emotion, predicts increased belief in fake news. Furthermore, nearly every type of emotion measured by the PANAS also appears to have a significant interaction with type of news, indicating an effect of emotion on differentiating real from fake news. Therefore, in Study 2, we causally assess the role of emotion in fake news perception using a dual-process framework—in which reliance on emotion in general is contrasted with reliance on reason—rather than by differentially assessing various roles of experiencing specific emotions.

Study 2 expands on the findings of Study 1 in several ways. First, Study 1 found that experienced emotion, regardless of the specific type of emotion, was associated with increased belief in fake news, as well as decreased ability to differentiate between real and fake news. To explain this association, we hypothesized that individuals who experienced greater emotionality also relied on emotion to a greater extent when making accuracy judgments of news headlines (otherwise, why increased emotionality should impact decision-making is not clear). Therefore, in Study 2, we directly manipulate the way that individuals engage in emotional processing while evaluating the veracity of news headlines. We manipulate the extent to which individuals rely on emotion (in general Footnote 4 ) or reason when judging the accuracy of news headlines. We investigate whether reliance on emotion versus reason causally affects judgments of fake news, as well as the ability to discern between real and fake news.

Our results from Study 1 suggest that heightened emotion in general is predictive of increased belief in fake news. To further assess the relationship between emotion and fake news belief, Study 2 analyzes a total of four experiments that shared a virtually identical experimental design in which reliance on reason versus emotion was experimentally manipulated using an induction prompt from Levine et al. ( 2018 ). The general procedure across all four experiments was as follows. Participants were randomly assigned to one of three conditions: a reason induction (“Many people believe that reason leads to good decision-making. When we use logic, rather than feelings, we make rationally satisfying decisions. Please assess the news headlines by relying on reason, rather than emotion.”), an emotion induction (“Many people believe that emotion leads to good decision-making. When we use feelings, rather than logic, we make emotionally satisfying decisions. Please assess the news headlines by relying on emotion, rather than reason.”), or a control induction (with the exception of experiment 1, which had no control condition (see Table 3 ); participants in all three conditions first read “You will be presented with a series of actual news headlines from 2017–2018. We are interested in your opinion about whether the headlines are accurate or not.”). After reading the induction prompt, participants receive a series of actual headlines that appeared on social media, some of which were factually accurate ( real news ), some of which were entirely untrue ( fake news ), some of which were favorable to the Democratic party, and some of which were favorable to the Republican party (based on ratings collected in a pretest, described in Pennycook and Rand 2019a ). Fake and real news headlines were selected via a process identical to that described in Study 1. Our news items are available online ( https://osf.io/gm4dp/?view_only=3b3754d7086d469cb421beb4c6659556 ). For each headline, real or fake, perceived accuracy was assessed. Participants were asked: “How accurate is the claim in the above headline?” Likert-scale: 1 =  Definitely false, 2 =  Probably false, 3 =  Possibly false , 4 =  Possibly true, 5 =  Probably true , 6 =  Definitely true . The specific number of fake, real, pro-Democrat, and pro-Republican headlines each participant viewed varied by experiment (see News headlines section of Table 3 ).

After rating the headlines, participants completed various post-experimental questionnaires. Most relevant for the current paper, participants were asked if they preferred that Donald Trump or Hillary Clinton was the President of the United States. Footnote 5 Pro-Democratic headlines rated by Clinton supporters and Pro-Republican headlines rated by Trump supporters were classified as politically concordant headlines, whereas Pro-Republican headlines rated by Clinton supporters and Pro-democratic headlines rated by Trump supporters were classified as politically discordant headlines.

Participants also completed a free-response manipulation check in which they were asked the question “At the beginning of the survey, you were asked to respond using your__” with words related to “emotion” or “intuition” being scored as accurate for the emotion induction condition and words relating to “reason” or “logic” being scored as accurate for the reason induction condition. Participants were also asked “At the beginning of the survey, you were asked to respond using your:” 1 =  Emotion , 2 =  Reason .

Participants in experiments 2 through 4 further completed several questions asking about the extent to which they used reason or emotion. Participants were directed to “Please indicate the extent to which you used emotion/feelings when judging the accuracy of the news headlines” and “Please indicate the extent to which you used reason/logic when judging the accuracy of the news headlines” according to the following Likert scale: 1 =  None at all , 2 =  A little , 3 =  A moderate amount , 4 =  A lot , 5 =  A great deal .

Participants also completed several other measures (a shortened version of the actively open-minded thinking scale; Stanovich and West 2007 ; a reworded version of the original Cognitive Reflection Test, a measure of analytic thinking; CRT; Frederick 2005 ; Shenhav et al. 2012 ; and a four-item non-numeric CRT; Thomson and Oppenheimer 2016 ) and standard demographics (e.g., age, sex, education), but we do not analyze those responses here. These further measures were included for exploratory purposes and are not analyzed or discussed here. However, all measures are included in our openly available aggregated data (see https://osf.io/gm4dp/?view_only=3b3754d7086d469cb421beb4c6659556 ). Furthermore, see Table 3 for further details on each experiment’s participants, design, and procedures.

We completed preregistrations of sample size, experimental design, and analyses for each experiment (available online https://osf.io/gm4dp/?view_only=3b3754d7086d469cb421beb4c6659556 ). Note that, across all four preregistrations, we predicted that analytic thinking should improve discernment between real and fake news.

We again did not have a sense of our expected effect sizes prior to running these studies. However, we a priori committed to our sample size (as indicated in our preregistrations) with the goal of maximizing power within our budgetary constraints. Additionally, our sample sizes are quite large relative to typical sample sizes in this field.

We soon recognized that the subject-level analysis approach proposed in all the preregistrations—calculating each subject’s average accuracy rating for each type of headline and performing an ANOVA predicting these subject-level averages based on condition and headline type—is problematic and may introduce bias (Judd et al. 2012 ). Thus, we do not follow our preregistered analyses and instead follow the guidelines of Judd et al. by conducting rating-level analyses using linear mixed-effects models with crossed random effects for subject and headline.

Furthermore, since all four experiments had essentially identical designs (in particular, manipulated reliance on emotion and reason, and asked for judgments of headline accuracy), we aggregate the data from each experiment and nest the subject within experiment in our random effects. Thus, none of the analyses reported in this paper were preregistered; however, we note that our decision to aggregate the four studies was made after we decided that we would not run any additional studies, and thus, our stopping criterion was not based on the outcome of the aggregate analysis. We aggregated our data across all four studies for several reasons. First, this substantially improved our statistical power for assessing the relative roles of relying on emotion and relying on reason in the formation of news headline accuracy judgments. Second, by combining across multiple studies, we could examine whether the effects of reliance on emotion or reliance on reason on media truth judgments were existent or consistent across a range of slightly different assessments, or if such relationships only appear in particular individual experiments.

Correlational results

Greater reliance on reason relative to emotion predicts greater truth discernment.

Before assessing the results of our causal manipulation, we examined the correlational relationship between self-reported use of reason, use of emotion, and headline accuracy ratings from the control conditions across experiments 2 through 4 ( N  = 1089). We start by investigating the relative use of reason versus emotion, and then (as argued above), we treat reason and emotion as separate continua and investigate their unique roles in fake/real news belief.

We first calculated relative use of reason as a difference score of self-reported use of reason minus self-reported use of emotion. We then performed a linear mixed-effects analysis of the relationship between perceived accuracy, relative use of reason versus emotion, and type of news headline (fake, real). Experiment (i.e., “study”) was also included in the model as a categorical covariate. We entered the relative use of reason, type of news headline, an interaction between the two terms, and study into the model as fixed effects. We had intercepts for headline items and participants nested by study, as well as by-item random slopes for the effect of relative use of reason and by-nested participant random slopes for the effect of type of news headline as random effects. The reference level for type of news headline was “fake.” Consistent with the classical account, we found that participants who self-reported greater relative use of reason rated fake news as less accurate, b  = − 0.17, SE = 0.02, t (67.14) = − 7.34, p  < 0.001. A significant interaction existed between relative use of reason and type of news headline, b  = 0.20, SE = 0.03, t (48.66) = 6.65, p  < 0.001, such that no effect of relative use of reason on perception of real headlines, b  = 0.02, F (1, 52.94) = 1.29, p  = 0.260, was observed. Thus, we found that participants who self-reported greater relative use of reason exhibited better discernment between news types. All study dummies were nonsignificant ( p  > 0.05). These findings are robust in the control for headline familiarity (see Additional file 1 ).

Unique relationships with use of emotion versus reason

We next ran a linear mixed-effects analysis similar to the aforementioned model, except replacing relative use of reason with either self-reported use of emotion or self-reported use of reason. When we considered use of emotion, we found that participants who reported greater use of emotion rated fake news headlines as more accurate, b  = 0.26, SE = 0.03, t (48.14) = 8.08, p  < 0.001. We also found a significant interaction between use of emotion and type of news headline, b  = − 0.22, SE = 0.04, t (38.33) = − 5.24, p  < 0.001, such that there was no effect of use of emotion on perceptions of real headlines, b  = 0.04, F (1, 40.39) = 2.29, p  = 0.138. Study dummies were again nonsignificant ( p  > 0.05).

Conversely, when we considered use of reason, we found no significant relationship between use of reason and accuracy ratings of fake news, p  > 0.05. However, a significant interaction was observed between use of reason and type of news, b  = 0.17, SE = 0.04, t (78.82) = 4.27, p  < 0.001, because use of reason was positively associated with perceived accuracy of real headlines, b  = 0.22, F (1, 77.23) = 20.94, p  < 0.001. Study dummies were again nonsignificant ( p  > 0.05). This evidence suggests that use of emotion may be uniquely linked to belief in false content whereas use of reason is uniquely linked to belief in true content. Figure  3 visually summarizes the results of our analyses: use of emotion is positively associated with belief in fake news but not real news, and use of reason is positively associated with belief in real news but is unrelated to belief in fake news. These findings, as well as our use of emotion findings, both remain largely consistent when we controlled for headline familiarity (see Additional file 1 ).

figure 3

Plotting reported news headline accuracy as a function of use of emotion or use of reason shows a positive relationship between emotion and belief in fake news, and a positive association between reason and belief in real news. Dot size is proportional to the number of observations (i.e., a specific participant viewing a specific headline). Error bars, mean ± 95% confidence intervals

Interactions with participant partisanship and headline political concordance

We then performed a linear mixed-effects analysis of the relationship between relative use of reason, type of news headline, participant’s partisanship (Clinton supporter, Trump supporter), and headline political concordance (concordant, discordant), allowing for interactions between all terms. Study was added as a covariate, without interactions. Our maximal linear mixed model failed to converge, so we followed the guidelines for how to achieve convergence in Brauer and Curtin ( 2018 ) and removed the by-unit random slopes for within-unit predictors and lower-order interactions, while leaving the by-unit random slopes for the highest order interactions (also see Barr 2013 ). As a result, our random effects included intercepts for headline items and participants nested by study; by-item random slopes for the three-way interaction among relative use of reason, concordance, and partisanship; and by-nested participant random slopes for the interaction between type of headline and concordance. The reference levels were “fake” for news type, “Clinton” for partisanship, and “discordant” for concordance. As in our model without partisanship and concordance, we found that relative use of reason was negatively associated with perceived accuracy of fake stories ( p  < 0.001) and had a significant interaction with type of headline ( p  < 0.001), such that no relationship was observed between relative use of reason and real news perception, b  = 0.01, F (1, 114.61) = 0.12, p  = 0.730. We found no effect of study ( p  > 0.05).

Our model also suggested a significant interaction between relative use of reason and concordance, b  = 0.11, SE = 0.02, t (10,240) = 4.41, p  < 0.001. The motivated account of fake news would predict that higher relative reasoners perceive concordant fake news as more accurate as compared to lower relative reasoners. However, we found the opposite: for concordant fake news headlines, relative use of reason was associated with decreased accuracy ratings, b  = − 0.09, F (1, 609.63) = 9.72, p  = 0.002. Both accounts would predict higher relative reasoners to perceive concordant real news as more accurate. We found that relative use of reason was nominally positively associated with accuracy ratings of concordant real news headlines, b  = 0.05, F (1, 600.57) = 3.08, p  = 0.080, though this relationship was not statistically significant.

Our model also revealed a three-way interaction among relative use of reason, type of news, and partisanship, b  = − 0.04, SE = 0.02, t (5,200) = − 2.58, p  = 0.010. For both Clinton and Trump supporters, relative use of reason was negatively associated with perceived accuracy of fake headlines ( b  = − 0.20 for both). The relationship between relative use of reason and perceived accuracy of real headlines, however, differed slightly based on partisanship: for Clinton supporters, the relationship was (barely) positive, b  = 0.01, whereas for Trump supporters the relationship was somewhat negative, b  = − 0.04. However, neither of the latter two effects were themselves significant ( p  > 0.1 for both); thus, we do not think that this three-way interaction is particularly meaningful.

Experimental manipulation results

Manipulation check of causal manipulation.

A brief manipulation check reveals that, across all four experiments, participants reported greatest use of emotion in the emotion condition ( M  = 3.47), followed by in the control condition ( M  = 2.50) and the reason condition ( M  = 2.06), F (2, 3386) = 479.80, p  < 0.001. Similarly, participants reported greatest use of reason in the reason condition ( M  = 4.14), followed by in the control condition ( M  = 3.90) and the emotion condition ( M  = 2.91), F (2, 3395) = 479.20, p  < 0.001. Follow-up pairwise Tukey tests revealed significant differences between all conditions for both use of emotion and reason, p  < 0.001.

Participants also reported greatest relative use of reason in the reason condition ( M  = 2.08), followed by the control condition ( M  = 1.41), and finally the emotion condition ( M  = − 0.56), F (2, 3372) = 748.60, p  < 0.001. These results suggest that (1) participants used relatively more emotion than reason in the emotion condition, (2) participants used relatively more reason than emotion in the reason and control conditions (based on self-report), and (3) the self-reported relative use of reason in the control condition was more similar to that of the reason condition than the emotion condition—suggesting that the manipulation was more successful at shifting people who typically rely on reason towards emotion than vice versa.

We also assessed how adherence to our manipulations was associated with headline accuracy ratings across conditions (see Additional file 1 ).

Manipulation effect on news accuracy perceptions

We next examined whether there was a condition effect on the perceived accuracy of fake and real news across all four experiments. We performed a linear mixed-effects analysis of the relationship between perceived news accuracy, experimental condition (emotion, control, reason), and type of news headline. We entered condition and type of news headline as fixed effects, with an interaction term. We also added study as a covariate. We included intercepts for headline items and participants nested by study, as well as by-item random slopes for condition and by-nested participant random slopes for type of news headline, as random effects. The reference level for condition was “emotion” and the reference level for type of news headline was “fake.” The results of this analysis are shown in Table 4 Footnote 6 (with “study” variables omitted, no effect of study was observed; all p  > 0.05).

A joint significance test revealed a significant effect of condition on fake news accuracy judgments, F (2, 186.54) = 4.72, p  = 0.010. Footnote 7 From our model, we see that fake news headlines were reported as significantly more accurate in the emotion condition as compared to the control condition ( p  = 0.003) and the reason condition ( p  = 0.028), respectively.

With respect to the magnitude of our condition effect on belief in fake news, we observe approximately a 10% increase in belief from our control condition (1.20 above scale minimum) to our emotion condition (1.32 above scale minimum) according to our mixed-effects model. While participants are still largely able to discern between real and fake news even in our emotion condition, this effect size suggests that belief in fake news was still meaningfully increased by the emotion induction.

Figure  4 shows that participants in the emotion condition more frequently assigned higher accuracy ratings to fake stories, whereas participants in the control and reason conditions more frequently assigned low accuracy ratings to fake stories.

figure 4

Higher accuracy ratings were more frequently given to fake news headlines in the emotion condition compared to the control and reason conditions

In contrast, a joint significance test of condition on real news accuracy perception did not show a significant effect, F (2, 114.42) = 1.18, p  = 0.312. That is, no effect was observed of thinking mode on real news accuracy perception (see Fig.  5 ).

figure 5

All three conditions produce similar accuracy ratings of real news stories

We next performed a joint significance test of the interaction between condition and news type. This revealed a marginally significant interaction, F (2, 112.60) = 2.75, p  = 0.069. The coefficients of our model show that media truth discernment, as indicated by the interaction between condition and news type, is significantly greater in the control condition than in the emotion condition ( p  = 0.048) and also significantly greater in the reason condition than in the emotion condition ( p  = 0.031) but did not significantly differ between the reason condition and the control condition ( p  = 0.821), hence, the larger p value for the joint significance test. Therefore, only a marginal effect was noted of condition on media truth discernment, such that discernment is worst in the emotion condition and comparatively better in both the control and reason conditions. Given that discernment is greater in the control condition than in the emotion condition, as well as greater in the reason condition than in the emotion condition, our results tentatively suggest that emotional thinking may hinder the ability to discern fake from real news. However, our results of an overall condition effect on truth discernment are not statistically significant, suggesting that manipulating emotion versus reason may not influence discernment overall compared to a control condition.

Interactions with participant partisanship and headline concordance

We next performed a linear mixed-effects analysis including partisanship and political concordance. Our maximal linear mixed model failed to converge, so we followed the guidelines for how to achieve convergence in Brauer and Curtin ( 2018 ). Ultimately, the only model that would converge was a model with random intercepts but without random slopes, which does inflate Type I error rate (Barr 2013 ). Our fixed effects included condition, real, concordance, and partisanship, allowing for all interactions. Study was included as a covariate without interactions. Our random effects included intercepts for headline items and participants nested by study. The reference levels were “fake” for news type, “Clinton” for partisanship, and “discordant” for concordance.

According to the motivated account, an interaction should exist between condition and concordance, such that fake concordant headlines have higher perceived accuracy in the reason condition than the emotion condition, and fake discordant headlines have lower perceived accuracy in the reason condition than the emotion condition. However, a joint significance test of the interaction between condition and concordance revealed a nonsignificant interaction, F (2, 39,081.07) = 1.09, p  = 0.335. A joint significant test of the three-way interaction among condition, concordance, and type of news headline also yielded nonsignificant results, F (2, 36,302.32) = 0.45, p  = 0.636.

However, joint significance was observed for the three-way interaction among condition, type of news, and partisanship, F (2, 36,946.68) = 4.24, p  = 0.014. Yet, follow-up analyses did not yield any significant differences in discernment across conditions for Clinton supporters or Trump supporters. For Clinton supporters, discernment in the emotion condition was nominally (though nonsignificantly) lower ( M  = 1.73) than discernment in either the control condition ( M  = 1.86) or reason condition ( M  = 1.81). Interestingly, for Trump supporters, discernment scores in the emotion ( M  = 1.11) and control ( M  = 1.12) conditions were nominally lower than in the reason condition ( M  = 1.26). Notably, none of these differences were statistically significant, perhaps due to the reduction in sample size—and thus power—arising from sub-setting for partisanship. Nonetheless, we found it potentially interesting that in the control condition, Clinton supporters exhibit media truth discernment capabilities more similar to the reason condition, whereas Trump supporters exhibit media truth discernment more similar to the emotion condition.

A joint significant test also revealed a significant three-way interaction among condition, concordance, and partisanship, F (2, 39,042.94) = 5.52, p  = 0.004. This three-way interaction was such that Clinton supporters nominally, though not significantly, perceived concordant fake headlines as most accurate in the emotion condition ( M  = 2.88) and as less accurate in both the control and reason conditions ( M ’s = 2.76), while Trump supporters perceived concordant fake headlines as nominally most accurate in both the emotion ( M  = 3.16) and reason ( M  = 3.15) conditions, and as least accurate in the control condition ( M  = 3.05). Interestingly, this pattern also emerged in Clinton supporters’ perceptions of discordant fake headlines, with higher accuracy perceptions in the emotion and reason conditions ( M ’s = 2.21) than in the control condition ( M  = 2.03). However, Trump supporters perceived discordant fake headlines as least accurate in the reason condition ( M  = 2.37) and as more accurate in the control ( M  = 2.44) and emotion ( M  = 2.54) conditions. Although these differences between conditions within partisan groups were not significant themselves, they suggest a potential interplay between thinking mode, partisanship, and political concordance. Notably, no evidence exists of either Clinton or Trump supporters perceiving concordant fake headlines as more accurate in the reason condition than in the emotion condition, which is unexpected under the motivated reasoning account.

Some evidence of interaction between condition, type of news, and study

To account for variation between experiments in our analyses, we fit a linear mixed model with condition, type of news, and study as fixed effects, allowing for all interactions. Experiment 2 served as our reference level for study. We included random intercepts by item and by participant nested by study as random effects. We were unable to include random slopes, as no random slopes model was able to converge. We found a joint significant interaction between condition, type of news, and study, F (4, 37,541.93) = 3.00, p  = 0.017. This joint significant interaction appeared to be driven by the interaction between the reason condition, type of news, and experiment 4 ( p  = 0.001). Since experiment 4 utilized a different online platform (Lucid) than the other three experiments (MTurk), we fit a model replacing study with platform as a fixed effect. MTurk was the reference level platform. In this model, we were able to include random slopes by item for the interaction between condition and platform, as well as random slopes for type of news for participants nested by studies. With random slopes, we did not find a significant joint interaction between platform, condition, and type of news, F (2, 35.65) = 2.32, p  = 0.113. The interaction between the reason condition, type of news, and platform was only marginally significant ( p  = 0.050). Taken together, these analyses suggest some evidence of a three-way interaction among study, type of news, and condition. As a result, we performed two separate versions of our main linear mixed-effects analysis looking at the relationship between accuracy, condition, and type of news: one with only our data from experiments 1 through 3 (MTurk) and one with the data from experiment 4 (Lucid). We found that the MTurk-specific results are similar to the results from our aggregated analyses, except the effects are even stronger: a significant effect of condition on fake news, F (2, 88.12) = 5.62, p  = 0.005, and a significant interaction between condition and type of news, F (2, 66.37) = 4.83, p  = 0.011, were observed. Conversely, our results from only the Lucid experiment were essentially null, with no condition effects. The results of these analyses are presented in the Additional file 1 . Our Additional file 1 also include analyses assessing differences in adherence to our causal manipulations across experiments, in which we find adherence to be significantly lower in experiment 4 (Lucid) than in experiments 2 and 3 (MTurk). These results provide tentative evidence that lower adherence to our manipulations on Lucid may explain our null effects on Lucid in experiment 4.

General discussion

Our results suggest several conclusions about the roles of emotion and reason in fake news perception. First, our findings from Study 1 indicate that momentary emotion, regardless of the specific type or valence of emotion, is predictive of increased belief in fake news and decreased discernment between real and fake news. Our results also suggest that emotion is specifically associated with belief in fake news. Therefore, rather than assessing how specific emotions impact perceptions of fake news, perhaps first assessing how emotion, in general, impacts belief in misinformation is best.

Second, our results from Study 2 further suggest clear correlational and experimental evidence that reliance on emotion increases belief in fake news. We found a positive association between self-reported use of emotion and belief in fake news, and that the more participants relied on emotion over reason, the more they perceived fake stories as accurate. Our manipulation also revealed causal evidence showing that inducing reliance on emotion results in greater belief in fake news compared to both a control and a condition where we induced analytic, logical thinking. Indeed, perhaps this study’s most notable finding is that reliance on emotion increases accuracy ratings of fake news relative to reliance on reason and relative to a control.

Our findings also provide some tentative evidence that the effect of emotion on perceptions of accuracy is specific to fake news. We found a significant correlational interaction between self-reported use of emotion and type of news headline (fake, real), suggesting that heightened reliance on emotion decreases people’s ability to discern between real and fake news. Our correlational analyses also showed that use of emotion was unrelated to real news accuracy perceptions. Additionally, we found no experimental effect of thinking mode on real news accuracy ratings. Although we only found a marginal overall interaction between condition and type of news headline, the interactions with type of news were significant when comparing emotion vs. control and emotion vs. reason; and the overall interaction was significant when consider the MTurk experiments (no manipulation effects at all were observed on Lucid). This tentatively suggests that inducing emotional thinking using a simple induction manipulation may impair the ability distinguish fake news from real, although further work is required.

Furthermore, the current studies suggest that belief in fake news is driven notably by over-reliance on emotion, relative to a simple lack of analytic reasoning. Use of reason was unrelated to fake news accuracy perceptions, and no difference was observed in accuracy perception between our experimental reason condition and the control condition. Therefore, emotion may be actively and uniquely promoting heightened belief in fake news relative to a baseline condition, and heightened reliance on emotion appears to be underlying susceptibility to fake news above and beyond a simple lack of reasoning.

Our evidence builds on prior work using the Cognitive Reflection Test (i.e., a measure assessing the propensity to engage in analytic, deliberative thinking; CRT; Frederick 2005 ), demonstrating a negative correlational relationship between CRT performance and perceived accuracy of fake news and a positive correlational relationship between CRT performance and the ability to discern fake news from real news (Pennycook and Rand 2019a ). Beyond these correlational results, the current studies provide causal evidence that inducing heightened reliance on emotion increases susceptibility to believing fake news and tentatively suggest that increasing emotional thinking hinders media truth discernment.

Furthermore, our findings provide further evidence against the motivated account of fake news perception. Whereas the motivated account would predict analytic reasoning to increase ideologically motivated belief of politically concordant fake news (see Kahan 2017 ), our results show no interaction between condition and concordance. We find no evidence suggesting that people utilize ideologically motivated reasoning to justify believing in fake news; rather, people appear to believe fake news if they rely too heavily on intuitive, emotional thinking. The motivated account would also predict analytic thinking to justify greater belief in concordant real news. However, we do not find a statistically significant association between relative use of reason and perceived accuracy of concordant real news. Our findings support the classical account of fake news perception, which posits that a failure to identify fake news stems from some combination of a lack of analytic, deliberative thinking and heightened reliance on emotion. Therefore, the mechanism by which individuals fall prey to fake news stories closely resembles how people make mistakes on questions such as the bat-and-ball problem from the CRT; that is, people mistakenly “go with their gut” when it would be prudent to stop and think more reflectively. Just as the bat-and-ball problem has an intuitive, albeit wrong, answer, evidence suggests that people have an intuitive truth bias (see Bond and DePaulo 2006 ), and thus, analytic reasoning aids in overcoming such intuitions in some contexts. Indeed, an abundance of evidence suggests that individuals assume they are being informed of the truth and are bad at identifying lies and misinformation (e.g., Bond and DePaulo 2006 ; Levine et al. 1999 ). This suggests that an over-reliance on intuition—and, specifically, having a reflexively open-minded thinking style (Pennycook and Rand 2019c )—is likely to result in people being more susceptible to believing fake news. As we find, inducing emotional, intuitive reasoning does in fact increase the propensity to believe fake news stories.

Our findings have important practical implications. If emotional, nondeliberative thinking results in heightened belief of fake news, then the extent to which social media platforms bias people to think with emotion over reason may contribute to the viral success of fake news. Indeed, sentiment analysis of fake news articles reveal that fake news tends to contain increased negative emotional language (Zollo et al. 2015 ; Horne and Adali 2017 ). Even true yet emotionally stimulating content may result in people being biased to think with emotion instead of reason. Further applied research into how social media platforms may separately display non-news related, yet emotionally provocative, content and news articles may provide insight into how to prevent inducing emotional thinking in individuals online, thereby potentially decreasing general susceptibility to fake news.

Limitations

Several potential limitations have been identified in the current research. First, the induction manipulation used across all four experiments was somewhat heavy-handed, and therefore, experimenter demand effects may be present. Future work should investigate whether similar patterns hold with alternative manipulations.

Second, although we find that reliance on emotion increases overall accuracy ratings of fake news, most individuals still consider fake news stories overall as more likely to be false than true. Thus, although reliance on emotion promotes belief in fake news overall, for a large proportion of participants, such reliance did not promote belief to the extent that participants found fake news stories to be more likely true than false. However, even incremental increases in belief (or reductions in disbelief) may contribute to greater long term belief (e.g., through repeated exposure; Pennycook et al. 2018 ).

Third, the classical account purports that analytic reasoning aids in overcoming intuitions such as automatic belief in false headlines. However, in the current research, we did not find evidence that inducing reason improves perceived accuracy of fake news or discernment between real and fake news relative to the control. Rather, we found that inducing intuitive, emotional thinking increased perceived accuracy of fake news. Therefore, susceptibility to fake news appears to be more about increased reliance on emotion rather than decreased analytic thinking. One potential explanation for why our induction of analytic thinking did not improve perceptions of fake news or discernment between real and fake news relative to the control is that participants in the control condition already may have been relying generally more on reason than emotion. This is supported by our manipulation check data, which suggests that people in the emotion condition used emotion relatively more than reason, whereas people in the control and reason conditions used reason relatively more than emotion. Such findings are also consistent with literature suggesting that, on average, fake news does not make up a large proportion of people’s media diets but rather is particularly consumed and shared by specific political and demographic groups (Guess et al. 2019 , 2020 ). Our results are largely consistent with the general idea that fake news belief and consumption may be driven by a small share of individuals sharing specific traits—one of which may be extremely heightened reliance on emotion. Therefore, one potential avenue for future research may be investigating manipulations aimed at reducing reliance on emotion while consuming news specifically for individuals with heightened susceptibility to fake news.

Fourth, fake news is often aimed at eliciting high emotionality (Bakir and McStay 2018 ; Horne and Adali 2017 ) and specific emotions such as moral outrage (e.g., Crockett 2017 ). However, our current work does not specifically assess the relative emotionality of fake news and real news in the context of accuracy assessments. Indeed, a key feature of fake news may be that it is more emotionally provocative than real news. Therefore, our current research does not control for the arousal or valence of headlines across real and fake stimuli. Instead, the current studies focus on the individual’s experience of and reliance on emotion while making media accuracy judgments. An examination of whether heightened reliance on emotion promotes increased belief in fake news because of the increased emotionality of fake news headlines themselves or whether an increased reliance on emotion promotes belief in fake news due to increased gullibility or susceptibility to inaccurate information regardless of the intrinsic emotional arousal or valence of such content is beyond the scope of this study. To reiterate, whether similar results would be found if fake news stimuli were adjusted to have the same emotional content as our real news stimuli remains unclear. An interesting and important future research direction would be to assess the interaction between emotional processing and the emotional content of fake and real news. Nonetheless, our results from Study 2 still suggest that increased reliance on emotion in particular increase belief in fake news headlines as they would appear in a real world setting, such as on social media.

Fifth, our assessment of the relationship between emotion and news accuracy judgments does not consider the precise mechanisms by which specific emotions may influence ratings of news accuracy. Although we find in Study 1 that most emotions measured by the PANAS are associated with increased belief in fake news and decreased ability to discern between real and fake news, we cannot speak to whether the mechanisms behind these relationships are uniform or vary between emotions. A number of studies detail how different emotions are associated with different processing patterns; for instance, positive emotions may facilitate assimilative processing (i.e., changing external information to fit internal representations), whereas negative emotions may be associated with accommodative processing (i.e., changing internal representations to fit external information; see Fiedler and Beier 2014 ; Bohn-Gettler 2019 ). However, other models of emotional processing posit that both positive and negative emotions may place limitations on cognitive resources if experiencing such emotions is part of a semantic network (Meinhardt and Pekrun 2003 ). Furthermore, even more complex relationships between emotion and cognition may exist and explain our results; for instance, the same emotion may promote different judgments depending on the appraisal of that emotion (e.g., pleasantness/unpleasantness of confidence/doubt appraisal; see Briñol et al. 2018 ). Although we find that both positive and negative emotions are associated with greater belief in fake news, whether uniform or distinct emotional information processes and appraisals drive these results is unclear.

Sixth, our analyses do not examine the role of trait-based emotion in news accuracy judgments and belief in fake news. Emotions and affective responses have been found to be relatively stable over time (Diener and Larsen 1984 ), and these stable emotional states thus may reflect general affective personality traits. In our current work, we assess the role of momentary mood states (Study 1) and emotional processing (Study 2) on belief in fake news. However, we do not measure or manipulate trait-based emotions. Future research may examine how trait-based emotions may impact who falls for fake news.

Seventh, our analyses rely primarily on a convenience sample of online Mechanical Turk workers (experiments 1–3). Although previous work has shown that Amazon Mechanical Turk is a reasonably reliable resource for research on political ideology (Coppock 2019 ; Krupnikov and Levine 2014 ; Mullinix et al. 2015 ), our samples were not nationally representative and our political ideology comparisons should be interpreted with this in mind. However, when assessing the causal role of reason and emotion in perceiving fake news accuracy, obtaining a nationally representative population may not be as important as sampling from groups of people who are frequent internet and social media users and therefore likely encounter fake news stories more regularly. Thus, Mechanical Turk may be an even more appropriate resource than a nationally representative sample. Nevertheless, how our findings may generalize to different populations is unclear. In experiment 4, which utilized a more nationally representative sample via Lucid, we found no effect of condition on fake news perception or on media truth discernment. However, this was not a precisely estimated null, as it was also not significantly different from the overall estimate. Additionally, the null effect may have been caused by Lucid participants being less attentive than MTurkers, rather than due to their differential demographic characteristics, as Lucid participants are perhaps less professionalized than the MTurk population (Coppock and McClellan 2019 ). Indeed, we find that adherence to our emotion and reason manipulations is significantly lower in study 4 (Lucid) than in studies 2 or 3 (MTurk). However, whether the manipulation used in our study is effective across samples from different online recruitment platforms remains unclear. Future work should identify whether the effects we found in our MTurk data generalize to other platforms.

Finally, our experiments used only a small subset of all contemporary fake and real news headlines. Although these headlines were selected to be representative of fake and real news headlines in general, further research is required to ascertain how our findings would generalize to different headlines or to different displays of headlines other than the Facebook news article format.

Dictionary.com recently named “misinformation” its 2018 word of the year and defined it as “false information that is spread, regardless of whether there is intent to mislead.” The online dissemination of misinformation and fake news is a troubling consequence of our digital age, and the need for psychologists to develop an understanding of the cognitive mechanisms behind why people fall for misinformation and fake stories so commonly viewed online is critical. The current results show that emotion plays a causal role in people’s susceptibility to incorrectly perceiving fake news as accurate. Contrary to the popular motivated cognition account, our findings indicate that people fall for fake news, in part, because they rely too heavily on emotion, not because they think in a motivated or identity-protective way. This suggests that interventions that are directed at making the public less emotional consumers of news media may have promise in reducing belief in fake news.

Availability of data and materials

All data and materials are available online at https://osf.io/gm4dp/?view_only=3b3754d7086d469cb421beb4c6659556 .

Here we conduct an exploratory analysis of data from a study originally designed to investigate the effects of political echo chambers on belief in fake news. For simplicity, we focus on the results of participants who were randomly assigned to the control condition of this study in which participants saw a politically balanced set of headlines (although the results are virtually identical when including subjects from the other conditions, in which most headlines were either favorable to the Democrats or the Republicans).

See Additional file 1 : Table S1 for relevant descriptive statistics.

Our PANAS scale internal reliabilities for positive and negative emotion were both acceptably high and in line with prior findings (e.g., Watson et al. 1988 ); Cronbach’s α positive  = 0.916 and Cronbach’s α negative  = 0.906.

This model may also be compatible with the circumplex model of affect, which posits that all affective states arise from common neurophysiological systems (Posner et al. 2005 ). In particular, while different affective processes and emotions may vary by valence and arousal, a common cognitive system underlying all emotional states may yet uniformly impact emotional information processing relevant to forming accuracy judgments of fake news.

We used Clinton versus Trump because the first experiment was completed in April, 2017—which was shortly after the inauguration. This question was then used in all subsequent experiments to retain consistency.

See Additional file 1 : Table S2 for descriptive statistics of relevant measures and variables.

Degrees of freedom calculated via joint significant tests within the lmer R package are computed using the Kenward–Roger degrees of freedom approximation; hence, the denominator degrees of freedom in our joint significance tests tend not to be integers.

Allcott, H., & Gentzkow, M. (2017). Social media and fake news in the 2016 election. Journal of Economic Perspectives, 31 , 211–236.

Article   Google Scholar  

Allen, J., Howland, B., Mobius, M., Rothschild, D., & Watts, D. J. (2020). Evaluating the fake news problem at the scale of the information ecosystem. Science Advances, 6 , eaay539.

Bagò, B., Rand, D. G., & Pennycook, P. (2020). Fake news, fast and slow: Deliberation reduces belief in false (but not true) news headlines. Journal of Experimental Psychology General . https://doi.org/10.1037/xge0000729 .

Article   PubMed   Google Scholar  

Bakir, V., & McStay, A. (2018). Fake news and the economy of emotions: Problems, causes, solutions. Digital Journalism, 6 , 154–175.

Ballarini, C., & Sloman, S. A. (2017). Reasons and the “Motivated numeracy effect”. In Proceedings of the 39th annual meeting of the cognitive science society (pp. 1580–1585).

Barr, D. J. (2013). Random effects structure for testing interactions in linear mixed-effects models. Frontiers in Psychology, 4 , 328.

Article   PubMed   PubMed Central   Google Scholar  

Bates, D., Mächler, M., Bolker, B., & Walker, S. (2015). Fitting linear mixed-effects models using lme4. Journal of Statistical Software, 67 , 1–48.

Bodenhausen, G. V., Sheppard, L. A., & Kramer, G. P. (1994). Negative affect and social judgment: The differential impact of anger and sadness. European Journal of Social Psychology, 24 , 45–62.

Bond, C. F., Jr., & DePaulo, B. M. (2006). Accuracy of deception judgments. Personality and Social Psychology Review, 10 , 214–234.

Bohn-Gettler, C. M. (2019). Getting a grip: the PET framework for studying how reader emotions influence comprehension. Discourse Processes, 56 , 386–401.

Bahçekapılı, H. G., & Yılmaz, O. (2017). The relation between different types of religiosity and analytic cognitive style. Personality and Individual Differences, 117 , 267–272.

Brashier, N. M., & Marsh, E. J. (2020). Judging truth. Annual Review of Psychology, 71 , 499–515.

Brauer, M., & Curtin, J. J. (2018). Linear mixed-effects models and the analysis of nonindependent data: A unified framework to analyze categorical and continuous independent variables that vary within-subjects and/or within-items. Psychological Methods, 23 , 389–411.

Briñol, P., Petty, R. E., Stavraki, M., Lamprinakos, G., Wagner, B., & Díaz, D. (2018). Affective and cognitive validation of thoughts: An appraisal perspective on anger, disgust, surprise, and awe. Journal of Personality and Social Psychology, 114 , 693–718.

Bronstein, M. V., Pennycook, G., Bear, A., Rand, D. G., & Cannon, T. D. (2019). Belief in fake news is associated with delusionality, dogmatism, religious fundamentalism, and reduced analytic thinking. Journal of Applied Research in Memory and Cognition, 8 , 108–117.

Coppock, A. (2019). Generalizing from survey experiments conducted on Mechanical Turk: A replication approach. Political Science Research and Methods, 7 , 613–628.

Coppock, A., & McClellan, O. A. (2019). Validating the demographic, political, psychological, and experimental results obtained from a new source of online survey respondents. Research and Politics, 6 , 2053168018822174.

Crockett, M. J. (2017). Moral outrage in the digital age. Nature Human Behaviour, 1 , 769–771.

Diener, E., & Larsen, R. J. (1984). Temporal stability and cross-situational consistency of affective, behavioral, and cognitive responses. Journal of Personality and Social Psychology, 47 , 871–883.

Drummond, C., & Fischhoff, B. (2017). Individuals with greater science literacy and education have more polarized beliefs on controversial science topics. Proceedings of the National Academy of Sciences, 114 , 9587–9592.

Effron, D. A., & Raj, M. (2020). Misinformation and morality: encountering fake-news headlines makes them seem less unethical to publish and share. Psychological Science, 31 , 75–87.

Evans, J. S. B. (2003). In two minds: Dual-process accounts of reasoning. Trends in Cognitive Sciences, 7 , 454–459.

Fazio, L. (2020). Pausing to consider why a headline is true or false can help reduce the sharing of false news. Misinformation Review . https://doi.org/10.37016/mr-2020-009 .

Fiedler, K., & Beier, S. (2014). Affect and cognitive processing in educational contexts. In R. Pekrun & L. Linnenbrink-Garcia (Eds.), International handbook of emotions in education (pp. 36–55). London: Taylor & Francis.

Google Scholar  

Forgas, J. P. (2019). Happy believers and sad skeptics? Affective influences on gullibility. Current Directions in Psychological Science, 28 , 306–313.

Forgas, J. P., & East, R. (2008). On being happy and gullible: Mood effects on skepticism and the detection of deception. Journal of Experimental Social Psychology, 44 , 1362–1367.

Frederick, S. (2005). Cognitive reflection and decision making. Journal of Economic Perspectives, 19 , 25–42.

Garrett, R. K., & Weeks, B. E. (2017). Epistemic beliefs’ role in promoting misperceptions and conspiracist ideation. PLoS ONE, 12 , e0184733.

Gelman, A., & Su, Y. (2018). Arm: Data analysis using regression and multilevel/hierarchical models: R package version 1.10-1 . Retrieved from: https://cran.r-project.org/web/packages/arm/index.html .

Guess, A. M., Nagler, J., & Tucker, J. (2019). Less than you think: Prevalence and predictors of fake news dissemination on Facebook. Science Advances, 5 , eaau586.

Guess, A. M., Nyhan, B., & Reifler, J. (2020). Exposure to untrustworthy websites in the 2016 US election. Nature Human Behaviour, 4 , 472–480.

Horne, B. D., & Adali, S. (2017, May). This just in: Fake news packs a lot in title, uses simpler, repetitive content in text body, more similar to satire than real news. Paper presented at the 11th international AAAI conference on web and social media . Montreal, QC.

Huntsinger, J. R., & Ray, C. (2016). A flexible influence of affective feelings on creative and analytic performance. Emotion, 16 , 826–837.

Judd, C. M., Westfall, J., & Kenny, D. A. (2012). Treating stimuli as a random factor in social psychology: A new and comprehensive solution to a pervasive but largely ignored problem. Journal of Personality and Social Psychology, 103 , 54–69.

Kahan, D. M. (2013). Ideology, motivated reasoning, and cognitive reflection. Judgment and Decision Making, 8 , 407–424.

Kahan, D. M. (2017). Misconceptions, misinformation, and the logic of identity-protective cognition. SSRN Electronic Journal, 85 , 808–822.

Kahan, D. M., & Peters, E. (2017). Rumors of the ‘Nonreplication’ of the ‘Motivated Numeracy Effect’ are greatly exaggerated. SSRN Electronic Journal . https://doi.org/10.2139/ssrn.3026941 .

Kahan, D. M., Peters, E., Dawson, E. C., & Slovic, P. (2017). Motivated numeracy and enlightened self-government. Behavioural Public Policy, 1 , 54–86.

Kahan, D. M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L. L., Braman, D., et al. (2012). The polarizing impact of science literacy and numeracy on perceived climate change risks. Nature Climate Change, 2 , 732–735.

Knobloch-Westerwick, S., Mothes, C., & Polavin, N. (2017). Confirmation bias, ingroup bias, and negativity bias in selective exposure to political information. Communication Research, 47 , 104–124.

Koch, A. S., & Forgas, J. P. (2012). Feeling good and feeling truth: The interactive effects of mood and processing fluency on truth judgments. Journal of Experimental Social Psychology, 48 , 481–485.

Krupnikov, Y., & Levine, A. (2014). Cross-sample comparisons and external validity. Journal of Experimental Political Science, 1 , 59–80.

Kuznetsova, A., Brockhoff, P. B., & Christensen, R. H. B. (2017). lmerTest package: Tests in linear mixed-effects models. Journal of Statistical Software, 82 , 1–26.

Lazer, D. M., Baum, M. A., Benkler, Y., Berinsky, A. J., Greenhill, K. M., Menczer, F., et al. (2018). The science of fake news. Science, 359 , 1094–1096.

Lerner, J. S., & Keltner, D. (2001). Fear, anger, and risk. Journal of Personality and Social Psychology, 81 , 146–159.

Levine, E. E., Barasch, A., Rand, D., Berman, J. Z., & Small, D. A. (2018). Signaling emotion and reason in cooperation. Journal of Experimental Psychology: General, 147 , 702–719.

Levine, T. R., Park, H. S., & McCornack, S. A. (1999). Accuracy in detecting truths and lies: Documenting the “veracity effect”. Communications Monographs, 66 , 125–144.

MacKuen, M., Wolak, J., Keele, L., & Marcus, G. E. (2010). Civic engagements: Resolute partisanship or reflective deliberation. American Journal of Political Science, 54 , 440–458.

Majima, Y., Walker, A. C., Turpin, M. H., & Fugelsang, J. A. (2020). Culture and epistemically suspect beliefs . PsyArXiv. Preprint. https://psyarxiv.com/qmtn6/ .

Mashuri, A., Zaduqisti, E., Sukmawati, F., Sakdiah, H., & Suharini, N. (2016). The role of identity subversion in structuring the effects of intergroup threats and negative emotions on belief in anti-west conspiracy theories in Indonesia. Psychology and Developing Societies, 28 , 1–28.

Meinhardt, J., & Pekrun, R. (2003). Attentional resource allocation to emotional events: An ERP study. Cognition and Emotion, 17 , 477–500.

Mercer, J. (2010). Emotional beliefs. International Organization, 64 , 1–31.

Mosleh, M., Arechar, A. A., Pennycook, G., & Rand, D. G. (2020). Twitter data reveal digital fingerprints of cognitive reflection. Nature Communications . https://doi.org/10.31234/osf.io/qaswn .

Mullinix, K., Leeper, T., Druckman, J., & Freese, J. (2015). The generalizability of survey experiments. Journal of Experimental Political Science, 2 , 109–138.

Pennycook, G., Cannon, T. D., & Rand, D. G. (2018). Prior exposure increases perceived accuracy of fake news. Journal of Experimental Psychology: General, 147 , 1865–1880.

Pennycook, G., Cheyne, J. A., Barr, N., Koehler, D. J., & Fugelsang, J. A. (2015a). On the reception and detection of pseudo-profound bullshit. Judgment and Decision Making, 10 , 549–563.

Pennycook, G., Cheyne, J. A., Seli, P., Koehler, D. J., & Fugelsang, J. A. (2012). Analytic cognitive style predicts religious and paranormal belief. Cognition, 123 , 335–346.

Pennycook, G., Epstein, Z., Mosleh, M., Arechar, A. A., Eckles, D., & Rand, D. G. (2019). Understanding and reducing the spread of misinformation online . https://psyarxiv.com/3n9u8 .

Pennycook, G., Fugelsang, J. A., & Koehler, D. J. (2015b). What makes us think? A three-stage dual-process model of analytic engagement. Cognitive Psychology, 80 , 34–72.

Pennycook, G., McPhetres, J., Zhang, Y., Lu, J., & Rand, D. G. (2020). Fighting COVID-19 misinformation on social media: Experimental evidence for a scalable accuracy nudge intervention. Psychological Science , 31 , 770–780.

Pennycook, G., & Rand, D. G. (2019a). Lazy, not biased: Susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition, 188 , 39–50.

Pennycook, G., & Rand, D. G. (2019b). Fighting misinformation on social media using crowdsourced judgments of news source quality. Proceedings of the National Academy of Sciences, 116 , 2521–2526.

Pennycook, G., & Rand, D. G. (2019c). Who falls for fake news? The roles of bullshit receptivity, overclaiming, familiarity, and analytic thinking. Journal of Personality . https://doi.org/10.1111/jopy.12476 .

Posner, J., Russell, J. A., & Peterson, B. S. (2005). The circumplex model of affect: An integrative approach to affective neuroscience, cognitive development, and psychopathology. Development and Psychopathology, 17 , 715–734.

Rapp, D. N., Hinze, S. R., Kohlhepp, K., & Ryskin, R. A. (2014). Reducing reliance on inaccurate information. Memory and Cognition, 42 , 11–26.

Rusting, C. L. (1998). Personality, mood, and cognitive processing of emotional information: three conceptual frameworks. Psychological Bulletin, 124 , 165–196.

Schwarz, N. (2011). Feelings-as-information theory. Handbook of Theories of Social Psychology, 1 , 289–308.

Shenhav, A., Rand, D. G., & Greene, J. D. (2012). Divine intuition: Cognitive style influences belief in God. Journal of Experimental Psychology: General, 141 , 423–428.

Silverman, C., & Singer-Vine, J. (2016). Most Americans who see fake news believe it, new survey says. BuzzFeed News . https://doi.org/10.7910/DVN/TRR0DK .

Stanovich, K. E. (2005). The robot's rebellion: Finding meaning in the age of Darwin . Chicago, IL: University of Chicago Press.

Stanovich, K. E., & West, R. F. (2007). Natural myside bias is independent of cognitive ability. Thinking and Reasoning, 13 , 225–247.

Swami, V., Voracek, M., Stieger, S., Tran, U. S., & Furnham, A. (2014). Analytic thinking reduces belief in conspiracy theories. Cognition, 133 , 572–585.

Thomson, K. S., & Oppenheimer, D. M. (2016). Investigating an alternate form of the cognitive reflection test. Judgment and Decision Making, 11 , 99–113.

Unkelbach, C., Bayer, M., Alves, H., Koch, A., & Stahl, C. (2011). Fluency and positivity as possible causes of the truth effect. Consciousness and Cognition, 20 , 594–602.

Valentino, N. A., Hutchings, V. L., Banks, A. J., & Davis, A. K. (2008). Is a worried citizen a good citizen? Emotions, political information seeking, and learning via the internet. Political Psychology, 29 , 247–273.

Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359 , 1146–1151.

Watson, D., Clark, L. A., & Tellegen, A. (1988). Development and validation of brief measures of positive and negative affect: the PANAS scales. Journal of Personality and Social Psychology, 54 , 1063–1070.

Weeks, B. E. (2015). Emotions, partisanship, and misperceptions: How anger and anxiety moderate the effect of partisan bias on susceptibility to political misinformation. Journal of Communication, 65 , 699–719.

Zollo, F., Novak, P. K., Del Vicario, M., Bessi, A., Mozetič, I., Scala, A., et al. (2015). Emotional dynamics in the age of misinformation. PLoS ONE, 10 , e0138740.

Download references

Acknowledgements

We would like to thank Antonio A. Arechar for assistance executing the experiments. We would also like to thank Clara Colombatto for assistance designing and executing Study 1.

We also gratefully acknowledge funding from the Ethics and Governance of Artificial Intelligence Initiative of the Miami Foundation, the William and Flora Hewlett Foundation, the Reset project of the Omidyar Network, the John Templeton Foundation, the Canadian Institute of Health Research, and the Social Sciences and Humanities Research Council of Canada. Furthermore, this material is based upon work supported by the National Science Foundation Graduate Research Fellowship under Grant No. 174530. Funding for open access publication provided by MIT Libraries.

Author information

Authors and affiliations.

Sloan School of Management, Massachusetts Institute of Technology, Cambridge, USA

Cameron Martel & David G. Rand

Hill/Levene Schools of Business, University of Regina, Regina, Canada

Gordon Pennycook

Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, USA

David G. Rand

You can also search for this author in PubMed   Google Scholar

Contributions

CM, GP, and DGR contributed to the design and implementation of the research and to the analysis of the results. CM contributed to the writing of the manuscript, with invaluable input from GP and DGR. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Cameron Martel .

Ethics declarations

Ethics approval and consent to participate.

The current studies were approved by the Yale University Institutional Review Boards, and consent was obtained from all participants.

Consent for publication

The authors provide consent for the publication of their work.

Competing interests

The authors declare no competing interests.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Additional file 1..

Additional file contains descriptive statistics and additional analyses

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Martel, C., Pennycook, G. & Rand, D.G. Reliance on emotion promotes belief in fake news. Cogn. Research 5 , 47 (2020). https://doi.org/10.1186/s41235-020-00252-3

Download citation

Received : 09 June 2020

Accepted : 22 September 2020

Published : 07 October 2020

DOI : https://doi.org/10.1186/s41235-020-00252-3

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Misinformation
  • Dual-process theory

critical thinking in fake news

Shenandoah University

Fake News & Critical Thinking: So What?

  • Critical Thinking
  • Finding Real News

Why Should We Care?

Now, more than ever, all responsible citizens need the skills to identify accurate information and real facts. We should be able to think critically and determine which facts and information are accurate - and which might be Fake News, or

  • Misinformation :  "False or inaccurate information, especially that which is deliberately intended to deceive." See the SU book [login required]  Web of deceit: misinformation and manipulation in the age of social media.
  • Alternative Fact: "A statement intended to contradict another more verifiable, but less palatable, statement." This term received heightened attention because of a statement made by Counselor to President Trump  Kelleyanne Conway during a January 22, 2017 interview with Meet the Press.
  • Post-Truth: "After much discussion, debate, and research, the Oxford Dictionaries Word of the Year 2016 is post-truth – an adjective defined as ‘relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief.' "

[Definitions retrieved from Oxford Living Dictionaries and  Collins dictionary .  See this  Glossary from Miami Dade College Library for more terms and definitions.]

How We Used to Get the News

critical thinking in fake news

[Image by  Tobias Rose-Stockwell  retrieved from  How we broke democracy . Data from Pew Research Center,  The modern news consumer. ]

How We Get the News Now

Statistic: Most popular platforms for daily news consumption in the United States as of August 2022, by age group | Statista

  • << Previous: Fake News
  • Next: Critical Thinking >>
  • Last Updated: Apr 2, 2024 6:52 PM
  • URL: https://libguides.su.edu/fake_news

StarsInsider

StarsInsider

Ways to improve your critical thinking

Posted: March 26, 2024 | Last updated: May 16, 2024

<p>Critical thinking is an essential <a href="https://www.starsinsider.com/lifestyle/439927/life-skills-parents-can-teach-their-children-for-success" rel="noopener">skill</a> for anyone who wishes to be successful in business. It is what allows us to analyze information properly to find appropriate solutions to problems. But it is also important to think critically in every day life; it helps us to filter out fake news, for example.</p> <p>While most of us have a certain level of critical thinking capacity, there is often room for improvement. Check out this gallery for some tips on how to improve your critical thinking.</p><p>You may also like:<a href="https://www.starsinsider.com/n/179932?utm_source=msn.com&utm_medium=display&utm_campaign=referral_description&utm_content=457285v1en-en"> Do you recognize these big TV stars from 10 years ago?</a></p>

Critical thinking is an essential skill for anyone who wishes to be successful in business. It is what allows us to analyze information properly to find appropriate solutions to problems. But it is also important to think critically in every day life; it helps us to filter out fake news, for example.

While most of us have a certain level of critical thinking capacity, there is often room for improvement. Check out this gallery for some tips on how to improve your critical thinking.

You may also like: Do you recognize these big TV stars from 10 years ago?

<p>Before you set about trying to build those critical thinking skills, it is important to first understand what exactly critical thinking is. Put simply, it is the ability to think about <a href="https://www.starsinsider.com/lifestyle/426643/30-ideas-to-entertain-kids-at-home" rel="noopener">ideas</a> and concepts in a critical way.</p>

Understand the concept of critical thinking

Before you set about trying to build those critical thinking skills, it is important to first understand what exactly critical thinking is. Put simply, it is the ability to think about ideas and concepts in a critical way.

<p>It is the difference between accepting what you're told at face value and asking questions such as why you're being told that and what is the motivation of the speaker.</p><p>You may also like:<a href="https://www.starsinsider.com/n/203513?utm_source=msn.com&utm_medium=display&utm_campaign=referral_description&utm_content=457285v1en-en"> Scottish landscapes that will take your breath away</a></p>

It is the difference between accepting what you're told at face value and asking questions such as why you're being told that and what is the motivation of the speaker.

You may also like: Scottish landscapes that will take your breath away

<p>It follows, then, that when learning to think critically it is important to ask questions. When you next read a report or listen to a presentation, try and ask as many questions as you can.</p>

Ask questions

It follows, then, that when learning to think critically it is important to ask questions. When you next read a report or listen to a presentation, try and ask as many questions as you can.

<p>Although you run the risk of winding up the presenter, asking questions is in everyone's interest because it can help to expose weaknesses in logic and pave the way for a better solution to a problem.</p><p>You may also like:<a href="https://www.starsinsider.com/n/262041?utm_source=msn.com&utm_medium=display&utm_campaign=referral_description&utm_content=457285v1en-en"> Laugh out loud: The best comedians in history</a></p>

Although you run the risk of winding up the presenter, asking questions is in everyone's interest because it can help to expose weaknesses in logic and pave the way for a better solution to a problem.

You may also like: Laugh out loud: The best comedians in history

<p>In addition to asking questions about the information in front of you, it is important also to question your own thoughts and actions on a regular basis.</p>

Question yourself

In addition to asking questions about the information in front of you, it is important also to question your own thoughts and actions on a regular basis.

<p>Questioning yourself will help you identify behaviors that are unhelpful or self-defeating. All too often we continue with a certain behavior because it seems right, when in fact it is making things worse.</p><p>You may also like:<a href="https://www.starsinsider.com/n/280284?utm_source=msn.com&utm_medium=display&utm_campaign=referral_description&utm_content=457285v1en-en"> The (often bizarre) foods historical figures loved</a></p>

Questioning yourself will help you identify behaviors that are unhelpful or self-defeating. All too often we continue with a certain behavior because it seems right, when in fact it is making things worse.

You may also like: The (often bizarre) foods historical figures loved

<p>It is paramount that you pay attention to all information coming your way, whether or not it comes from a source or person you agree with.</p>

Pay attention to all incoming information

It is paramount that you pay attention to all information coming your way, whether or not it comes from a source or person you agree with.

<p>People without critical thinking skills tend to tune out information that they don't want to hear, when in fact people we don't like nearly always have something useful to say.</p><p>You may also like:<a href="https://www.starsinsider.com/n/304638?utm_source=msn.com&utm_medium=display&utm_campaign=referral_description&utm_content=457285v1en-en"> Funny celebrity moments: pranksters on the red carpet</a></p>

People without critical thinking skills tend to tune out information that they don't want to hear, when in fact people we don't like nearly always have something useful to say.

You may also like: Funny celebrity moments: pranksters on the red carpet

<p>Good critical thinking always involves an element of foresight. Successful critical thinkers are able to use the information available to them to predict what will happen in the future.</p>

Develop foresight

Good critical thinking always involves an element of foresight. Successful critical thinkers are able to use the information available to them to predict what will happen in the future.

<p>However, foresight is not about clairvoyants and tarot cards. Instead it is about carefully considering all the possible consequences of a certain action.</p><p>You may also like:<a href="https://www.starsinsider.com/n/350852?utm_source=msn.com&utm_medium=display&utm_campaign=referral_description&utm_content=457285v1en-en"> The dark side of Walt Disney</a></p>

However, foresight is not about clairvoyants and tarot cards. Instead it is about carefully considering all the possible consequences of a certain action.

You may also like: The dark side of Walt Disney

<p>Critical thinking, like anything else, takes practice. It is therefore a good idea to rid your life of time-wasting activities, such as Netflix bingeing, so you have more time to practice.</p>

Reduce time-wasting

Critical thinking, like anything else, takes practice. It is therefore a good idea to rid your life of time-wasting activities, such as Netflix bingeing, so you have more time to practice.

<p>That does not mean to say you shouldn't relax, however. In fact, the brain needs downtime in order to develop. Try and go for something more stimulating, though, like reading a book.</p><p>You may also like:<a href="https://www.starsinsider.com/n/369610?utm_source=msn.com&utm_medium=display&utm_campaign=referral_description&utm_content=457285v1en-en"> Hit songs you didn't know were written by Prince</a></p>

That does not mean to say you shouldn't relax, however. In fact, the brain needs downtime in order to develop. Try and go for something more stimulating, though, like reading a book.

You may also like: Hit songs you didn't know were written by Prince

<p>The more you practice critical thinking, the more easily it will come. In the beginning, however, it takes time. It is therefore important to maximize your time by planning carefully.</p>

Plan your day

The more you practice critical thinking, the more easily it will come. In the beginning, however, it takes time. It is therefore important to maximize your time by planning carefully.

<p>Prioritize your tasks and don't bite off more than you can chew. Make sure that you are allowing yourself enough time to really focus on each of your projects and consider them critically.</p><p>You may also like:<a href="https://www.starsinsider.com/n/382766?utm_source=msn.com&utm_medium=display&utm_campaign=referral_description&utm_content=457285v1en-en"> Torture tracks: Songs that have been weaponized</a></p>

Prioritize your tasks and don't bite off more than you can chew. Make sure that you are allowing yourself enough time to really focus on each of your projects and consider them critically.

You may also like: Torture tracks: Songs that have been weaponized

<p>Do not limit your critical thinking practice to office hours. While being able to think critically is a must if you want to be successful in business, it is also an important life skill in everyday life.</p>

Practice critical thinking in your daily life

Do not limit your critical thinking practice to office hours. While being able to think critically is a must if you want to be successful in business, it is also an important life skill in everyday life.

<p>Next time you are choosing a book to read or watching the news, ask yourself what you want to gain from the book, or why that newsreader is emphasizing a particular story.</p><p>You may also like:<a href="https://www.starsinsider.com/n/444420?utm_source=msn.com&utm_medium=display&utm_campaign=referral_description&utm_content=457285v1en-en"> Famous women who were demonized by the media</a></p>

Next time you are choosing a book to read or watching the news, ask yourself what you want to gain from the book, or why that newsreader is emphasizing a particular story.

You may also like: Famous women who were demonized by the media

<p>Try to keep a record of difficult situations that arise and how you handle them. Writing down your thoughts on such situations will help you to reflect better on your own actions.</p>

Keep a thought journal

Try to keep a record of difficult situations that arise and how you handle them. Writing down your thoughts on such situations will help you to reflect better on your own actions.

<p>It may not be easy at first, but laying bare your reactions to a difficult situation will help you to identify and eliminate destructive behaviors and therefore solve problems more efficiently.</p><p>You may also like:<a href="https://www.starsinsider.com/n/455968?utm_source=msn.com&utm_medium=display&utm_campaign=referral_description&utm_content=457285v1en-en"> Celebrities who were raised by single fathers</a></p>

It may not be easy at first, but laying bare your reactions to a difficult situation will help you to identify and eliminate destructive behaviors and therefore solve problems more efficiently.

You may also like: Celebrities who were raised by single fathers

<p>Having a big head can inhibit critical thinking since it makes it difficult to be objective when assessing a situation. However, being too altruistic doesn't help either.</p>

Check your ego

Having a big head can inhibit critical thinking since it makes it difficult to be objective when assessing a situation. However, being too altruistic doesn't help either.

<p>Try to assign the same level of importance to both your needs and the needs of others. When analyzing a situation, try to focus on people's motivations; why do they want a certain outcome?</p><p>You may also like:<a href="https://www.starsinsider.com/n/457876?utm_source=msn.com&utm_medium=display&utm_campaign=referral_description&utm_content=457285v1en-en"> Funniest sayings from around the world</a></p>

Try to assign the same level of importance to both your needs and the needs of others. When analyzing a situation, try to focus on people's motivations; why do they want a certain outcome?

You may also like: Funniest sayings from around the world

<p>Active listening involves truly paying attention while someone else is talking, and not letting your eyes glaze over and your mind run off elsewhere.</p>

Practice active listening

Active listening involves truly paying attention while someone else is talking, and not letting your eyes glaze over and your mind run off elsewhere.

<p>Not only is it rude not to listen properly when someone is presenting, but you will miss important information and/or ideas that should be submitted to your own mental analysis.</p><p>You may also like:<a href="https://www.starsinsider.com/n/468378?utm_source=msn.com&utm_medium=display&utm_campaign=referral_description&utm_content=457285v1en-en"> Bandmates who hated each other</a></p>

Not only is it rude not to listen properly when someone is presenting, but you will miss important information and/or ideas that should be submitted to your own mental analysis.

You may also like: Bandmates who hated each other

<p>If you have a business problem to solve, the likelihood is that someone before you has solved a very similar if not identical issue. Make the most of past learnings to help you in the present.</p>

Evaluate existing evidence

If you have a business problem to solve, the likelihood is that someone before you has solved a very similar if not identical issue. Make the most of past learnings to help you in the present.

<p>Ask yourself whether you have encountered the issue before and, if not, speak to others. Use all the information available to you to find a successful solution.</p><p>You may also like:<a href="https://www.starsinsider.com/n/472561?utm_source=msn.com&utm_medium=display&utm_campaign=referral_description&utm_content=457285v1en-en"> These celebrities live in surprisingly modest homes</a></p>

Ask yourself whether you have encountered the issue before and, if not, speak to others. Use all the information available to you to find a successful solution.

You may also like: These celebrities live in surprisingly modest homes

<p>Like many other things in life, critical thinking can be taught. If the tips in this gallery aren't enough, it may be an idea to find a mentor who can help you on your way to becoming a critical thinking expert.</p>

Engage a mentor

Like many other things in life, critical thinking can be taught. If the tips in this gallery aren't enough, it may be an idea to find a mentor who can help you on your way to becoming a critical thinking expert.

<p>A mentor may be able to frame critical thinking in such a way that it becomes more accessible and natural to you, and they may have resources for you to practice with.</p><p>You may also like:<a href="https://www.starsinsider.com/n/477032?utm_source=msn.com&utm_medium=display&utm_campaign=referral_description&utm_content=457285v1en-en"> Bizarre jobs that no longer exist</a></p>

A mentor may be able to frame critical thinking in such a way that it becomes more accessible and natural to you, and they may have resources for you to practice with.

You may also like: Bizarre jobs that no longer exist

<p>Many team-building activities put on by companies have the aim of improving the critical thinking skills of employees.</p>

Participate in team-building activities

Many team-building activities put on by companies have the aim of improving the critical thinking skills of employees.

<p>Try not to let the thought of your next team-building session fill you with dread. Instead, see it as an opportunity to hone those critical thinking skills and give you a competitive advantage.</p><p>You may also like:<a href="https://www.starsinsider.com/n/500114?utm_source=msn.com&utm_medium=display&utm_campaign=referral_description&utm_content=457285v1en-en"> Bizarre jobs within the British royal household</a></p>

Try not to let the thought of your next team-building session fill you with dread. Instead, see it as an opportunity to hone those critical thinking skills and give you a competitive advantage.

You may also like: Bizarre jobs within the British royal household

<p>If you're feeling confident, why not throw yourself in the deep end and volunteer to lead a project? Leaders are required to constantly think critically, meaning you'll have loads of practice.</p>

Take on a leadership role

If you're feeling confident, why not throw yourself in the deep end and volunteer to lead a project? Leaders are required to constantly think critically, meaning you'll have loads of practice.

<p>And as we all know, practice makes perfect. So next time your boss asks for a volunteer to head a new initiative, why not take the plunge?</p> <p>Sources: (Indeed) (Small Businessify)</p> <p>See also: <a href="https://www.starsinsider.com/lifestyle/433338/30-fun-virtual-team-building-ideas">30 fun virtual team building ideas</a></p>

And as we all know, practice makes perfect. So next time your boss asks for a volunteer to head a new initiative, why not take the plunge?

Sources: (Indeed) (Small Businessify)

See also: 30 fun virtual team building ideas

More for You

The Magura V5 is a powerful weapon

A Ukrainian naval drone has done hundreds of millions in damage to Russia

Russian attack on eastern Ukraine

Is Ukraine losing the war?

Actors who played multiple roles in the same film

Actors who played multiple roles in the same film

The Gazette's Reita passed away

The sudden death of Reita from Japanese rock band The GazettE

Garuda Indonesia flight turns back and makes emergency landing after engine fire

Garuda Indonesia flight turns back and makes emergency landing after engine fire

Summer Swimwear

Top Tips For Finding The Perfect Summer Swimwear

MailOnline logo

Putin ally warns of global war and 'tragedy for all of humanity'

Putin Turns to Xi: China Poised to Broker Controversial Peace Deal in Ukraine War | Oneindia News

Putin Turns to Xi: China Poised to Broker Controversial Peace Deal in Ukraine War | Oneindia News

How to deep clean your body

How to deep clean your body

The best enemies-to-lovers movies

The best enemies-to-lovers movies

Hundreds Of Dolphins Form Rare Superpod

Hundreds Of Dolphins Form Rare Superpod

Harry Kane names most underrated footballer

Harry Kane names most underrated footballer

Russia's war in Ukraine may be a topic at talks between Xi Jinping and Vladimir Putin this week

How China supports Russia's war

Georgia's president says she will veto 'Russian law' as furious protests continue

Georgia's president says she will veto 'Russian law' as furious protests continue

Since its formation in 1908, General Motors has built untold millions of cars, trucks and SUVs.

Everything General Motors has ever made that isn’t a car

MailOnline logo

Bodybuilder champion Xisco Serra dies aged 50 from 'stomach problem'

Famous Christian symbols explained

Famous Christian symbols explained

Bruce Lee: from martial artist to Hollywood star

Bruce Lee: from martial artist to Hollywood star

EPL: Why I missed big chance against Man City – Son Heung-min breaks silence

EPL: Why I missed big chance against Man City – Son Heung-min breaks silence

The US and Ukraine allege that Russia has used North Korean weaponry to strike Ukraine

Ukraine updates: US sanctions Russia, North Korea arms deals

  • Rita Panahi

critical thinking in fake news

Teacher sacked after encouraging student to use critical thinking

Senior Sky News Australia reporter Caroline Marcus discusses the firing of a teacher who questioned a student's claim that author J.K. Rowling was a transphobe and bigot.

A teacher was fired after his video went viral on X, where he gently challenged a student's stance on whether Harry Potter author J.K. Rowling is a transphobe and bigot.

"As far as I can see, there was nothing offensive in that video," Ms Marcus told Sky News host Rita Panahi.

"In fact, I would welcome him to any school here in Australia."

"We need more of those kinds of teachers helping our children and university students develop their critical thinking skills, which are so lacking these days."

Insider Risk Management

Critical Tinyproxy Flaw Opens Over 50,000 Hosts to Remote Code Execution

critical thinking in fake news

More than 50% of the 90,310 hosts have been found exposing a Tinyproxy service on the internet that's vulnerable to a critical unpatched security flaw in the HTTP/HTTPS proxy tool.

The issue, tracked as CVE-2023-49606 , carries a CVSS score of 9.8 out of a maximum of 10, per Cisco Talos, which described it as a use-after-free bug impacting versions 1.10.0 and 1.11.1, the latter of which is the latest version.

"A specially crafted HTTP header can trigger reuse of previously freed memory, which leads to memory corruption and could lead to remote code execution," Talos said in an advisory last week. "An attacker needs to make an unauthenticated HTTP request to trigger this vulnerability."

Cybersecurity

In other words, an unauthenticated threat actor could send a specially crafted HTTP Connection header to trigger memory corruption that can result in remote code execution.

According to data shared by attack surface management company Censys, of the 90,310 hosts exposing a Tinyproxy service to the public internet as of May 3, 2024, 52,000 (~57%) of them are running a vulnerable version of Tinyproxy.

A majority of the publicly-accessible hosts are located in the U.S. (32,846), South Korea (18,358), China (7,808), France (5,208), and Germany (3,680).

Talos, which reported the issue to December 22, 2023, has also released a proof-of-concept (PoC) for the flaw, describing how the issue with parsing HTTP Connection connections could be weaponized to trigger a crash and, in some cases, code execution.

The maintainers of Tinyproxy, in a set of commits made over the weekend, called out Talos for sending the report to a likely "outdated email address," adding they were made aware by a Debian Tinyproxy package maintainer on May 5, 2024.

Cybersecurity

"No GitHub issue was filed, and nobody mentioned a vulnerability on the mentioned IRC chat," rofl0r said in a commit. "If the issue had been reported on Github or IRC, the bug would have been fixed within a day."

Update: Fix Available

Users are advised to pull the latest master branch from git or manually apply the aforementioned commit as a patch on version 1.11.1 until Tinyproxy 1.11.2 is made available. It's also recommended that the Tinyproxy service is not exposed to the public internet.

Tinyproxy version 1.11.2 now available

Tinyproxy maintainers have officially released version 1.11.2 to fix the critical use-after-free bug that could enable an unauthenticated attacker to achieve remote code execution.

Cybersecurity

Cybersecurity Webinars

Discover the hidden tactics of cyber criminals.

Join us to explore how attackers are turning your tools against you and what you can do about it.

Mastering Data Security in the Age of Petabytes

Ever wondered how giants like McDonald's and the CIA secure petabytes of data? Find out in our exclusive webinar.

Cybersecurity

How to Investigate an OAuth Grant for Suspicious Activity or Overly Permissive Scopes

Expert Insights

Why You Should Consider Leveraging Your Python Skills to Code Securely on Blockchain

Expert Insights

Securing SaaS Apps in the Era of Generative AI

Expert Insights

GitHub Abuse Flaw Shows Why We Can't Shrug Off Abuse Vulnerabilities in Security

Sign up for free and start receiving your daily dose of cybersecurity news, insights and tips.

IMAGES

  1. Critical Thinking

    critical thinking in fake news

  2. Critical thinking and Fake News

    critical thinking in fake news

  3. Critical thinking in Fake News

    critical thinking in fake news

  4. Critical Thinking & Worldview in a Fake News World

    critical thinking in fake news

  5. Cre8tive Resources

    critical thinking in fake news

  6. Fighting Fake News!: Teaching Critical Thinking and Media Literacy in a

    critical thinking in fake news

COMMENTS

  1. The Use of Critical Thinking to Identify Fake News: A Systematic Literature Review

    The thematic discussion grouped and synthesised the articles in this review according to the main themes of fake news, information literacy and critical thinking. The Fake news and accountability discussion raised the question of who becomes accountable for the spreading of fake news between social media and the user. The articles presented a ...

  2. Opinion

    "The goal of disinformation is to capture attention, and critical thinking is deep attention," he wrote in 2018. People learn to think critically by focusing on something and contemplating it ...

  3. Distractions, analytical thinking and falling for fake news: A survey

    Analytical thinking safeguards us against believing or spreading fake news. In various forms, this common assumption has been reported, investigated, or implemented in fake news education programs.

  4. The Use of Critical Thinking to Identify Fake News: A Systematic

    While there are many studies involving fake news and tools on how to detect it, there is a limited amount of work that focuses on the use of information literacy to assist people to critically access online information and news. Critical thinking, as a form of information literacy, provides a means to critically engage with online content, for ...

  5. The Use of Critical Thinking to Identify Fake News: A Systematic

    S ystematic Literature Review*. Paul Machete and Marita Turpin1 [0000-0002-4425-2010] 1 Department of Informatics, University of Pretoria, Pretoria 0001, South Africa. [email protected] ...

  6. Fake News & Critical Thinking: Critical Thinking

    What is Critical Thinking? "Critical thinking is the disciplined art of ensuring that you use the best thinking you are capable of in any set of circumstances." Think critically about what you read, hear, see, and about how you absorb information. It's a good way to begin figuring out and interpreting fake news, alternative facts, post-truth ...

  7. Critical Thinking in the Age of Fake News: Developing Fairmindedness

    DiMatteo, T.(2019). Critical Thinking in the Age of Fake News: Developing Fairmindedness and Metacognition among Gifted High School Learners. (Doctoral dissertation). Retrieved from https://scholarcommons.sc.edu/etd/5640 This Open Access Dissertation is brought to you by Scholar Commons. It has been accepted for inclusion in

  8. Frontiers

    Researchers understand critical thinking as a tool and a higher-order thinking skill necessary for being an active citizen when dealing with socio-scientific information and making decisions that affect human life, which the pandemic of COVID-19 provides many opportunities for. ... Fake news covering all aspects of the pandemic spread rapidly ...

  9. Fighting Fake News!: Teaching Critical Thinking and Media Literacy in a

    Fighting Fake News! focuses on applying critical thinking skills in digital environments while also helping students and teachers to avoid information overload. According to a 2016 Pew Research report, we are now living in a world where 62% of people report that they get their "news" from social media.

  10. The Use of Critical Thinking to Identify Fake News: A Systematic

    The purpose of this study is to investigate the current state of knowledge on the use of critical thinking to identify fake news and recommend that information literacy be included in academic institutions, specifically to encourage critical thinking. With the large amount of news currently being published online, the ability to evaluate the credibility of online news has become essential.

  11. Critical Thinking and Fake News

    The bad news is that 'fake news' is often very believable, and it is extremely easy to get caught out. This page explains how you can apply critical thinking techniques to news stories to reduce the chances of believing fake news, or at least starting to understand that 'not everything you read is true'.

  12. 10 Ways to Spot Fake News

    Commission on Fake News and the Teaching of Critical Literacy in Schools (2018). Fake news and critical literacy: Final report. National Literacy Trust: UK. Dwyer, C. P. (2017). Critical thinking ...

  13. Who falls for fake news? Psychological and clinical profiling evidence

    Psychological interventions based on critical thinking may be useful to prevent fake news consumption. Abstract Awareness of the potential psychological significance of false news increased during the coronavirus pandemic, however, its impact on psychopathology and individual differences remains unclear.

  14. How do you counter misinformation? Critical thinking is step one

    Critical thinking is step one : Planet Money An economic perspective on ... They find that participants who watch the video are over 30 percent less likely to "consider fake news reliable." At the ...

  15. Full article: Combating fake news, disinformation, and misinformation

    Existing understandings and prior experiences, when appropriately accessed, benefit critical evaluation. By consulting valid understandings, people can interrogate incoming discourse to filter out misinformation and disinformation. ... Adriani (Citation 2019) argues that although public opinion is used to thinking about fake news as a tool used ...

  16. Critical thinking and information fluency: Fake news in the classroom

    As social media drives information dissemination based on popularity rather than accuracy, "fake news" is seemingly everywhere. Political fake stories get more press, but science fake stories are also proliferating. Not all scientific misinformation is fake, strictly defined (Oremus, 2016).

  17. Fake news detection on social media: the predictive role of university

    Critical thinking and fake news detection. CT is a functional, reflective, and logical thinking process employed by individuals before deciding what to do or what to believe (Ennis, 2000). In other words, CT can help individuals to make true and reasonable decisions about their actions or on the accuracy of information.

  18. Why we fall for fake news: Hijacked thinking or laziness?

    But research by Rand and colleagues challenges the idea that it's our reasoning that is biased. "The dominant explanation for why people believe fake news has been that their reasoning is held captive by partisan biases—their thinking gets hijacked," Rand says. His studies paint an alternate picture: "People who believe false things ...

  19. Reliance on emotion promotes belief in fake news

    The 2016 US presidential election and UK Brexit vote focused attention on the spread of "fake news" ("fabricated information that mimics news media content in form but not in organizational process or intent"; Lazer et al. 2018, p. 1094) via social media.Although the fabrication of ostensible news events has been around in media such as tabloid magazines since the early twentieth ...

  20. So What?

    Now, more than ever, all responsible citizens need the skills to identify accurate information and real facts. We should be able to think critically and determine which facts and information are accurate - and which might be Fake News, or. Misinformation: "False or inaccurate information, especially that which is deliberately intended to deceive."

  21. Come elections, critical thinking skills are key to fighting fake news

    In essence, critical thinking revolves around analysis and evaluation, and helps us to judge, understand, reason and, ultimately, determine fact from fiction. A very valuable tool as we head to ...

  22. PDF The Use of Critical Thinking to Identify Fake News: A Systematic

    Critical thinking, as a form of information literacy, provides a means to critically engage with online content, for example by looking for evidence to support claims and by evalu-ating the plausibility of arguments. The purpose of this study is to investigate the current state of knowledge on the use of critical thinking to identify fake news.

  23. Ways to improve your critical thinking

    Prioritize your tasks and don't bite off more than you can chew. Make sure that you are allowing yourself enough time to really focus on each of your projects and consider them critically. You may ...

  24. Teacher sacked after encouraging student to use critical thinking

    Teacher sacked after encouraging student to use critical thinking. May 14, 2024 - 5:17PM. Senior Sky News Australia reporter Caroline Marcus discusses the firing of a teacher who questioned a ...

  25. Critical Tinyproxy Flaw Opens Over 50,000 Hosts to ...

    More than 50% of the 90,310 hosts have been found exposing a Tinyproxy service on the internet that's vulnerable to a critical unpatched security flaw in the HTTP/HTTPS proxy tool. The issue, tracked as CVE-2023-49606, carries a CVSS score of 9.8 out of a maximum of 10, per Cisco Talos, which described it as a use-after-free bug impacting ...

  26. The Use of Critical Thinking to Identify Fake News: A Systematic

    The thematic discussion grouped and synthesised the articles in this review according to the main themes of fake news, information literacy and critical thinking. The Fake news and accountability discussion raised the question of who becomes accountable for the spreading of fake news between social media and the user. The articles presented a ...

  27. PDF The Use of Critical Thinking to Identify Fake News: A Systematic

    Critical Thinking, Fake News, Information Literacy, Systematic Literature Review. 1 Introduction . The information age has brought a significant increase in available sources of infor-