Here we've listed some common tactics that are used by disinformers to to influence, confuse, and do harm. Knowing about these tactics can help you recognize them in information you consume on social media or elsewhere.
For an (editable / printable) google document of this content, click here.
The content / claim comes from a person or group with inadequate or unrelated credentials.
For example: Dr. Ben Tapper (Facebook) - is a chiropractor from Nebraska who has continuously spread disinformation on social media about vaccines, despite having no related training or expertise (Sources: 1, 2).
Cherry picking is choosing only data (or research) that supports your argument while neglecting or ignoring other data or research.
“For example, from 1991 to 2015, of the 54,195 peer-reviewed publications on anthropogenic global warming (AGW), 31 were at odds with AGW. An example of cherry-picking is focusing on any or all of the 31 dissenting articles in the face of the 99.94% of publications supporting AGW” (Source: 3, p.3)
This type of misrepresentation involves taking data, statistics or ideas and applying them to unrelated contexts or making inappropriate connections.
A common example is the use of “zombie statistics” which are “numbers that are cited badly out of context, are sorely outdated, or were entirely made up in the first place” (Source: 4, p. 101). Zombie statistics are used to make an argument appear reliable or legitimate.
For more, check out this post on data misrepresentation (lack of context) by ScienceUpFirst
Fake Experts - People will claim they have credentials (e.g. degrees) or experience they do not have, in order to appear to be legitimate experts.
Fake Identity - Impersonation of real experts or social media accounts: often by using similar URLs or account names, hijacking hashtags, or copying the appearance of the real account.
Fake Data - making up data to “prove” a point / argument. This could include images, research data, and more. (eg Andrew Wakefield et al deliberately falsified facts in the (now retracted) 1998 paper that wrongly connected vaccines to autism).
Creating Fake Controversy - (also called Doubt Mongering) - this is fabricating doubt or uncertainty so that you think more information / research is needed before you can make a decision, even when there is significant evidence available.
Creating Fake Support - the more times you see a piece of information (regardless of whether it’s true or not), the more likely you are to believe it or think it’s legitimate (Source: 8, 9). So, disinformers will spread their fake / misleading content everywhere, often creating multiple websites / accounts, assuming fake identities, etc. For more, check out these posts on firehosing and astroturfing from ScienceUpFirst.
Disinformation often surrounds topics that challenge people's beliefs and serve as emotional triggers. Sources of disinformation often appeal to emotions through storytelling and / or the use of language, images, or video that incite fear, shock, anger, outrage, disgust or other strong emotions. These strong emotions may promote biased reasoning (Source: 5,6).
For example, there was a 2019 story circulating about how a child was abducted from a bathroom at Canada’s Wonderland, and was (allegedly) found later wearing different clothing. The story was completely fabricated, but spread widely because of the emotional context of the story (Source: 7).
If a scientist / source is going straight to the public with their claims before first going through peer review, then we need to be skeptical about the information being presented.
The peer review process is an important part of how scientific research is evaluated for quality, and skipping peer review is a huge red flag.
Logical fallacies are examples of poor or flawed reasoning. There are many different types of logical fallacies; a few common examples seen in disinformation are listed here:
ad hominem:attacking scientists or individuals, not their arguments.
causal fallacy: assuming correlation means causation
red herring: a misdirection aimed to lead you away from the true issue.
False choice (false dichotomy) making it seem as if there are only two possible options, to polarize and over-simplify a complex issue.
single cause: assuming that something is the result of a single cause, when there might be multiple causes or reasons.
A common disinformation technique is to hold scientific research / findings to impossibly high standards, and failing to recognize that scientific knowledge is durable but open to revision.
For example, expecting near 100% effectiveness of a treatment / medication or expecting that there should be no reported side effects.
Claims of conspiracies is another common disinformation tactic.
Some may claim the scientific community is conspiring to withhold ‘the truth’, (i.e the disinformer’s point of view); these claims “ignore the size and diversity of the scientific community” (Source: 3, p. 4) whose focus is on understanding the world and conducting high quality research according to the standards within their discipline.
Sources:
Associated Press. (2021, October 9). Anti-vaccine chiropractors capitalizing on Covid and sowing misinformation. The Guardian. https://www.theguardian.com/us-news/2021/oct/09/anti-vaccine-chiropractors-covid-sow-misinformation
Center for Countering Digital Hate. 2021. The disinformation dozen: Why platforms must act on twelve leading online anti‐vaxxers.https://counterhate.com/wp-content/uploads/2022/05/210324-The-Disinformation-Dozen.pdf
Clough, M. P., Herman, B.C., and Taylor, J. (2023). Features of science misinformation and disinformation efforts: Understand how to detect false information. Story Behind The Science. https://www.storybehindthescience.org/_files/ugd/790356_8b7e11c68d744df2b6ec39603e44963f.pdf
Bergstrom, C. T., & West, J. D. (2021). Calling bullshit: The art of skepticism in a data-driven world. Random House Trade Paperbacks.
Martel, C., Pennycook, G., & Rand, D. G. (2020). Reliance on emotion promotes belief in fake news. Cognitive research: principles and implications, 5, 1-20.
Weeks, B. E. (2015). Emotions, partisanship, and misperceptions: How anger and anxiety moderate the effect of partisan bias on susceptibility to political misinformation. Journal of communication, 65(4), 699-719.
Pennycook, G., Martel, C., and Rand, D. (2019, September 12). Knowing how fake news prays on your emotions can help you spot it. CBC. https://www.cbc.ca/news/canada/saskatchewan/analysis-fake-news-appeals-to-emotion-1.5274207
De Keersmaecker, J., Dunning, D., Pennycook, G., Rand, D. G., Sanchez, C., Unkelbach, C., & Roets, A. (2020). Investigating the robustness of the illusory truth effect across individual differences in cognitive ability, need for cognitive closure, and cognitive style. Personality and Social Psychology Bulletin, 46(2), 204-215.
Flanagin, A. J., & Metzger, M. J. (2013). Trusting expert-versus user-generated ratings online: The role of information volume, valence, and consumer characteristics. Computers in Human Behavior, 29(4), 1626-1634.