Business Book Blog: Foolproof
I've set myself a goal to blog about the books I read. I will extract wisdom from the best business and personal development books I've read and share it with you.
This time around, I’m reading “Foolproof: Why we fall for misinformation and how to build immunity” by Sander Van Der Linden
About the book
Finding myself increasingly disturbed by the amount of misinformation I see online (and the amount of people I see fall for it), I was intrigued to read this book. Can anything be done? How do I stop myself from falling for misinformation?
Foolproof by Sander van der Linden offers timely insights into why we’re so prone to believing falsehoods and what we can do to protect ourselves. The book delves into the psychological mechanisms—like cognitive biases and social influences—that make misinformation so persuasive. Van der Linden not only explains why we fall for these traps but also presents strategies to build resilience against them. From the importance of critical thinking and media literacy to innovative concepts like “prebunking,” Foolproof promises to equip readers with practical tools to navigate today’s complex information landscape. It also addresses the broader responsibility of media and social platforms in curbing misinformation.
Can any of this make a difference? Here’s what I learnt.
What did I learn?
The Motivated Brain
The book starts here because this is why misinformation can be so stubbornly persistent in our minds. The motivated brain refers to the idea that our cognitive processes are not always neutral; instead, they are often influenced by our desires, emotions, and pre-existing beliefs. When we encounter information, our brains are more likely to accept what aligns with our worldview while dismissing or forgetting what challenges it.
This selective processing makes misinformation particularly hard to dislodge. Even when confronted with factual corrections, the original false belief can persist.
Van der Linden likens misinformation to a virus: once it has taken hold in the mind, it’s difficult to eradicate. The brain’s motivation to cling to what we know and avoid cognitive dissonance means that people often hold onto false beliefs even when they know better.
How Misinformation Spreads
Van der Linden likens misinformation to a virus, spreading through digital platforms. Just as a virus exploits weaknesses in the body, misinformation exploits cognitive biases and emotions, making people more susceptible to believing and sharing it. Social media amplifies this spread, with algorithms favouring sensational or emotionally charged content, creating an environment where false information can easily go viral.
‘Echo chambers’ and ‘filter bubbles’ further accelerate this process by isolating individuals within networks that reinforce their beliefs, making misinformation harder to shake off.
Echo chambers refer to environments, often within social media, where individuals are primarily exposed to information that reinforces their existing beliefs. In these spaces, differing viewpoints are either absent or actively suppressed, leading to a feedback loop that amplifies and entrenches misinformation. People within echo chambers become more convinced of the false information they encounter because it’s constantly echoed back to them by like-minded individuals.
Filter bubbles are a related phenomenon where algorithms, designed to personalise online experiences, selectively present content that aligns with a user’s previous behaviour and preferences. This creates a bubble where people are less likely to encounter information that challenges their views, further isolating them from alternative perspectives. As a result, misinformation thrives, as people are continually exposed to skewed or false narratives without the opportunity for critical examination or rebuttal.
Six Degrees of Manipulation (D.E.P.I.C.T)
These are strategies that are used to fool people into believing the unbelievable.
Discrediting: This strategy involves discrediting factual information by diverting attention away from the truth or outright denying established facts. It seeks to confuse the audience and create doubt about what is real.
Emotion: Manipulators exaggerate or fabricate risks and dangers to provoke emotional responses rather than rational thought. For example, inflating the dangers of rare side effects from vaccines creates fear and resistance.
Polarisation: This tactic deepens divisions between groups, often on issues with strong ideological divides like abortion or immigration. By exacerbating these differences, manipulators weaken social cohesion and entrench opposing views.
Conspiracy: Seeding conspiracy theories aims to cast doubt on mainstream explanations for events, leading people to question legitimate sources of information and trust in institutions.
Impersonation: False credibility is given to misinformation by impersonating reputable individuals or organisations. This manipulation lends an appearance of authority to falsehoods, making them more convincing.
Trolling: This strategy involves provoking or baiting individuals, often through inflammatory or controversial statements, to exploit emotional reactions. Trolling can be used to stir up conflict or discredit people, especially in high-profile debates like Brexit.
Does this actually affect our behaviour?
For example, could this influence the way we vote?
For me, this was the most eye-opening part of the book.
Van der Linden reviews research on this subject, much of which suggests that no, misinformation online doesn’t influence the outcome of elections because not enough people are exposed to enough of it to sway enough voters (not least because about 80% of the misinformation studied was shared by the same 1% of people or ‘superspreaders’).
Van der Linden is sceptical about these claims because he points out some of the missing factors in the research. The most significant of these is that the research did not consider the influence of microtargeting. It seems that you can do a pretty good job of influencing people if you tailor your adverts to someone’s personality type. The challenge is that you can’t target an advert by personality type. However, you can target someone on Facebook according to their likes and their likes can tell you a lot about their personality type. There is a lot of information in this chapter about Cambridge Analytica, Facebook, Steve Bannon and those dodgy personality tests but the key detail for me is that combining your scraped data with your Facebook likes enables accurate microtargeting. With this data, a computer can predict your personality better than one of your colleagues by knowing just 10 of your likes. It needs just 70 likes to outperform your friends, 150 to outperform your colleagues and 300 to outperform your spouse.
So, can all this swing an election? We don’t know. It can influence your views and shape your opinion of the candidates, which in an election with fine margins could make a difference, but there isn’t enough data to determine whether this could swing an entire election.
Prebunking
How do you combat this misinformation? Van der Linden suggests “Prebunking” is the answer.
Unlike traditional debunking, which involves correcting false information after it has spread, prebunking aims to inoculate people against misinformation before they encounter it. This strategy works by exposing individuals to a weakened version of the misleading tactic or false narrative in advance, along with an explanation of why it is misleading. By doing so, it equips them with the cognitive tools to recognise and resist misinformation when they encounter it in the future. Van der Linden likens this to a psychological vaccine, where forewarning individuals makes them more resilient to the “virus” of misinformation.
Prebunking is particularly effective because it addresses the root causes of susceptibility to misinformation, such as cognitive biases and emotional triggers, helping people develop a critical mindset that can filter out falsehoods before they take hold. Therefore, you can apply it to multiple types of misinformation.
You might be wondering how you deploy this method. An interesting output of Van der Linden’s work is an online game called ‘Bad News’. In the game, you are encouraged to become a spreader of misinformation. Each step of the way you are encouraged to deploy one of the 6 degrees of manipulation. The idea is that it helps you spot the manipulation techniques in action. Afterwards, you are tested on your ability to spot these techniques with realistic headlines. Having been exposed to the prebunking methods in the game, you should be fairly good at this. Van Der Linden shares his experiences of using different versions of the prebunking effect. It would seem that this approach works both in the short term and to a certain degree, the long term too.
I played the game and I was OK at it. This is a good thing I guess because I wouldn’t take any pride in being a master manipulator. I got 5 out of 6 of the questions right at the end. In the real-world experiment, participants were assessed before and after playing. I don’t know how different my end score would be from my starting score.
You can play the game for yourself here: https://www.getbadnews.com/en
I also played a version of this game created for the UK government during COVID. You can play it here: https://www.goviralgame.com/en/
This one was shorter, and I wondered if it would be effective. I actually thought it was better. It doesn’t cover as many of the six degrees of manipulation as Bad News but being applied to a specific subject really helped the lessons hit home.
Herd Immunity
The good news is that like a vaccine, ‘psychological inoculation’ also provides ‘herd immunity’. You don’t need everyone on board for it to be effective. If enough people are vaccinated, others in the community will be protected too. For this to be effective, the ‘vaccine’ needs to be able to scale and reach people around the world. This doesn’t have to be in the form of a game, it can also take the form of a video, a message or a conversation.
You can help achieve this by influencing friends and family. Van der Linden suggests opting for a ‘technique-based inoculation’ rather than a ‘fact-based inoculation’. Fact-based inoculation relies on refuting specific misinformation using facts and evidence. But someone already bought into the information might not be receptive to this it can start a confrontation (no one likes to be told that they are wrong) and you can easily get sidetracked. Instead, try ‘technique-based inoculation’ that tackles the overarching tactics used to spread misinformation. This also helps when tackling a range of falsehoods and reduces the chance of resistance. The effects of this are enhanced when you take an ‘active’ approach. An example of this is where you challenge someone to come up with their own counterarguments (something you can do for yourself). Like a medical vaccine, ‘booster shots are administered over time to help maintain immunity.
What did I think of the book?
This is one of the most interesting books I have read in a long time. It’s a well laid out description of why and how we are susceptible to misinformation and what we can do to inoculate ourselves. It covers a wide range of studies and shares the research that went into creating and testing the ideas shared in the book. For this reason, I found reading the book hard going. That’s not a reflection of the writing style, which is friendly and in plain English despite covering so much academic research. It is more a reflection of the amount of information presented.
I freely admit that I found the virus/vaccine analogy to be over-played, to begin with but as I got further into the book I was sold. It turns out this wasn’t a gimmick to make the book accessible to us ‘plebs’, it is a genuinely relevant analogy that helps us remember and apply the ideas of psychological inoculation.
This is an interesting book on an important subject. I would recommend you read this and then share it with your friends and family.