How cognitive biases shape our choices by affecting judgement and persuasion - wisdom for mental health.
- May 29, 2023
- 11 min read
It's not what happens to you, but how you react to it that matters.
Epictetus
Many of our decisions are made hastily or poorly due to cognitive biases. This blog provides an explanation of several that affect our judgement and persuasion and all are completely worth knowing. So without further ado....
Cognitive biases (judgement)
One of these best known biases is the sunk cost fallacy, where people overlook poor results from investments, and tend to continue an endeavor more once an investment in money, effort or time has been made. In other words, they avoid 'cutting their losses', since turning a defeat to a victory seems more glamorous and worthy of risk than accepting a smaller defeat. It can also be summed up as"throwing good money after bad" and affects especially those prone to gambling addictions and the most stubborn and/or optimistic hearted.
An equally known bias, made popular by Kahneman, Lovallo and Tversky is the planning fallacy, defined as 'the tendency to underestimate the time, costs, and risks of future actions and at the same time overestimate the benefits of the same actions.' It is a stereotype that builders hardly ever complete works according to their planned construction times, but the planning fallacy has been shown to apply to all people doing pretty all types of tasks. Canadian researchers in 1994 showed in studies that (unsurprisingly) 'over 70% of students finished their assignment later than they had predicted they would, with the average time taken being over 55 days compared to the average prediction of 34 days.' The researchers then found 'a similar number of people underestimated the time it would take to complete a task, regardless of it if was an academic piece of work or an everyday activity, such as cleaning their apartment or fixing their bike.' Again unsurprisingly, people were more likely to be better planners when there were consequences for failure and deadlines. Most interestingly though, showing how in general the planning fallacy is provoked by a lack of realism not laziness, the Canadian researchers showed that '60% of students who spent time actively recalling similar previous tasks were more likely to accurately predict how long the next task would take. Students who did not use this technique were only right 29% of the time.'
The anchoring effect is another cognitive bias that unfortunately messes with our lives and decisions. Essentially, people tend to rely too heavily on the first piece of information offered (the “anchor”) when making decisions. We are first told x is good for whatever reason (y). Later, we find out x is good also for z but bad for q and w, and yet we fail to properly balance the reasons to buy or not to buy x out, resulting in us being slow to react to the reality of the situation and likely over buying x. To compound matters, when people believe a conclusion is true, they are also very likely to believe arguments that appear to support it, even when these arguments are weak.
The availability heuristic is a type of mental shortcut that involves estimating the probability or risk of something based on how easily examples come to mind, and based on familiarity. In the 1990s US media coverage on drug use was extremely high, as emphasised by Bill Clinton's 1994 crime bill which saw a massive increase in incarceration time for minor drug use. Russell Eisenman's 1993 study on availability heuristic shows how the media had affected the belief system of an average college student, who were asked to predict whether drug use had been rising or decreasing over the previous seven years. Given the chance to say that they weren't sure, 70% stated that drug use had risen, some that it was skyrocketing, when in fact it had been decreasing (according to reputable survey data of thousands from the National Household Survey on Drug Abuse at the time). It's worth bearing in mind the Centre for Media and Public affairs revealed that crime-coverage from 1990 to 1998 was the number-one topic on nightly news and whilst homicide rates halved nationwide, stories of them of the three most popular news networks rose almost fourfold - almost certainly provoking false paranoia in many US citizens.
Similarly the Mere Exposure Effect explain how if something sounds familiar it is more likely accepted as truth and how the human psyche tends to favor the familiar whenever making choices on average, making repetition particularly manipulative. This applies even when the experience is negative or neutral (the more a person experiences such an experience, however negative it was at first, the more likely they are to perceive it positively and prefer to repeat it at the cost of other experiences). For instance, several studies have shown that the more people are exposed to unfamiliar music, the more positively they tend to rate it (Bornstein and Lemly, 2017) and equally that the more a picture of a person is shown to someone, the more likely a person will rate them positively or say they are suitable for a position of power (Zajonc, 1968).
AI preference for reliability is often a huge mistake as anything designed by humans, such as supervised machine learning is susceptible to human error too. Although the algorithms themselves may be objective, the information these computations are based on is anything but. AI is trained to detect patterns that only exist within the datasets that humans prone to cognitive biases provide it with, and worse machine learning amplifies the cognitive biases even more than usual because people mistakenly prefer it, assuming that its algorithms are resistant to prejudiced predictions, making us more prone to accept AI answers wrongly as the absolute truth.
The Einstellung Effect was coined by Abraham Luchins in 1942 and showed how the mind loses insight, creativity and vision after practice of using one technique to solve a problem, meaning it will use the same to technique to solve a separate problem even when there's an easier and more obvious way to solve the separate problem (adopted first by people who had never learnt any previous technique to solving similar problems). In short, the Einstellung Effect demonstrates how past experiences may hinder our ability to find more efficient solutions to problems. In his experiment, Luchins compared how an experimental group of children (between 9-12) given practice problems followed by critical test problems performed in comparison with a control group (of same age) who never were given practise. The problems centred on calculating units of water using different sized jugs, but the trick was that the critical test problems could be worked out in an easier way than the practice test problems. Although over half the experimental group also used the easier way after they were given the warning 'Don't be blind', without the prompting they largely used the less practical and harder method that they had used in the practise test. By contrast, the control group used the easier way without prompting. Luchins found that stressful speed testing increased rigidity and that children of all IQs were potentially able to fall victim to or resist the Einstellung bias (though those with higher IQ were very slightly more cognitively flexible). Later experiments (by Luchins and by others) demonstrated very conclusively that older adults were far more susceptible to this cognitive bias than younger adults, and there have been shown correlations that women and westerners are more vulnerable than men and non-westerners (for instance in Luchins original test girls were less influenced by the warning 'Don't be blind' and continued pursuing the harder, tried and tested method).
Similarly, Functional Fixedness was coined by Karl Duncker in 1945 and showed how an individual who had previously used an object in one way, suffers from an impaired ability to discover a new use for an object in another context. In essence, it's often found the individual is left using an object only in the way it was traditionally used by them, even when it has multiple-uses that other individuals can more easily see. Duncker explained that the phenomenon occurs not only with physical objects, but also with mental objects or concepts (a point which lends itself nicely to the phenomenon of Einstellung effect).
Lastly, Francis Bacon summed up the popular and widely proven Confirmation Bias nearly perfectly all the way back in 1620, when he wrote in a book on philosophy and scientific method that- 'The human understanding when it has once adopted an opinion (either as being the received opinion or as being agreeable to itself) draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects and despises, or else by some distinction sets aside and rejects, in order that by this great and pernicious predetermination the authority of its former conclusions may remain inviolate.' Numerous studies have shown people tend to search for information that supports their previous beliefs more than what would question it, and that they then discern information in an unequal manner - laying more importance to points that would defend their original beliefs even when they are more superficial. Most alarmingly, studies have shown the scientific community which arguably most depend on being free from such bias, is in fact particularly vulnerable to the bias, explaining the sheer number of conflicting research and false peer-to-peer reviews.
The Confirmation Bias, Einstellung Effect and Funcional Fixedness all stress the importance of open-mindedness and to take great care with our beliefs and methods; as they show that we do better at reasoning tasks when the logical conclusion is consistent with our beliefs about the world and that when it is not, we make more mistakes or are less innovative as a direct consequence. Once understood, Ralph Waldo Emerson sounds less mad for having said- ´to a sound judgment, the most abstract truth is the most practical'.
Cognitive biases (persuasion)
Likewise it has been shown that persuasion also depends on the ease with which the mind can assimilate the information provided rather than the actual quality, truthfulness or depth of the content itself. Rather like how people have been shown to be so easily won over or put off by first impressions when meeting someone, if we read a point of view, our mind may make up whether its credible or intelligent based on factors such as its legibility and presentations, what colours were used, whether it was rhythmically expressed and whether over-complex words were used or whether the sources to the information were difficult names to pronounce; all rather than the point (and its references) itself.
This also explains why narratives are so persuasive and therefore often effectively used to get someone to do something they were reasonably reluctant to do, much more so than merely requesting them to do it. Narratives are easier to absorb, but above all they are usually emotive. Passion, interest and care are vital necessities for a person to be affected at all by any argument, and to even be capable to discern stronger arguments from weaker ones. We are fairer when we are emotional. Petty & Cacioppo's 1979 study demonstrated that the more relevant and 'personally' involved a subject is with an issue, the more the subject was affected by arguments relating to it - in the way you would hope - weak arguments against their initial position made people more entrenched, but strong arguments resulted in a shift of position. By contrast, both strong and weak arguments had no effect on subjects who weren't emotionally connected with the issue they were presented with. Dispassionate Judgement is a severe bias as it is notoriously stubborner, poorer and less empathetic judgement.
Just like with emotional and dispassionate judgement, it has been shown that people need time to reflect on strong arguments otherwise they will just reject them as if they are weak arguments. In essence, people are more persuadable if they are given Deliberation Time. Harvard and Stanford mycologists Unger, Paxton and Greene asked subjects to consider a scenario involving incest between consenting adult siblings, a scenario 'known for eliciting emotionally driven condemnation that resists reasoned persuasion'. They demonstrated people could soften their condemnation but that it depended on both deliberation time and argument strength, not just argument strength. They concluded their experiment (supported by related separate ones they conducted at the same time} that- 'A strong argument defending the incestuous behavior was more persuasive than a weak argument, but only when increased deliberation time encouraged subjects to reflect'. In the short-run, our minds are made up and arguments are futile (hence, the expression 'only time can heal') - we are biased to react poorly with snap decisions even to persuasive arguments if not given actual time to reflect - our first instincts are always rigid.
Purposeful Priming can seriously affect our personality. For instance, a 2006 study by Vohs showed that participants who had been reminded of money a lot by deliberately being given money cues were less helpful towards others and tended to ask for help less too than participants who had not been primed. They tended to prefer 'to play alone, work alone, and put more physical distance between themselves and a new acquaintance” in the words of Vohs. Moreover, socially, it was found that exposing people to the idea of money resulted in them expressing less disapproval of social inequality and endorse more unregulated free-market systems.
The Barnum effect was named after the great American huckster P. T. Barnum, who revealed that when trying to deceive someone, he normally could count on the other person to meet him halfway. He found often the person will join in the effort to deceive themselves due to the fact that the information they were presented with fitted neatly with their preconceived opinions or because it was flattering. For example, he considered that if a fortune teller correctly tells you a positive story about an endeared relative, you would be much more likely to believe that you need to buy a special candle to protect that endeared relative from a foreseen bad spirit sold by the fortune teller, than if he had to just told you about the bad spirit.
Stanley Milgrim's famous electric stock experiment and its 17 other similar variations unanimously showed what he called the Perils of Obedience. He demonstrated how ordinary civilians can commit sadistic and torturous acts out of sheer obedience, just by following authoritarian figures orders. The experiment in Yale was inspired by the Nuremberg War Criminal trials and the defense most commonly made by ordinary civilians who turned Nazis, that they were just following orders of their superiors.
The original experiment was fascinatingly thought out - the subject performed the role of a teacher to a student - who was in reality an actor who feigned pain at being electrocuted and who made deliberate mistakes when the subject tested their verbal memory (pairing words) and offered them multiple-choice options to aid their recall. The subject/ teacher was told to administer an electric shock every time the learner makes a mistake, increasing the level of shock each time by 15 volts up to 30 times. On the shock generator these 30 switches were marked from 15 volts (slight shock) to 450 (danger – severe shock). However, whenever a subject showed defiance perfoming his teacher role he was ordered by Milgram to continue, sequentially with 4 orders -
Prod 1 : Please continue. Prod 2: The experiment requires you to continue. Prod 3 : It is absolutely essential that you continue. Prod 4 : You have no other choice but to continue.
Milgram could also use two other orders if and when he thought necessary which were - Extra Prod 1 Although the shocks may be painful, there is no permanent tissue damage, so please go on’
Extra Prod 2 - Whether the learner likes it or not, you must go on until he has learned all the word pairs correctly. So please go on.
The results were astounding and conclusive. In the 1st experiment two-thirds of the teachers were completely obedient and continued to the highest level of 450 volts (marked 'danger - severe shock') despite the cries of the learner; whilst all the subjects continued to at least 300 volts. The Perils of Obedience bias demonstrates how obedience to authority leads us to ignore our compassionate human side and disregard the suffering the orders can cause, even when in plain-sight.
References
(1) Thinking fast and slow, Daniel Kahneman, 2012

Comments