Often, when speaking of oneself, one describes as being rational, objective and Cartesian. However, to our surprise, simple tests prove the contrary … Indeed, our choices, observations, reactions are, most of the time, distorted by “cognitive biases”. As their name suggests, they skew our reasoning.
What is a cognitive bias?
This information processing mechanism, unique to the human mind, deviates from rational and logical thinking by relying on prejudices, beliefs, past references, common opinions, and distorts judgment.
The concept of “cognitive biases” was introduced in the 1970s by two American psychologists, Daniel Kahneman (Nobel Prize winner in economics in 2002) and Amos Tversky to explain the tendency to go towards irrational economic decisions.
Since then, hundreds of cognitive biases have been identified in multiple areas.
Kinds of mental short-circuit, they are used systematically and unconsciously by our brain to quickly process the information we have at a given moment, so that it coincides with our vision of the world (which is, most often, very widespread: humans do not like to feel alone …)
According to the researchers, some of these shortcuts have been shown to be useful for the survival of humans in the wild, allowing them to regroup, to order the world through beliefs, superstitions and generalities. In addition to reassuring themselves, prehistoric men thus saved energy, a process being less laborious than analytical reasoning.
But these simplifications are proving inappropriate as the world becomes more complex and the massive flow of summary information and opinions of all kinds overwhelms us.
While novel and rational responses must be thought out, it has been shown – and particularly in times of crisis – that the recourse to unprovable arguments increases. A phenomenon that can be explained in part by the feeling that our thought patterns and beliefs (as well as our ready-made ideas), whose identities are largely constituted, are threatened.
While it is impossible to avoid all cognitive biases, knowing some of them can free us from our grip (family loyalties, educational, political, commercial, religious …). The less we resort to it, that is to say the more time we take to reflect and observe, the more our capacity for discernment and our intelligence quotient (I.Q) increase …
The blind spot (or Bias of the blind spot)
Might as well start with this one, the most difficult to fight, and for good reason! In anatomy, the blind spot is the blind spot, the result of a feature of the human eye that closes the visual field where the optic nerve originates. In cognitive science, it corresponds to seeing cognitive biases in others but not in us! And deduce that we are less subject to it.
“Why do you see the straw that’s in your brother’s eye, and can’t you see the beam that’s in your eye?”. In other words, we assess ourselves in a much more positive way than we assess others. Researchers Pronin and Kugler called this phenomenon the illusion of introspection.
The Dunning-Kruger effect!
The Dunning-Kruger effect, named after the psychologists who have observed it, is the result of cognitive biases that lead, in several areas, the least competent individuals to overestimate their skills and the most competent to underestimate them. In other words, the less competent one is, the less one has the capacity to notice it, which the two psychologists confirm in their study.
Which may explain why it is not always the best who lead us, because they do not know how to sell themselves. Which is not a problem for the mediocre. As Michel Audiard said: “The idiots, that dares everything, that’s even what we recognize them”. But beware: no generalities!
Confirmation bias!
This bias consists in favoring information that resonates with our assumptions and our preconceived ideas while not worrying about their logic or their veracity. This often makes the debates sterile, each of the participants taking up what will confirm their point of view (political, social, emotional, etc.) despite the evidence to the contrary, which can even be used to reinforce the beliefs of each of the parties..
This bias is also at work in the reading of a statement which one will interpret according to what one “believes” to hear or what one “thinks” to find there, and not according to the information which ‘it actually contains. A very trivial mechanism since, according to intelligence quotient tests, it is at work in 90% of individuals.
The simple exposure bias!
Exploited by advertising, media, fashion, propaganda, marketing, this bias results from repeated exposure to something or a person (clubbing). The more we see such an object, the more we hear such affirmation, the more we internalize it and the more our positive feeling towards them increases.
Conversely, something new or a reasoning heard for the first time will not be easily integrated. The brain does not like novelty which requires more effort. It is more comfortable with automatic thought processes that are acquired through repeating.
The conformism (or conformity) bias!
Conformity bias refers to the tendency to rally the opinion of the majority (regardless of its merits) to the detriment of our private opinion. Experiments have shown, when faced with a problem, that the same individual will make absurd decisions if he is aware of the majority opinion and will respond correctly if he is isolated.
According to studies from Princeton University, a fold in the cerebral cortex – called the insula – centralizes information of an emotional nature and is activated when the individual fears being marginalized by his group of origin. This is why you sometimes have to wait for information to become commonplace to assimilate it.
Authority bias!
It is the tendency to place near absolute value on the opinion of someone in authority: experts, parents, superiors, guru, even if they contradict our experiences or make questionable, shocking or arbitrary decisions.
We all know the experience of submission to authority, carried out by the psychologist Milgram in 1961: Under medical authority, volunteers took part in a scientific experiment consisting in sending increasingly strong electric shocks, in case of bad response, to an individual subjected to a questionnaire and connected to electrodes. Despite the victim’s screams of pain, two-thirds of the participants continued to inflict this torture on them.
In reality, the victim and the doctor are actors, the discharges are fictitious, and the aim of the experiment is to test the submission of individuals when the request comes from a person of authority. An illustration of the proverb “no one is a prophet in his country”: If the same information comes from an “unofficial” source, it will be treated with suspicion but “swallowed” as a truth if it comes from an official source.
The illusion of knowing!
This bias consists in referring to a known situation in order to apprehend a reality which looks like it, without seeking to observe and analyze the differences and the new elements which could contradict our beliefs and enrich our perception. The required analytical effort is abandoned in favor of habit, which is more comfortable a priori, even if it leads to errors and catastrophes …
These few biases are among the most obvious. We can learn to spot them in our functioning. Because if our brain is easily manipulated by them, it is much better off when we protect it from their effects. What magnetic imaging shows today.
Reference: https://www.boardofinnovation.com/blog/16-cognitive-biases-that-kill-innovative-thinking/