Year Alpha

Fallacies and biases

By September 23, 2019 October 29th, 2019 No Comments

Dear fellow thinkers,

it is my pleasure to welcome you to the second Thinking Toolkit module, entitled ‘Fallacies and biases’. This letter is intended to help you understand the logic behind it, and to guide you through the assigned exercise. Please read it carefully, as it contains some hints on how to make best possible use of the time you spend exploring the module.

First and foremost: if you have studied the syllabus, you may have noticed that the module title stands out from the other ones. While the Thinking Toolkit module entitled ‘systems thinking’ can be reasonably expected to contain tools to assist thinking in a more systemic way, you probably don’t look forward to learn how to commit more thinking fallacies. How then is exploring faulty thinking going to contribute to your toolbox? Well, I’m glad you asked! Here are at least 3 important ways.

  • Understanding faulty thinking helps us become aware how our thinking actually works. Like it or not, our brains and minds do not seem to be the Universal Rational Thinking Machines we sometimes fantasize them to be. Objective, infallible and reliable reasoning does not seem to be their default output. If we permit ourselves to speak of defaults at all, they may be more related to survival, comfort and pragmatic efficiency than certainty, logic or truth. As a result, whenever we’re trying to do some thinking, we’re likely to deviate from the iron-cast ideals of rationality. More interestingly, this is not happening at random – it is not a case of ‘true signal + accidental distortions’. Our thinking often goes wrong in a systematic, predictable way, so by observing our tendency to make certain mistakes we can infer a lot about the processes that created them. We can then ask: what is this mistake revealing about how I habitually think? Is this a generally useful strategy that has just stuttered in unfamiliar conditions? What if it has actually worked, but its purpose is something different from being ‘logical’? This is useful, because it allows us to notice that we don’t have to receive any formal training to be constantly using thinking tools – and that those tools are not universally effective.
  • Understanding fallacies and biases helps us challenge our conclusions. This body of knowledge gives background to more powerful critical thinking. Once you have forfeited the assumption that things which seem clear and obvious to you must necessarily be true, and identified some of them as examples of poor reasoning, you are somewhat more resistant to them next time. Make no mistake – your thinking will continue to be biased in many ways. It’s just that when you decide to examine it closely, you’re more likely to find and correct certain shortcomings.
  • Understanding fallacies and biases helps us design our environment. Knowing that people – including ourselves – are prone to commit certain mistakes, makes it possible to design smarter environments for reasoning and decision-making. Instead of noticing and correcting those mistakes we can sometimes count on making them, and adjust the conditions accordingly. To give a simple example: if you know that people tend to make estimations by relating them to a salient ‘baseline’, you can make things seem bigger or smaller just by changing the reference point. In other words, if you want to eat less, consider buying smaller plates. This last line of reasoning is giving raise to some interesting ethical questions – when exactly is it right to influence thinking, and consequently decisions and actions of other people? Still,  here’s your third ‘toolbox’ perspective: using knowledge of fallacies and biases for designing environments that promote certain thinking outcomes.

All these benefits from exploring faulty thinking depend on their link to our everyday life. People are capable of making unique mistakes that happen for a multitude of reasons, but it’s not the bizarre and spectacular examples we’re looking for. We should be asking: how does thinking usually go wrong? What are examples of errors commonly made in meaningful areas of life by smart, well-educated people such as ourselves? The catch, obviously, is that this question is very hard to answer. One problem with biased thinking is that we’re usually not aware of being biased. A useful workaround then, would be to start by noticing biases in other people, then assuming we must have similar biases, and finally recognizing them and correcting if necessary.

This reasoning lies at the foundation of your pre-workshop assignment. Your task will be to find, describe and attempt to understand an example of faulty thinking relevant to your environment. To the extent that it’s possible, choose something that is:

  • related to faulty thinking, not lack of knowledge (i.e. how we process information, not what information we have);
  • typical (i.e. it tends to happen rather than it happened once);
  • important (i.e. being right or wrong here can have some real-life consequences);
  • surprising (i.e. otherwise rational people tend to act in a seemingly irrational way).

Don’t start by reading about biases or fallacies, and then looking for real-life examples. Instead, think about what puzzles you in other people’s thinking and work from there. If you prefer to use yourself as an example, you are also welcome.

Either way, I’m sending you a list of questions (click and download here) to assist your reflection on what you have noticed and make it more systematic. It is put together in a one-page document that will make it easier to share and exchange observations during the workshop – and for me to react and refer to it after I’ve read your contribution. Please fill it out and send by e-mail to maciej.swiezy@wszechnica.uj.pl no later than on Sunday, October 20th so I can read it before we meet. See you soon!