Year Alpha

Fallacies and biases 
– feedback letter

By October 28, 2019 November 12th, 2019 No Comments

Dear fellow thinkers,

It is already last week that we have explored fallacies and biases together. As you are heading off to further steps of the journey, let me share with you a summary of what we have discussed outside of the pre-determined agenda, and maybe a few thoughts to expand upon that.

First, I’ve typed up a transcript of your own categorization of thinking biases, that was created during the workshop based on the assignments. It is certainly incomplete, but it has the benefit of reflecting what you actually notice in your everyday environment. I took the liberty of giving the impromptu clusters a category name that seemed most appropriate, even if it’s not always an official name of a cognitive bias. The outcome is in the table.

Secondly, during the workshop you have completed a list of Dark Side beliefs. (On a side note, the notion of Dark Side Epistemology is borrowed from the compiled essays of Elizer Yudkowsky, who does a good job of infusing rational thinking with some much needed mythology. Viewed from his position, it is not this boring, unsexy thing for philosophers with too much time on their hands. When you struggle to improve your thinking, you are Walking the Path, or Fighting the Good Fight, or resisting the Dark Side). Anyway, here is the full list from last Thursday:

#1 Evidence is there to confirm your predictions.
#2 What you see is all there is.
#3 Change is loss and loss is bad.
#4 Arguments are soldiers.
#5 Everything was better in the past.
#6 There is only one religion.
#7 Body = mind.
#8 Pain needs to be avoided at all cost.
#9 Shades of grey are too hard.

As you remember, the idea to define and discuss Dark Side beliefs was to create a memory device – something that could help you to identify a pattern in your own or somebody else’s thinking that tends to distort the understanding of a given problem. So, next time you feel reluctant to even consider positive qualities of something you strongly disagree with you can mutter to yourself “ah, here it is again, arguments are soldiers” and do something different than usual. For that purpose you don’t really need to remember all 9 of them – especially since there seems to be some overlap (e.g. #3 seems to be related to #8, #6 to #9). Maybe you’ll want to go as far as to compile your own, shorter list. Have fun!

Since this led to an interesting conversation, I want to focus for a while on #7: Body = mind. The logic for making this a Dark Side belief was, if I recall correctly that our physical and emotional state (e.g. tiredness, physical pain, hunger, stress) tends to affect the quality of our thinking – first by draining away resources required to perform more subtle distinctions and deal with uncertainty, and then by having us confuse signals from the body with valid intuitions about the problem we are trying to solve. Case in point: the study reported by Kahneman where parole judges’ ratio of positive decisions averaged at 65% after a meal, and then slowly decreased to 0% just before the next break. Yet, our body should not be seen as simply the source of disruptive interference to an otherwise rational mind. If you’re interested, check Antionio Damasio’s research around the concept of a somatic marker*, or explore the notion of embodied cognition, likely to be mentioned by Francis Heylighen in one of the upcoming lectures.

Finally, we’ve been toying with the idea of doing something about the fact that our minds tend to fall into biases. The most obvious way of dealing with this is to try and review our reasoning with common pitfalls in mind, and maybe to replace intuitive judgments with more systematic analysis that makes use of logic, probability theory and scientific method. This what I referred to as the Path of the Guardian, but might have equally well been called rational scepticism or (one flavour of) critical thinking. Another idea – exploring many possible points of view, remembering that we can only have knowledge from a certain perspective, was emphasized during the second day by Karin Verelst, and is likely to come back during the workshop on Lateral Thinking with Edward Nęcka.

And finally, what I referred to as the Path of the Trickster (or Hacker) is the strategy of designing decision-making environments in a way that makes use of, or even exploits the known biases. This is the idea behind the program of “nudging” on a more global level, or various “life hacks” offered as ways to change habits or make everyday life decisions more rational. There is an obvious ethical question here (who decides what is rational?) but the topic does seem interesting, and – to some extent – unavoidable in modern society. To give an obvious example: how is it possible to make more people care about the climate crisis? Is it ethical to ‘manipulate’ people into consuming less? If so, how can it be done without passing strict laws to limit consumption? Or, on a more personal note, should you ‘trick yourself’ into exercising more or saving more money you’re your future? If so, how? If this sounds like something you want to explore further, check out the book by Thaler** (for the global context) or Baumeister and Tierney*** for the personal, and less thinking-oriented touch.

Thank you again for the time spent together – and till next time.

Maciej

 

References

* Damasio, A. (1995/2005) Descartes’ Error: Emotion, Reason, and the Human Brain, revised Penguin edition, 2005

**Thaler, R.H. , Sunstein, C.R. (2008) Nudge: Improving decisions about health, wealth, and happiness.

***Baumeister, R.F, Tierney, J. (2012) Willpower: Rediscovering the greatest human strength.