Cognitive Biases and Irrationality


+17 rating, 17 votes
Loading...

Cognitive Biases and Irrationality
Rationality
Stone age brains in a silicon age
Share

We are in possession of the most complex structure known in the universe, the human brain. This jelly-like mass contains a mind-boggling one hundred billion neurons and produces our every thought, action, memory, feeling, and experience of the world. It allows us to recognize and organize complex patterns, to plan for the future, to interact with others, to communicate via language, to acquire and transmit knowledge, and to think abstractly beyond our concrete circumstances. In short, it allows us to perform those tasks uniquely associated with human ingenuity.

But the human brain is also a liar that relentlessly sabotages our success. For all its complex brilliance, the brain stumbles into a number of cognitive traps with astonishing regularity. The more often it does so, the less often we are able to achieve what we truly want.

Most of the time we are not even aware of falling into these traps, as it is almost impossible to view the way we think with any objective distance. Consider optical illusions, for example. We perceive squares A and B on the chessboard below to be different shades of grey. Even after referring to the diagram on the right, which demonstrates they are identical shades, it is very difficult to overcome the initial perception of difference. Intellectually, we can accept that the squares are identical, but even after much reassurance, it remains hard to internalize that knowledge. In the same way that our brain falls prey to optical illusions, it can equally mislead us via ‘thinking’ illusions, which are as difficult to notice, acknowledge and avoid as their visual counterparts. Such ‘errors’ in thinking are called cognitive biases.

Grey_square_optical_illusionSame_color_illusion_proof2
Cognitive biases are systematic deviations from the standard of rationality — they are patterns of thinking that systematically lead us to acquire inaccurate beliefs and make inadequate decisions. They are the gaps between the descriptive model of how we think and decide and the normative model of how we ideally should think and decide. Therefore, if we want to become better at achieving our goals, we should try to overcome cognitive biases.

There are hundreds of cognitive biases that have already been discovered and examined by psychologists and behavioral economists. We value things more highly simply because we own them (endowment effect); we can’t quit after investing a lot (sunk cost fallacy); we maintain the current state of affairs even when change would drastically improve our lives (status quo bias); we are overconfident with regard to our own abilities (overconfidence bias); and we have the tendency to search only for evidence that confirms — instead of falsifies — our hypothesis (confirmation bias), to name only a few biases.

These examples may seem obvious, even trivial, yet the global implications of compounded errors in thinking are staggering. Billions of people grant more authority to the doctrines associated with an anthropomorphized god than to the assessments of their rational and reflective minds. We live in societies where the production, consumption, and ownership of material goods dominates the assessment of our happiness. We pour our mental and physical resources into projects we know are doomed to failure and beliefs our inner selves suspect are untrue; and we actively strive to prevent the very changes that would liberate us from this condition. We judge our success relative to that of others rather than to our own goals, and thereby, to quote from the movie Fight Club, waste our lives working to buy things we don’t need, with money we don’t have, to impress people we don’t like. We fail to have a rational and honest public discourse about where we as a society want to go and are terrible at identifying what really matters. And, of course, in all of this, it is our own minds that provide the impulse towards irrationality.

So – how can our brain be both an organ of complex brilliance and a saboteur of our best-laid plans? The answer lies in the fact there is a mismatch between what our brains were optimized for and the current situation we find ourselves in. Evolution has optimized our brains to maximize the proliferation of our ancestors’ genes in the environment they were living in. Cultural evolution has been progressing at such a fast pace over the last ten thousand years, that biological evolution could not catch up. As a result, we are living in a silicon age with Stone Age brains; we are gene-replicating machines attempting to achieve our individual goals that are utterly different from purely spreading our genes — goals that often are very abstract, such as impartial moral goals. In addition, we live in a highly technological society in which millions of people interact with each other; a situation very different from the hunter-gatherer environment in which our brains evolved. It is therefore no surprise that our thinking is often suboptimal for the modern reality we find ourselves in — and indeed it would be a surprise if this were not the case.

But all is not lost. The more we are able to identify the sources of our errors in our thinking, the better equipped we will be to tackle them — and more attention than ever is being paid to understanding and combatting cognitive bias. The old adage “know thine enemy” remains as relevant as ever, and from this first crucial step of recognition, there are a variety of steps one can take to counteract bias.

First, recognize that thinking is difficult, and that without concentrated effort, human beings rarely think optimally. Second, train yourself. In the same way that we can use disciplined study to learn how to play the piano or solve sudoku, we can train our brains to become better at thinking, and thereby better at helping us achieve our goals. We can also improve decision-making via ‘nudging’, i.e. arranging the environment in such a way that people will tend to choose the option in their best interest. Nudging can be used personally, in small groups, and also in society as a whole.

Whatever values we hold or whatever goals we wish to achieve, there is an action whose outcome will best satisfy our values or lead to the attainment of our goals. Being rational is difficult, but it is simply about maximizing one’s chances of success, and avoiding the many common pitfalls that forestall it. We all have the ability to improve ourselves and become better at thinking and decision-making, and though we can never achieve perfect rationality, improvement is always possible and desirable.

References

Baron, J. (2000). Thinking and deciding. Cambridge University Press.

Kahneman, D., Knetsch, J. L., & Thaler, R. H. (1991). Anomalies: The endowment effect, loss aversion, and status quo bias. The journal of economic perspectives, 193-206.

Simon, P., & de Laplace, M. (2004). Richard P. Larrick. Blackwell handbook of judgment and decision making, 316.

Samuelson, W., & Zeckhauser, R. (1988). Status quo bias in decision making. Journal of risk and uncertainty, 1(1), 7-59.

Stanovich, K. E. (2005). The robot’s rebellion: Finding meaning in the age of Darwin. University of Chicago Press.

Thaler, R. H., & Sunstein, C. R. (2008). Nudge: Improving decisions about health, wealth, and happiness. Yale University Press.

Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. science, 185(4157), 1124-1131.

This article has 7 comments

  1. Great post and blog. How do I follow?

  2. Interesting article. I’d love to read some more about how to defend yourself against these biases. How exactly can I “train myself”?

  3. Tobi,
    I think understanding your biology is the best way to defend yourself! :p

  4. Also worth checking out:

    Cohen (1981) “Can Human Irrationality Be Demonstrated Experimentally”

    Stich (1990) The Fragmentation of Reason, Chapter 1, 4, and 6.

    Bishop and Trout (2005). Epistemokogy and the Psychology of Judgment, Chapter 1, 2, and 8.

    Gigerenzer & Brighton (2009) “Homo Heiristicus: Why Biased Minds Make Better Inferences”

    Samuels, Stich, and Bishop (2002) “Ending The Rationality Wars: How To Make Disputes About Human Rationality Dissappear.”

    Also, I am very interested in this topic, so I am very open to discussing this further, either here or elsewhere.

    Thank you for posting about this!

    Nick

    • Nick,
      I am searching for a copy of the L J Cohen article .I too share your interest in this topic .Have you been able to obtain a free copy of it or do you have access to it through an academic institution.?I’m currently trying to compare Gigerenzer’s work with that of Nick Chater and Mike Oaksford..Have to confess to not having checked out Stich .Any advice would be appreciated.Unfortunately ,I am constrained by my ignorance of statistics