Anti-Epistemology


+19 rating, 19 votes
Loading...

Anti-Epistemology
Rationality
The dark side of belief-revision
Share

Introduction: The Phenomenon

“But even as light is opposed by darkness, science and reason have their enemies.”
Quine and Ullian, The Web of Belief, 1978

Sometimes Anti-Epistemology is understood as the general process of covering and obscuring knowledge. We mean something more specific, namely an interesting phenomenon in the area of belief-acquisition and revision. Sometimes people are very fond of certain beliefs. Prime examples are religious beliefs, but people may also cling to more mundane beliefs, such as the belief that they are morally decent persons or good drivers. Like most beliefs, these beliefs are linked to various other beliefs – other beliefs either confirm or disconfirm the original belief, support the original belief or call it into question. This can create a tension between the cherished belief and other beliefs, and the person may begin to feel a cognitive dissonance.

There are numerous ways to resolve this tension. In the case of cherished beliefs it is sometimes attractive to revise many other beliefs, some of which are concerned with rules of reasoning and may be crucial for a truth-tracking cognitive system, instead of just abandoning this one cherished belief. If the cherished belief turns out to be false, then this falsehood is now spreading and affecting the person’s whole net of beliefs: it has become contagious. This process or practice of shielding cherished beliefs in bad ways and thereby putting one’s truth-tracking cognitive system at stake we will call “Anti-Epistemology”. Given how tempting it is to protect our cherished beliefs we should expect there to be a lot of Anti-Epistemology out there, and correspondingly a lot of bad advice on how to reason and revise beliefs.

The idea of Anti-Epistemology revolves around three basic elements: the distinction between cherished beliefs and ordinary beliefs, the claim that beliefs are not isolated but stand in confirmation and disconfirmation relations to each other, and the idea that there are bad ways of resolving tension between beliefs and restoring rational equilibrium. Let us have a look at these elements in turn.

Ordinary Beliefs and Cherished Beliefs

Not all beliefs are equally important to us. Someone may have a firm belief on the average distance between the Moon and Earth, but she’d be happy to revise it in the face of new evidence. There is no particular emotional attachment to this belief. Other beliefs we hold very dear; it can be very painful to abandon them. These are our cherished beliefs. There are various reasons for this special emotional status of some of our beliefs, and the following catalogue captures some of these reasons:

Identity-Imbuing Beliefs: Some beliefs are strongly related to how we see ourselves and to our sense of identity. Someone might for example see himself as a good driver and make this part of his identity. Such a person might hold onto this belief even in the face of evidence to the contrary.

Meaning-Endowing Beliefs: Some beliefs provide meaning for our lives, and this makes them valuable to people. The belief in a God who has a plan for our lives is precious to many people for this reason.

High-Stakes Beliefs: Sometimes a belief is important to us because much is at stake. Libertarian free will might be important because much depends on this view – other views on free will might yield radical changes to our views on ultimate responsibility and punishment.

Peer-Relevant Beliefs: Some people treat certain beliefs as entry tickets into their social group. Rejecting these beliefs leads to exclusion from this group, which can make the belief in question emotionally charged.

In all these cases the belief in question may become emotionally valuable to its holder, which makes it emotionally taxing to give it up, revise it, or doubt it, even in the face of strong evidence against the belief. Since there are so many cherished beliefs we should expect there to be numerous instances of Anti-Epistemology.

The Web of Beliefs

Many beliefs are not isolated – they either support or call into question other beliefs, and are supported or called into question by other beliefs. The belief that I have just observed a certain phenomenon which has been predicted by a theory I believe, together with some auxiliary hypotheses I believe, should increase my confidence in both the theory and the auxiliary hypotheses – the belief about my sensory experience confirms the other beliefs. These beliefs themselves again confirm or disconfirm other beliefs, and so on. Collectively our beliefs constitute a complicated web of beliefs, where the individual beliefs are connected to each other by confirmation or disconfirmation relations. In this web of belief even a small belief revision may have important ripple effects; it can either increase or decrease the confidence we have in many other beliefs.

One of the implications of this structure of our beliefs is that having beliefs which we are reluctant to revise for emotional reasons alone is potentially very harmful – if such a belief turns out to be false, it can malignantly influence our cognitive system far beyond itself. How so? Imagine you have one false but emotionally valuable belief with a low probability of being true before any evidence is taken into account. It is also strongly disconfirmed by many of your other beliefs, e.g. beliefs about empirical evidence, and it is not confirmed by any of your beliefs. Some of your other beliefs may be flat-out logically inconsistent with this belief. To avoid either plain logical or at least probabilistic inconsistency a person may have to revise many other beliefs in order to maintain her emotionally important belief. The cherished belief has now become contagious. Some of these other beliefs which the person revises in order to maintain the cherished belief might be crucial for a truth-tracking cognitive system, e.g. the belief that in general beliefs require evidence to be justified, or that contradictions can’t be true. If this is the case then the cherished belief is not just contagious, but also malignant and sabotages an originally truth-tracking cognitive system by introducing elements of Anti-Epistemology.

Anti-Epistemology: Dangerous Ways of Restoring Rational Equilibrium

Let us call a situation in which a person’s web of belief is logically and probabilistically coherent “rational equilibrium”. If a person holds logically or probabilistically inconsistent beliefs because he is not willing to give up a cherished belief even if many other robust beliefs disconfirm it and the Bayes theorem would dictate a very low probability, then there are various more or less dangerous ways in which he can restore rational equilibrium. In the following catalogue we discuss some of them. The categories in the catalogue are perhaps not exhaustive and may not be mutually exclusive. They are discussed in the order of the potential harm they can do to a truth-tracking cognitive system. We start with the least damaging option:

Changing First-Order Beliefs: Suppose your best friend tells you he can’t make it to the cinema tonight because he is not feeling well. Later several other friends see him having drinks at a bar and they tell you about this and you believe them. But you also refuse to believe that your best friend would lie to you, and hence you hold inconsistent beliefs. To achieve rational equilibrium you decide to believe that somehow all of your friends were mistaken in their perception, perhaps they saw someone with similar facial features – you have revised a number of first-order beliefs. This method is somewhat damaging because you will likely increase the ratio of false to true beliefs, but it is not massively damaging. Presumably a sufficient amount of new evidence will still cause you to eventually revise the belief that your best friend would not lie to you and then you would also revise the belief that your other friends are fairly bad at distinguishing faces and can’t tell apart your best friend from someone with similar facial features. Anti-Epistemology of this type seems to be a trait which is often present in conspiracy theorists.

Changing Beliefs About Laws of Nature: Suppose you believe Earth came into existence 5000 years ago, give or take some years, and was created within a week. But you’re also familiar with basic physics and geology and know that this creation scenario is incompatible with many of your beliefs concerning these disciplines. One method of achieving rational equilibrium is by the method above: revise many of your first-order beliefs. You could for example start to revise your beliefs about the honesty of scientists and adopt the view that all scientists are involved in a global conspiracy and hide the evidence supporting Young Earth creationism. But there is another way. Instead you could revise your beliefs about laws of nature. A young-earth creationist textbook adopts this strategy: “During Creation week, these processes were not strictly the same as analogous processes today. For instance, gravity must have been in operation, but when the waters of Day 2 drained off the rising continents on Day 3, they were able to move faster and farther than waters can be moved today. Modern natural laws, operating at rates we recognize today were evidently not fully instituted until Creation was completed on Day 6.” Revising beliefs about laws is potentially much more harmful since they are the basis for connecting numerous other beliefs to each other. The impact of such a belief revision are much more severe, and the ratio of false to true beliefs may be increased much more dramatically than in the previous case.

Changing Beliefs About Logic and Belief-Acquisition: Another way of restoring rational equilibrium is by arbitrarily privileging certain beliefs and excluding them from the list of beliefs that need to be supported by evidence to be rational. The philosopher Alvin Plantinga has suggested that belief in God is properly basic and that they can be rational “without any evidence or argument at all” (Plantinga 1983, 17). He only requires that the belief is directly, non-inferentially, evident in the believer’s experience. Kierkegaard and Tillich suggest that “faith involves full commitment, in the face of the recognition that this is not “objectively” justified on the evidence” (Bishop 2010). On some (but not all) versions of this view the beliefs in question are effectively immunized from revision. This has similar effects as changing first-order beliefs, but is worse because it may preclude belief revision even in the face of overwhelming evidence. Another particularly damaging move is to abandon basic laws of logic such as the law of non-contradiction.

Outlawing Cognitive Processes: An especially drastic response to loss of rational equilibrium is to completely outlaw certain cognitive processes which are responsible for revising and acquiring beliefs. “Don’t try to argue with the devil – he has more experience at it than you” is a Christian proverb which encourages people to stop “arguing with the devil”, which presumably means to stop considering and factoring in evidence for and against religious beliefs. Following this rule amounts to outlawing a cognitive process which is essential to a truth-tracking cognitive system, namely the process of updating credences based on evidence. Such a modification can cripple a person’s rationality.

All four ways of restoring rational equilibrium are potentially harmful and can lead to an impaired cognitive system. Therefore all of them are instances of Anti-Epistemology. But there is an interesting distinction between the first two and the last two methods. The first two methods are concerned with first-order beliefs about the world. They can be damaging, but not extremely so. Enough evidence to the contrary often causes a person to revise their beliefs eventually. It seems appropriate to call them Soft Anti-Epistemology.

The last two on the other hand are concerned with beliefs about beliefs or second-order beliefs. They deal with general rules of reasoning. These cases seem to be much more damaging – once adopted it is very difficult to get rid of them. We may dub these ways of restoring rational equilibrium Hard Anti-Epistemology.

Remaining Question

Are there cases where it would be justified and reasonable to revise large parts of our cognitive systems due to a single cherished belief – which is not itself confirmed by any of our other beliefs – by changing beliefs about laws of nature, logic, and evidence, or by outlawing cognitive processes? Or to put it differently: are there cases which show all the marks of Anti-Epistemology but are just massive but justified revisions of our cognitive system? In Bayesian terms these would be beliefs with an extremely high prior, such that they survive (almost) any amount of disconfirmation, and in turn influence other beliefs massively. This question is a tricky one, both within and outside of the Bayesian framework, and it is beyond the scope of this essay.

References

Bishop, J. (2010). Faith. The Stanford Encyclopedia of Philosophy (Fall 2010 Edition).

Plantinga, A. (1983). Reason and belief in God.

Steup, M. (2014). Epistemology. The Stanford Encyclopedia of Philosophy (Spring 2014 Edition).

Yudkowsky, E. (2010). Anti-epistemology. Less Wrong Wiki.

This article has 3 comments

  1. I love the article and will share it with a few rational friends.

  2. […] known as Anti-Epistemology. An insightful article written by Adriano Mannino, titled “The Dark Side of Belief Revision“, gives a thorough exposition on this […]

  3. […] Modifying a single concept or relation causes disruption throughout. Here's a relevant article: Anti-Epistemology | Crucial Considerations – a science, philosophy, rationality & ethics blog Reply   Reply With […]