Pages Menu
Categories Menu

Posted by on May 17, 2010 in intellectual black holes | 0 comments

Introduction – first draft for comments please

Here’s the most of the introduction the new book on Intellectual Black Holes. Comments please.


Intellectual black holes

Wacky and ridiculous belief systems abound. One cult promises members a ride to heaven on board a UFO. Another insists the Earth is ruled by lizard-like aliens. Even mainstream religions have people believing absurdities. Preachers have promised 46 heavenly virgins to suicide bombers. Others insist the entire universe is just 6,000 years old (extraordinarily, polls consistently indicate this belief is currently held by about 45% of US citizens – that’s around 130 million individuals). And of course its not only cults and religions that promote bizarre beliefs. Significant numbers of people believe in astrology, the amazing powers of TV psychics, astrology, crystal divination, the healing powers of magnets, the prophecies of Nostradamus, and that the Word Trade Centre was brought down by the US Government. There is even a handful who continue to believe that the Earth is flat.

How do such ridiculous views succeed in entrenching themselves in people’s minds? How are such wacky belief systems able to take sane, intelligent, college-educated people and turn them into the willing slaves of claptrap? How, in particular, do the true believers manage to convince themselves that they are the rational, reasonable ones and that everyone else is deluded?

This book identifies eight key mechanisms that can transform a set of ideas into a psychological fly trap – a bubble of belief that, while seductively easy to enter, can be almost impossible to reason your way out of again.

Cosmologists talk about black-holes, objects so gravitationally powerful that nothing, not even light, can escape from them. Unwary space travellers passing too close to a black-hole will find themselves inexorably sucked in. Increasingly powerful motor is required to resist its pull, until eventually one passes the “event horizon” – the point of no return – and escape is impossible. My suggestion is that our contemporary cultural landscape contains, if you like, numerous intellectual black-holes – belief systems constructed in such a way that unwary passers-by can similarly find themselves drawn in, often never to escape. While those of us lacking robust intellectual and other psychological defences will be most easily trapped, even the most intelligent and educated of us are potentially vulnerable. Some of the world’s greatest thinkers have fallen victim. If you find yourself encountering a belief system in which several of these eight mechanisms feature prominently, you should be wary. Alarm bells should be going off and warning lights flashing. For you are now approaching the intellectual equivalent of a black-hole.

As the centuries roll by, such self-sealing bubbles of belief appear and disappear. Sometimes, many may fizz into existence in one place, because the conditions are just right (an age of superstition). Occasionally, one of these little bubbles may grow huge, perhaps encompassing an entire civilization, before dividing or deflating or popping or being subsumed by another bubble. The greatest threat such bubbles of irrational belief face, perhaps, is the flourishing of a rigorous, sceptical culture that encourages free thought and in which all beliefs are subjected to close critical scrutiny – an “Enlightened” society. However, some bubbles are able to flourish even within such a society.

Aim of this book

The central aim of this book is to help immunize readers against the wiles of cultists, political zealots and other purveyors of intellectual snake oil by, as it were, clearly setting out the tricks of the trade by which such self-sealing bubbles of belief are created and maintained, by revealing how an intellectually impregnable fortress can be constructed around a set of even patently ridiculous beliefs, providing them with a veneer of “reasonableness” and rendering them immune to rational criticism.

Most of us will have at some point experienced the frustration and of trying to change the convictions of someone powerfully committed to a ridiculous belief, and will have come against many of these strategies. My aim here is to provide an overview of eight key strategies, which I call:

1. Playing the mystery card
2. “But it fits!”
3. “Moving the goal posts”
4. Going nuclear
5. ” I just know!”
6. Pseudo-profundity
7. The Amazingly Persuasive Power of Ramified Anecdote (APPRA)
8. Pressing your buttons

In each case I (i) explain the strategy, (ii) diagnose exactly what is wrong with it, and (ii) provide illustrations of how it is applied.

It is worth clarifying seven things at the outset:

1. This book focuses particularly, though by no means exclusively, on religious examples of intellectual black holes. Why, given there are many non-religious examples from which to choose? My main reason for doing so is that while many belief-systems (e.g. political philosophies such as Marxism, New Age philosophies, belief systems involving dubious or bogus medical treatments, and belief systems centred on grand political conspiracies (such as those involving 9/11) also employ various combinations of these eight mechanisms to ensnare minds, religions typically employ a wider range. Historically, the established religions have had a great deal of time and huge intellectual and other resources to deploy in refining their own particular versions of these strategies. They have, as a result, produced some of the most powerful and seductive intellectual black holes. They therefore provide some of the best illustrations.

2. I also want to stress that this book certainly does not argue that all religious belief-systems are essentially irrational. Several recent books have done that, of course. The aim of this book is different. It is not the content of religious belief systems that is attacked here, but the manner in which they are often bolstered and defended. It’s important to realize that any belief-system, including perfectly sensible belief-systems, can be bolstered and defended by means of the same eight mechanisms. To point out that a belief-system is both propped up and defended against intellectual threats by means of bullshit strategies is not yet to show that the content of that belief-system is itself bullshit. It’s worth remembering that many of the same strategies can and have been employed to defend atheistic belief-systems (I’m thinking, in particular, of certain totalitarian atheist regimes). I am not, here, suggesting that atheism is intrinsically any more or less sensible than theism. However, given a belief system is fairly rational, its proponents won’t need to rely on the kind of dubious strategies outlined here in order to bolster and defend it. The fact that many religious people rely pretty heavily on many – in some cases all – of these eight strategies in order to generate the impression that their particular belief system is, at the very least, not unreasonable, would of course be neatly explained by the fact that their particular religious system of belief is, in fact, pretty unreasonable.

3. Third, not only are some atheists guilty of using such strategies to bolster and defend their atheism, some religious people are largely innocent. My aim is not to tar all religious people with the same brush. To say that religion may have produced many of the most dramatic and powerful intellectual black holes is one thing. To insist that every religious person is a victim is quite another. I’m certainly not suggesting that..

4. Fourth, we should acknowledge that those who fall victim to intellectual black-holes need be neither dim nor foolish. The sophistication of some of the strategies examined in this book demonstrates that those who develop and use them are often highly intelligent. They are, in many cases, clever strategies– sometimes very clever indeed. Nor need those who fall foul of intellectual black holes be generally gullible. Victims may, in other areas of their lives, be models of cautious acceptance, subjecting claims to close critical scrutiny, weighing evidence scrupulously, and generally tailoring their beliefs according to robust rational standards. If, after reading this book, you begin to suspect that may yourself have fallen victim to an intellectual black-hole, there’s no need to feel particularly foolish. People far wiser and cleverer than either you and me have also become trapped. Neither need those who create or work to sustain intellectual black holes be particularly bad or deliberately deceitful people. Those who work hardest to sustain intellectual black holes are typically victims themselves. Yes, some intellectual black holes are deliberately fashioned by frauds and con artists. But in most cases, such bubbles of belief are a product of the ingenuity of honest and sincere people genuinely committed to the belief system at its core.

5. Fifth, notice that I am not suggesting that every intellectual black hole will exhibit all eight of mechanisms outlined in this book. Some exhibit some, and others others. In chapter XX, I illustrate how different belief systems employ different combinations of the eight mechanisms, or place different emphasis on them. Belief in a dubious alternative medicine, for example, can be turned into something approaching an intellectual black hole by heavy reliance on just two mechanisms in particular: APRA and playing the mystery card. On the other hand, many religious belief systems employ many of the eight mechanisms – in some cases, all of them. Also note that intellectual black holes tend to be dynamic – the mechanisms used to sustain them are likely to shift and develop in response to new and differing rational threats to the belief system at their cores.

6. It is worth emphasizing that intellectual black holes lie at one end of a sliding scale. The fact is, almost all of us engage in these eight strategies to some extent, particularly when beliefs to which we are strongly committed are faced with a rational threat. And in fact, under certain circumstances, there is little wrong in using at least some of them in moderation (as I will explain). But that is not to say that every belief system is, then, an intellectual black-hole, (in fact, defending a belief system against the charge that it is an intellectual black hole by maintaining that all belief systems are intellectual black holes, and thus no less reasonable/unreasonable, is itself a warning sign that one is dealing with an intellectual black hole – it is an example of the strategy I call “Going Nuclear”). What transforms a belief system into an intellectual black hole is the extent to which such mechanisms are relied upon in dealing with rational threats and generating an appearance of “reasonableness”. The more we start to rely on these kinds of strategy to prop up and defend our belief system, the more black-hole-like that belief system becomes, until a black hole is clearly what we have got. However, even if we have not fallen victim to an intellectual black hole, some of our belief systems may still exhibit an unhealthy reliance on the same strategies.

7. Lastly, like any analogy, the black hole analogy breaks down if pushed too far. There are differences between physical black holes and their intellectual equivalents. Here’s an obvious illustration: a physical black hole is something from which you can never escape, but people can and do sometimes escape from intellectual black-holes. Individuals do occasionally find the resources to think themselves clear of even some of the most seductive and powerful examples.

Other explanations for why we believe

This book examines eight key mechanisms by which belief systems can be transformed into intellectual black holes. It doesn’t attempt to explain why we are drawn to particular belief systems in the first place. Why, for example, is belief in a god or gods, and in other supernatural beings, such as ghosts, angels, dead ancestors, and so on, so widespread? These kinds of belief appear to be universal, and there is some evidence that a propensity or disposition towards beliefs of this kind may actually be innate – part of our natural, evolutionary heritage. The psychologist Justin Barrett (REF XX), for example, has suggested that the prevalence of beliefs of this kind may in part be explained by our possessing a Hyper-Active Agent Detection Device, or H.A.D.D.

The H.A.D.D. Hypothesis

Human beings explain features of the world around them in two very different ways. For example, we sometimes appeal to natural causes or laws in order to account for an event. Why did that apple fall from the tree? Because the wind blew and shook the branch, causing the apple to fall. Why did the water freeze in the pipes last night, because the temperature of the water fell below zero, and it is a law that water freezes below zero.

However, we also explain by appealing to agents – beings who act on the basis of their beliefs and desires in a more or less rational way. Why did the apple fall from the tree? Because Ted wanted to eat it, believed that shaking the tree would make it fall, and so shook the tree. Why are Mary’s car keys on the mantelpiece? Because Mary wanted to remind herself not to forget them, so put them where she thought would she spot them.

Barrett suggests that we have evolved to be overly sensitive to agency. We evolved in an environment containing many agents – family members, friends, rivals, predators, prey, and so on. Spotting and understanding other agents helps us survive and reproduce. So we evolved to be very sensitive to them – overly sensitive in fact. Hear a rustle in the bushes behind you and you instinctively spin round, looking for an agent. Most times, there’s no agent there – just the wind in the leaves. But, in the environment in which we evolved, on those few occasions when there was an agent present, detecting it may well have saved your life. Far better to avoid several imaginary predators than be eaten by a real one. Thus evolution will select for an inheritable tendency to not just detect – but over-detect – agency. We evolved to have (or, perhaps more plausibly, to be) hyper-active agency detectors.

If we do have an H.A.D.D., that would at least partly explain the human tendency to feel there is “someone there” even when no one is observed, and so may at least partly explain our tendency to believe in the existence of invisible agents – in spirits, ghosts, angels or gods.

Now I am not here endorsing this particular explanation for widespread belief in such invisible agents (though I suspect there is some truth to it). The fact is that, even if we do possess an H.A.D.D. that would at best only explain the attractiveness of the content of some of the belief systems we will be examining. Many wacky belief systems, such as crystal healing or palmistry or numerology, involve no hidden agents at all. I mention the H.A.D.D. hypothesis only to illustrate the point that the eight mechanisms identified in this book for turning a belief system into an intellectual black hole are not intended to rival such psychological and evolutionary explanations for why we believe what we do. My claim is that once we find ourselves drawn to a belief system, for whatever reason, then these eight mechanisms may come into play to bolster and defend it.

Note that the H.A.D.D. hypothesis does not say that there are no invisible agents. Perhaps at least some of the invisible agents people suppose exist are real. Perhaps there really are ghosts, or spirits, or gods. However, if the H.A.D.D. hypothesis does correctly explain why we suppose that such invisible agents exist, then the fact that large numbers of us believe in the existence of such invisible agents supplies no evidence that any such agents exist. It will no longer do to say “Surely not all these people can be so very deluded? Surely there must be some truth to these beliefs, otherwise they would not be so widespread?” The fact is, if the H.A.D.D. hypothesis is correct, we are going to likely to believe in the existence of such invisible agents anyway, whether or not they exist. So the fact that they do exist is no evidence that they are real. Of course, there was already good reason to reject such appeals when it comes to beliefs of a religious, supernatural or paranormal character… we know already that
{{NOT FINISHED THIS BIT YETfter all, 130 million citizens of one of the richest and best-educated populations on the Planet believe the entire universe is six thousand years old. If the H.A.D.D hypothesis is correct, then it. …adds a further nail to the coffin of that kind of justification for belief in invisible agents.}}

Theory of Cognitive Dissonance
Another psychological theory that may play some role in explaining why we are drawn to the kind of strategies described in this book is the theory of cognitive dissonance. Cognitive dissonance is the psychological discomfort we feel when we hold beliefs or attitudes that conflict. The theory of cognitive dissonance says that we motivated to reduce such dissonance by either adjusting our beliefs and attitudes or rationalizing them.

Aesop’s story of The Fox and The Grapes is often used as an illustration. The fox desires those juicy-looking grapes, but then, when he realizes he will never attain them, he adjusts his belief accordingly to make himself feel better – he supposes the grapes are sour.

How might the theory of cognitive dissonance play a role in explaining why we are drawn to using the kind of belief immunizing strategies described in this book? Here’s an example. Suppose, for the sake of argument, that our evolutionary history has made us innately predisposed towards both a belief in supernatural agents, but also towards forming beliefs that are, broadly speaking, rational, or at the very least not downright irrational. That might put us in psychological bind. On the one hand, we may find ourselves unwilling or even unable to give up our belief in certain invisible agents. On the other hand, we may find ourselves confronted by overwhelming evidence that what we believe is pretty silly. Under these circumstances, strategies promising to disarm rational threats to that belief and give it at least the illusion of reasonableness are likely to seem increasingly attractive. Such strategies can provide us with a way of dealing with the intellectual tension and discomfort such innate tendencies might otherwise produce. They allow true believers to reassure themselves that they are not being nearly as irrational as reason might otherwise suggest – to convince themselves and others that their belief in ghosts or spirits or whatever is, even if not well-confirmed, at the very least, not contrary to reason.

So we can speculate about why certain belief systems are attractive, and why such strategies are employed to immunize them against rational criticism and give them a veneer of “reasonableness”. Both the H.A.D.D. hypothesis and the theory of cognitive dissonance may have a role to play. The extent to which they do have a role to play would be the subject of a rather different sort of book.

Two threats to the rationality of theism

Our discussion will include several examples of how our eight mechanisms are used to deal with intellectual challenges to theism – to belief in God. I will focus on two challenges in particular: (i) the evidential problem of evil, and (ii) the problem of non-temporal agency. Because it is easy to underestimate the power of these objections, it’s worth clarifying them at the outset.

Post a Reply

Your email address will not be published.