religious schools and brainwashing
In the last couple of posts I’ve been exploring two ways in which we might explain, or try to shape, someone’s beliefs – by giving reasons, or by applying purely causal mechanisms.
One of the most obvious ways of engaging in purely causal manipulation of what people believe is, of course, brainwashing. What is brainwashing, exactly?
Kathleen Taylor, a research scientist in physiology at the University of Oxford who has published a study of brainwashing, writes that five core techniques consistently show up:
One striking fact about brainwashing is its consistency. Whether the context is a prisoner of war camp, a cult’s headquarters or a radical mosque, five core techniques keep cropping up: isolation, control, uncertainty, repetition and emotional manipulation.
The isolation may involve physical isolation or separation. Control covers restricting the information and range of views people have access to, and includes censorship. Cults tend endlessly to repeat their beliefs to potential converts. This repetition may include, for example, very regular communal chanting or singing. Under uncertainty, Taylor discusses the discomfort we feel when presented with uncertainty: by providing a simple set of geometric certainties that cover and explain everything, and also constantly reminding people of the vagaries and chaos of what lies outside this belief system, cultists can make their system seem increasingly attractive. Emotional manipulation can take many forms – most obviously the associating positive feelings and images (e.g. uplifting or serenely smiling icons) with the belief system, and fear and uncertainty with the alternatives.
Of course, the extent to which these techniques are applied varies from cult to cult. Clearly, they are also be applied by non-religious cults and regimes. A school in Mao’s China or under the present regime in North Korea would almost certainly check all five boxes.
That these and other purely causal mechanisms are effective at influencing belief even outside a cult’s headquarters or a prisoner of war camp is surely undeniable. We are all very heavily influenced by them. The success of the advertizing industry is testimony to their effectiveness. Indeed, many advertising campaigns check many, if not all, of the Taylor’s five boxes for brainwashing.
When challenged on this, the industry typically insists that it is merely “informing” the public – providing good reasons and evidence on which consumers can base a rational, informed choice. Nevertheless the main tools of the advertizing trade are for the most part purely causal. An advertisement for soap powder, lipstick, a car or a loan typically contains very little factual information or argument. The power of these adverts to shape our thinking and behaviour is mostly purely causal – they play on our uncertainties and rely very heavily on repetition and emotional manipulation.
I note (though Turner doesn’t), simply as a point of fact, that religious schools of the sort that tended to predominate in this country up until the 1960’s also very clearly check all five boxes.