As Seen On TV: The Science Of Brainwashing, Big And Small
Twenty-three years before terrorists sent two commercial airplanes ripping through the 77th and 93rd floors of the World Trade Center, 909 Americans were swallowing a mixture of Valium, cyanide, and grape Kool-Aid on the northwestern tip of Guyana in what was, at the time, the largest single loss of American life. To this day the Jonestown Massacre swirls with controversy. Was it a suicide, or should it rightfully be called murder?
The question matters. When Reverend Jim Jones launched the Peoples Temple in Indianapolis, roughly two decades prior to the devastation, he had visions of a flourishing socialist new religious movement. But in making a home in Jonestown, the 1,000 or so inhabitants quickly fell into what many now call a cult, and the incident still has scientists and historians studying its catastrophic complexities.
Within weeks of settlement, Jones began indoctrinating his members. People worked for eight hours a day and afterward they studied for eight more. Often, this included heated, heavily vetted lessons from Jones on Marxist and Maoist propaganda, including books and film screenings. He broadcasted anti-American news reports over the Jonestown tower speakers throughout the day, forced non-compliers to suffer through beatings and time spent in a slim plywood box called a “torture hole,” and, several months into settlement, was regularly referred to as “Dad” by both the settlement’s children and other adults. According to surviving journal entries, he routinely performed miracles.
But as the few Temple defectors in the U.S. gained ground in their fight to dismantle Jonestown, pressure mounted. Despite calling the U.S. intelligence operation a “grand conspiracy,” Jones, now in declining health, quickly and fearfully informed his residents the end was near. He gave them the option to flee to the Soviet Union, escape into the Guyanese jungle, or commit “revolutionary suicide.” The group decided on suicide.
And that’s how crewmen found the bodies, limp and together. The group’s leader lay next to two other bodies, his head resting on a pillow, a gunshot wound pouring from his temple. On Nov. 18, 1978, he, like the others, had finally stepped over.
What Brainwashing Is
The Jonestown Massacre is a famous example of brainwashing in psychology circles because it seems to capture the power of thought control in its most extreme, well-executed form. It offers psychologists a window into the human mind that, for obvious ethical reasons, can’t be recreated in a lab. For Kathleen Taylor, University of Oxford psychologist and author of Brainwashing: The Science of Thought Control, Jonestown offers startling insights into how easily our brains can change, and how those changes actually happen on a daily basis.
“What I think’s going on,” Taylor said, “is that people are using techniques of social psychology that we use all the time, but they are applied in very extreme circumstances.”
These techniques are so commonplace they might be invisible. Advertisements and salespeople like to distract customers so they stay focused on the message, repeating certain phrases over and over, and, perhaps the most compelling, filling customers with doubt about their past choices. The principles hold whether the product is dish soap or religious fundamentalism.
“People tend not to think about things they believe in very much,” Taylor said. By exploiting that lack of analysis, someone interested in reshaping a belief could instill so much self-doubt in a person that eventually the new idea seems plausible, and even true. People’s uncertainty is seized, recast as new beliefs, and behavior follows.
Brainwashing sits at the far end of this manipulative spectrum, Taylor argues, but it relies on the same principles. Its aim is ultimately to control what information enters the brain, where the one doing the brainwashing can delete old associations and, indeed, form brand new neural pathways that cement new ones.
“Because the brain is so malleable, [the information] reshapes what’s going on inside the brain, thereby affecting what behavior comes out the other side.”
That change in behavior seems to come in two distinct forms. The first is brainwashing by force, the kind popularized by prison camps, in which people are tortured and starved and practically destroyed until the “new” reality replaces the old one. The person doing the brainwashing has complete control of the person’s psyche, stripping the victim of what he or she thought she knew and offering redemption through a new, seemingly better, alternative.
“But obviously advertisers can’t do that,” Taylor said, “so what they do instead is use, what I call, brainwashing by stealth.”
In this case, coercing involves changing the emotional associations people make without them noticing a change is taking place. It doesn’t have to be as sinister as subliminal messaging. For decades, advertisers have known that customers respond to emotional connections on an unconscious level. And that unawareness is important, Taylor says, because of a psychological phenomenon known as “reactance,” which states when people know they’re being emotionally manipulated, it tends not to work. So advertisers have a vested interest in being as discreet as possible.
“You’re not really concentrating on it,” Taylor said, “but you’re left with the impression that there is a positive emotion associated with this particular thing they want you to buy.”
Finding The Line
Consumption isn’t the only motivation that compels people to be psychologically coercive. Sometimes it’s spite. Dr. William Bernet, professor emeritus of psychiatry at Vanderbilt University, has seen firsthand the true range of parents’ destruction when divorce leaves children caught in the middle of an ugly war for the upper hand. Confused and scared about who to trust, and sometimes plagued with fears of physical abuse, kids reject one parent in favor of the alienating one, exhibiting what psychologists call parental alienation syndrome (PAS).
“I would say that PAS is caused by brainwashing or indoctrination of the child,” said Bernet.
Wracked with resentment, the alienating parent tells the child how awful the other parent is, openly insulting them, forbidding the parent from visiting, and sometimes lying about cases of abuse, just so they can get full custody of the child. In children, the resulting effect is a warped view of the other parent, one that can last well into adulthood if contact is never reestablished.
PAS can combine both forms of brainwashing Taylor mentions, which makes it decidedly harder to pinpoint when alienation takes place, and therefore harder to measure when a child exhibits signs of the syndrome. What’s more, the line between emotional parenting and criminal activity is whisper-thin, and judges can’t monitor parents’ activity round-the-clock. The best evidence psychologists can use to prove a child’s PAS is his or her testimony, which is reliable, but first they have the difficult task of noticing a problem.
In this, Taylor points out a challenge inherent in studying brainwashing: The phenomenon is so insidious because it builds a web of false beliefs in which there is no one unscrupulous act. As a result, manipulating a person’s psychology is a serious gray area in medical ethics. At what point is the brainwasher morally, and legally, culpable? Brainwashing carves new neural pathways in the victim’s brain, but when those pathways form and which are most responsible for destructive behavior are mysteries science is helpless in solving.
In looking at cases of PAS, in which kids can’t reasonably think for themselves, the distinction is clearer. But in adults, whose brains are fully formed, emotional manipulation can proceed as silently as the victims’ dissent.
Staying Resolute
Day to day, our fragile psyches might never face the threat of brainwashing on the scale of parental coercion or Kool-Aid-drinking cults, but it may still be in our interest to stay keenly aware of when we’re being influenced to feel or think a certain way.
To counter these forces, people need to expose themselves to diverse ideas from disparate sources, Taylor says. She advocates a measure of self-assuredness, reflected in beliefs that are well-founded and defensible. With confidence in decision making — be it in buying a ring or forming an opinion — people can stay immune to outside deceit.
But sometimes that decision making is exhausting, if not downright paralyzing. With hundreds of salad dressings and countless world views to choose from, how can we possibly expend the energy to be rational consumers at all times? Living in the Information Age means wrestling with this dilemma, whether we recognize it or not, and accepting the consequences if we choose to submit.
“All of these things — exhaustion, distraction, fatigue, time, pressure — you can resist them if it matters enough to you,” Taylor said. “But for most people most of the time, it doesn’t. So we don’t.”