← Back Published on

The Lucifer Effect

The 1971 Stanford Prison Experiment, conducted in the basement of Stanford University with 24 college students, proved to be the turning point in understanding how perceived power and the situation could turn any average, healthy person into a perpetrator of evil.

Zimbardo randomly assigns these college students to the roles of guards and prisoners. The experiment was discontinued only after six days (8 days early) because the experimenters did not expect that level of psychological torture to be inflicted by college boys paid to act as guards. They made the "prisoners" do humiliating, menial tasks and even subjected them to sexual degradation.

Philip Zimbardo, the lead experimenter, compared this personality transformation to that of Dr Jekyll and Mr Hyde. Here, instead of a chemical, it was social psychology that affected this transformation.

Dr Jekyll and Mr Hyde are characters (or a character) from Robert Louis Stevenson's 1886 novella, investigating the two-sided personality—one of which is exemplary and the other which is evil—of Dr Henry Jekyll.

Zimbardo aimed to show how readily a psychologically healthy person could become an agent of evil, given the right circumstances and environment. This discussion comes from the iconic images of evil that were, shot in the Abu Ghraib prison, and revealed to the public in 2006, three years after the horror took place.

These "trophy" photos came from the digital cameras of the guards, which "showed Army reservist guards torturing and humiliating Iraqi prisoners — naked prisoners stacked in pyramids or crawling on the floor with leashes." It also included photographs of "a prisoner wearing a black hood with electrodes on his fingertips; naked, terrified prisoners being threatened with attack dogs or having guns pointed at their genitals by hideously masked guards, and worse."

Although it is easy to feel repulsed by the actions of these guards and label them "bad apples"—a temperamental account that blames the individual for their wrongdoings—it is not as easy to explain why these regular, healthy guards would perpetrate such behaviour when put into situations like Abu Ghraib.

The explanation lies in the word "situation". The perpetrators of abuse didn't go into Abu Ghraib with sadistic tendencies, nor was it a part of their lifestyle. These people weren't serial torturers or murderers. Instead, they were transformed into agents of evil by their situation, the "bad barrel" of war.

Adolf Eichmann was a German official hanged by the state of Israel for his part in the Holocaust, the Nazi extermination of Jews in World War II. Journalists described him as the banality of evil. Even though he had evil deeds to his name, Eichmann was not inherently immoral. Researchers like Stanley Milgram were riveted to see how all it took was a legitimate authority figure to turn you into a torturer.

Eichmann was a mentally sound, healthy person who'd drifted into the Nazi regime, looking for a sense of purpose and direction. His brutality towards the Jews did not stem from a deep-rooted psychological belief. Instead, he "was just following orders".

Most Nazi soldiers committed their crimes because they did not hold themselves personally accountable. This is because they diffused the responsibility for their actions. Diffusion of Responsibility was coined by John Darley and Bibb Latané in their famous Smoke-Filled- Room study. This term describes how people are more likely to commit morally-questionable deeds when they commit them with a larger group. For example, "when you ask a classroom of students who would be willing to pull the trigger to execute a condemned traitor; no one will raise their hands. Alter the conditions such that one would be part of a large firing squad in which there is only one real bullet, no one knowing who had fired the fatal shot, resistance to committing the deed lessens."

This theory is based on the need not to feel personally accountable for your actions. If you leave someone lying on the side of the road, you will want to believe that someone else will help them and that it is not your responsibility alone. We do this to avoid guilt. This allowed many Nazi soldiers—including Adolf Eichmann—to feel no remorse for their actions. They absolved each other of guilt that never surfaced. 

Another factor that played into the Lucifer effect was the authority figure. A classic 1961 study of obedience to authority by Stanley Milgram showed that people would deliver lethal shocks to another person if told to do so by an authority figure here. However, the person receiving the shock was an actor feigning pain.

In a subsequent 1972 study by Charles Sheridan and Richard King, the actor was swapped for a "cute, fluffy" puppy. The twist was that the puppy was getting zapped with painful but harmless shocks. They found that 54% of the male subjects and all females obeyed throughout, even during what seemed to be great emotional stress. Some had even openly wept. They all believed they would get a failing grade if they were unsuccessful in operant conditioning the puppy.

This shows that people may feel a sense of diminished responsibility in the presence of an authority figure.

Some of the factors that influence obedience to authority figures are:

- A legitimate backstory, like a memory experiment or for the sake of "national security".

- A legitimate-looking authority figure. A lab coat helps. - Rules that are vague enough that they are hard to

understand or remember

- Depersonalization of the enemy

- How difficult it is to exit.

Cultures like the People's Temple cult in Guyana have used the last factor. "They create a barrier to leaving [by saying] 'If you exit, you're going to end up mentally impaired.'" Many people in practising cults are still there because they do not know how to exit.

Another constituent of the Lucifer effect is deindividuation. It is the loss of self-awareness in groups. Disguises such as military uniforms and hoods encourage deindividuation and enable the overcoming of moral barriers to inflict pain on a person.

The Klu Klux Klan, dubbed the first American terrorists, donned hoods to allow themselves to deindividuate and partake in racist activities. They combated post-Civil War reforms and terrorized freed African Americans in the former Confederacy. Previous anthropological studies showed that warriors in cultures donning masks were more significantly likely to torture, mutilate or harm than warriors that did not practice self-disguise.

Depersonalizing or divesting the enemy of human characteristics or individuality further helps perpetrators harm. Albert Bandura theorized from his studies about moral disengagement that people who regularly partake in hurtful and harmful conduct conceal their behaviour in legitimate-seeming rationalizations, which renders self-sanctioning ineffective and elevates the behaviour into something seemingly noble.
For example, in one of Bandura's experiments, students delivered much greater electric shocks to another group of participants merely if they had overheard that those students from the other college seemed like "animals."

Chip Fredrick—the reservist Sergeant, for whom Zimbardo became the witness defence in the case of abuse at Abu Ghraib before his fateful tour of duty — was an average healthy person. This was the same man who'd come up with the infamous idea of attaching electrodes to the Iraqi prisoner's fingers, making him stand on a box and electrocuting him if he fell off. Zimbardo concluded upon interviewing Fredrick that he was normal, and there was no evidence of psychopathology or sadistic tendencies. However, Fredrick was somewhat obsessive about orderliness, neatness, discipline and personal appearance. All of which were absent in Abu Ghraib.

The inhuman conditions in Abu Ghraib were enough to effect an evil transformation in Fredrick and his colleagues' personalities. The guards were forced to work 12-hour shifts, seven nights a week for 40 days straight, in filthy conditions (no toilets/running water) and under constant enemy bombardment.

The American military leadership in Abu Ghraib under the Bush Administration condoned prisoner torture, and the American guards were under direct orders to torture these prisoners for information. This is a classic case of how bad barrels turn good apples bad.
The corrupt system of torture created under the Bush administration corrupted these once-good American soldiers.

The Banality of Evil describes how a person who does terrible things does not need to be inherently evil. The Banality of Heroism, as Zimbardo called it, is the inherent heroic tendency in all of us waiting for the right moment to showcase our heroism. This debunks the theory of 'heroic elect', which ascribes superhuman qualities to those who help when we all have these qualities—both the good and the evil ones.
If situations and circumstances can drive a person to evil, they can also drive them to goodness.