Milgram's Obedience Study
- Eniokos

- Oct 1, 2024
- 7 min read
Updated: Feb 11
The 1961 Yale "Memory Project" Wasn't About Memory. It Was a Map of the Human Shadow
What would you do if a stranger in a lab coat told you to hurt someone? Most of us answer with a confident, moral certainty. We imagine our "true self" standing firm, refusing the order, and walking out of the room. But in 1961–62, a series of experiments at Yale University that were publicly framed as the "Memory Project" shook the foundations of that self-perception.
In May 1962, the elegant interaction laboratory at Yale University became the stage for one of the most unsettling dramas in the history of psychology. Imagine walking into that room: the air is clinical, the prestige of the institution is palpable, and the experimenter stands before you in a stern gray lab coat. You are one of 40 men—a group ranging from Good Humor men and plumbers to corporation presidents and PhDs—who have responded to a newspaper ad for a "memory project."
The laboratory stage was set with a chilling simplicity. You watch as another participant, a pleasant 50-year-old man, is strapped into a chair. You see his arms secured to avoid "excessive movement" and watch the experimenter apply "electrode paste" to his skin to ensure a perfect, blister-free contact for the shocks to come. To ensure you understand the gravity of the task, you are given a sample shock of 45 volts. Because of the paste, it feels much stronger than you expected. You are the "Teacher"; he is the "Learner." Your job is to punish his mistakes with increasing voltages from a generator featuring 30 switches, ranging from 15 volts to a final row marked "Danger: Severe Shock" and, finally, a haunting "XXX."
As we look back at these studies today, we have to look past the famous shocking headlines to understand the true psychology at play. To navigate the results, we must distinguish between three versions of ourselves: our Innate Self (our sympathetic, internal nature), our Social Self (the polite version of us that seeks to avoid social friction), and the most dangerous one of all: the Institutional Self. This is the version of us that enters a public organization and adopts its mindset entirely, often at the expense of our own conscience. While the baseline results of Stanley Milgram’s work are grim, his variations reveal a much more nuanced, and occasionally hopeful, map of the human heart.
The "Expert Fail": We Are Far More Obedient Than We Think
Before the experiments began, Milgram asked forty psychiatrists and psychologists to predict the outcome. They reached a near-unanimous consensus: they believed "only a little over one-tenth of one percent" population (the statistical fringe they identified as sociopaths) would go all the way to the maximum 450-volt shock.
They were wrong. In the baseline study (Variation 5), 65% of participants administered the maximum shock, even as the "learner" in the other room screamed about a heart condition before falling into a terrifying silence.
Why were the experts so off the mark? They underestimated the power of the Social Self to defer to authority. When we enter a room with a "man in a white lab coat," our internal moral compass is often overridden by a terrifyingly strong desire to be "good subjects" or "polite people." We don't want to cause trouble for the experimenter. As the transcript illustrates, the pressure of a simple command can silence our natural empathy:
"The experiment requires that you continue, teacher. Please continue."
Variation 11: Proof That We Aren’t Naturally Cruel
If the 65% statistic makes human nature look bleak, Variation 11 offers a vital glimmer of hope. In this version, the "teacher" was given freedom of choice. The experimenter told the participant they could decide the maximum shock level themselves.
The results were a complete reversal: the vast majority of people stayed well below the point of pain. Over half didn't even pass the 60-volt threshold—hardly higher than the 45-volt sample shock they received themselves. Only 3% went all the way to the end.
This 3% might match the statistical expectation for sociopaths. It is important to distinguish them from psychopaths; while a psychopath might enjoy inflicting pain, a sociopath simply lacks the "guilt" mechanism that the rest of us possess. They often don't realize they are different until they are placed in extreme situations—like war or these experiments—and realize they feel fine while everyone else is breaking down.
For the other 97% of us, this variation suggests that our Innate Self is sympathetic and avoids harm. When the "field of force" of authority is removed and we are given a choice, we default to kindness. Cruelty isn't our nature; it is a product of social and institutional hacking.
The "Remote Control" Conscience: Distance as a Moral Buffer
Milgram found that proximity acts as a psychological "field of force." As the distance between the teacher, the victim, and the authority figure shifted, so did the participants' morality:
Same Room (Variation 3): When the learner was visible, obedience dropped to 40%.
Physical Touch (Variation 4): When the teacher had to physically force the learner's hand onto a shock plate, obedience dropped to 30%.
Remote Authority (Variation 7): When the experimenter left the room and gave orders via telephone, obedience plummeted to 21%.
The most striking insight here is that distance to the authority figure matters more than distance to the victim. When the lab coat isn't there to "remind" us to be obedient, we start thinking for ourselves. We even see the Institutional Self override personal expertise. In Variation 4, an electrical engineer, a man who professionally understood the dangers of the shock, continued into the high 200-volt range. When asked why, he admitted he assumed the experimenter "knew more about these issues" than he did. He shut off his own brain to stay in his role.
The Peer Pressure Multiplier: The Power of the Group
We are social creatures who look to others to define "normal." Milgram’s variations on group dynamics show that our individual conscience is a fragile thing when placed in a team.
The Rebels (Variation 17): When two other "teachers" (actors) refused to continue, only 10% of participants stayed. Seeing someone else disobey provides the "social cue" for courage; we realize the authority won't kill us for saying no.
The Conformists (Variation 18): When the other teachers obeyed and the participant only performed a minor subsidiary task, obedience spiked to a staggering 93%.
This is perhaps the most chilling finding of all. As the researcher noted: "Ninety-three percent of you would have killed this person... just because you're obeying Authority and other people are obeying Authority."
The "Puppy" Anomaly: Gender and the limits of Empathy
When Milgram tested female teachers, he found they were identical to men: 65% obedience. While the women expressed more emotional distress and spoke more about their guilt, their behavior did not change. It suggests that human nature is consistent across genders, even if our social expressions of that nature vary.
However, a later study (Charles Sheridan and Richard King) involving shocks to a puppy (which was actually visible and crying) produced a macabre anomaly. Men refused to shock the puppy at higher rates, but in that specific study, every single woman shocked the puppy to the maximum level. The researchers admitted they had "no idea why" this result flipped. It remains a haunting mystery: why did the same gender that express more empathy toward the human learner become the most compliant when a different kind of victim was involved?
The Seven-Step Recipe for "Institutional Evil"
Milgram identified seven conditions that allow an institution to "hack" the human brain, effectively replacing the Innate Self with a compliant Institutional Self:
Gradual escalation
Prestige of institution
Diffusion of responsibility
Hierarchy
Voluntary participation
Technical focus
Let us take these up one by one.
Technical Task Absorption: Give people a specific task to focus on so the victim fades into the background. In philosophy, Heidegger called this the "ready-to-hand" state—your brain is so locked into the "tool" and the "task" that the moral reality of the situation disappears.
Shift of Responsibility: Provide an authority figure who explicitly states, "The responsibility is mine," allowing the subject to feel like a mere instrument.
Hierarchy: Create an atmosphere where the moral sense shifts toward being "polite" and "obedient" rather than "kind."
Justifying Institution: Use the prestige of a name like "Yale" or the goal of "Science" to rationalize the harm as part of a noble cause.
Devaluing the Victim: Leave room for the participant to rationalize that the victim "deserved it" for being slow or getting answers wrong.
Voluntary Start: Ensure the subject enters the situation of their own free will. If they feel they chose to be there, they feel a social obligation to finish the "job."
Slow Escalation: Start with a tiny 15-volt shock. By the time the harm is severe, the brain has already rationalized the previous step, making a refusal now feel like a shameful admission of previous guilt.
Conclusion: The Question of Everyday Compliance
The conditions Milgram identified—hierarchies, technical tasks, and justifying institutions—are not confined to a laboratory. They define our classrooms, our corporate offices, our sports teams, and our governments.
This leads to a haunting question: If the recipe for institutional cruelty is so common, why don't we see more of it in our daily lives? Perhaps it is because our innate self, the sympathetic core that appeared so clearly in Variation 11, is quietly holding the line more often than we realize. We are all capable of becoming the person who flips the switch, but we are also the person who, when left to our own devices, chooses not to. The real test is whether we can trust that innate self to stay awake the next time a "lab coat" tells us that the experiment requires us to continue.
The 1962 Milgram experiment serves as a stark warning about the vulnerability of the human conscience. Stanley Milgram’s synthesis was sobering: human nature cannot be fully counted on to insulate us from participating in brutality. A substantial proportion of people will do what they are told, regardless of the content of the act or the protests of the victim, as long as the command originates from a perceived legitimate authority.
We must reflect on the scale of this power. If an anonymous researcher in a temporary laboratory could command everyday citizens to ignore a man's heart condition and administer severe shocks, we are left with a haunting question: What could a government, with its vastly greater prestige, authority, and resources, command of its subjects today?
Videos on Milgram's Obedience Study
This is a lecture on the Milgram Obedience Study.
Here are some videos that present some interesting discussions around this study.
References
Milgram, S. (1963). Behavioral Study of Obedience. Journal of Abnormal and Social Psychology, 67(4), 371–378.
Milgram, S. (1974). Obedience to Authority: An Experimental View.
Perry, G. (2012). Behind the Shock Machine.
Haslam, S. A., & Reicher, S. D. (2017). 50 years of “obedience to authority”: From blind conformity to engaged followership.



