Milgram’s Infamous Shock Studies Still Hold Lessons for Confronting Authoritarianism

Why ordinary people will follow orders to the point of hurting others remains a critical question for scientists—though some answers have emerged

Illustration of silhouetted giant hand controlling a man with puppet strings on a blue background

Fifty years ago Stanley Milgram published his book Obedience to Authority, which described what have arguably become the most famous experiments in psychology. As the book detailed, an experimenter informed participants—called “teachers” in the study—that they would be administering progressively stronger shocks to people called “learners.” The shocks were not real, but participants thought they were.

The majority of participants proceeded to shock learners despite hearing screams of apparent agony. Under the experimenter’s direction, two thirds of participants in the study’s canonical version continued to the highest voltage level, even when the learner complained of heart problems and later stopped screaming, presumably having lost consciousness.

Today Milgram’s “shock experiments” appear in psychology textbooks, movies, TV shows, blog posts and podcasts. When a point must be made about humans’ vulnerability to pernicious authority figures, Milgram’s findings are there. But this work has never settled comfortably with science or the public. It was, and still is, upsetting to know that participants did what they did. People think, “Surely I wouldn’t do that.” Philosophers and scientists are similarly surprised that so many of the supposed teachers “followed orders” all the way to the maximum shock voltage.


On supporting science journalism

If you’re enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Researchers have taken a critical lens to Milgram’s work time and again. Well-founded concerns about the ethics of putting participants in such distressing circumstances have led to much stricter research standards. Further, an important reform movement in psychology has prompted reexamination of many classic experiments. One critique suggests that perhaps Milgram misinterpreted his findings: maybe the participants did not actually believe “learners” were being shocked.

By reexamining the data from Milgram’s experiments and considering the outcomes of several conceptual replications (more recent studies that used different approaches to probe people’s susceptibility to authority figures), we determined that, in fact, Milgram’s work and conclusions still stand. That finding has several important implications, particularly for confronting the knotty question of how people might overcome the tendency to submit to malevolent authority.

First, we should note that Milgram’s experimental paradigm is robustly replicable. Milgram himself closely replicated the findings of the canonical version of his experiment at least three times. In addition, we’ve identified 20 replications from around the world with varying degrees of fealty to the original study. One variation involved participants carrying out orders to torment “job applicants” by making negative comments until the applicants failed their qualification exams and lost their chance at employment. Another used a game-show scenario in which participants questioned and shocked fellow contestants in front of a studio audience. These efforts show that many people follow the instructions of various kinds of authority figures—even to the point of causing others extreme distress.

But do participants believe these setups are real? When we reanalyzed data from Milgram’s original experiments, we found that the evidence is strongly against the notion that people followed orders because they didn’t believe in the experimental scenario. When the experiment was over, Milgram told participants that the learner was not really being shocked and asked them if they had believed the shocks were real. The participants overwhelmingly affirmed belief in the experimental protocol, Milgram’s data show. Indeed, videos of these experiments (both Milgram’s own and others’) are disturbing in part because of the participants’ acute discomfort, anxiety and stress. Why would they be upset if they knew the shocks were fake? Furthermore, when we look at Milgram’s data, we find that obedient and disobedient participants reported very similar levels of belief in the experiment.

These studies reveal that we are motivated to carry out an authority figure’s requests. The question is whether we can guard against that tendency. In his theory of moral disengagement, the late social psychologist Albert Bandura describes blame-shifting as one powerful mechanism that allows people who carry out immoral orders to “disengage” from their moral compass. For example, by claiming “I was just following orders,” people move culpability onto the person who issued the command, avoiding self-condemnation. Milgram’s experiments provide dramatic evidence of a kind of blame-shifting called victim-blaming. One of his subjects reported being “disgusted” when the learner wouldn’t cooperate, stating, “You better answer and get it over with. We can’t stay here all night.”

One of us (Niemi) has studied when and why people blame victims for their own suffering and has found that the more people express strong support for moral values centered on authority and traditional hierarchies, the more likely they are to agree that victims deserve their misfortune. Fortunately, the findings also suggest that the more that people express support for moral values centered on care and fairness, the more sensitive they are to victims’ suffering. Such values can be consciously cultivated and are highly prized by many different communities. These findings apply across different political groups, genders and religious beliefs.

There are also slivers of hope within Milgram’s original experiment and variations. For instance, when participants chose the voltage themselves, very few doled out maximum punishment to the “learners.” Most people—far from being naturally sadistic—were averse to inflicting painful shocks. Strikingly, people overwhelmingly resisted the experimenter’s directions when they were joined by two “defiant peers” who refused to follow orders. Imagine the power for good each of us could have if we were to join together against authoritarian influence.

When we zoom out to the big picture, we can see that Milgram’s work also points to the seriousness of selecting appropriate leadership—whether in the boardroom or for political office. Ultimately, the people in charge can influence many others to follow their direction. That’s just as important to understand now as it was half a century ago.

The authoritarianism that was the impetus for Milgram’s work remains on the rise worldwide, and with it has risen the normalization of violating core democratic values: impartiality, transparency, openness, protection from harm and recusal from conflicts of interest. In response, research programs investigating the slippage of democracy and increases in totalitarian governance are increasing not only in psychology but in adjacent fields such as public policy, political science, sociology and philosophy. It’s therefore critical to correct misinterpretations of Milgram’s work.

Are you a scientist who specializes in neuroscience, cognitive science or psychology? And have you read a recent peer-reviewed paper that you would like to write about for Mind Matters? Please send suggestions to Scientific American’s Mind Matters editor Daisy Yuhas at dyuhas@sciam.com.

This is an opinion and analysis article, and the views expressed by the author or authors are not necessarily those of Scientific American.



Source link