How far will we go to conform when it means disregarding our own beliefs and sensory observation? In the 1950’s, an experimenter by the name of Soloman Asch sought to understand that question. In a similar, albeit more ethical, vein as the Milgram studies and Stanford Prison Experiment, Asch began experimenting with the extent that people will go to in order to conform. Asch attacked this question of moral conformity by gathering students at Swarthmore College and asking them a simple question. “Which line on card 2 most closely resembles the length of the line on card 1?”
Imagine you’re in this study, and you are seated at a table with five other participants. Everyone is expected to give an answer out loud. The first person says, “B.” The next agrees, and the next. You’re the last to respond. What do you say?
32% of the time, people in this position tended to agree with the crowd and say the answer was B (C is correct). Fortunately for your ego, this experiment is based in deception and you’re the only real participant in the room. Everyone else was told to give an incorrect answer to see how you might respond.
How confident do you think you would be in your answer if you changed it to match everyone else’s? After the experiment, Asch asked people individually how confident that they felt in their answer (without giving away the nature of the experiment) and—shockingly—most felt quite confident in their responses regardless of perceived correctness. They trusted the crowd more than they trusted their eyes.
If this line of reasoning sounds familiar at all, that’s because we’ve seen it time and time again throughout history. Researchers like Milgram and Zimbardo showed that people will follow a group or authority figure even if it defies their senses or values, and the consequences can be ghastly. The complicity of German citizens in the Holocaust or the inhumane treatment of prisoners at Abu Ghraib are real life replications of their studies. Asch, Milgram, and Zimbardo tapped in to a human mechanism that is almost too macabre and unethical to study safely. When put into an environment in which our social cues conflict with our personal values, these experiments (in addition to real-world examples) have shown us that we will violate our senses in favor of social acceptance.
As a student-researcher, my mind immediately jumps to an urge for exploration. What exactly is this “moral switch” that can be turned off without our conscious awareness and allows us to violate our own values and even commit atrocities without regret? How can we safely study this phenomenon and intervene in such a way that we can prevent this blind following in the face of adversity? Furthermore, could we potentially prevent events as extreme as genocide by intervening with this subconscious mechanism?
As science moves forward, more questions are asked than can be safely answered. Obviously we can’t keep experimentally deceiving people to the point of “emotional stress”, but there are ways of observing this mechanism. For example, in the aforementioned Milgram and Zimbardo studies, participants were experiencing unethical conditions which resulted in this emotional stress. There are ways of tapping this construct ethically and Asch was on to something, but this area of research is unstable grounds. Knowing this, I think you should ask yourself, “Will I be willing to stand out in the face of adversity when I know others are wrong?”