Skip to main content

Verified by Psychology Today

Psychology

What You Think You Know About Social Psychology Is Wrong

Revisiting truisms within the field.

This semester, I have found myself in the position of myth-busting in both my undergraduate and graduate courses. So much of what we know or believe to be true gets taken for granted over time, and we don’t stop to question whether what we think we know may hold up to further scrutiny.

Certainty gives us a sense of security, especially during times of threat, which we have all been living under since the start of the global pandemic. However, certainty also clouds us to the possibility that what we know may only be part of a larger story—or gasp—what we believe to be true is just flat out wrong.

In the spirit of revisiting what we have banked as set knowledge or truth, I would like to identify some of the most common myths regarding prominent studies and theories in social psychology based on my experience teaching it in higher education:

Myth 1: Solomon Asch’s “conformity” study

It is embedded right there in the name of the study—a series of studies Solomon Asch performed in the 1950s to test the extent to which individuals maintain their independence or free-thinking in the presence of others became known as the classic “conformity” study over time. One of the most popularly cited studies in the social-psychological canon, college students are exposed to the experiment Asch conducted whereby a naïve participant is surrounded by confederates in a room. A confederate is someone who appears to also be a participant in a given study but who actually is part of the research team. The actual participant in the study is instructed to engage in a fairly simple task of identifying which in a series of three lines of various lengths most closely resembles the length of another line on a separate poster. Unknown to the naïve participant, however, the confederates sitting around the table with them are trained after the first two trials to all give the wrong answer as they go around the table to share their responses. The crux of Asch’s hypothesis was, will the real participant in the study bow to group pressure or doubt their own knowledge of what is the correct answer because everyone before them said another line length as matching the one featured?

Over 70% of the participants in the study always or mostly always dissented and maintained vocalizing the correct answer, even as their response defied the responses of every confederate who spoke their response before them. In fact, there were 12 trials that each real participant went through. Based on participant responses, Asch, “concluded that he had convincingly demonstrated powers of independence under certain highly demanding conditions” (Friend et. al., 1999, p. 30). Moreover, Asch’s study identified what variables would make participants more or less likely to maintain the right answer, or essentially conform to the wrong one. For instance, participants who were in a trial where at least one confederate before them also stated the correct response were even less likely to yield to the group pressure, and in that case, only 5% demonstrated conformity.

However, over time, rather than focusing on the complexities of independence versus conformity that were demonstrated by this study, it essentially was reduced down to the “conformity study”. The 37% of participants who gave a clearly wrong answer became the focus of dialogue surrounding the study—driven largely by how social psychology textbooks presented the research.

Friend et al. (1999) investigated the way the study was presented following the publication of Asch’s work in the ‘50s all the way through the 1980s in textbooks and identified significant distortions to the original study findings. The more recent the textbook, the more likely it was to have focused on the conformity aspect of the study, rather than what it demonstrated regarding asserting one’s independence. The takeaway: over time, the canon within the field as reflected by these texts has, “increasingly accentuated the role of conformity and underestimated that of independence” (Friend et al., 1999, p. 29).

Such has become the norm within social psychology today, with generations of students being exposed to Asch’s work as the ultimate manifestation of conformity.

Myth 2: Milgram’s shock study revealed “blind” obedience

Perhaps not so shocking to hear, the shock-heard-round-academia with Milgram’s Yale obedience study would similarly be distorted over time. Milgram and Asch worked together when he was studying at Harvard, and one can see how the design of Asch’s study influenced Milgram’s as well. In the early 1960s, Milgram wanted to test the hypothesis of whether or not the pressure to obey an authority figure could compel a person to harm a seemingly innocent other.

This study is perhaps better known and even more infamous than Asch’s. Milgram created an experimental design with a “cover story”—when participants consented to volunteer for his research at Yale University, they thought they were joining a study testing the effects of punishment on memory. What the participants didn’t know is that Milgram was actually testing whether they could be compelled to shock another seeming participant—no one was actually harmed, this other “participant” was actually a confederate—any time they gave a wrong answer on a memory task. The confederate was in another room communicating with the naïve participant through a microphone and any time they responded incorrectly, the naïve participant would administer a shock (the confederate was not actually being shocked, the entire encounter from their end was prerecorded). The participant was in a room with the shock machine and a head researcher in a white lab coat who was clearly in charge of the study. This authority figure in the white lab coat continued to encourage the naïve participant to administer shocks for purposes of the research when they appeared to waver. The shock machine started at 15 volts and escalated all the way up to 450—with prompts identified on the machine so that the participant knew how dangerous the shock they were about to unleash on the other person may be.

The question was, how many participants would go all the way up the voltage on the machine to XXX, which represented 450 volts, even as the confederate screamed in agony and begged the researcher to stop the study? Alas, the rest of this story may sound familiar—the study results that everyone talks about is the sample of 40 male participants where 65% of them went all the way up to 450-volts (of those who disobeyed, reportedly none stopped before 300 volts). And there you have it: Milgram demonstrated our proclivity to blindly obey an authority figure.

Or, did he? In fact, there was no evidence that the obedience was “blind”. Participants were stuttering, laughing hysterically, looking back at the authority figure as the shocks escalated—demonstrating clear signs of agitation, stress, and nervousness. Moreover, the study didn’t start and end here.

There were 20-some experimental variations that Milgram conducted, and the extent to which participants obeyed varied in each one. The most commonly reported results for the study represent one of the experimental conditions that actually yielded one of the higher rates of obedience among participants than most of the others. For instance, in a Bring-a-Friend condition that wasn’t published until after Milgram’s death (Rochat & Blass, 2014), disobedience was the norm when participants were tasked with shocking their friends, and the earlier it occurred, the less likely participants were to follow the orders of the authority. In fact, taken together, rather than demonstrating blind obedience to authority, Milgram was quantifying what situational factors would render a person more or less likely to obey.

The nuances of obedience that Milgram identified in the series of studies he conducted are critical to understand, as crimes of obedience are complex as well, and rarely—if ever—demonstrate merely a blind obedience to authority. The reduction of the scope of his work to the most controversial or “shocking” results from a larger pool of data is not only unfortunate but also reductionistic from the perspective of deconstructing the role that obedience plays in destructive behaviors such as mass violence and genocide.

Myth 3: Mass suicide at Jonestown and mind control

From faulty declarations of “blind” obedience to authority to notions of brainwashing or mind control, in this case social psychologists themselves have played a central role in perpetuating the mythology that the massacre that occurred at Jonestown in a remote jungle in South America in 1978 was done by mass suicide of over 900-people who were spellbound by Jim Jones.

If you ever heard the expression “drinking the Kool-Aid” you know the story (or think you know it): Jim Jones was the charismatic leader of a cult known as the People’s Temple who moved their congregation from California to a remote jungle in Guyana. At his behest, over 900 of his followers blindly followed his call to mass suicide and drank cups of Kool-Aid laced with cyanide (while also willingly facilitating the mass murder-suicides of the 300 or so kids under 17 who were there).

Philip Zimbardo, a prominent social psychologist best known for his own infamous Stanford Prison experiment, wrote an article for the APA identifying the “mind control” techniques Jones used on his followers to essentially compel them to willingly kill themselves and their loved ones. It sounds like compelling stuff—and indeed, what social psychologist wouldn’t like to propel the significance of our work in offering legitimate theories to explain such horrors—except this version of the story isn’t exactly true.

Jones lured his followers to this remote jungle under false pretenses—they thought they were joining an agricultural utopia where they would be working on the land and cultivating community with their fellow members. But there wasn’t enough food to feed everyone, and people both young and old were forced to engage in difficult manual labor many weren’t qualified for, for hours on end, in sweltering heat, all the while listening to Jones’ voice blasted among loudspeakers ranting in his increasingly paranoid rhetoric about their imagined enemies and the evils of the government. Jones was not only clearly deranged and becoming more and more paranoid and unhinged—to the chagrin of his followers—he was often high on amphetamines and engaged in brutal tactics against his own people. Oh yeah, and once you got there the jungle was remote, surrounded by armed guards—and his inner circle of true believers confiscated members’ passports. These followers weren’t blindly following anyone, they were sleep-deprived, hungry, and trapped.

A senator from California was alerted by family members in America of those in Guyana that they suspected their loved ones were being held against their will. These family members also suspected their loved ones had been abused and exploited by Jones. The senator came to the remote jungle with a small contingent of investigators and over a visit that spanned three days, a number of defectors passed notes to journalists that they wanted out. When their group ended the visit, they took these members with them—presumably to return them to safety in America.

Jones directed his own group of henchmen to follow the senator and members to the airstrip where they were heading; follow they did—in a truck heavily armed with guns they used to attempt to kill every person on the airstrip that was trying to escape. Back at the compound, Jones announced via loudspeaker that the day of reckoning was upon them, as fruit drinks—it wasn’t actually Kool-Aid—laced with cyanide, sedatives, and tranquilizers were prepared.

This is where the story gets murky, however. Those who survive insist that the members did not willingly kill themselves. Moreover, there is evidence that Jones purposely squirted toxins in the mouths of the babies and children without the consent of their parents in the compound to compel them to also follow suit. There were also troves of syringes found at the scene suggesting the possibility of something other than suicide.

Former members recall that Jones played mind games all the time with his followers, distributing food or drink and then declaring after it had been consumed that he had poisoned everyone only to declare in another breath that actually this was just a loyalty test to gauge whether members would be willing to die for their cause. He was a perennial showman, giving sermons when still based in the United States where he purportedly cured people of their physical ailments, or staging “shootings” where he would drop to the ground screaming that he was shot only to emerge fully intact. It is entirely plausible to consider that members may have become desensitized to such antics, and perhaps did not take them literally or seriously given Jones’ proclivities towards spectacle.

In the aftermath of the Jonestown massacre, scholars began to refer to such antics as “suicide drills” reflecting the intention of his followers to end their lives by such practice exercises without digging deeper regarding how seriously such measures were taken—is it really a “suicide drill” if you are eating or drinking within a community only to hear after you have consumed your food that it has been poisoned?

While the debate regarding whether the massacre reflected suicide or murder has not been resolved, this murkiness should be acknowledged in social psychology and included in any examination of the events. Former members and those who survived the massacre in the jungle that day insist that members did not willingly end their own lives or that of their children. All told, more than 900 women, children, and men perished in Jonestown, an event that would go down in history as the greatest mass casualty of American lives in a single day until the terrorist attacks of 9/11.

To learn more about this atrocity, I recommend the documentary Truth & Lies: Jonestown, Paradise Lost (2018) available on Hulu. You’re Wrong About is also a compelling podcast series that dedicates an episode to Jonestown.

References

Friend, R., Rafferty, Y., & Bramel, D. (1990). A Puzzling Misinterpretation of the Asch ‘Conformity’ Study. European Journal of Social Psychology, 20, 29-44.

Rochat, F., Blass, T. (2014). Milgram’s Unpublished Obedience Variation and its Historical Relevance. Journal of Social Issues: 70(3), 456-472.

advertisement
More from Azadeh Aalai Ph.D.
More from Psychology Today