Skip to main content

Verified by Psychology Today

Motivated Reasoning

Motivated Reasoning Plus Ideology Can Make You Foolish

People talk themselves into foolish ideas when they’re blinded by ideology.

Key points

  • Motivated reasoning is a fundamental human tendency to reason one's way into a conclusion that prioritizes one goal or value ahead of others.
  • Ideology can cause analytical thinkers to reason their way into acceptance of dubious, ideologically driven decisions.
  • Learning to avoid ideological capture can reduce the likelihood of falling prey to such decisions.

In the past, I wrote about motivated reasoning as arising in situations where goals or values conflict. To some extent, most reasoning is motivated. We have goals we try to achieve and values that drive how we select among various choice options. Sometimes, these goals/values are well aligned, and sometimes they conflict. When they conflict, we often attempt to reason our way into a conclusion that prioritizes one goal or value ahead of others. There’s nothing innately wrong with doing so. It is a fundamental human tendency; values vary in the degree to which they take precedence in each situation; and there’s often a degree of subjectivity that determines whether a choice is reasonable or not.

Think about any purchase we might make. Reasonable choices are often a function of the criteria we use to select the product, and those criteria are an outgrowth of our goals/values (e.g., what is important to us in this purchase). We might prioritize cost, reliability, functionality, or some other criterion. We might attempt to balance multiple criteria and select a product good enough on each of them. Regardless of the criteria, a reasonable or good enough choice is a function of how well it fits those criteria. If we change the criteria, and the corresponding prioritized values, we change which choices are reasonable.

For example, when setting out to buy a new car, a person might establish a set of criteria, evaluate options based on those criteria, and then make a choice based on those evaluations. In this instance, the criteria relate directly to the choice made.

But now let’s say there’s a conflict between two of our criteria: reliability and fuel efficiency. Making the choice using reliability as the priority would result in choosing Car A, but making the choice using fuel efficiency as the priority would result in choosing Car B. We really, really want Car B because of the way it looks, so we introduce a new criterion (which one looks better) and use it to justify purchasing Car B (even if Car A might have been a slightly better overall purchase).

In this instance, we engaged in motivated reasoning to resolve a conflict, and the end result very well may be one that is perfectly justifiable. Motivated reasoning affected our decision choice but did not fully circumvent a priori reasoning. We relied on our emotion to help us choose among final alternatives, rather than letting emotion be the predominant reason for the decision. There are other times, though, when motivated reasoning is the primary driver of our decision-making, leading us to accept some dubious conclusions.

Reasoning Into the Desired Outcome

Rather than using a forward, logic-driven process to reach a conclusion, where emotion may simply aid in making a final choice, we often start in reverse. We decide which option we want to choose and then attempt to reason our way into that choice. This is especially likely when there’s a strongly held goal, value, or belief causing us to want to make that choice. In such instances, our emotional connection to the preferred outcome motivates us to derive a reason to make that choice. In other words, we let our emotion supersede other evidence. Almost every impulsive decision we make, and many decisions we come to regret, result from this reverse engineering.

The key to this type of motivated reasoning is that, generally, we must be able to construct sufficient justification to lead us to accept our desired conclusion. If we cannot do so, we’re much less likely to act on it. We must believe the conclusion or decision to be justified, even if it means externalizing responsibility for our choice to someone or something else (e.g., my boss made me do it).

In the car example, we decide right away, before considering any other evidence, that we really want Car B. If we can reason our way into making that purchase—regardless of how Car B may have fared if we engaged in a deliberate evidence-based process—we’re susceptible to doing so. But if our self-justification falls apart (e.g,, we can’t convince ourselves it’s a choice that makes sense), we’re likely to reject Car B as an option.

Ideology Reinforces Motivated Reasoning

And that brings me to a recent argument made by Gurwinder (2022), who provided an explanation for why smart people believe stupid things[1]. He cited and linked to prior research demonstrating that people who possess greater analytical thinking, basic knowledge of a topic, greater ability to digest statistical information, or education were also more likely to be susceptible to motivated reasoning if they also possessed strongly held beliefs related to the decision at hand. In other words, people who should have been least susceptible to belief-driven bias (educated, analytical thinkers) were more motivated to reason their way into conclusions consistent with their beliefs.

But there’s more. The biases people demonstrated tended to conform to their ideology. This ties into a previous post I wrote on ideological thinking. Recall that ideological thinking has two components: the doctrinal component and the relational component. The doctrinal component provides the ideology’s beliefs and rules, and the relational component concerns attitudes about in-group and out-group members. So, when we strongly internalize the doctrinal and relational elements of ideological thinking (i.e., become true believers or zealots), it creates a situation in which those doctrinal beliefs are rarely challenged.

What this also means is that once educated, intelligent, or analytical people adopt an ideological position, they tend to be much more susceptible to motivated reasoning. Whereas less educated or less analytical thinkers may be more influenced by others (which is itself a problem), more analytical ideological thinkers require no such external influence: They are perfectly capable of reasoning themselves into conclusions consistent with their ideological views. That means the ideology becomes self-reinforcing.

This doesn’t explain how more analytical thinkers end up enthralled in ideology. After all, there must be a reason for them to adopt the views in the first place. Though I don’t have any complete answers here, one likely motivation for this is a desire to fit in, to be a part of a group. Bonhoeffer’s Law of Stupidity argues that our social needs are a primary driver of our willingness to suspend critical thinking when it benefits our need to fit in.

The need for acceptance and group identity can be a strong motivator. If it is strong enough, it can override other less strongly activated values, causing an individual to entertain—and even accept some of the ideology’s doctrinal beliefs (Brandt, 2022). From there, more educated, analytical thinkers can simply do the rest, reasoning their way deeper into ideological thinking.

Key Takeaways

True belief in an ideology can lead to the endorsement of ideas that flow from that ideology, regardless of how questionable, dubious, or even foolish they may be. And true believers are not always, or even often, those with less intelligence, education, or analytical thinking. In fact, it is often quite the opposite. Whereas those with less intelligence or education often require ongoing reinforcement to maintain their ideological beliefs, those with more intelligence or education can more easily self-reinforce those beliefs via their own motivated reasoning.

I offer a two-fold suggestion for avoiding ideologically driven foolish ideas:

  • Apply some of Gurwinder’s ideas: A healthy dose of both curiosity (a desire to learn more) and humility (learning to keep one’s own ego in check) can decrease ideological enchantment.
  • Learn from Bonhoeffer’s writing: Avoiding environments where a single perspective is reinforced and encouraged (i.e., an echo chamber) can reduce the likelihood that foolish ideas will be accepted with no critical thought.

Applying these two ideas is no guarantee of avoiding ideological thinking or embracing foolish ideas, but they should help to reduce the likelihood of both.

References

Footnotes

[1] While he pointed to intelligence, almost none of the studies measured intelligence. They measured knowledge, education, reflective thinking, and/or statistical competence.

advertisement
More from Matt Grawitch Ph.D.
More from Psychology Today