This month's
presidential election was between two fairly centrist candidates.
And yet political discourse between ordinary Republicans and Democrats is more
contentious and hostile than it's been
in decades. I bet you strongly agree with one of these statements:
- If you're a Democrat: The Obama campaign for
reelection was run largely based on telling the truth. The Romney campaign was laregely based on lies.
- If you're a Republican: All political campaigns
stretch the facts from time to time to make a point. Romney and Obama both did.
I'd like to
suggest that both these statements are false.
Let's first take
on the claim that the Obama reelection team did not lie. During the campaign
President Obama said, directly and through campaign advertising, that Romney opposed gay adoption, opposed abortion even
in cases of rape or incest, and that Romney's plan could take away
middle-class tax deductions. He claimed that during his first term we doubled our use of
renewable energy, doubled exports, and that 30 million
Americans are going to get health care next year because of Obamacare. And that's
before we even get to how the campaign twisted the facts around when Romney
left Bain Capital to make him look bad.
So, "everyone
does it"? Not so fast, Republicans. A quick dip into the evidence makes it
clear that Mitt Romney's lies were of a scope, magnitude, and brazeneness that
is unmatched in modern political history. His first TV ad, back in November
2011, featured audio of Obama saying, "If we keep talking about the
economy, we're going to lose." That sounds terrible -- except it's from
2008, and Obama actually said, "Senator McCain's campaign actually said,
and I quote, 'If we keep talking about the economy, we're going to lose.'"
Romney said that Obama began his presidency
with an apology tour, wants to end Medicare
as we know it, and didn't mention the
deficit or debt in the 2012 State of the Union (he mentioned
them six times). Then there were Romney's misguided statements
about the Cairo embassy situation, his running
mate's untruth-riddled
convention speech, and the campaign's blatant admission, "we're not
going to let our campaign be dictated by fact-checkers."
What's happening
here? We weigh facts and lines of reasoning far more strongly when they favor
our own side, and we minimize the importance and validity of the opposition's
arguments. That may be appropriate behavior in a formal debate, or when we're
trying to sway the opinion of a third party. But to the extent that we
internalize these tendencies, they injure our ability to think and see clearly.
And if we bring them into the sort of open and honest one-on-one political
debates that we'd like to think Americans have with each other, we strain our
own credibility and undermine the possibility of reaching an understanding.
A defense attorney
presents the best case for his client's innocence in court, but he's realistic
with himself about what he believes the truth of the matter is. Too often in
political arguments we have drunk our own Kool-Aid. Take, for example, the national
debate we had about torture during the Bush Administration. Almost universally,
people who thought torture was unjustified also believed it was ineffective,
and people who thought it was justified believed it was effective. There are
two questions: Is torture effective for increasing national security? (A matter
of fact.) If it is, is it justified? (A matter of judgement.) The pursuit of
the answer to the factual question was anemic because most partisans appeared
to assume the answer that supported their moral judgment was correct.
A recent report on three
psychological studies by professors from the University of California,
Irvine confirms this bias, and points out that it's pervasive across a wide
range of human situations. Where our moral judgements come into conflict with
evidence, we look for ways to dismiss and minimize the evidence:
For example, many political conservatives believe that promoting condom use to teenagers is inherently wrong. This deontological intuition conflicts with consequentialist sensibilities, however, if one also believes that condoms are effective at preventing pregnancy and sexually transmitted disease (STDs). Individuals can resolve this conflict by becoming unskeptical consumers of information that disparages the benefits of condom use (e.g., their prophylactic effectiveness) or enhances its costs (e.g., encouragement of promiscuous sex). [...] Analogously, liberals who feel moral disgust toward the death penalty should be prone to believe information emphasizing its ineffectiveness at deterring future crime or the risks of wrongful execution[.]
It's
hard not to read that and think of pundits from opposing political teams
arguing on cable television for low-information voters. But cherry-picking
facts while trying to persuade someone to take our side or while engaging in a
debate is one thing. It's something else to do it while reasoning for
ourselves. And yet it seems that our brains are wired to do so; it's not a
phenomena brought on by soundbite culture.
While individuals can and do appeal to principle in some cases to support their moral positions, we argue that this is a difficult stance psychologically because it conflicts with well-rehearsed economic intuitions urging that the most rational course of action is the one that produces the most favorable cost-benefit ratio. Our research suggests that people resolve such dilemmas by bringing cost-benefit beliefs into line with moral evaluations, such that the right course of action morally becomes the right course of action practically as well. Study 3 provides experimental confirmation of a pattern implied by both our own and others' correlational research (e.g., Kahan, 2010): People shape their descriptive understanding of the world to fit their prescriptive understanding of it. Our findings contribute to a growing body of research demonstrating that moral evaluations affect non-moral judgments such as assessments of cause (Alicke, 2000; Cushman & Young, 2011) intention (Knobe, 2003, 2010), and control (Young & Phillips, 2011). At the broadest level, all these examples represent a tendency, long noted by philosophers, for people to have trouble maintaining clear conceptual boundaries between what is and what ought to be (Davis, 1978; Hume, 1740/1985).
The
studies further show that this effect is stronger in well-informed, politically
engaged individuals. The more information we have, the higher our propensity to
cheat with it. I've been talking to a lot of people on both sides of the
election, and the thing I'm often struck by is an inability to find any
validity in the opposing side's arguments. By blocking our ability to have
meaningful conversations, this effect is actually harming political discourse.
Luckily, it's not
impossible to overcome. Like all other cognitive biases, as we become aware of
this effect, we can take it into account in our thinking and actively work to
hedge against it by checking our beliefs against facts. Instead of latching
onto the weakest arguments of those we disagree with, we can look at their
strong arguments and try to see where they may be right. Embarking on
conversations this way pays off in better discourse, and a better chance of
someone changing their mind. Of course we can't do anything about this effect
in actual policy-makers, who are prone to making political decisions on
prescriptive thinking and selective facts. All we can hope is that good
intellectual habits trickle up.
No comments:
Post a Comment