Starting next week and through October, President Barack Obama and Gov. Mitt Romney will face off in a series of four televised debates, designed to clarify the candidates’ positions on the most pressing public policy issues confronting the nation today. In place of the ideals and elegant rhetoric of the campaign trail, the leaders of the two major parties will have an opportunity to describe the nitty-gritty of governing—how they will deal with complex matters like affordable health care, foreign policy in the Middle East, job creation, equitable taxation, and more.
But the unfortunate reality is that Americans won’t get much in the way of detail and explanation. If history is any guide, the debate moderators will not press very hard for nuts and bolts, instead allowing the candidates to evade and attack and talk in unhelpful generalities. They will preach in pre-tested catch phrases to the already converted, rather than really explaining the difficult day-to-day realities of decision making in a democracy.
Cynics will say that it doesn’t matter, that voters’ minds are made up anyway. But if national debates aren’t the venue for challenging citizens’ thinking, then where? Voters need to understand the prosaic details of complex policies. Most have staked out positions on these issues, but they are not often reasoned positions, which take hard intellectual work. Most citizens opt instead of simplistic explanations, assuming wrongly that they comprehend the nuances of issues.
Psychological scientists have a name for this easy, automatic, simplistic thinking: the illusion of explanatory depth. We strongly believe that we understand complex matters, when in fact we are clueless, and these false and extreme beliefs shape our preferences, judgments and actions—including our votes.
Is it possible to shake such deep-rooted convictions? That’s the question that Philip Fernbach, a psychological scientist at the University of Colorado’s Leeds School of Business, wanted to explore. Fernbach and his colleagues wondered if forcing people to explain complex policies in detail—not cheerleading for a position but really considering the mechanics of implementation—might force them to confront their ignorance and thus weaken their extremist stands on issues. They ran a series of lab experiments to test this idea.
They started by recruiting a group of volunteers in their 30s—Democrats, Republicans and Independents—and asking them to state their positions on a variety of issues, from a national flat tax a cap-and-trade system for carbon emissions. They indicated how strongly the felt about each issue, and also rated their own understanding of the issues. Then the volunteers were instructed to write elaborate explanations of two issues. If the issue was cap-and-trade, for example, they would first explain precisely what cap-and-trade means, how it is implemented, who it benefits and who it could hurt, the sources of carbon emissions, and so forth. They were not asked for value judgments about the policy or about the environment or business, but only for a highly detailed description of the mechanics of the policy in action.
Let’s be honest. Most of us never do this. Fernbach’s dea was that such an exercise would force many to realize just how little they really know about cap-and-trade, and confronted with their own ignorance, they would dampen their own enthusiasm. They would be humbled, and as a result take less extreme positions. And that’s just what happened. Trying—and failing—to explain complex policies undermined the extremists’ illusions about being well-informed. They became more moderate in their views as a result.
Being forced to articulate the nuts and bolts of a policy is not the same as trying to sell that policy. In fact, talking about one’s views can often strengthen them. Fernbach believes it’s the slow, cognitive work—the deliberate analysis—that changes people’s judgments, but he wanted to check this in another experiment. This one was very similar to the first, but some volunteers, instead of explaining a policy, merely listed reasons for liking it. Consider universal health care, for example: It’s highly complex and challenging to explain, but much easier to label it “compassionate” or, alternatively, “European” or “socialist.” So some volunteers were assigned to do the hard explaining and others the simplistic labeling.
The results were clear. As described in a forthcoming issue of the journal Psychological Science, those who simply listed reasons for their positions—articulating their values—were less shaken in their views. They continued to think they understood the policies in their complexity—and, notably, they remained extreme in their passion for their positions. In a final version of the study, volunteers who were forced to confront their inadequate knowledge actually gave less money to the cause, suggesting that, with their extremism attenuated, they actually acted more moderately.
Americans in 2012 are about as polarized and partisan as they’ve ever been, and such polarization tends to reinforce itself. People are unaware of their own ignorance, and they seek out information that bolsters their views, often without knowing it. They also process new information in biased ways, and they hang out with people like themselves. All of these psychological forces increase political extremism, and no simple measure will change that. But forcing the candidates to provide concrete and elaborate plans might be a start—it gives citizens a starting place. As former presidential hopeful Ross Perot famously stated: “The devil is in the details.”
Excerpts from Wray Herbert’s two blogs—“Full Frontal Psychology” and “We’re Only Human”—appear in The Huffington Post and Scientific American Mind.