I once worked for an organization that required a fair amount of business travel. I had a colleague who, for medical reasons, had to eat frequent, small meals. But when he submitted his expense report itemizing these meals, the company returned it to him, disapproved. He was told that the company would only reimburse for three meals per day.
My colleague re-submitted his expense report a few days later, claiming only three meals a day. But the report also specified various incidental expenses that had not appeared before—gratuities, bus fares, pens and notebooks. The bottom line was unchanged. What changed was the integrity of the expense report.
There is no question that my colleague cheated on his expenses. But he rationalized his strategy in ways that made it okay in his mind. After all, the company rules were unreasonable and inflexible, and invited cheating. What’s more, others were expensing lavish meals with expensive wines, so why should he have to pay out of pocket for his meager meals in the coffee shop? So what if he bent the rules a bit? At the very least, this was one of those morally ambiguous situations, a gray area.
Was it? Or is this just a self-serving rationalization on the part of my colleague? The fact is, people act dishonestly, in small ways and large, every day—and not just on expense reports. People lie to friends and family, and justify these as harmless, “white” lies. People cheat on their taxes, on their spouses, in sports and cards. They commit petty theft and other immoral acts when nobody is looking.
We only hear about the sensational cases of lying and cheating, but these small moral transgressions add up. So it’s worth trying to understand why people do wrong, and how they live with their everyday immorality. Intriguingly, people don’t lie and cheat indiscriminately—simply because they can get away with it. Even when there is no chance of being found out, people show some level of aversion to acting unethically. They want it both ways: To profit by dishonesty, but also to preserve some sense of themselves as moral beings.
This is the moral dynamic that interests psychological scientist Shaul Shalvi of Ben Gurion University of the Negev, Israel, who with his colleagues has been analyzing the many ways in which we rationalize bad acts and manage to feel good. In a forthcoming issue of the journal Current Directions in Psychological Science, the scientists summarize their work and the work of others on the rationalization of everyday dishonesty.
The scientists have found that we use different forms of rationalization, depending on whether our wrongdoing lies in the past or the future. When we are anticipating doing something bad, we tend to frame the situation in a way that seems ambiguous, and therefore avoids any moral dilemma or guilt—like my colleague above. One way to do this is by “shuffling facts,” which is less clear than outright inventing facts. In one experiment, for example, participants rolled a die and claimed money based on their (claimed) roll. Interestingly, when they rolled the die three times, compared to just once, they reported higher numbers (and took more money) for the paid roll. Apparently, rolling three times—even though two rolls were meaningless—allowed participants to observe higher numbers more often, so all they had to do was shuffle the high numbers into the meaningful (and profitable) roll. This kind of cheating is more ambiguous and easier to justify than making up a false number out of whole cloth.
The scientists have also found that people rationalize wrongdoing when it benefits others in addition to themselves. Such “altruistic cheating” increases—and guilt decreases—as the number of beneficiaries increases. Similarly, people use “moral licensing”—a kind of balance scale—so that their beneficent acts entitle them to act immorally later on. So, paradoxically, doing good keeps us from feeling bad when we actually violate moral rules.
Sometimes, people act unethically without much forethought, and so must do some moral gymnastics to preserve their sense of moral integrity afterward. This, the scientists have discovered, is usually done through some kind of atonement. Literal or symbolic cleansing can alleviate guilt, and mild electrical shocks can bring redemption. Confession also helps moral violators to balance their moral ledger, and even “partial confessions”—admitting to some wrongdoing—seems to allow people to feel moral following a transgression. People also point to others’ immoral deeds to justify their own—as when my colleague chastised our employer’s rigid and unfair rules. Apparently, such “demonizing” of others permits people to see themselves as “ultra-moral” people who merely committed a one-time ethical slip.
Ultimately, behavioral ethicists aim to craft interventions to increase everyday ethical behavior. The science so far suggests that interventions could target irrational thinking, before and after a moral violation. This might be done by clarifying and accentuating ethical codes, minimizing gray areas with concrete examples of misconduct—anything to weaken the impressive power of rationalization.
Follow Wray Herbert’s reporting on psychological science in The Huffington Post and on Twitter at @wrayherbert.