One of the most daunting public health challenges is getting people to take care of themselves in the most basic ways. It’s not that people with cardiac risk don’t know about exercise and its heart benefits. Or that people with diabetes are unaware of insulin treatment. Or that the elderly don’t know about the flu and flu shots. It’s that they don’t take the first steps in helping themselves get and stay healthy, like seeing a physician and having a checkup and filling a prescription. In this sense, the biggest health risk for many is doing nothing, and the cost of this medical non-compliance could be as high as $100 billion a year in the US alone.
There are lots of complex reasons for this seemingly self-destructive inaction. Many people are intimidated by doctors’ offices. Others simply cannot afford to take care of themselves. But the problem may be rooted in something even more fundamental: a powerful cognitive bias against change.
Stanford University psychological scientist Gaurav Suri calls this the “status quo bias.” This means simply that, given the option, people will continue doing what they’re doing—they won’t proactively choose to make a change if they don’t have to. This bias has been demonstrated in many other contexts, but usually where both options are equally desirable. Suri and his colleagues wondered if it might also be at work in patient inertia, in which the default option is unambiguously more harmful than the alternative. They ran a series of experiments to simulate medical decision making in the laboratory.
They used the threat of electrical shocks to represent an unhealthy and anxious status quo. Most people, given the choice of waiting for a shock or having that shock right away, opt to get it over with. So in the first experiment, half the volunteers were forced to choose between the unpleasant status quo—waiting anxiously—and shortening the waiting time. These volunteers served as a control group: Their choices were presumably the choices we would all make if there were no status quo bias at work. The other half, the experimental group, had the same options—but they could make the choice at any time during the trials, or not at all. Suri and colleagues wanted to see if the status quo bias would prevent them from acting in their own best interest.
And it did, fairly dramatically. Unless they were required to make a choice, the volunteers stuck with the status quo most the time, even though it was psychologically aversive. This is like fretting over a mysterious, persistent pain—but refusing to get it checked out.
This actually surprised the scientists, but they wanted more evidence. After all, both choices here ended with a shock—it was just a matter of timing—so perhaps that influenced the volunteers’ choices. So in a second study, the scientist made the choice even starker than before, to see if they could eliminate the status quo bias.
In this study, the choice was between almost certain shock—the default position—and a dramatically diminished chance of being shocked. All they had to do was push a button to opt out of the more perilous position, so who wouldn’t opt out? Suri and the others were confident that the status quo bias would lose its force when the stakes were so undeniable.
But they were wrong. The results were the same as before. Unless they were forced to make a decision, they didn’t. They inexplicably stuck with the status quo almost half the time, even though this almost guaranteed they would get the unpleasant shock. And here’s the most striking finding: When they talked with these volunteers afterward, all of them—100 percent—said they thought everyone would opt out of the shock, and they couldn’t explain why they themselves had not. It’s like an invisible force was making them into a choice they knew didn’t make any sense.
This is much like patient inertia. Patients with compelling reasons to change opt instead to stick with a personally harmful default position by doing nothing. The scientists wondered if there might be a way to trump such irrationality, which is what they attempted in a final experiment.
They basically ran a modified version of the earlier studies. In this one, all of the volunteers could opt out of the shocking condition or not. But half of these, during practice trials before they actual study, were instructed to press the button that would protect them from the shock. This was the extent of the intervention, and they only had to do it once. The theory here was that this simple action—pushing the button to opt out of status quo—would be enough to counter their psychological inertia and weaken the bias against change.
And it apparently was enough. As reported in a forthcoming issue of the journal Psychological Science, those who went through this simple intervention were much more likely, later on in the real trials, to opt out of the unpleasant status quo. They had, in effect, become more rational in their decision making.
It’s not entirely clearly what is driving this change of mind, but these findings nevertheless have policy implications. People can’t be forced to get a flu shot or an EKG, but these findings suggest that breaking the ice is psychologically important. Perhaps public health strategies could focus on inducing this kind of initiation—trying something once, for the first time. Perhaps going for a single medical checkup, or a single jog in the park, or a one-time flu shot—perhaps these steps might overcome patient inertia in a lasting way.
Wray Herbert’s blogs—“Full Frontal Psychology” and “We’re Only Human”—appear regularly in The Huffington Post.