The Wall Street Journal:
The psychologists Daniel Kahneman and Amon Tversky won the Nobel Prize in Economics in part for identifying certain consistent deviations from rationality that seem to be endemic among human decision-makers.
Susceptibility to “framing” is one such bias. Consider a situation in which a disease puts 600,000 people at risk, and it’s your job to choose the medicine to deploy. One option is presented his way:
If you choose Medicine A, 200,000 people will be saved.
If you choose Medicine B, there is a 33.3% chance that 600,000 people will be saved and a 66.6% chance that no one will be saved.
That’s the “gain-frame” version of the question. And in that framing, people tend to choose the sure thing. The “loss-frame” goes like this: Choose Medicine A and “400,000 will die”; choose Medicine B and there’s a 33.3% chance that “no one will die,” a 66.6% chance that “600,000 people will die.”
If you look at the options closely, you can see that they are logically equivalent. But people tend to take the sure thing when it involves lives saved, and to take a risk when the sure thing involves lives lost.
A second, related bias, involves “loss aversion”: Staked $10, most people will pass up a 50-50 bet in which they will earn $12 or lose the $10; the loss looms larger than the gain. (If you don’t take such bets as a Wall Street trader, however, you’ll soon be out of a job.)
A new study, however, whose central finding is nearly as mysterious as the biases themselves, finds that these much discussed cognitive quirks disappear when the thought experiments are presented to people in a language that they have studied but not fully mastered.
Read the whole story: The Wall Street Journal
Leave a comment below and continue the conversation.