Morality is not a universal constant. For example, why would the government of Spain give human rights to chimps in 2007, yet other governments continue to hand out licenses for hunting seal pups? In a symposium today, three researchers discussed their efforts to investigate individual variation in morality.
One way to account for differences in morality is genetics. Abigail Marsh, a researcher from Georgetown University, shared her latest results from a study where she and her team correlated genetic alleles for a serotonin transporter called SLC6A4 with participants’ responses to moral scenarios. Participants were asked to respond to dilemmas with various outcomes. The results indicated that people who were homozygous for the long allele of the gene found the “forseen harm” scenarios to be more acceptable than people who were homozygous for the short allele. In other words, people with the short allele thought it was less acceptable to passively hurt an innocent person (i.e., allow a boulder to run over a person) even if it saves more people.
Another potential source of variation is the mind, or mind perception as Kurt Gray from the University of Maryland puts it.
“Mind is a matter of perception,” says Gray. “Because we can’t directly experience another person’s mind, we have to infer it.”
Mind perception refers to a person’s ability to perceive another person’s mind and to imagine what type of mind it is. Gray showed results from a study where people were asked about the “minds” of individuals like an adult woman, a baby or Superman as well as other entities like animals, God, or a dead person. According to Gray, people were more likely to categorize babies and animals as having “less responsible” minds than adults.
Brain activity might also be a way to parse out differences in morality. Joshua Greene from Harvard University shared data from an fMRI study where people were given the opportunity to lie about the results of a coin flipping test. After sorting the participants into “dishonest”, “honest” and “ambiguous” categories, Greene showed that the dishonest participants had distinct activation in the control system of their brains when they chose to lie. Honest people, however, did not have a distinct pattern when they chose to be honest. These results suggest that honest people do not struggle with temptation when they are given an opportunity to lie.
Overall, there is no simple answer to account for differences in human morality. There’s no single brain area that scientists can point to as the source of a person’s conscience or moral reasoning. Morality is complex, and it’s no surprise that the science is the same.