The term truthiness, coined by Comedy Central’s Stephen Colbert, means “truth that comes from the gut, not books.” It was chosen as the word that best sums up 2006 in an online survey conducted by the dictionary folks at Merriam-Webster.
Merriam-Webster’s president, John Morse, says that many people believe that truth is “up for grabs.” Unlike the doctrine of relativism, though, in which truth is not absolute but varies by culture or social circumstance, truthiness propounds that absolute truth is what we feel.
Colbert’s “truth from the gut” is inimical to good critical thinking. Surprisingly, however, it is very hard to get most of us to think beyond our gut feelings. When we have a visceral reaction to something, we assume that we have discovered a truth. However, as psychologist David Levy, Pepperdine University, pointcogs out in his book Tools of Critical Thinking, feelings and truth are conceptually unrelated.
In a risk assessment exercise last year, one of my students refused to switch positions even though the class had demonstrated conclusively through a series of trials that the new position was twice as favorable as the position he held. The student’s explanation for refusing to switch was that he “was feeling it.” That thinking error is related to belief perseverance, a phenomenon in which we commit to an initial position and stubbornly hold onto our belief despite evidence that suggests the belief is incorrect. Critical thinking is the opposite of truthiness. According to Levy, critical thinking is a systematic cognitive strategy to examine, evaluate, and understand events. It involves solving problems and making decisions on the basis of sound reasoning and valid evidence.
Levy identifies a variety of attitudes that characterize critical thinkers, including questioning assumptions, discerning hidden biases, avoiding overgeneralizations, developing tolerance for ambiguity, and exploring alternative perspectives.
Critical thinkers learn to digest and use data. They adapt and refit hypotheses in accord with that data rather than attempt to force the data to fit their pre-existing assumptions.
APS Fellow and Charter Member David Myers identifies instances in which intuitions go awry in his book Intuition: Its Powers and Perils. For example, although research has shown that interviewers form impressions of job applicants in seconds and are confident about their judgments, those interviewers commonly overestimate their ability to predict who will become good employees.
We can often misconstrue situations because of our mood, too. The Talmud says that “we don’t see things as they are, we see things as we are.” Good critical thinking demands that we put aside our feelings and judge situations objectively. We should explore alternative possibilities and leave ourselves open to discovering new evidence that might point in other directions. The difficulty is that our initial gut reaction may obscure our ability to pursue new leads.
For example, our initial hunch may cause us to look for evidence confirming what we already believe and ignore evidence to the contrary. This confirmation bias leads the uncritical thinker to find what he or she is looking for and only that.
Poor critical thinkers can be easily manipulated into believing in such things as the existence of weapons of mass destruction in a country and subsequently led into destructive actions because the belief feels right. We can develop into good critical thinkers by initially recognizing the characteristics of good thinking as well as truthiness roadblocks to critical thought.