“Precisely right. No doubt. Trust me.”

As a general rule, we tend to value confidence in other people, especially in the “experts” who help us with important decisions in life. Who wants a financial advisor who hesitates in his judgments, or a physician who waffles on every diagnosis and prescription? I want my lawyer to look me in the eye and speak with certainty about the law, and I look for consistency and self-assurance in politicians and leaders. Our decisions in these realms can have profound consequences, so we don’t want to take our cues from the wishy-washy.

Fortunately, these experts are all people, and people offer us cues. The rhythm of speech, nervous tics, posture—all of these and more can signal confidence or insecurity, and we’re pretty good at reading these clues. But what if we cannot see these experts that help shape our decisions?  More and more of our communication takes place on-line these days. We make our judgments and choices based on information that comes without smiles or shrugs or distant gazes. How do we identify self-assured experts in the digital age?

Psychological scientist Daniel Oppenheimer and his colleagues at UCLA believe that the way we use numbers could signal confidence, in the absence of face-to-face contact. Specifically, the UCLA scientists suggest that when people use precise numbers rather than rounded numbers—3012 rather than 3000—this is taken as a sign of confidence in the source, making the information and the expert source more trustworthy. They tested this idea in a couple experiments.

In one, volunteers were asked to judge the answers that others had (ostensibly) given to questions about geography. Sometimes, the answers were rounded to the 100th place—20.85 miles—sometimes to the closest whole number—21 miles. Volunteers estimated the confidence level of the people who had provided these answers, from complete confidence to complete lack of confidence. They found that the geography “experts” were judged to be more confident if their answers had more digits. In other words, precision was seen as an indicator of self-assurance.

The second study examined the implications of this finding. How do we weigh advice from others, and which experts do we trust, using this numerical clue. Volunteers played a game similar to TV’s “The Price Is Right,” in which they used suggestions from the audience to help them. Some of the audience suggestions were rounded to zero ($60), some not ($63). The volunteers’ choices indicated who they trusted to guide them in their decisions. And as reported in a forthcoming article in the journal Psychological Science, they clearly preferred the more precise advisors.

So it appears that we infer others’ confidence based on their precision, and prefer their advice, incorporating their precise expertise into our own judgments.  This may have real-world implications, determining in part which politician’s budget analysis to support, which financial analyst’s profit forecast to heed, which doctor’s view of drug risks to trust.

And of course, this all has to do with perceptions of confidence—not true expertise. Savvy charlatans may intuit this cognitive bias, and use false precision to create an air of confidence—and the illusion of expertise. As Oppenheimer notes, sports pundits are notorious for their excessive precision in reporting information that ranges from meaningless to misleading.

Follow Wray Herbert’s reporting on psychological science on Twitter at @wrayherbert. 


APS regularly opens certain online articles for discussion on our website. Effective February 2021, you must be a logged-in APS member to post comments. By posting a comment, you agree to our Community Guidelines and the display of your profile information, including your name and affiliation. Any opinions, findings, conclusions, or recommendations present in article comments are those of the writers and do not necessarily reflect the views of APS or the article’s author. For more information, please see our Community Guidelines.

Please login with your APS account to comment.