If you spent Thanksgiving trying in vain to convince relatives that the Pope didn’t really endorse Donald Trump or that Hillary Clinton didn’t sell weapons to ISIS, fake news has already weaseled its way into your brain.
Those “stories” and other falsified news outperformed much of the real news on Facebook before the 2016 U.S. presidential election. And on Twitter, an analysis by University of Southern California computer scientists found that nearly 20 percent of election-related tweets came from bots, computer programs posing as real people and often spouting biased or fake news.
And if you think only people on the opposite side of the political fence from you will fall for lies, think again. We all do it. Plenty of research shows that people are more likely to believe news if it confirms their preexisting political views, says cognitive scientist David Rapp of Northwestern University in Evanston, Ill. More surprising, though, are Rapp’s latest studies along with others on learning and memory. They show that when we read inaccurate information, we often remember it later as being true, even if we initially knew it was wrong. That misinformation can then bias us or affect our decisions.
Read the whole story: ScienceNews