While the past several decades can rightly be called the “information age,” the rise of social media platforms makes is seem like the past few years could be considered the misinformation age. The viral nature of alternative facts, rumors, and planned disinformation campaigns has taken its toll on global politics, the economy, and healthcare. If only we could be told when we were reading “misinformation” that might just solve our problems! Or perhaps not. Does knowing that a news story is false really inoculate us against its misinformation?
Taking a new look at misinformation is “Max” Bai, a post-doctoral scholar in the Polarization and Social Change Lab as part of the Stanford Impact Labs. who presented new research at the 2022 APS Annual Convention earlier this year.
Charles Blue (00:13)
While the past several decades can rightly be called the information age, the rise of social media platforms makes it seem like the past few years could be be considered the misinformation age. The viral nature of alternative facts, rumors and planned disinformation campaigns has taken its toll on global politics, the economy and healthcare. If only we could be told when we were reading misinformation that might solve all our problems. Or perhaps not. Does knowing that a news story is false really inoculate us against misinformation? This is Charles Blue with the Association for Psychological Science. And to explain one piece of this puzzle is Max By, a postdoctoral scholar in the Polarization and Social Change lab as part of the Stanford Impact Labs, who presented new research at the 2022 aPS annual convention earlier this year. Welcome to Under the Cortex.
Max Bai (01:15)
Thank you very much for the opportunity. Glad to be here.
Charles Blue (01:18)
Can you explain to us in very simple terms what you found about misinformation nd why you’re researching it?
Max Bai (01:26)
Yeah. So in the current psychological intervention research, a lot of studies on how we can combat and address this information issue and people’s exposure to it is to help people recognize the missing information as missing information. So telling people or helping people to recognize the fake news as fake, at some point I just started to wonder, does this thing really help? And to what degree this thing can play a role in our overall strategy of combating misinformation? In particular, how it’s related to some other alternative approach such as like a counter moderation or using algorithm to potentially rank lower the misinformation in people’s environment. In the beginning, as a social psychologist, I tend to use experimental methods and look at how people respond to information like this. In one set of experiments, what I found is that by just like blatantly tell people and ensure people that what I’m showing them is missing information and fake news and things like that, are people still going to be affected by it? Are people going to believe the content that are related to it? And are people’s attitudes or behavioral intentions to be shifted around based on what I’m saying in the misinformation, again, the misinformation people know about it is not real.
Max Bai (02:45)
And in the end it sounds like people still are, their beliefs are still shifted around and their political attitudes really due to are still changed. So to me that sounds like it’s suggesting that, well, probably just by telling people and helping people recognize misinformation and misinformation is not going to be the entire part of the overall strategy. And sometimes we really need to gravitate towards a lot of strategies, are focusing on the macro level things, the things about reducing the presence of misinformation in our environment. In the beginning, it seems counterintuitive.
Charles Blue (03:19)
If I were told that I’m just going to be reading essentially fiction or something that has less than factual information, I would disregard. I might read it for the amusement’s sake. But what you’re saying is even with that information, the people in your study, when asked later, expressed some agreement or some recognition that what they heard was true. Is that what you found out?
Max Bai (03:46)
Yeah. So I think a lot of it really depends on what people are going to do with the misinformation once they read about the false nature of that misinformation. If people are just like, stop reading and not even looking at the content or the headline, people are not going to be exposed to anything and therefore probably no change going to ever happen on their belief in attitude. But usually a lot of times when we’re seeing something in our social media feed, regardless of false, we’re not like, our attention just like, automatically drift toward it. And sometimes they can still potentially change our belief in attitude.
Charles Blue (04:24)
Could you explain a little bit about how you actually constructed this research?
Max Bai (04:29)
Yeah, so I did a couple of studies. There are five experiments in their total. I have participants just, like, read a sentence, something along the line of like, okay, what you’re going to read is entirely false and made up. And also to make sure that participants understood that it is not real, they also have to type down their understanding. They couldn’t just copy and paste. They really have to type down so they understood it. Then they read one of the three versions of a false news headline. Again, all of them are entirely made up. In one version, it talks about how eating black pepper can help people prevent getting cold. One talks about how eating black pepper can lead people to have cold. And another one is something that’s just not relevant to the black pepper. So that’s designated control condition. Then I ask participants how likely they will increase or decrease their eating for black pepper in the future. So what the result shows is that people still indicated they wanted to eat more black pepper if they were in the condition that says black pepper is helping people to prevent COVID-19. And people indicated they wanted to eat less black pepper if they were in the causation where the article says that eating black pepper can lead to getting cold, 19 and the participants for sure understood what’s not real.
Charles Blue (05:46)
So you found that even knowing it was false, people change their behavior.
Max Bai (05:50)
Well, in that particular study, it’s what we call the behavior intention. So it is just a question of people indicating how much they wanted to eat more and less in the future. It was not really measuring about the exact behavior that people were going to do.
Charles Blue (06:05)
So you had a fake news article. Was it a full article or just a headline?
Max Bai (06:11)
Oh, yeah. So in that particular experiment, what I had is just a short Twitter like posting. So it is a headline with just like two, three sentences or maybe a couple more with a picture and that turned out to be by self sufficient to move people’s attitude.
Charles Blue (06:28)
Well, that seems to make sense because if you had to have them read a full article there has to be some explanation of why this is or how it works or studies that have been done. But in this case it’s just a very simple statement. Black pepper either leads to covet or helps prevent COVID. Did you have the chance to go. Back and check this later down the line? So this was just an initial report after reading this and they responded yes or no far more. Do we have any information about any more ingrained impact of misinformation?
Max Bai (07:01)
Right. For that particular study, I didn’t do any of the longitudinal components so I couldn’t really tell whether it actually shaped the behavior down the road or how long it lasted. But in a different side of the experiment I actually did longitudinal things. So in that experiment I did the same thing with telling people they were going to read something that’s made up that’s false and they read one of the two versions of the article. One article claimed that Republicans are more supportive of the smoking ban in the public places and the other version talks about how Democrats are more supportive of doing this than Republican. And I looked at participants own belief in how much Republican versus the Democrats are supportive of this policy and that being changed after the exposure, not just immediately after the exposure, but that was something that was observable two days later and again nine days later. And what I think is even more interesting is that I also asked participants how likely they themselves will want to support spanning smoking in the public places. And what I discovered is that people’s attitude towards it diverge based on their own political partisanship.
Max Bai (08:15)
So Republican people became more supportive of it if they were reading the article that talks about how other Republicans are more supportive of it, whereas the Democrats are on the reverse trajectory. They become more supportive of it if they were in the other condition that talked about how Democrats are more supportive of this. And this was something that’s also temporarily enduring. So that was something that’s observable two days later and again nine days later. So that gives us a little more evidence that this is not something that just like change your attitude in the moment but that’s something that could even lead to a temporary enduring attitude or change.
Charles Blue (08:52)
To make sure I understand this question about whether or not you support banning smoking in indoor spaces or public spaces, this also was something where people were aware that it was fake news or was this just a general statement and they had to process as far as it was related to their politics.
Max Bai (09:13)
Right? So this was something that participants. They were also thoroughly instructed. It was something that’s entirely made up. There’s nothing really in there that’s remarkable. That even though people don’t know that they were reading something that was a fake comment about a political party, depending on their own party, it actually changed, at least for a week or so, what they felt about a particular social issue. Is there anything in this research that you’ve done that actually surprised you that you kind of know people are susceptible, but is there something that you uncovered that was not anticipated?
Max Bai (09:51)
I think that result itself is something that is quite interesting to me and quite surprising because in most of the psychological research you usually don’t see things that can last for a couple of days. Like if you can get a manipulation work in the same session, that will be great enough. And this is something that can last for a couple of days. And what it also surprised me is that it has like a very to me, it has a very frightening implication to the democracy itself, essentially suggest that there is a very easy way for people to create political division where there is none in the beginning. And it’s not something that’s just like creating in just one moment. It is something that can pick up for a while to fade away potentially. And if this is something that political actors with bad intention try to do in the electorate, they can very easily just create a real division among the citizens that just self perpetuate over time. In that study, I also just didn’t deliberately choose to not use any real issue that is harvesting the environment. And some people might say, oh okay, people probably didn’t have any attitude and belief on this in the beginning, but I think that’s precisely what is so scary about the implications. Because a lot of times when we’re looking at a novel social issue that is not yet politicized, like COVID 19 in the beginning, just like a very minor division among the polio elites, plus some repeated mission that is in people’s environment that can buy itself just perpetuate over time and exacerbate and widen the division between political partisans.
Charles Blue (11:33)
Seems like you could even pick a topic that is completely benign and may have absolutely no prior feelings toward and wouldn’t even consider a political issue. But with this approach you could turn it into a wedge issue, hitting two sides against each other, even if they knew that the information was made up. It seems very easy then to manipulate people when politics is involved. Would you expect a similar enduring factor if it weren’t political? Say for example with your black pepper example? Is there something different in the way people process or emotionally connect to an issue that makes it uniquely powerful when it’s a political comment?
Max Bai (12:18)
Right. I think about politics in this context it is pretty special because people in America in particular are very motivated by their political identity. It is a combination of the group identity motivation and other psychological motivation, including sweat perception, epistemic need and all of that.
Charles Blue (12:39)
You have a couple of very interesting results. One about people immediately saying whether or not they’ll choose to slightly modify their diet depending on how it impacts COVID. The other is creating a wedge issue where none existed before. Where does this research lead? What’s the next step for us to understand so we have a little better understanding of how fake news, how misinformation, can so easily elicit a response even when we know that it’s not true, right?
Max Bai (13:12)
Yeah. I think generally in the whole field of social science, more needs to be done about how to reduce people’s exposure to misinformation in the beginning, how to reduce their presence in people’s environment. That would be a lot more important in many ways than helping people to recognize them once they are already in people’s environment. Of course, helping people to recognize misinformation as fake is something that is better than nothing. But within psychology, I think also what can be done about just like helping people to reduce their sharing of missing information. A friend of mine was actually recently doing some work on this. I think he found something like people were engaged in this thing called mindless sharing. So they look at their social media feed, they see something and without knowing or verifying the authenticity of the content, they just go ahead and click sharing. And that’s one of what I think the most dangerous type of online behavior people can engage in. And I think also in the future, probably psychological researchers could engage in more for intervention research that reduces that kind of thing like sharing content that are false or misleading.
Charles Blue (14:19)
So the idea is essentially nip it in the bud before someone has a chance to share it because once it’s essentially out of the box, it’s much harder to counteract even if you are able to flag things as misinformation.
Max Bai (14:37)
Charles Blue (14:39)
Well, I guess we do have our work cut out for us as we move into the future, particularly with the upcoming elections in the United States. I’m certain we will have our fair share of less than accurate details. But if the idea then is identify it, help people identify it, and in the long run reduce the exposure to it, that sounds like as good a start as any. This has been Charles Blue with under the Cortex and I have been speaking with Max By who is with the Polarization and Social Change Lab at Stanford Impact Labs and who was one of the featured Flashtalk presenters at the 2022 APS Annual Convention earlier this year. Thanks very much for joining me.
Max Bai (15:20)
Thank you very much.
Feedback on this article? Email firstname.lastname@example.org or login to comment.