Vaccine misinformation and social media

How do social media platforms affect public opinion and trust in vaccines?
14 June 2021

Interview with 

Yotam Ophir, University of Buffalo

SOCIAL-MEDIA-APPS

A smartphone screen displaying social media applications

Share

Although vaccine hesitancy has been around in various forms for centuries, modern technologies like smartphones and social media that allow uncurated communication have moved the goalposts on a massive scale, allowing the mass proliferation of what is sometimes highly misleading information. Yotam Ophir is from the University of Buffalo where he studies how these platforms can affect public opinion and trust in science and medicine, and he spoke with Eva Higginbotham...

Yotam - On average, both in the United Kingdom and in the US, trust in science remains actually high relatively to other institutions. In fact, in the United Kingdom, surveys indicated that there is some increase in trust in science during COVID. However, and it's important, averages can be misleading. Part of the reason why we don't see changes in trust of science is a movement towards political polarisation in trust. So here in the United States, for example, we see an increase in trust among Democrats, but also a worrisome decrease in trust among Republicans. That's polarisation.

We might pay a heavy price for it in the future when we need to cope with other challenges such as global warming and so on. So even if on average trust in science remains relatively high, I believe our focus should be directed to sub-populations, political or otherwise, where misinformation and this trust is actually on the rise.

Eva - And on that note, what can we actually do to try and increase public trust?

Yotam - So my own research suggests that to increase trust in science, we need to educate the public, but by educating, we do not mean teaching people scientific facts, for example, that the world is warming up. Those were found consistently in recent years to be ineffective. Instead, what we found is that explaining to people how science works, what is the nature and values of science, could increase trust in science and allow people to better understand why they should trust science and why science is a reliable way of learning about the world.

Eva - I see, so it's not so much about learning this fact, it's about learning how that fact was discovered. How we decided as a community of scientists that this is the truth. You've done some research into the role of the media in particular, in creating or damaging public trust in science. What have you found is helpful in that scenario for the media to do?

Yotam - So in my work with Kathleen Hall Jamieson, we found that one problem with how the media discussed science is that they tend to focus too much on individual achievements and failures. So most times media coverage of science focuses on the hero scientists who made a breakthrough or the villain scientist who committed a fraud. Now we believe that a more accurate depiction of science should focus on the scientific community and its values, the consistent skeptic search for the truth and the ability of science as a community project to self-correct itself and identify mistakes when they are made.

A good example for this is the Johnson and Johnson vaccine that was put on hold in the United States after some reports of blood clots among females. This could be depicted by the media as a crisis, as, as a sign that science doesn't work. But we believe that it's a sign that science is actually healthy, that science is doing what it needs to be doing. Even after you approve something, you keep testing it, you keep being skeptic about it. The Johnson and Johnson vaccine was put on hold, was retested, was found to be safe and then was redeployed.

Eva - And the thing underlying a lot of this is social media. Lots of people, I mean, everyone's on something, Twitter, Facebook, whatever it is. Why does misinformation about things like vaccines spread so well on social media?

Yotam - Right? So it's easy and tempting to blame us, to blame the people for spreading misinformation. But in my view, the biggest problem with social media is not the people who use it, but the algorithms working behind the scenes. What we call the 'newsfeed' in Twitter or Facebook is actually programmed to keep users engaged for as long as possible in order to increase profits by those private companies. So basically social media show us what we want to see. It shows us what they believe is engaging content that's going to keep us engaged for as long as possible. That content is often misleading. That's because the truth is often much more boring than conspiracy theories. And so social media algorithms are pushing misinformation to the top of our feeds to keep us engaged, to increase profits. Now because of those algorithms, in part, those who distrust science managed to remain a very, very loud minority that can influence others online as well.

Eva - So it seems like a lot of social media is kind of almost bound to be perpetuating this negative stuff, this misinformation. Is there any way that we can harness the power of social media though, to spread helpful stories and narratives and facts about science and about the vaccines.

Yotam - So health organisations and science communicators do their best to harness social media for the benefit of society. But in my opinion, they often do so without following the science of science communication. Again, as I said earlier, just providing facts doesn't work. So if science communicators should get better at creating engaging content that matters, content that takes into consideration the values and characteristics of the audience and relies on engaging messages in order to make the point about, for example, the safety of vaccines. So social media do offer a prominence for science communication, but it will require us to get better at working in this platform.

Comments

Add a comment