Science vs fake news – the fight is on
Call it fake news, false content, or propaganda, either way we are in a new era of misinformation and it’s influencing elections and fuelling extremism all around the world. Now, researchers are tackling the challenge head on by helping journalists and social media platforms verify video content and understand how 21st-century propaganda works...
We live in the video age where people are more likely to believe what they see – and that’s adding fuel to the fake news fire.
‘Video in many cases reinforces what you read. This is why all the news outlets want to use video content to reinforce what they present,’ said Dr Vasileios Mezaris, a senior researcher in multimedia at the Centre for Research and Technology Hellas in Greece.
Verifying this content is essential for media outlets to prevent the spread of fake news and protect their integrity, but that’s not as easy as it seems.
‘Most online videos are not professional footage, it is shot by an individual with a mobile phone,’ said Dr Mezaris. ‘This can create many forms of deceit.’
It's a problem faced daily by many journalists.
‘One of the major challenges is definitely the verification of the content because we have a lot of manipulated content going viral,’ said Julia Bayer, a journalist at Germany-based news service Deutsche Welle. 'People are Tweeting content that’s not true and people are sharing that because it is really easy to click on retweet or share.'
One possibility, for example, is that someone shoots video footage in one location but claims that it depicts another. Other examples of manipulative behaviour include using digital effects to change the context of the video, editing the clip to create a misleading snapshot of the event or simply communicating lies to support particular agendas.
‘We know that fake news is a big issue these days, in election campaigns also. If a video surfaces that reinforces rumours it could influence elections,’ said Dr Mezaris, who is the project coordinator of InVID, an EU-funded project that is building a platform for journalists to authenticate newsworthy video content.
InVID’s analysis tool identifies different digital characteristics of video content using traceable data from YouTube and Twitter such as location, weather, and time, as well as scanning comments to determine if people have flagged a video as being fake.
It creates a social media timeline of the video and gives an overall estimate of whether it is fake. This provides journalists with the confidence to determine if social media footage is accurate, or has been tampered with.
At the moment it is being pilot-tested by three leading news outlets that are members of the InVID consortium, the Agence France-Presse News Agency (AFP), Germany’s Deutsche Welle (DW), and the Austria Press Agency (APA).
‘They use our tools in different test scenarios to detect if a video is fake. They give very valuable feedback on what works for them and what doesn’t so we know what needs to be changed,’ said Dr Mezaris.
Video verification could also be used to validate content on social media where fake news is often propagated, although social network companies have more on their plates than news agencies due to the size of their audience and the amount of content posted online.
They also face another problem – the social media echo chamber. This is where our natural biases, be they liberal, conservative or somewhere in between, combine with social media bubbles, meaning that groups of like-minded members congregate together and often keep to themselves, inflating their opinions and preventing them from interacting with opposing views.
Dr Majid KhosraviNik, lecturer in media and discourse studies at Newcastle University, UK, warns how these echo chambers could magnify fake news by reinforcing existing views rather than giving access to information that may challenge them.
‘We are promoted to ‘like’ views we agree with, as ‘liking’ is a form of endorsement and helps certain views to become more visible,’ he said.
‘This is of course done to maximise profit through targeted adverting and promotions. This overwhelming individual marketing approach dismantles society as a collective and when it is applied to politics, it replaces necessary truths with satisfaction (by reinforcing your opinion).’
Dr KhosraviNik works on the EU-funded project MWDIR and over the next two years will investigate how the ideology of Islamic radicalism is legitimised through social media content. He will focus on the terrorist organisation ISIS to determine why their ideas are so persuasive for certain demographics in Western countries and how this persuades recruits from the EU and worldwide.
‘ISIS has been characterised as being very media savvy and specifically very up to date in using new interactive technologies and going around the obstacles of not having access to mainstream mass media,’ said Dr KhosraviNik.
He says that ISIS strategically repeats certain assumptions to a point of saturation where it finally becomes a fact amongst its audience, a propaganda technique similar to those used by the Nazis or in Soviet Russia.
What’s more, he says that in order to increase the effectiveness of its misinformation campaign, ISIS uses an extreme version of what’s known as a self and other portrait of reality, which paints the entire West as radical ultra-right wing, like the English Defence League or other Islamophobic groups.
‘ISIS adamantly endorses, encourages, and invokes performance of extremism at both ends of this spectrum,’ said Dr KhosraviNik. ‘Right wing populism and ultra-right parties in the West and ISIS both encourage extreme measures which help legitimise each other.'
And the problem isn’t just confined to extremist groups. He says that recent electoral outcomes in the West can also be attributed to what he calls ‘affective politics’ where rationality takes a back seat in favour of populism, often aided by social media.
‘Layers of electorates in the UK and US revolted against the status quo … despite being rational. Social media is both the product and enforcer of such preference.’
By Steve Gillman