Do social media platforms protect users?

There's a link between high social media consumption and poor mental health in teens. What's being done?
26 October 2021

Interview with 

Gareth Mitchell, Imperial College London


App icons on a smartphone screen.


Last week Facebook came under scrutiny after internal research on children's mental health was leaked to the press. What's the latest on social media?

Gareth - The social media companies are, I suppose, doing their bit, a lot of people saying they're not doing enough, but, for instance, last year, Facebook, along with Google and Microsoft and a lot of other tech companies, got together to launch what they call project protect, which is a plan to combat online child sexual abuse. And some of those are things that you'd expect like tech solutions improving their algorithms and that kind of thing. But a lot of these fixes were more about just the social aspects and indeed ways that the companies could work together. So more transparency, more knowledge, more sharing, and more working with policy makers. And so that certainly from the social media companies. But they always just seem to be under the spotlight. And earlier this week, you'll know that there was this BBC investigation on Panorama that in this case was looking at abuse against women. So a different, but related issue and the technology companies. And I'm afraid that, Facebook and Instagram, which of course Facebook owns, came out the worst in terms of pushing violent and misogynistic content to users. And of course Facebook said we are improving our algorithms, we're trying to sort this out, but certainly in these really important issues about hate speech. And then of course, where we started this conversation in terms of protecting children and combating online child sexual abuse, the technology companies are taking action, but many people are not saying enough and that there's a huge role now for regulation.

Harry - And I guess that's what it comes down to. It's not feet on the ground in this world because it's across the internet. It's how good your algorithm is at finding these ads or finding this content and removing it. Is that right?

Gareth - Well, that's partly it. And I suppose to be fair to these companies, they have loads of users and these are difficult things to police. And of course Facebook will say it has employed thousands of fact checkers, for instance, and YouTube has thousands of people screening through videos to check for abusive or violent content there, or content that breaks it's terms and conditions. But I suppose our argument would be, well, we just think you should employ more people to do this. And of course the tech companies that's hurtful because it affects their profits. And of course, people like us enjoy using these services for inverted commas "free". But of course they're not free because we're giving them our data and our attention. But I think what this points to, I mean, it's very complicated, but just the economics of social media, this perception that they're free and our expectation that we can just get onto them without actually having to make a cash transaction and maybe realizing a little less than we should, but it is a transaction. It's just our data and our attention that we're handing over. And I think I'd be amongst some who would say actually that is more valuable than cash within reason.

Harry - Something that always feels quite blurred as well, but hard to remember when you're just popping on your phone on an app to have a look down social media, you are giving something back.


Add a comment