Policing social media: is it possible?
The British government will soon grant its communications watchdog Ofcom sweeping new powers to police social media. The announcement is linked to the death of teenager Molly Russell, who took her own life in 2017, and whose Instagram feed was later found to contain graphic suicide-related content. Ofcom will now target violence, cyber-bullying, and child-abuse - but how effective will they actually be? Our tech correspondent Peter Cowley joined Chris Smith and Phil Sansom...
Peter - Before we do that, can we just have a bit background. So what we have here is a spectrum between clamping down completely, therefore removing the possibility of the tragic case of Molly Russell, and full freedom of speech. Now we know it's not a good idea to clamp down completely, but we know at the other end there's positive benefits for being able to communicate. This has been a problem for some time now, many years, and the government published a white paper called Online Harms last year. They're putting together some rules, and at the moment they're saying things like it's got to be proportionate and risk-based, and it's got to rely on the platforms to self-police, etc.
Chris - How enforceable is this going to be though? Because the issue is that if I'm a tech company and I am hanging out in the back of beyond somewhere, and I've got servers which are not located in the UK, I don't really care what Ofcom say!
Peter - Yeah, well the German system has fines of up to 50 million euros if you get it wrong. I don't think that's been invoked yet, but there's various things they can do. They can switch off the domain at the ISP...
Chris - But can they Peter? Because the thing is, the domains are registered not in the UK, some of them, are they? Some of these domains are actually held in servers in America, so you'd have to have some kind of bilateral relationship with the Americans and say, "I want to turn off Facebook." And then America might have antibodies about that and say, "no, you're not doing that."
Peter - Yeah, that's not quite true because China manages it, doesn't it? With 10,000 plus domains.
Chris - But China has this sort of giant firewall infrastructure, and Russia are talking about having one as well, aren't they? Where basically there's a ring fence on the internet around a territory or geography, and they control and potentially, probably, inspect everything that's going backwards and forwards across that firewall, so they could just turn off a domain. I mean, are we talking about therefore, the internet was this amazing free-for-all for many years... are we now talking about carving it up?
Peter - Exactly. I don't know. I think the most likely thing that's going to have some force is to have a named individual who is responsible within an organization. Who wants to put their hand up within, say, Facebook UK and say, you know, "me"... Because this is an imprisonable offense, effectively. That will then trigger some change internally. Because in the end, switching off the domain, as you say, is too extreme.
Phil - Well, let's say Ofcom can enforce these rules. From the company's end, say Instagram or Facebook, to what extent are they already set up such that, when you look at a certain type of content, the algorithms then give you stuff that's very related to the stuff you're looking at. They say, "oh, you like this? We're going to give you more of that." To what extent are, if they're trying to regulate certain types of content, they fighting against their own internal...?
Peter - Exactly. So I'll give you an example in a minute. But what you've got to look at is the problem that we don't know whether they're content providers or just platforming content from other people. They clearly are editing content coming. I mean, I don't know if you noticed but today, Mark Zuckerburg in Switzerland actually said something: he said they have 35,000 people working on this problem. It's costing $5 billion a year. They're switching off a million accounts a day at the moment. So they're obviously trying to do something, but clearly, how can they do enough?
Chris - Presumably that's one of the ways in which this problem can be solved because algorithms can be written to spot the very things that enable these sites to bring people together who have common interests and beliefs. That whole technology could be turned to find the very things that we decide not in an interest of an individual or society.
Peter - I'm convinced it's already being done. There's no way that a million accounts can be switched off by 35,000 people. They can't read it all. And even if they do read it, how would you interpret it as a human being?
Chris - The problem is, and I've run into this, where Google has decided that certain pages on the Naked Scientists contain graphic content. And actually when you look at it, because we're a medical site, we've got pictures of bits of the body. Now they're not necessarily naughty bits of the body. There was one page the other day that Google had condemned, and it was a wound being sutured, it was very informative, but they had decided the image there was not in the interest of people to see. And in fact it was fundamental to explaining how the technology.
Peter - But that's freedom of speech again, isn't it? Where on the spectrum...
Chris - But that's what I'm saying. I think there is a danger we're going to take the wrong things down.
Peter - I'm sure the three of us in this studio all have our own views about this, and all your listeners will have their own views. And who makes that decision? How can we expect the government, in whatever form, to get it right? We can't. But they must do something to prevent the sort of things that are causing harm and suicides.