The power of algorithms

How does the internet know which things to show you?
27 August 2019

Interview with 

Beth Singler, University of Cambridge


a computer screen full of computer code


Social media can push us in extreme directions, create these bubbles we can get stuck in. But how? Social media is run by algorithms, programs which spit out the things you see online, working in the background to come up with the things you see. Adam Murphy spoke with Beth Singler at the University of Cambridge, who specialises in the social side of these algoritms, and talked about what they could mean for us...

Beth - At its most simplest state, an algorithm is data going into a model computation and then data coming out. And when you’re talking about algorithms on the internet or social media, you're talking about people's data going into a system and reworked preferences that come from that data input coming out. So you're seeing the same sorts of things again and again when you're expressing your preferences online

Adam - And then how does that relate to the words a lot of people hear, like “machine learning” and then “AI” on top of that?

Beth - It gets complicated because people use the terms in lots of different ways. Machine learning and reinforcement learning, these different systems of different ways are new models of thinking about how these levels of data input interact with each other.  AI has become a catchall for those systems as well as the more personified versions that we get in our science fiction as well.

Adam - The algorithm is just doing its job. Some programmer told it to say “maximise time people spend on the website” and that's what it does. It does things at random first, slowly refining the best approaches, until you have something that works pretty well. Usually. So how do we relate to this algorithm, this bit of code? What can it do to us?

Beth - Well at the very simplest level, you can see it if you pop on Facebook, and you see the kinds of stories that are being promoted to based on your previous interest in other things. You may have seen stories online about YouTube and how various different videos are promoted to you, again, based on what you've watched before. In my case, you watch one video with a previous Doctor Who in it and suddenly you've got ten videos with that previous Doctor Who. And maybe you wanted to see something else in your suggestions, can be useful as well if you're looking for a particular genre of things.

Adam - If it just shows us what we want, can it push us in different directions?

Beth - Yeah absolutely. So there's lots of work being done at the moment on identity formation and community formation online, and how these algorithms can push our interest in particular directions. There is a sort of meta level of algorithms going on; the interest of the corporation that is fueling the content that you're seeing. So the YouTube, the Twitter, the Facebook.

They have an aim in getting you to watch more and more of the content that's available online and they use the algorithms to direct you, based on what you've already shown interest on. It's very easy to get into what they call a “social media bubble” quite quickly, that you're only seeing things that reinforce your existing views or can take you down a rabbit hole of slightly more extreme versions of those views.

There's lots of work being done on the rise of fascism or populism through social media and the kind of messages that are being promoted to people, when they only show a slight interest in a topic.

Adam - It can be hard sometimes to not feel like the algorithm is thinking. It does its job. Even here at The Naked Scientists we worry about how the algorithm will treat our stuff, which apparently isn't unusual.

Beth - This is another strand of my work that I'm really interested in and I've started to look at some very specific instances of people's responses to the concept of the algorithm. Where they personify the algorithm and think about it in terms of what the algorithm wants and how it's treating them.  I've noticed people talking about being “blessed by the algorithm” and, personally, I'm quite interested in religious metaphors anyway. But I think this is a really interesting way into this discussion of how much agency, or even super agency, do we give algorithms. Do we decide that they're making actual choices for us? Are they giving us beneficial moments?

The whole concept of “blessed by the algorithm” would be I've put some content online and it's doing really well. I've been blessed by the algorithm. So it's about giving anthropomorphic agency to something that really doesn't make decisions in the same way that we do. It is a reinforcement system based on existing preferences.

Adam - Is there a better way to think about the algorithm?

Beth - I think we've slipped very easily. As I say, there's this slippage into personification of algorithms. But there's also this slippage into thinking of social media platforms as some sort of form of human rights. Because it's public speech, we think it's something that we necessarily can do in any way that we want to, without really remembering that these are private corporations. You sign up to terms and conditions when you get online with Twitter or Facebook or so forth. They own your content in very specific ways and they have very specific aims for your content as well.

I'm not saying that they’re evil by any means but they have corporate goals and interests. There’s some really interesting work by David Runciman also here at Cambridge, where he talks about the fact that we talk a lot about artificial intelligence but we don't always think as much about artificial agents. And those would be things like the corporations, or the states, or the nations that have their own models that they're working with, their own algorithmic systems of thinking. And, most of the time, being a capitalist system it's about profits.


Add a comment