Big Data, Big Problems?

Do algorithms run the world? We explore the limits of automation, and whether we're losing our free will...
22 September 2020
Presented by Phil Sansom, Chris Smith
Production by Phil Sansom.

BRAIN_PHONE

Abstract image of a phone displaying a neuron, and a person thinking.

Share

Do algorithms run the world? Nowadays we measure the amount of data we generate in zettabytes - that’s 1 followed by 21 zeroes. This data, in turn, powers algorithms that are getting more and more sophisticated at predicting our behaviour, and are making ever more decisions for us. What does this mean for our society, privacy, and even our inner selves? Plus, in the news, the science - or lack of it - behind the latest COVID “rule of six” guidelines; the Arctic ice shelf that’s lost a Manchester-sized chunk of itself; and the whales that ended up tens of kilometres up an inland river…

In this episode

Families and groups of people sitting on the grass in a park.

01:05 - Coronavirus: any science behind rule of six?

New coronavirus restrictions in England include a six-person limit on gatherings. What's the reasoning?

Coronavirus: any science behind rule of six?
Gabriel Scally, University of Bristol

This week the number of COVID cases reported worldwide passed 30 million, with recorded deaths approaching a million and counting. We’ve seen surges in the countries worst hit - that’s India, Brazil, and the USA - but also in Argentina, France, Spain, and many others. Israel was the first country to enter a second national lockdown, ahead of the Jewish “high holy days”. Here in the UK there has been a sharp spike in cases, although without a corresponding increase in deaths, likely because this spike is mostly - at the moment - among younger people. In response to this, on Monday the government introduced the “rule of six” limit on gatherings, and Boris Johnson announced an ambitious plan for wide scale mass testing called “Operation Moonshot” - but this has proved to be embarrassing as testing facilities across the country were overwhelmed. What is the scientific rationale behind these interventions, will they even work, and what are the alternatives? Chris Smith asked the University of Bristol’s Gabriel Scally: first of all, why pick 6?

Gabriel - Well, I think it probably comes from that scientific method of wetting a finger and raising it to the wind. There doesn't seem to me to be any science about it. If one was to try and come up with something that had some scientific background, you would look at population density, you would look at household size in particular; and we do know that the places where the virus is at its most prevalent have poor housing and overcrowding, and these tend to be amongst the most deprived areas in the country. So I don't think there is science behind it, but we should be paying attention to household construction, I think that's extremely important: I mean the age and the generations involved in living together in a house.

Chris - They're going for then the domestic setting, because the figures that have been published suggests that's where the bulk of the transmissions occur. Is the rationale then that if you limit the number of people getting together in households, you do cut off the virus transmission at its knees, at least a bit?

Gabriel - I don't know if you can cut something off at its knees a bit! But yes, that would be the idea behind it. Now whether it'll work or not is another question, because the virus pops up in various different places, and we've already seen a significant number of workplace outbreaks, often to do with the food industry and often to do with migrant workers. And often that is difficult to discern whether the transmission is the workplace or the living conditions, in which migrant workers are often grouped together. So the answer I think is not in the area of trying to make some very precise social restrictions. We need a broader, deeper approach to the virus; and that's just not what we're getting.

Chris - You're in reasonable company, because a number of other academics have come out and said that people just won't tolerate this, and this will actually breed some degree of disrespect for the rules rather than respect for the rules. And that will lead to more flouting of the rules and actually more transmissions, especially with Christmas approaching.

Gabriel - I think there's a lot in the behavioural science behind that. And certainly people's confidence in what they're asked to do is really, really important, and community solidarity is extraordinarily important; and if you lose the confidence of the public, well, then you lose the battle against the virus. One of the anachronisms is talking about policing with marshals and fines and so on, and I don't like the idea that we have to spy on each other and report each other to the police. This is not how you mobilise an entire community to deal with this infectious disease.

Chris - The most recent announcements dwell very heavily on testing this whole concept of a 'moonshot': talking about, by October, scaling up testing to four- or five-fold over what we're seeing at the moment, and then possibly into million scale testing by next year. Do you think they've lost confidence in the question of a vaccine, and so now they're going down the test route?

Gabriel - Well, it's a very interesting question because on several occasions there have been interventions from the top, which have sought to cast a really, really optimistic picture before us: herd immunity, the promise of an app, huge enthusiasm from the very top about antibody testing, and that was going to be the next great thing. And now we've got the moonshot. It's no substitute for doing the hard public health graft to get this virus under control and a strategy to back that up.

Chris - So what would that hard public health graft be? What would you do differently than the present strategy, or lack of, that we see playing out?

Gabriel - I would construct a system which had all the elements to it: find, test, trace, isolate, and support. And there are also reports of people not coming forward for testing because they know they'll be told to isolate, and they won't be able to go to work and earn the money they need because they're on a zero hours contract. So therefore I would take the very large sum of money that is being wasted on an ineffective, non-functioning NHS test and trace system and plow it into local areas: building up public health teams, making use of people like health visitors who know their communities, environmental health officers who do contact tracing for infectious disease all the time, and community leaders, particularly from ethnic minority communities, and local councillors as well, to really turn it into a community effort.

Two facemasks crossed on a red background.

06:50 - Trinity Challenge launches to fund pandemic protection

A former Chief Medical Officer for England has kicked off a multi-million pandemic-proofing fund...

Trinity Challenge launches to fund pandemic protection
Sally Davies, Former Chief Medical Officer for England

Even as we’re struggling with this pandemic, many scientists are watching warily for the next pandemic - whatever it might be. For this reason, former Chief Medical Officer for England Dame Sally Davies has just launched the Trinity Challenge: a multi-million pound fund to support ideas that might protect us against the next worldwide disease. The money comes from a range of sources including the University of Cambridge, the Gates Foundation, Google, and Facebook. Sally joined Phil Sansom and Chris Smith...

Sally - Well while everyone else faces the here and now, which is terribly important, we've been thinking about how to prevent the next one. Because every five years in this century we've had an outbreak which has become a pandemic, including ebola, SARS, MERS, Zika, and of course the 2009-10 flu, which we were lucky - it was mild. What are we missing? We need better ways to identify what is out there that might come out and impact on humans. And once we've got it, surely we should be able, if we prepared and thought about it effectively, to respond better. And then finally, the third area is recovery: how to come back together again, not only as a society, but without damaging our economies too much. So we're trying to bring together people to look at these three different areas. And we've got 22 wonderful founder members who are contributing not only funding, but more importantly, data and people. I think you could call it colliding data science with public health - whether it's genomic data, behavioural data, economic mobility, health - in a different way, so that we're better prepared.

Chris - Why has this taken so long to implement though Sally? Because as you highlight in this century, we've seen so many examples of this happening again and again and again, and with increasing frequency; why has it taken this one to teach us a lesson? Why have we not done this before?

Sally - We couldn't have done what we're trying to do five years ago, actually, because it relies on having the data there, with engineers and data architects who know how it works, how it looks. This couldn't have been done five years ago, but it can now; and we have to do it now or we will be remiss.

Chris - How much money have you got? What's your firepower looking like on this?

Sally - We're aiming for a hundred million because we want to do three challenge rounds, one a year for the next three years. This is about making a difference; so people need money for recognition and to help them move it into the next stage but it's not about getting rich.

Phil - Sally, can you just make it very clear for me: what's the kind of thing that I could come to you and you'd say, “yes, here's the big bucks”?

Sally - Okay, so is there a way that we can pick up a spike of a new virus in the SARS family by looking at sewage, but make it easy? Is there a way that if we looked at all the viruses in bats that pharma companies could make pre-pandemic vaccines that they could tweak so they were ready if that one came out of the box? Or that they could look with AI at all the possible libraries of drugs and see which ones are on the shelf that might work? 

Phil - Is there a risk, Sally, that these partly private enterprises you're funding are maybe not as appropriate as a public one to tackle what is public health?

Sally - I don't think so. I think that public health does not have the data they need, nor do they have the great skills of AI and machine learning. We have to bring in different academics, different data sources, and it's no secret that these platform companies have taken some of the best academics too. We need the best brains wherever they are in the world. Geeks and nerds need to come together and solve these problems.

Phil - That's on the supply end, but once you've got your product, whatever it is out the other end... there's been discussion about this in the context of a vaccine as well. If a vaccine does eventually get created, who's going to get it? Some people think it should be free. When it comes to your sort of projects, who's going to get them?

Sally - It is not our job to put them into practice, but they won't win if they're terrifically expensive solutions that could only be used by the rich nations. We are interested in the global good and global products. We have the support of the World Health Organization as a delivery organisation, and they may well help us with judging. So we're looking for insights and tools that are affordable, that can be used across the world, and then we're going to hand them off to multilaterals, to delivery partners: people who make things happen, but don't do the research and this innovative thinking. We're filling a hole.

global view of the surface of Venus

12:32 - Phosphine on Venus: a sign of life?

Scientists have detected a gas in Venus' atmosphere that they can't explain...

Phosphine on Venus: a sign of life?
Ben McAllister, University of Western Australia

Venus shares a lot in common with the Earth in terms of its size, proximity to the Sun, and possibly its past history. As a result, scientists have wondered for a long time if life could have got started there. But those theories were dealt what seemed like a knockout blow a few decades back when space probes revealed conditions sufficiently extreme to melt metal on the planet’s roasting hot surface. But could life nevertheless eek out an existence in the planet’s upper atmosphere, where the conditions are more mild? This week that theory became a lot more plausible when scientists announced they’d picked up the signature of the chemical phosphine, regarded as a sign of life, coming from clouds above Venus. Ben McAllister…

Ben - As far as planets in our solar system where you wouldn’t want to live are concerned, Venus, our nearest neighbour, might just take the cake.

It’s famously inhospitable, with surface temperatures hot enough to melt lead, thick clouds of sulphuric acid, corrosive rain, and crushing surface pressures. For life as we know it, it’s hell.

And yet, despite all of that, earlier this week an international group of astronomers reported something truly remarkable. In Venus’ upper atmosphere, they detected the presence of a chemical compound known as phosphine - a phosphorus atom joined to three hydrogen atoms, which is widely regarded by scientists as a signature of life.

If the hypothesis is correct, it’s difficult to overstate how big of a deal this is. It would be the first ever detection of extraterrestrial life; proof that we truly aren’t alone in the Universe.

Now, don’t get too carried away - nobody is theorising little green men just yet. If we are looking at life here, it is very likely some kind of extreme microbial life, tiny organisms capable of surviving in the harsh, acidic environment of Venus’ upper cloud decks.

So how did the team discover this?

They used a technique discovered originally by Robert Bunsen - of “burner” fame - back in the 1800s. His breakthrough was to realise that different chemicals absorb and emit specific colours - or wavelengths - of light, and a particular combination of colours are unique to a given chemical compound. So by measuring the colours of light that the atmosphere of Venus, or another planet, absorbs and emits, we can identify the cocktail of chemicals that are present without having to actually go there. 

The team turned the Hawaii-based James Clerk Maxwell Telescope and the South American ALMA telescope to stare at Venus and record the light coming from the planet’s atmosphere.

What jumped out from the data was a signal consistent with phosphine at a concentration of about 20 parts per billion, indicating possible life in the upper atmosphere.

But phosphine isn’t just produced by life. There are other natural chemical processes that can produce it. And it’s also been found elsewhere in the solar system before - for example, around gas giants like Jupiter.

However, on a rocky planet like Venus, with the kinds of conditions we know exist there, we aren’t aware of any mechanisms for making phosphine other than as a by-product of living organisms.

So, it is possible there is some new chemistry, or some other as-yet-undiscovered mechanism which is responsible for the presence of phosphine in the atmosphere of Venus, and it definitely requires further study to confirm - but for now, it’s arguably the strongest signature of extraterrestrial life we have ever detected.

Ice breaking off a large ice shelf.

17:35 - Arctic ice shelf loses enormous chunk

Thanks to hotter summers in the Arctic, a huge calving event has shrunk its largest remaining ice shelf...

Arctic ice shelf loses enormous chunk
Jenny Turton, Friedrich-Alexander University

The Arctic’s largest remaining ice shelf this week lost an enormous chunk into the ocean. The shelf, called 79N, comprises thousands of kilometres of glacier hanging over the ocean to the northwest of Greenland. Now it’s more than a hundred square kilometres smaller. The loss of glacier ice like this isn’t going to be replaced anytime soon, and in combination with similar events last year, as well as wildfires raging in the Amazon and the west of the USA, it paints a bleak picture for the planet - as Phil Sansom heard from Friedrich-Alexander University's Jenny Turton...

Jenny - In 2019 and in 2020, in both years, huge chunks of ice have broken off a glacier in the Northeast of Greenland. In total, over the two years, it's around a hundred kilometres squared of ice that's been lost, which is roughly the size of Manchester.

Phil - I mean, that's a huge area.

Jenny - Yes. And way bigger than we've seen previously. Normally it's two to three kilometres squared that are lost each year. But for the last two years, we've seen a much bigger increase in the amount of ice that's lost.

Phil - When you say it's lost, it's not like it mysteriously disappeared, right?

Jenny - No, no. The technical term is carving. This is where a piece of glacier or ice breaks off into a big iceberg or multiple big icebergs and floats away in the ocean.

Phil - So this isn't a bit that was on the glacier in Greenland on the land; this was a bit that was on the sea, and now it's broken off into the sea.

Jenny - Exactly, yeah. So the 79N glacier, partly it's on land and partly it floats on the ocean, which is called an ice shelf. And the part that has broken off was already in the water.

Phil - What is it that causes something like this, and especially causes it to happen two years in a row?

Jenny - There are quite a few reasons that are responsible for carving events. They're a natural process, normally, when they're much smaller. In the last two years we've had exceptionally warm summers in Greenland; in both 2019 and 2020 we had record breaking air temperatures. And because it's floating on the ocean, it's also vulnerable to warming oceans as well. When the air temperatures get particularly warm, you get a lot of melting on the surface of the glacier. This water then drains down into cracks and adds additional pressure to the glacier, which widens the cracks. And then this can end up causing a breaking off of the ice and it floats away.

Phil - So unquestionably, this is a rare event - thanks to global warming.

Jenny - It's very difficult to pinpoint particular events to climate change. But when you are having multiple extreme events, year after year, it becomes very hard to say that it's not climate change.

Phil - What do you predict for the future? Can you track how it's melting at the moment?

Jenny - Yeah, we've got quite a lot of observations going on in tracking the speed of the ice, the thickness, how much melting is happening. In terms of predicting, it's quite difficult, but the glacier that sits just south of 79N, in the last decade, lost all of its floating aspect. And now we're starting to wonder - are we going to see the same pattern in the neighbouring 79N glacier?

Phil - What are the consequences of losing these huge areas of ice shelf?

Jenny - I think sea level rise is one of the bigger problems that we face because obviously whilst the ice breaks off locally, the sea level rise will end up being a global phenomenon. Mostly it's through indirect sea level rise. Direct sea level rise is when you lose land ice that goes into the ocean and automatically melts and causes more sea level. Because ice shelves are already floating, the mass of them is already taken into account, so when they break off we don't get direct sea level rise; but they allow more of the land glaciers to flow out to the ocean, and so then you get indirect sea level rise perhaps a few years later.

Phil - How much to you is this sort of a bellwether for climate change as a whole?

Jenny - The Arctic is always seen as a very important location for climate change because it often feels the effects earlier than in other places and also to a larger effect. And because it's quite vulnerable to changes in the ice, which then affects how much solar radiation is absorbed, just a small change in the Arctic can cause quite a big impact.

Phil - I asked because it's been quite a hot summer here in the UK. And also obviously recently in the news, there have been these horrible wildfires all down the west coast of America. Are we seeing the same phenomenon?

Jenny - Yeah, it's all related really. In 2019 and 2020 we had particularly warm summers - heat waves in most of Europe, as well as in the Arctic. And this was also responsible for the temperatures in Greenland. And rightly so, we're seeing these record breaking wildfires on the west coast that are bigger and more fierce than they've ever been before. I saw a good analogy earlier: it was, if you light a match and throw it on a green lush forest, it's going to do way less damage than if you'd like to match and throw it on a very dry, dead forest. And because we've had years of droughts on the west coast of America, the forest fires are getting more intense and more frequent.

Phil - Not good, is it.

Jenny - No, and it's not just the northern hemisphere either where we're seeing this. I mean, at the start of the year in January, we also had unbelievable wildfires in Australia. And so it's not just in one particular place, it's happening all across the globe.

The tail of a humpback whale surfacing above the water.

23:07 - Humpback whales lost up Australian river

These whales were spotted tens of kilometres inland - how did they get there, and can they make it home?

Humpback whales lost up Australian river

A group of humpback whales were spotted this week tens of kilometres off their beaten track, inland along the East Alligator River in Australia’s Northern Territory. Despite the misleading name - there are no alligators in Australia - the river is nevertheless filled with fearsome saltwater crocodiles. So what were the whales doing there, and what will happen to them? Katie Haylor reports...

Katie - Off-duty marine ecologist Jason Fowler, and colleagues, first spotted a group of humpback whales on a fishing trip in Kakadu's East Alligator River in the Northern Territory; which raised quite a few eyebrows, to say the least, as it's the first known instance of this happening.

Vanessa - We first saw these animals up to as much as 20 kilometres up a river in the Northern Territory in Australia; and just to paint a bit of a picture here, we're talking murky, muddy waters. And so when I first saw the picture of a humpback whale, which is an oceanic species, in this murky water, it was something... it was a phenomenal thing.

Katie - That's Sydney-based marine scientist Vanessa Pirotta. Now two of the whales, it's thought, have since swam back out to sea; but at least one remains in the river. And the worry is if it gets stranded in the shallower water.

Vanessa - Is it going to be able to get out? So the main reason that they're probably in there is - I should point out this has never happened before - but maybe one of the animals took a wrong turn and ended up in this area. The humpback whales are generally in the Kimberley region, which is the North West of Australia, each and every year to breed and have their babies. And then now it's their time to be heading back south, down to Antarctica, where they're going to spend the summer feeding; but let's hope that this one remaining whale has the opportunity to do just that.

Katie - Being in a tidal river is rather different to being in the sea. So how might the whale be doing?

Vanessa - They do use sound to listen and to vocalise, and to talk to each other. Now this whale, because they're non-echolocators, may be reliant on visual cues, so simply having a little look around or trying to see where there's a space to swim in that's safe. So there's a whole number of things that would probably be going through this whale's mind, and without anthropomorphising it or putting a human spin on it, I'm sure that this animal might be doing circles. At least on Friday there'll be a team going up just to have another look at it, just to see what it's doing, and to see if it's made any progression in its movements.

Katie - Stranding is a real risk. And up in the Northern hemisphere, Southampton University's Clive Trueman told me why this is so dangerous for a whale.

Clive - Water would normally be supporting the weight of the organs and the weight of the animal, so when it strands, that can compress the lungs and damage the internal organs. At the same time, if a whale is stuck and the tide is coming in and out, almost paradoxically the whale can drown because it can't lift itself off of the sandbank, and then water can get into the blowhole and drown it.

Katie - So what tools do scientists have available to encourage a 12-plus-metre whale to do anything?

Vanessa - There's a couple examples that I can run through, one being creating a physical barrier with boats so the animal will simply move away, it's hoped; in some cases that hasn't worked in the past where the animal has simply gone onto boats. You could use acoustics such as banging on, physically banging on vessels, which is really not too nice for a whale. Some have suggested using killer whale playback sounds, which is the predator of the humpback whales; but again, a lot of these are potentially going to induce stress, so an expert team will have to weigh up what options are potentially going to be put on the table to see if it's worth inducing these kind of reactions to then have a favourable result, which would be the animal turning directions and heading out to sea.

Katie - As the name of the river suggests, the whale isn't the only thing in East Alligator river. Clive again.

Clive - Saltwater crocodiles are fantastic animals and extremely intimidating, but probably not a risk to a 16-metre adult humpback whale; unless, again, the whale is stranded. And if the whale is stranded and stuck then you could imagine the crocodiles could pose an additional risk.

Katie - At the time of recording we don't yet know the humpback whales fate. But could there be a positive here? Could having enough whales to be able to get lost on a migration indicate that population numbers are doing well?

Vanessa - We definitely know this population is doing quite well. In fact, this is one of the largest humpback whale populations in the world. So what I'm trying to say is the removal of one individual in a very large, well growing population is not going to limit the recovery of this species or essentially the population, which is a positive thing.

Clive - Humpback whales are certainly recovering from the effects of whaling faster than many other baleen whales. What would be fascinating to know is whether there are cultural records, indigenous records, tribal stories of humpback whales in rivers from the time before European hunting. And it would be absolutely fascinating if that's true.

Since this story first aired, the remaining whale has safely made it out of the Alligator River and back to the open sea.

A blank computer monitor in front of a wall of images.

30:34 - Algorithms making decisions: the problems

When algorithms take the driver's seat, they can be as biased as humans - a problem when "computer says no"...

Algorithms making decisions: the problems
Karen Yeung, University of Birmingham

Recently in the UK, the government attempted to use an algorithm to generate replacement grades for public exams interrupted by the COVID-19 pandemic. The results were catastrophic: many students claimed they had been treated unfairly and cheated of cherished university places. The government were forced to backtrack. But this is far from the only critical social decision that’s being automated nowadays, as Phil Sansom heard from law and computer science expert Karen Yeung…

Karen - There is a long history of using statistics to inform all sorts of allocation decisions. The problem happened because someone somewhere lost sight of the critical importance of assessing each student on their own merit. And that of course is grossly unfair, and hence we saw the outcry that emerged after those grades were published.

Phil - This idea then, of using what's happened in the past to automate decisions about the future: is this something that happens in other places that I wouldn't expect?

Karen - It happens everywhere. This is one of the really serious problems that we haven't yet been able to find a way to address, because if we take past patterns of shopping behaviour, for example, to build a profile of what it is that you like and what you don't like, then we're going to imagine that you'll like the same kinds of things tomorrow as you did yesterday. Now that's a fairly benign example, but to use one that I find quite powerful to illustrate this problem: a number of years ago in fact, a set of researchers at Carnegie Mellon did an experiment partitioning 1000 simulated users who were asked to search for jobs on the internet. And they allowed them to search away, and then they had them look at the same news sites to see what kinds of personalised ads were served up to them. And what they found is that male users were shown high paying job ads six times more frequently than female users, on the assumption that the historic data showed that women do not acquire high paying jobs; and because they do not click on high paying jobs, the assumption is that women are not interested in high paying jobs, and that they don't have the capacity to meet the criteria of high paying jobs.

Phil - Wow. So you're saying that, not only do you lose the fact that people can be unpredictable, but also you get stereotyped.

Karen - Absolutely, absolutely. There is a basic stereotyping logic when you use historic data as a predictor of the future.

Phil - Is that something that you can fix by just doing better at your statistics or your analytics?

Karen - I'm not convinced that we can. How could you eliminate, in a non-arbitrary, non-subjective way, historic bias from your dataset? You would actually be making it up. You would have a vision of your ideal society, and you would try and reflect it by altering your dataset accordingly, but you would effectively be doing that on the basis of arbitrary judgment.

Phil - We talked about getting a job; we talked about crucial exam results; are there other areas where this kind of thing is a problem for crucial life moments or decisions?

Karen - Yeah, so I think one of the things that has emerged in recent years in particular is public sector decision-making, particularly in relation to eligibility for certain kinds of public service, has increasingly been automated.

Phil - That actually happened in the UK, like automated universal credit or something?

Karen - Automated universal credit is a nice illustration. There's the robodebt fiasco that you may have heard of in Australia, where attempts were made to claim back predicted overpayments, and a number - many, many thousands of people - were deprived of benefit checks. And in fact one young man even committed suicide because he was erroneously charged.

Phil - Is this technology a straight-up bugbear? Because it seems like it can be quite useful for chewing through huge, complicated, tedious tasks.

Karen - It's absolutely true. And computational systems are wonderful at automating very repetitive, straightforward tasks, and there are so many tasks that we should celebrate when they become automated. But I think what we need to attend to is thinking about these technologies as complex sociotechnical systems, particularly when the consequences are concrete for people's lives. And of course the rich are able to escape these kinds of systems and can usually speak to a human if they want to speak to one; and there are many other stories of algorithmic horror shows, if you like, where people have been essentially trapped or find it impossible to challenge the outcomes that are being produced from these systems, because they simply don't have an entry point.

Phil - Do we have a proper plan to train people to be really good at using data and automation like you're talking about, and being trained to figure out when's the right place to use them?

Karen - I don't think we have yet.

A lock symbol in the middle of a translucent dial.

36:16 - Privacy, surveillance, and the trade in data

Everything about us is tracked and recorded, to be sold onwards often without our consent. What can we do?

Privacy, surveillance, and the trade in data
Carissa Véliz, University of Oxford

Much of the automation described earlier in the show by Karen Yeung relies on enormous amounts of data collected about, and during, people’s day to day activities. Current estimates suggest that we’ll produce in the region of ten sextillion bits of information in 2020 alone. That’s a mind-boggling amount - if you tried to record each bit by hand, you could easily hit the end of the universe itself before you finish. But with each of us generating so much information about ourselves, our lives, and the world around us, our privacy is shrinking and shrinking. That’s the subject of a new book, Privacy is Power - out this week - from Oxford University’s Carissa Véliz. She explained to Chris Smith…

Carissa - Well, it's a book about the state of privacy today, how the surveillance economy came about, and why we should end the trade in personal data and how to do it.

Chris - What's actually getting documented? I said that we're getting lots of data recorded about ourselves, but what are people actually logging and recording and documenting?

Carissa - Almost everything you do online or while you have your smartphone near you is being recorded. And that includes sensitive information like who your friends and family are, where you live, whom you sleep with, if you're having an affair, where you work, your credit history, your purchasing power, your diseases, your personality traits, your sexual orientation and fantasies, your political tendencies, whether you've had an abortion, whether you do drugs, how well you drive, what you search for, what you buy, what videos you watch, what keeps you up at night, how well you sleep, whether you exercise, and much, much more.

Chris - My goodness, that sounds terribly alarming. How on earth do they know all that? Because I thought this stuff was supposed to be anonymous.

Carissa - It turns out that it's incredibly easy to re-identify data. Even if you only have two data points, say location data about where somebody sleeps and somebody works, that's enough; because usually there's only one person who sleeps and works where you do, and that's you.

Chris - And so that's how they can figure out if you're sleeping where you shouldn't, so to speak?

Carissa - Exactly, or if two phones are together more than they should be. It's all about inferences.

Chris - At what point did it become a free-for-all, that people could just grab this information, and who has in fact got it?When we say 'they' are collecting this information: who?

Carissa - It started happening in 2001, especially with Google and the development of personalised ads...

Chris - But there's a big jump between personal ads and where I spend the night.

Carissa - Well, not really. Companies want to know as much as possible about you so they can target you as precisely as possible. So say if you're having an affair, maybe you are interested in certain kinds of apps that allow you to be secret.

Chris - And what's really changed then? If this has been going on since 2001, this is not really a new problem.

Carissa - It becomes riskier and riskier the more data we have, and people are having more bad experiences. In a survey I carried out with Siân Brooke, about 92% of people have had some kind of bad experience with privacy, from data theft to public humiliation. Privacy is important because data is toxic. It's dangerous to have it out there, and there are many ways in which it can be misused.

Chris - At what point did I give permission for all of this data to be collected? Is it just assumed that it's okay to make these sorts of data collections, then?

Carissa - You didn't give meaningful consent. By the time we realised it was happening the data economy was well-developed. And even today, Google sends personal information about you well before you consent about any kind of personalised ads.

Chris - But I thought that's what our friends at the European Union with their GDPR, the device intended legally to stop this sort of thing happening is. Are you saying it's just not working then?

Carissa - It has helped, but it's not enough: first because even if consent, it's not informed consent because you don't know what kind of inferences people can make from your data, and not even data scientists can know. And secondly, many times it's not working because companies are not strict enough, and so sometimes your data gets sent before you consent.

Chris - That's sort of what's happened with Uber, isn't it? Because there's a report in the Times newspaper recently showing that Uber, at least in the UK, have agreed to share passenger details with the police!

Carissa - Yes, that is concerning. A lot of questions arise as a result, but one of them is whether it's okay for the government to be encouraging certain services that might be bad for society overall - if you think for example of Uber's problems with safety and with employment - just because it provides the government with a surveillance opportunity.

Chris - So is it a lost battle then? Is it too late for me, I may as well just resign myself to the fact that Google, Facebook, and all the others know more about me than I do; or actually can we start to do something about this now?

Carissa - It's definitely not a lost battle; we're just starting. I think we're going through a process of civilisation and we're turning the internet into a liveable and bearable place. So in some ways we're better now than we were five years ago because there's more regulation. It's not enough, but it matters. And every time you protect your privacy, it matters. You don't know which data point will be the one that caused you harm, so anything you can do helps. And it's also making a statement about what you stand for, and you'd be surprised to what extent governments and companies are sensitive to these expressions of dissatisfaction. They're listening.

A viewer watches streaming media online

42:31 - Free will online: the illusion of choice

You might think you control your online life - but algorithms make most decisions for you...

Free will online: the illusion of choice
Kartik Hosanagar, Wharton School of the University of Pennsylvania

Thanks to the internet, it seems like today we have more choices than ever before: films to watch or books to read, places to go, people to talk to. But there’s so much information online that we can only ever see a fraction of it - and so most tech companies track everything you do on their platform, build a digital doppelganger of who you are, then compare that model of you to others, to try and figure out what you’d like to see best. Kartik Hosanagar researches the consequences of this process at the Wharton School at the University of Pennsylvania, and Chris Smith asked him who’s really in control here...

Kartik - We all believe that we're making our own choices, but all of my research shows that's not the reality. For example, 80% of the viewing hours streamed on Netflix originated from automated recommendations. At YouTube, the statistic is very similar. Close to a third of the sales at Amazon originate from automated recommendations; and the vast majority of dating matches on apps like Tinder are initiated by algorithms. Really, if you think about it, there are very few decisions we make these days that aren't touched by algorithms that are built on top of the big data.

Chris - So how are these algorithms actually doing what they do, and what are the risks?

Kartik - First, I want to acknowledge that algorithms create a tremendous amount of value. I mean, which of us wants to go back to whatever the TV network decides we need to watch? But at the same time these algorithms are not necessarily objective, infallible decision makers; they are prone to many of the same biases we associate with humans. For example, in a recent case where algorithms used in courtrooms in the US to compute a defendant's risk of reoffending, it was shown that these algorithms were biased against black defendants. And to be clear, it's not that there's a human programmer who's programming these biases in; rather, the algorithms are learning from data. So in the past, if there's a race bias in the policing system or the criminal sentencing system, this algorithm will learn to assume that a black defendant is more likely to reoffend.

Chris - But also these algorithms can be sussed out by savvy humans, who basically work out how they work.

Kartik - Yeah, I think that's a good way to look at it. If you look at the world of advertising and marketing, even before computers, before digital technologies, most of marketing and advertising was focused on how do we... I don't want to use the word “con” people into making decisions, but certainly how do we persuade people? The new version of it is: how do we persuade algorithms to put us in front of the people? I view this mostly as a positive, in the sense that this exercise helps ensure that the most relevant websites come in front of consumers; but at the same time, there is a dark side of it. Some of these companies are focused on, "how do we fool the algorithm into thinking our page is more relevant than it really is?" There's an element of grey here. And one, again, needs to be cautious about: how is the algorithm making the decision? Why did this particular recommendation get made?

Chris - Is there not a danger that this is narrowing our choices, because it just force feeds us a monotonous diet of what we like, and it makes us less adventurous, less likely to think outside the box; and as a result while our life may seem simpler, it's potentially poorer for it? Are we losing our free will here?

Kartik - Yeah. In fact in my book, A Human's Guide to Machine Intelligence, in that I argue that most of us really do not have the free will that we think we do. And if you think about who we are, at the end of the day, it's the sum total of the choices we made, what we read, what we bought; and it ultimately has shaped us to become who we are today. To be clear again, I'm not saying we need to become Luddites and go back to a world before algorithms. The analogy I offer is that it's like early caveman discovering fire and saying, “well, this can be tricky to control - let me stop using fire”. Instead you learn to use fire; you learn to control fire; you learn to have things like the fire department that can extinguish fires; you maybe even keep fire extinguishers at home and so on; and you use fire.

A brain surrounded by code and electronic connections.

47:43 - The attention hack: how tech is changing us

The companies that trade in manipulating human decisions - thanks to data - are changing our culture...

The attention hack: how tech is changing us
Aza Raskin, Center for Humane Technology

Whenever you’re endlessly scrolling down Facebook - or Twitter, or Instagram, or whatever your poison - these platforms are only showing you the content you’re most likely to engage with. Aza Raskin, from the Centre for Humane Technology, is the man who invented that ‘endless scroll’ function; and he says that those algorithms have gotten so good at what they do, they make us addicted to the pseudo-reality that the online world shows us. He explained the idea to Phil Sansom...

Aza - I think it's important to step back and say: we call these things social networks, social sites, or just apps - TikTok, Reddit, Facebook - and that actually hides the true nature of what they are. These are immense digital habitats in which, post-COVID, we are living over one half of our waking lives. And the shape of our environments, what we tell the machines they want from us, have profound implications not just for our behavior, but for our values themselves. The story of AI is a very old one, it's the: be careful what you wish for, because whatever you wish for, the AI is going to go off and do, independent of how you wish for it to happen, and it will get your intentions wrong. And what we wished for was: can we grab as much of your attention as possible.

And in fact, how Google, Facebook, Twitter, they make their money is by selling certainty in the ability to get you to take certain actions which were different than the actions you were taking before. So as they've asked us for more and more attention, it's starting to override - because these are digital habitats - who we think we are. And we first felt it as digital overload; then digital addiction, unable to spend time with the people we care about most; to the place where over 52% of kids in the US, when they are asked, "what do you want to be when they grow up," they say, not astronauts, not scientists, but YouTube influencers. It has changed who our culture is.

Phil - There's that phrase that comes up a lot here, which is that if you're not paying for a product, you are the product.

Aza - Yeah, that's exactly right. The question everyone should be asking themselves is: "how much did you pay for Facebook recently?" You're like, “oh yeah, I haven't paid anything”. Why is that? Then the first answer you'll get is, “well, they're trying to sell you ads. They're trying to get you to buy something.” And then the first reaction to that is like, “wow, well, I sort of like those ads!” And that completely misses the point. They're collecting massive amounts of behavioural surplus data to make models of you, like a little voodoo doll, which they can prod and poke to see how you respond, to get you to take specific actions. Sometimes that's click on an ad, but often it's just to get you to do anything, because that puts you in a reactive state. 64% of all QAnon and conspiracy joins on Facebook come from the Facebook recommendation algorithm itself.

Phil - This is a hugely popular conspiracy, right?

Aza - Yeah, it has spread like wildfire through the US. Lots of these conspiracies are taking hold across the internet. And why is that? Well it's because if you shorten the attention spans of the entire world all at once, we stop reading as much, and it creates the conditions in which anger and the worst parts of human nature are reflected back to ourselves. And it's not like technology is an existential threat to humanity, but the worst of society is absolutely an existential threat to humanity. And what technology is doing is showing our worst versions of ourselves back to ourselves.

Phil - Aza, it feels like we've always been worried about technology, ever since there's been technology. You can read little newspaper op-eds from the 1800s or whatever that say, “people are spending too much time reading”! What's different about this?

Aza - Totally. And we should be intellectually honest, because as you say, like newspapers, bicycles, we've often had moral panics that come with new technology. And that's because new technology changes things. It makes disruptions, it alters the status quo. And the argument here is not that technology is bad; I've spent my entire lifetime building technology. I still do it. We've always had persuasion; we've always had propaganda; we've often always had advertisements. What's new here is that we are now living inside of the advertisement; living inside of the persuasion; living inside of the technology.

And that means its effects on us are exponentially bigger than it was before. When we wake up, we look at our phones; before we go to sleep, we look at our phones. The way we interact with our friends are through our phones and through for-profit companies' decisions. That has never happened before. And on the other side of the screen are a thousand engineers times a massive supercomputer that knows more about you than your lawyer, your priest, your therapist combined. That kind of asymmetric power is new. To put it another way, as E.O. Wilson said, the problem that we're facing now is that we have paleolithic emotions, medieval institutions, and godlike technology. And we do not have the godlike wisdom to wield this technology yet.

Phil - Aza, if I'm reading you right, then things like the rise of fake news are actually a natural consequence of having these algorithms get better and better at feeding us stuff that makes us click. And actually, as you were talking, I had not one but three notifications buzz on my phone in my pocket, which freaked me out a little bit! We all live in this world... is there any way to not?

Aza - Well, so the first thing to note is there are absolutely things that you can do that can help, but even if you don't use social media, you still live in a world that does. It's sort of like when you're in the middle of a pandemic, you can exercise, you can stay home; there are things you can do to protect your health, but you still live in a world which is affected by the global pandemic. That said, because it's often like gaslighting, it's sort of like saying, “hey, to solve climate change, you should not use straws and fly less,” but it's really not the consumer sector which is driving the most climate change. A few tips: turn your phone on greyscale. Turn off all notifications from anything that is not a human. Always ask: if something feels like it's pushing your emotional buttons, maybe it is. And finally, always wonder: why are we so angry at each other? The other side - how could they possibly believe what they believe, are they seeing what I'm seeing? And the important thing to note is they aren't seeing what you're seeing. They're seeing something different. Swap phones, and look at each other's newsfeeds and see how different their world is.

The Earth in space orbiting round the sun.

55:09 - QotW: in space, what units of time work best?

Listener David asks how to measure time during space travel - and how the body clock gets affected...

QotW: in space, what units of time work best?

Time is of the essence, as Eva Higginbotham has been working against the clock to answer this question from listener David...

Eva - Our sense of days, nights, weeks, and months is so ingrained, it's almost hard to imagine. So who better to answer than someone who's experienced space travel firsthand? I put the question to former NASA astronaut Steve Swanson.

Steve - What measurement of time would I use if I was on a long space trip? Psychologically, it's best to keep what we're used to, in that sense. So you can still celebrate birthdays and stuff like that, because we could remember kind of how old we are. I think that would be very important to have along too.

Eva - But what effect might being in space have on your body clock? I asked sleep scientist Cassie Hilditch, who works in collaboration with NASA scientists.

Cassie - First, we need to understand how our body clock or circadian rhythm works. The timings of many of the processes in our bodies are programmed by a cluster of cells in the brain, and one of their main jobs is to coordinate when our bodies should be asleep, and when we should be awake.

Eva - Our internal body clock is set to be about 24 hours, but is usually a little off. This means we need to set our clocks every day to keep in sync with the light-dark cycle of our environment. And our body does that by using sunlight as a time cue. But when our body clock is out of sync, it can affect our ability to sleep, to stay alert, and ultimately lead to long term health consequences.

Cassie - On the International Space Station, or ISS, astronauts and cosmonauts are currently having to deal with this very problem. The ISS orbits the Earth about every 90 minutes. So the crew on board see the sunrise and sunset 16 times per Earth day. This, as you can imagine, sends some pretty confusing time cues to the body clock, and can cause a disruption of the different systems that are usually synchronised in the body.

Eva - On the ISS there are studies going on trying the use of specialised lighting to help align the body clocks of astronauts to a 24 hour rhythm, by mimicking the light patterns on Earth. If this is successful, Cassie says this technique could possibly be adapted to other spacecraft, including for deep space flight in the future.

Cassie - And we're also starting to think about how we might live on Mars, which has a different day length, of 24 hours and 39 minutes. Luckily this is pretty close to 24 hours, and study suggests that we might be able to entrain to, or synchronise with this Martian day, but we might still need some special lighting to help us.

Eva - Thanks Cassie and Steve! Next week we'll be taking a cold shower while looking for the answer to this question from Margaret.

Margaret - Why, why, why can I work in the yard and be covered in sweat for hours and only stink a little, but reveal one sensitive personal thing to a group of friends and immediately stink to high heaven?

Comments

Add a comment