Privacy, surveillance, and the trade in data
Much of the automation described earlier in the show by Karen Yeung relies on enormous amounts of data collected about, and during, people’s day to day activities. Current estimates suggest that we’ll produce in the region of ten sextillion bits of information in 2020 alone. That’s a mind-boggling amount - if you tried to record each bit by hand, you could easily hit the end of the universe itself before you finish. But with each of us generating so much information about ourselves, our lives, and the world around us, our privacy is shrinking and shrinking. That’s the subject of a new book, Privacy is Power - out this week - from Oxford University’s Carissa Véliz. She explained to Chris Smith…
Carissa - Well, it's a book about the state of privacy today, how the surveillance economy came about, and why we should end the trade in personal data and how to do it.
Chris - What's actually getting documented? I said that we're getting lots of data recorded about ourselves, but what are people actually logging and recording and documenting?
Carissa - Almost everything you do online or while you have your smartphone near you is being recorded. And that includes sensitive information like who your friends and family are, where you live, whom you sleep with, if you're having an affair, where you work, your credit history, your purchasing power, your diseases, your personality traits, your sexual orientation and fantasies, your political tendencies, whether you've had an abortion, whether you do drugs, how well you drive, what you search for, what you buy, what videos you watch, what keeps you up at night, how well you sleep, whether you exercise, and much, much more.
Chris - My goodness, that sounds terribly alarming. How on earth do they know all that? Because I thought this stuff was supposed to be anonymous.
Carissa - It turns out that it's incredibly easy to re-identify data. Even if you only have two data points, say location data about where somebody sleeps and somebody works, that's enough; because usually there's only one person who sleeps and works where you do, and that's you.
Chris - And so that's how they can figure out if you're sleeping where you shouldn't, so to speak?
Carissa - Exactly, or if two phones are together more than they should be. It's all about inferences.
Chris - At what point did it become a free-for-all, that people could just grab this information, and who has in fact got it?When we say 'they' are collecting this information: who?
Carissa - It started happening in 2001, especially with Google and the development of personalised ads...
Chris - But there's a big jump between personal ads and where I spend the night.
Carissa - Well, not really. Companies want to know as much as possible about you so they can target you as precisely as possible. So say if you're having an affair, maybe you are interested in certain kinds of apps that allow you to be secret.
Chris - And what's really changed then? If this has been going on since 2001, this is not really a new problem.
Carissa - It becomes riskier and riskier the more data we have, and people are having more bad experiences. In a survey I carried out with Siân Brooke, about 92% of people have had some kind of bad experience with privacy, from data theft to public humiliation. Privacy is important because data is toxic. It's dangerous to have it out there, and there are many ways in which it can be misused.
Chris - At what point did I give permission for all of this data to be collected? Is it just assumed that it's okay to make these sorts of data collections, then?
Carissa - You didn't give meaningful consent. By the time we realised it was happening the data economy was well-developed. And even today, Google sends personal information about you well before you consent about any kind of personalised ads.
Chris - But I thought that's what our friends at the European Union with their GDPR, the device intended legally to stop this sort of thing happening is. Are you saying it's just not working then?
Carissa - It has helped, but it's not enough: first because even if consent, it's not informed consent because you don't know what kind of inferences people can make from your data, and not even data scientists can know. And secondly, many times it's not working because companies are not strict enough, and so sometimes your data gets sent before you consent.
Chris - That's sort of what's happened with Uber, isn't it? Because there's a report in the Times newspaper recently showing that Uber, at least in the UK, have agreed to share passenger details with the police!
Carissa - Yes, that is concerning. A lot of questions arise as a result, but one of them is whether it's okay for the government to be encouraging certain services that might be bad for society overall - if you think for example of Uber's problems with safety and with employment - just because it provides the government with a surveillance opportunity.
Chris - So is it a lost battle then? Is it too late for me, I may as well just resign myself to the fact that Google, Facebook, and all the others know more about me than I do; or actually can we start to do something about this now?
Carissa - It's definitely not a lost battle; we're just starting. I think we're going through a process of civilisation and we're turning the internet into a liveable and bearable place. So in some ways we're better now than we were five years ago because there's more regulation. It's not enough, but it matters. And every time you protect your privacy, it matters. You don't know which data point will be the one that caused you harm, so anything you can do helps. And it's also making a statement about what you stand for, and you'd be surprised to what extent governments and companies are sensitive to these expressions of dissatisfaction. They're listening.