Would you succumb to robo peer pressure?
Would you let a robot influence your judgement in the same way you might take advice from your friends and family? It’s, surprisingly, an important consideration because robotic devices, including things like virtual assistants, are spreading rapidly in society, so we need to know how they can influence people, and especially children. Katie Haylor heard how the University of Plymouth’s Tony Belpaeme is going about it...
Tony - We’ve been using robots, and specifically social robots, so these are not robots that make you a cup of tea or that would hoover the house, but these are robots that would interact with you on a social level. And we’ve been using these robots in hospitals, in schools, and quite often these robots need to be very convincing there. They need to convince you to stick to a dietary programme or they need to convince you to do your homework. And we were having a chat with some psychologists about how can we measure really how convincing our robots are, and that’s how we arrived at measuring how much peer pressure a robot can exert on people.
Katie - Peer pressure is quite a common concept in terms of human to human peer pressure, right?
Tony - It is, it is. We all know that we follow the herd; we wear what the others are wearing; we say what the others are saying. In the 1950s Solomon Asch devised a very simple and elegant experiment to actually measure how much we succumb to peer pressure. So you’ve got a group of people, they're brought into a room, and all of them except one are accomplices - they’re into the game.
You let these people do a simple visual task. On the screen you show four lines and they need to say which two lines match in length. And it’s such an easy task that if you would be alone in the room you would be near perfect. But what if all people in the room actually give wrong answers? They consistently say the wrong line, the one that doesn’t match. Will you follow suit; will you succumb to peer pressure or will you stick to the correct answer? That’s the twist in the experiment. Usually it’s been done with people to see how much we follow social pressure from a group of people, and this time we replaced the people by robots.
Katie - Okay. So tell me what you did in this study.
Tony - Because this experiment was done in the 1950s we first wanted to check if now in the 21st century people still succumbed to peer pressure in this experiment. So we did the experiment all over again and much to our surprise really, people still kind of followed suit. They follow what the others are saying even if that answer is wrong. So we knew that the setup works, the experiment works and then we kind of removed the adults and replaced them by robots. It’s a small group of robots; only three robots and we checked if the adults would succumb to peer pressure by the robots, which they didn’t.
But then we replaced the adult with a child, so now we had a child with three robots in the room doing that very simple visual test, and what we noticed was that the children followed. They succumbed to peer pressure by the robots so the robots really exert social pressure on the children to give a wrong answer.
Katie - What would you say is the significance of this finding?
Tony - On the one hand, children succumbing to peer pressure sounds sinister but it doesn’t have to be. It’s very important to us that children actually listen to what robots say and follow what robots suggest. We can make excellent use of that in education, in health care, in therapeutic support, and we’ve been doing that for many years.
But there could be a dark side to this as well. It could, for example, be that if you have one of these robots in the home and these robots suggest children to buy things, or to ask for things that they shouldn't be asking for then we know that these robots are much more convincing than any other form of technology.
We’ve had different types of technology in the home. These social robots are going to be a new form of channel through which information reaches us and it reaches us in a social manner. We now know that these robots will be able to persuade you to do things that could be good or that could be bad.