Are Moral Values Contagious?
Would you kill one person to save 5 others? Does religion evade morality by omission and how can you tweak people's motivations?
Hannah - First up, let's question our morals. Imagine the scenario - there's a runaway trolley barrelling down the railway tracks. Ahead on the tracks, there are five people tied up and unable to move. The trolley is headed straight for them. You are standing some distance off, next to a lever. If you pull this lever, the trolley will switch to a different set of tracks. Unfortunately, you notice that there is one person tied up on the side-track. So, you have two options. One, do nothing and the trolley kills the 5 people on the main track or two, pull the lever, diverting the trolley onto the side track or it will kill just one person. Which is the correct choice? I put this question to Ray Dolan, Professor of Decision Making at University College London.
Ray - What I would do, I could say, "Well, I'm going to do this..." but I might do something very different in the ease of the moment. Corollaristically, what I would do is probably do nothing. I think that is the case also for the majority. So, 5 people may die and the one person who you could've run over will survive. Whether we act or don't act is very much influenced by the consequence of our action and if the consequence of our actions or the outcome is bad, we are more disposed not to act than to act. So, it's a basic tendency. So, that would tell me that on average, the majority of people would do nothing. Now, I was brought up a catholic and we learned a distinction between two types of sins - sins of omission and sins of commission. It's an interesting distinction because here, I have a choice between omission - don't do anything, which is a minor sin and commission is a more active sin. So, I think the fact that that's there in sort of catholic, sort of ethical teaching reflects a wider kind of truth that to do things through action usually carries a greater social sanction than to do something through action. I think society at the movement, things you will not be blamed for doing nothing whereas if you switch the lever and run over somebody, somebody is going to blame you.
Hannah - Do we know much about what's going on in the brain whilst we're making these very difficult moral decisions?
Ray - People are beginning to study it and there are lots of surprising findings. We've been doing work on pain. We've got very counterintuitive findings. Let me give you just a brief summary and this is work done by somebody called Molly Crockett in the lab. She's been looking at the issue of: If I'm given an amount of money and I'm going to get pain, but I'm also given an amount of money, that same amount of money can buy you out of pain, the same pain as I'm getting that hurts you just as much as it hurts me. The question is, will I pay the same amount of money to avoid 1 unit of pain for myself as I would for you. It turns out and there's a robust finding that I'm more likely to pay twice or three times as much for you to avoid pain and for me to avoid the same level of pain. Now, that's the level of altruism that's totally unpredicted, totally surprising, and this is somebody we'll never ever meet again. So, there's no social consequence, no reputation to yourself. We make that quite explicit.
Hannah - That's quite a surprising finding as you say, so someone to be that altruistic to a stranger in such a capitalist society that we now live in. Can you explain why people seem to make those decisions in this experimental paradigm and whether it actually - do you think translates into real life?
Ray - I mean, that's always a question about whether these things translate outside a laboratory setting, very hard to do naturalistic studies. But I think the findings are so robust and so replicable across individuals that I think there will be a very strong residue. I think you have to involve almost something like an evolutionary argument here that in fact, there is a great advantage to us in general or in the long run being helpful and good to others. I think a lot of morality rises out of a framing of this truism within more formal notion of, 'you will go to heaven or go to hell' but this truths were around long before religion. If I help you, that's going to be money in the bank, sort of moral money when I need help in the future and I will see that it's to my advantage. I subscribe quite strongly to the view that in every moral act, in every altruistic act, there's also a selfish component. So selfishness is not such a bad thing sometimes because my selfishness will mean that I'm prepared to invest and invest quite heavily in the well-beings of others because their well-being may well be my well-being in the future.
Hannah - So, do we have kind of engrained in our neural circuitry a basic concept of karma that what goes around comes around and it may be that we give to a stranger but then another stranger will return to us something that we value.
Ray - These types or sort of deep causal type of explanations are very difficult to prove or disprove as you can imagine. All you can do is get sort of residues of evidence that would be supportive of that. I suspect that over the scale of evolution that the types of behaviour that have been selected for will incorporate these type of altruistic behaviours because they are in our own interest in the long term. So, you have to have a long horizon. So, I think they are probably wired into our better natures.
Hannah - Is there any evidence that other animals also exhibit this altruistic behaviour in societies?
Ray - There's a very famous or a bit controversial experiment and I think involve experiment of the rats whereby a rat, given a choice of pushing a lever to get food or pushing a lever to diminish shock to a confederate who can see in an adjacent cage will actually choose to push the lever to reduce the pain. There are lots of anecdotal stuff as well in relation to dolphins for example. Interesting anecdotes of dolphins gathering in a group to keep a person afloat who's in danger of drowning in many, many stories like that. So, I think it will be peculiar if there was particular evolutionary pressures that were unique to humans. This must be unique to all sentient beings.
Hannah - Are there any groups of humans who don't seem to exhibit this altruistic type of behaviour that you've been studying? So for example, maybe particular patient groups or maybe groups of people with particular political views.
Ray - Yeah, so where do we start? The two party? The answer is almost certainly the case. So that any human trait, be it height, eye colour, skin colour, there's a range that applies also to behaviour - the degree to which you will startle ease of distribution. Some people will startle more easily, some people less. So, how do you recognise somebody who didn't give a damn about the pain? Who wouldn't pay a penny back to the experiment I described? They wouldn't pay a penny to reduce your pain, but they pay a lot to reduce their own pain. Well, that might bring up the picture of a sociopath or people who've got psychopathic behaviour who could be extremely cruel to other people. I think the interest of these experiments goes very much to the heart of those problems. How can we understand other people's behaviours when they seem so extreme? I've been working on this across a number of different domains and we're not beginning to think about applying in more sort of clinical context to try to solve these type of problems. So, if we just take something like psychopathy where somebody can be very cruel to others, totally disregard for their pain. Is that due to a biological problem such that they cannot represent the states of others? In other words, their sentient feeling states.
Hannah - Or is it just that they don't actually feel the negative value of pain? Is it that they're just not so receptive to pain?
Ray - They're very receptive to their own pain.
Hannah - But not to other people's.
Ray - So, it's very much to do with representations for others. And so, I'm going to talk a little bit about today is another study where we've done something that suggest that if I interact with you for example, and I have just learned about your values, and I have a set of values for that particular context myself. By interacting with you, my values will shift a little bit towards your values and this seems to happen implicitly automatically. And that's where I have to make decisions for me, make decisions for you. I have to learn your values in this context, but in the process of interacting with you, my valuation shifts towards yours. So, we've seen that behaviourally. We've got a neurobiological accounting of this. It happens automatically. It seems to happen in everybody. We tend to have a moral stance with people who are bad like that. But in fact, it may be that they are victims of a biological development that has (11:33) for whatever reason. That may not be just probably another genes. It could be something like their early environment, etc.
Hannah - And so, there may be a point where you can actually diagnose sociopathy more accurately, based on these types of experimental paradigms and also, the brain imaging studies that you're looking at in getting to grips with these very moral questions.
Ray - And perhaps early on, if you understand its biology, you might be able to come up with interventions that can actually change that because in the studies I've described, we see what I call 'plasticity'- things change on the basis of experience. You may be able to teach people understand this biological realisation. Then you're in a better position to come up with treatments that are rational and that are focused on what the problem is. What we have at the moment is a range of ad hoc treatments that generally don't work.
Hannah - And then I suppose also, if you're surrounded by people that have good - as they're perceived - moral values then you were just saying, your own moral values will shift more towards theirs. So, that's a form of therapy that might be available to those that maybe experienced early insult during their childhood or trauma which may have somehow tweaked or shifted their moral values in a direction that isn't quite so socially acceptable.
Ray - Yeah, I think that's right and so, we learn an awful lot. Not we learn through our own experience, but we learn hell of a lot through observing others, absorbing their behaviour, making inferences about what it is that accounts for their behaviour. That person doesn't do that because that can hurt somebody. And so, knowing what is the correct type of environment for people to flourish and to nurture people becomes very, very important because we can specify in a more precise way. And then if you take that into a therapy, you can then setup much better therapeutic environments because you know what the key ingredient is or do you know what it is, but you can actually monitor it as it goes along. You can have a measure of whether this is working or not. Right now, we don't have anything like that because we don't even understand what this core problem is in many of these situations.
Hannah - And going back to the point now of religion, do you think that as less people in Europe or in UK certainly are catholic or have a particular religion that they really believe in, do you think that's going to cause the moral degradation of society?
Ray - My own views, I don't think religion insulates you from bad behaviour. In fact, you can make a very powerful argument that religion has been the engine of an awful lot of terrible behaviour, both in the past and today. So, I don't think religion necessarily is a good soil for people to develop moral values despite the claims of religion. I would say the evidence from history is that it's had a very detrimental effect. I grew up in a very religious family. I grew up in a catholic family and does that make me a better person? I don't know. The one thing I did learn, those some very fine grained moral distinctions, fundamental back to what you asked me about at the beginning - the trolley car problem - I think at the age of 9 or 10, I was very aware of the distinction between sins of omission, sins of commission. So, I learned a very fine grain sort of way of thinking about the world. But knowing that, I'm not certain it have me to be a better or worst person necessarily. So fear of hell didn't ever stop me doing some very naughty things.
Hannah - Thanks to Ray Dolan, Professor of Decision Making and Neuroimaging at University College London.