Ears wriggling in response to sound
We’re all familiar with the way our dogs and cats move their ears about to tune in to the source of a sound. They do it using muscles attached to the external ear tissue. Modern humans don’t visibly do this of course, but we do still have scraps of the same muscles, and if you record the activity of these muscles, you can work out what they are paying attention to, as Eva Higginbotham heard from Daniel Strauss…
Daniel - Our far away ancestors moved a lot their ears to orient their attention. And in the course of primate evolution, we moved from a more nocturnal lifestyle to a more diurnal lifestyle, we became more eye-oriented, evolution just decided that we don't need to swivel around our ears a lot. But still, you know, there are these muscles, maybe they don't serve the original purpose, but they are still there. And so we measured the electrical activity of the muscles and we use the signals to decode attention.
Eva - So you saw that electrical activity in the muscles around the ears could indicate where the person was paying attention?
Daniel - Exactly. And we decoded two types of attention: one type is the exogenous attention, so this is what is driven by the physical world outside. And the endogenous attention, this is the voluntary attention - you know, when you say 'Oh, I want to pay attention to this, or I want to pay attention to that'.
Eva - Did you have to do different experiments then for the exogenous and the endogenous? Or could you do it altogether?
Daniel - Exactly. We did two types of experiments. One the person was sitting there in a chair, there was a chin rest. They had to read texts in front of them and they were surrounded by loudspeakers and while they were reading the text suddenly there was a sound coming front-left, front-right, or from behind. And this was the exogenous experiment because we wanted to see how this new surprising sound captured their attention. And in the endogenous experiment, we played two stories either from the two front speakers or from the two speakers from behind. And the participants had to focus on one of these stories.
Eva - What did you find in those experiments?
Daniel - We found out that the activity of the muscles that we just mentioned before, really pretty nicely indicates the direction in which a person is paying attention to. And it was really all the time there, this effect of the ears trying to move in a way to this particular direction. So there was this sustained activity over a longer period of time, basically for the entire listening period. And this was a way surprising result for us.
Eva - And does the electrical activity in the muscle mean that the ears were actually moving? Like were the ears twitching trying to get closer to the source of the sound, is that what you saw?
Daniel - Exactly. We had a stereo-vision camera set up - stereo vision means we had two high definition cameras on both sides. So each ear was monitored by two cameras. And the stereo vision allows us really to see, you know, the movement in the three dimensional space. And indeed we observed these movements in some subjects. We used also a technique called video magnification that allows you to see the movements a little bit better than the original video. In some subjects, we really saw pretty large movements and they are really correlated and consistent to the muscle activity. So you really see the nice correlation between the muscles being active and the ear moving this direction.
Eva - That's really amazing! So now we know that our ears are moving in response to where we're paying attention, what can we do with that information?
Daniel - There is an application if you talk about hearing aids. You know, modern hearing aids have these directional microphones. So they should amplify the sounds in a particular direction. If you are in a conversation you look at somebody, but then you listen to something here, maybe you focus your attention over there, and this can be realised by this technology. So this directional microphones really follow the listening intentions of the user. So there's a way friendly human-machine concept. You know, you really have your intentions and the machine does what you want. It's really, I think a big promise for our findings here.