The Royal Society summer exhibition 2015

The Royal Society in London comes alive with a week-long exhibition showcasing the very latest in cutting edge science from around the UK.
07 July 2015

Interview with 

Patrick Naylor, Matthew Morris and Phillip Murgatroyd

Share

Every summer, the Royal Society in London comes alive with a week-long exhibition showcasing the very latest in cutting edge science from around the UK. This year is no exception, with exhibits and events covering everything from nanotechnology to cosmic rays, cancer screening, and fighting the flu. Kat Arney went along to talk to a few of the researchers presenting their work...

Philip - Hi. I'm Philip Murgatroyd. I'm from the University of Birmingham and I'm part of the Stonehenge Hidden Landscapes project.

Kat - There's a wonderful exhibition here. We've got a tray of sand. There's an enormous flat iPad-looking thing, very bizarre piece of machinery, the gravity imager, and it's all done up as an underground station. What is this about?

Philip - The basic theme of the display Stonehenge underground because it's developing from work that we've done using archaeological geophysics in the Stonehenge Landscape and its modern scientific techniques have allowed us to completely revolutionise how we see the Stonehenge Landscape.

Kat - I have to ask, what on earth is a gravity imager? That looks a very impressive piece of scientific kit, but what does it do?

Philip - It's part of the future of geophysics and we're working with the GG-TOP project at Birmingham and they're developing a new way to detect gravity by using an atom interferometer. Basically, this catches a cloud of atoms in a vacuum and throws them in the air and sees how they go up in the air and come down using laser interferometry. Because we're detecting the effects of gravity on very small atoms, we can detect very small influences of gravity so we can detect smaller objects based on the gravity than we're ever able to before.

Kat - So, that's letting you see what's underground, what's in the ground.

Philip - Yes. The whole reason we use different types of geophysics is, there are different things you can see with different types of geophysics. Currently, gravity detection is quite poor for archaeological deposits but that's because the sensors aren't fine enough. By developing our own sensor, we hope to be able to introduce gravity into the whole suite of geophysics.

Kat - Tell me one of the most interesting things you found under Stonehenge using this kind of technology.

Philip - Stonehenge has actually been very well surveyed because it's relatively small area. What we've done in conjuction with the Ludwig Boltzmann Institute in Austria is they've developed a suite of technologies that allow us to collect geophysical data pulled behind quadbikes at 70 miles an hour. This allows us to, not just focus on areas where we think there are stuff, but focus on the whole landscape within 12 hectares of magnetometry and that allows us to see everything. We can see the spaces where we assumed were spaces between monuments and we found that they're not spaces at all. There are monuments there and we just never looked further because we've never had the technology to do so.

Kat - So, there's a whole world that's underground that's being revealed by new technology.

Philip - Yes, absolutely. It's an entire landscape that we've not been able to look at geophysically before, that now we can and it's opened us a whole array of different monuments.

Matthew - Hi. My name is Matthew Morris. I work for University of Leicester and we're here, explaining how we were able to identify Richard III's remains.

Kat - Now, there's a huge glass case here with a skeleton in it. Is this the man himself?

Matthew - It's a copy of the man himself. So, we reburied the real king in Leicester and this is an exact 3D printed copy of his skeleton that we made from the CT scans we made of the real bones. We then created models that we then printed out using the 3D printer.

Kat - So Richard III was very famously found in this car park in Leicester. What can people see here to help explain and understand how you went about identifying whether it was our missing king or not?

Matthew - So, we got the skeleton himself. We're explaining how we were able to put all the evidence together to make the case. So, it's like a 500-year missing person's case, mystery case, and we're putting it together. We've got activities about statistics and probabilities. We've got an arrow drop, we've got a medieval knight so we can show people how the injuries were inflicted on the skeleton that we've got.

Kat - There has been some discussion about whether it really is Richard III. How do we know that it definitely is?

Matthew - So that's why we're here to show people how. So, you can't prove it's Richard III from one strand of evidence, but when you take all of the strands of evidence together from all the aspects of the investigation, you come up with a probability of 99.999 per cent likely that this skeleton was Richard III. And so, that's what we're here, hoping to convince everybody.

Kat - So, what's this arrow drop? It looks a bit like a guillotine. Can we wind it up and have a go?

Matthew - We can.

Kat - Alright. Let's give it a go.

Patrick - So, I'm Patrick Naylor. I'm at Imperial College London and we're working on 3D sound and soundscapes, particularly focusing on how do humans understand sound, and what can we learn from that to help us design machines that understand sound better than they do now. In here, we have the opportunity to talk to our robot. Hello...

Robot - Hello. It's nice to meet you.

Patrick - This is a 50-centimetre high robot that is running a number of processing algorithms. First of all, it's doing face detection and it's also working out the direction that sound is coming from. It's following that by head tracking.

Kat - He's a charming little chap. I feel like I want to wave at him. He's white with some nice orange styling, tipping his head towards me. What's going on here?

Patrick - The sound is being picked up by the microphones on the head and from these two signals, from the two microphones, the robot works out the direction that the sound is coming from. From the cameras, it works out the position of faces using face detection algorithms. The robot tries to pay attention in particular to directions that have sound coming from them and also, a face is detected. So, faces that talk are important to the robot because that's where its instructions will come from.

Kat - What would be a kind of application of this? How could kind of technology be useful? Hello. It's looking at me while I'm talking.

Patrick - It's all about human-robot interaction. So, any robot interaction involving humans, it's likely that speech is going to be important. In real world environments, robots have to deal with multiple humans in the same room, a typical application which is quoted as a welcoming robot for a hotel. You walk up to the check-in desk and instead of having to queue up in line to check into your hotel, the robot might simply welcome you and give you your check-in details and your room key. Now, the important thing there is that there are many other people checking in at the same time. Which person should the robot pay attention to? So, this selective attention - we call it selective attention. It's an important capability that robots need to be taught or we need to develop algorithms that can deal with that. Goodbye!

Robot - It was nice to talk to you. Have fun at the exhibition!

Comments

Add a comment