Inside the mind's fly: AI model predicts insect behaviour
Interview with
Scientists in America have developed an AI model of the visual system of a male fruit fly such that they can predict its behaviour when it is looking at a female.
The ultimate goal for this branch of computational neuroscience is to understand more complex visual systems, like ours, in detail. A better grasp of our mind’s eye could pave the way for treatments for blindness, and help us build better artificial visual systems for self-driving cars.
But that’s easier said than done. While hardware has come on leaps and bounds, our visual system consists of billions of neurons, more than we can replicate inside a computer.
That’s why Ben Cowley from Cold Spring Harbor Laboratory is focusing on flies instead: because of their streamlined visual system, he can model every brain cell involved in the process which turns visual inputs into social behaviours, and it’s led to some pretty eye-catching findings…
Ben - It turns out that if you actually look at the anatomy of the fruit fly visual system going from its retina, and if you look at maybe the three or four synapses away, the circuitry looks very similar to what we have in the early visual processing of the human visual system. So if we can understand these principles quite well in the fruit fly, it will apply to understanding the human visual system. But obviously the fruit fly has maybe 800 little photoreceptors tiling the visual field, whereas humans have, in the retina alone, maybe 100 million neurons.
James - To do this, I imagine you need a window into the visual perspective of a fruit fly. How did you get one?
Ben - Well, it's a beautiful setup, but it's very complicated. They have a little ball that the fruit fly rests on, and then there's a projection screen around the ball that we can project visual stimuli on. And then the fly itself is head fixed, so that way we know where it's viewing. And then there's this microscope above this little fruit fly head that can peer and record neural activity that these neurons fire when particular stimuli are presented. As you can imagine, if a human was in that setup, just like a fruit fly, it's a very unnatural setup. The fruit fly doesn't know what's going on. Its head is fixed and it's not really participating in any, let's say, behaviour. So the difficult part is then we could view this fruit fly in a natural behaviour - and one I'll describe is courtship where this male chases the female - but we're unable to record from the neurons while the male is chasing the female for obvious reasons: we can't move the microscope and things like that.
James - And how did you overcome that challenge?
Ben - Imagine all this visual information coming into this optic lobe. It's read out by, for the most part, 50 channels. And luckily for us each channel is made up of only one cell type. So maybe 100 neurons are projecting to this channel. And why that's really advantageous is that we can make a genetic line that targets that particular cell type and we can just say with the genes, 'stop making this particular protein.' And if it does that, it won't actually transmit any information past its synapse. And it's a little convoluted way, but it's a way that we can ensure that these neurons do not transmit any information silencing that particular neuron.
James - Okay. And by silencing a particular type of neuron, you're able to therefore see, when you put that fly into an experimental context, how its behaviour changes based on what cells you've blocked from developing inside its brain?
Ben - Exactly. So for example, if we silence a particular cell type and let's say that cell type, that channel, encodes for example the female size, if we silence that, the male no longer has access to knowing how far the female is. And so this behaviour is going to change and the hope is our model can pick out when that behaviour changes. So the dataset looks something like this. We have a bunch of blind dates that we put these males and females through. Roughly 500 different dates. And each date there'll be a male that has some silenced cell type, and then we put it with a control female. So nothing's changed with the female. And then we just observe the behaviour. We observe how this male tries to court or attract or pursue the female. Basically we take away a component, we see a deficit. We can infer then what the function is of that component.
James - It's extraordinary. And did it work?
Ben - The main key result, scientific result we found coming in, we were thinking probably maybe three or four of these channels, maybe five or six of these channels are going to contribute to courtship. But what we found through our model is the model is saying no, actually many of these channels, almost all of the channels contribute to some type of behaviour in courtship. And that was rather eye-opening. I apologise for the pun, but that was a little surprising to us because you have these discrete channels that it makes sense that each one contributes in a certain way. One is controlling or encoding female size, one is encoding female position, et cetera. But what we found instead is a population code. It's combinations of these channels are being reused in very different ways to perform complex behaviour.
James - Fascinating, isn't it? But with the caveat that what you're saying is that this already complicated system is even more complicated than we really thought. And if the human visual system is orders of magnitude more complex, we're some way off being able to understand it to the granular level of detail.
Ben - I guess in hindsight it should be unexpected. But yes, we were hoping with the fruit fly that things will just be very easy to pick out that a lot of these are just reflexes. So for example, if a visual stimulus looms, if a spot becomes very large, it's like a predator approaching. And so there's an escape response. We are hoping most of these behaviours were reflexes, if you will, and each channel is for one reflex. But it turns out, and maybe this is through evolution or maybe back in the day millions of years ago, the fruit fly was just all reflexes. But then as evolutionary and environmental constraints started to change and alter the behaviour of the male and the female, they realise that they're stuck with the hardware, they're stuck with these 50 channels, but they could update the software, if you will. They can read out from these channels in very different ways to produce more and more complex behaviour. So that's sort of where we're at right now. One thing we don't know in fruit fly vision is what they see during flight. Because you can imagine that's really difficult, we can't have them fly around and record from the neural activity. So our approach is, in my mind, very exciting to be able to silence these different neurons and just let them fly around and see what the behavioural deficits there are. That will be really tantalising to try to model.
Comments
Add a comment