The future of neuroscience research

How will brain findings shape future societies? Will we all become part machine? Is there a dark side to brain research?
19 November 2014

Interview with 

Geoff Ling, Defense Advanced Research Projects Agency, George Koob, National Institute on Alcohol Abuse and Alcoholism, Tom Insel, National Institute of Mental Health.

Share

How will brain findings shape future societies? Will we all become part machine?Alzheimer's brain Is there a dark side to brain research?

We explore in this special Neuroethics series...

Hannah - Hello. I'm Hannah Critchlow reporting from Washington DC for this special Naked neuroscience podcast in partnership with the International Neuroethics Society and the Wellcome Trust where we'll be taking a journey into the future to explore how brain research could shape our future society. In the last two episodes, we welcomed in the era of the brain. We discussed the colossal cash injections, including the human brain initiative and the human brain project, that hopes that we can peer into the human as never before. And we started to discuss how as a society, we should best use the data that comes out. Plus, we met the robots that may be caring for our increasingly elderly populations in the future. In this episode, we explore how the US defence agency who are involved in creating the internet are also involved in producing these new human brain imaging techniques and we get their take on how it might be used.

Geoffrey - When any new technology comes about, there will be somebody who's going to think of good things to do with it and there are going to be other folks who think of bad things to do with it. And you can't let that limit you in terms of proceeding forward on the technology because in the advancement of technology, we may in fact find the solutions to those bad uses.

Hannah - And we discuss if we really have free will or can we simply state that our brains force us to act the way that we do?

George - You can equate free will with self-control and that self-control or self-regulation is where our conscious brain can direct our unconscious impulses.

Hannah - Plus the brain isn't fully developed until the mid-20s, making teenagers more likely to take risks, have poor decision making and be more susceptible to peer pressure. We explore the ramifications of these findings in the court room.

Nita - Seems to actually be gaining a lot of traction in the US. It seems that judges and courts, and policy makers find that to be an appealing reason to try to be less harsh, less long-term in the way we treat juveniles and to try to have more compassionate decision-making with respect to them. Understanding that their brains are likely to get better over time, that they're likely to exercise better judgment over time.

Hannah - All to come! One of the sessions at the International Neuroethics Society was centred on the future of neuroscience research and it's ethical implications. I met with the panellists to discuss the issues raised.

Geoffrey - Name is Geoffrey Ling, I'm at the Defence Research Projects Agency. I'm the director of the biological technologies office. So, the internet, clearly one of the major advances of the last century. The program was initiated and involved DARPA. The scientists working there realized the internet had great potential but as he pursued forward I doubt very much that he actually thought some of the negative consequences. Such as that there'd be maleficents around using it for child pornography. But even if he had known that, it was not a good reason not to proceed forward with the technology.

Hannah - So even though now we have the dark web where there's lots of child pornography and also drugs being sold, and people being trafficked, and also people being murdered, for example, via the dark web. Even though there are lots of negatives associated with the internet there's also lots of positives and President Obama sees neuroscience as, potentially, a similar scenario.

Geoffrey - I agree. With everything good, there's always the bad. And when any new technology comes about,  there will be somebody who's going to think of good things to do with it and there are going to be other folks who think of bad things to do with it.

And you can't let that limit you in terms of proceeding forward on the technology because in the advancement of technology, we may in fact find the solutions to those bad uses. So, for example, in the dark web, we find that the internet as we know it for most people they're able to monitor that activity and have some level of control over.

So, there has to be a parallel one developed for this other one but then as technology will advance even further we'll find ways in fact to manage that. So, what I'm saying to you is that many times the solutions for some of these problems can be found within the technology itself.

Hannah - What kind of technology and what kind of neuroscience research are DARPA funding?

Geoffrey - So, at DARPA, we're really looking at neuro-technology as much as the President mandated in the Brain Initiative which is brain research advancement through innovative neuro technologies. Focus is to build neuro-technology such that these technologies will, in fact, be enabling to the neuroscientific community at large.

So, a particular technology in the hands of one neuroscientist, may have one function in the hands of another neuroscientist, may have another, but there lies the beauty of it. A hammer is a classic example of a very useful, broadly applied technology. A hammer in the hands of a sculptor will create art, in the hands of an orthopaedic surgeon will repair a hip, and a hammer in the hands of a carpenter will build a house. We view the same thing for neuro-technologies. The ability to look at functional data, potentially at anatomical data at different scales will in fact, be transformative as well informative to the neuroscientific community and those where we're focused at.

Hannah - And currently, neuroscientists are using quite dated techniques in order to peer into the brain and so that's why President Obama is really interested in investing in new technologies that will revolutionize the way that we can look at the human brain and see how we behave?

Geoffrey - That is indeed true. If you think about it, the ability to then find these breakthroughs that will after very desperate diseases such as epilepsy, stroke, Parkinson's disease, and other neuro-degenerative disorders such as Alzheimer's. It's going to be the tools that enable the neuroscientist to work with the neuroclinicians to make these advances, but we are working with very old technologies.

MRI has been around clinically since the 80s. EEGs are clinically since the 1920s, and so if you think about that, I mean they really have not kept pace with the other technologies that these clinicians such as the computer, for example, an iPad, an iPhone, this sort of thing. So, in fact we really are at a point where we desperately need these new technologies.

Hannah - And you mentioned today about the case of a pilot who unfortunately had lost the use of some of her limbs, and she was able to control a plane by thought alone. There was a question raised from this by one of the audience members which was, "Will humanity somehow lose itself as we become more in of a relationship with machine and with robots."

Geoffrey - So, I'll make it clear that that woman was not a pilot. That was the amazing thing. Was that she was just a regular person who had become quadriplegic due to a disease process and in fact when we were able to do this interface, she was able to show that she was able to fly or control, well I should say, a simulated aircraft as well as a seasoned pilot which is really quite extraordinary. It tells you the opportunity space for some of these neuro-technologies.

As far as losing one's self-identity in these interfaces, we're not there yet and we may be a long, long ways away from that 'cause what these technologies do is they measure simple signals and they translate them into motor activity. That's a far cry from being integrated within a machine or a substance that actually has a self-awareness and a self-consciousness, and before machines actually get to that point, it's going to take quite a bit of advances to do that. Will it happen one day? I may. I don't know though. That one is worthy of a very in depth discussion and what is the definition of self-consciousness, self-awareness, and so I think we're far away from that.

Hannah - And George, you mentioned consciousness, and the issue, the big issue of free will of today's meeting.

George - George Koob and I'm director of the National Institute on Alcohol Abuse and Alcoholism.

Well, basically I raise the argument of free will versus my brain made me do it, and you can take that argument from philosophers who would argue that there is no such thing as free will. Because basically they argue that these false assumptions about that we could behave differently than we did and that most of our conscious activity is in the present, things that we select. And so, we've learned a lot about how the brain makes selections and we know that habits can be formed in some basic parts of our brain. We know our stress responses and some basic parts of our brain that even reptiles have but in the end, I kind of fall on the zone of Patricia Churchland who argues that you can equate free will with self-control, and that self-control or self-regulation is where our conscious brain can direct our unconscious impulses. A key part of that is the frontal cortex and a key part of that frontal cortex is the ventral part of the frontal cortex, and then you get into a neuroethical question as to what happens when you have damage to that part of the ventral pre-frontal cortex whether it's through developmental issues or whether it's through excessive alcohol use as an adult. And I think that's a question that remains out there to be solved in neuroethics is that when is it really true that your brain made you do it?

Hannah - So, as we're finding out more and more through neuroscience on how the brain of an alcoholic, for example, may look. You might say that they have brain damage and that is causing them to impulsively and compulsively go towards alcohol?

George - Exactly right. You can make that argument but you can also the converse argument that that brain has the capability to recover that dysfunction and we know that to be the case in alcoholism because individuals do recover and they use a much diffuse pathways in their brain in that recovery process. So, I think it's kind of a cup half-empty, cup half-full position while you're engage in excessive drug taking behaviour, you may have a dysfunctional brain. But the brain, when it goes obstinate, has the capability to recover normal function. That would be the way I look at it.

Hannah - And talking about treatments. Tom, you made the point that although we're at a precursor to one of the largest neuroscience conferences in the world where over 40,000 neuroscientists are converging this weekend in Washington. You made the argument that although neuroscience is finding out lots of things about basic brain function and there are more and more neuroscientists being employed, we haven't actually made huge leap or huge translation for patients in the clinic or homeless people on the streets, for example.

Tom - I'm Tom Insel, I'm the director of the National Institute of Mental Health. That's the case, so the public health data speak for themselves. When you look at measures of morbidity and mortality, what's striking is the lack of change that we have seen remarkable improvements in life expectancy and we've seen reductions, really profound reductions, in some areas for mortality whether it's in heart disease or acute lymphoblastic leukaemia, or certain forms of stroke. We've done really very well.

When you look at suicide, it's striking that over 20 years, there's not really not even a hint of a reduction in the numbers and the numbers are very unsettling - 39,000 suicides this year in the United States expected. At least based on the historical trends and that's double the number of homicides. So, we're talking about very large numbers. There was a time in this country when homicides outnumbered suicides. Those are down about 50%. Suicide hasn't budged. Suicide 90% of the time is related to a mental illness or some form of brain disorder. That's inexcusable that we haven't been able to deliver to begin save lives when the actual numbers demonstrate just how prevalent this is.

Hannah - And so, bear in mind that we're about to have a big celebration of neuroscience at the conference. How on Earth do we try and translate some these of these findings to help patients that have mental health issues?

Tom - Well. There's sort of two sides to it, I think you do need the deep dive on the brain and you need to understand much, much more than we understand today about how the brain works.

The imaging is spectacular that we're seeing but to be fair, what we can do in mice for imaging as well as for diagnostics and therapeutics is way, way beyond where we are for humans. And that's really the challenge for us as a neuroscience community. How do we get to the point where we can take the kind of molecular cellular systems understanding that we have and some of the tools that seem to be working so well for simple organisms and take them to be able to study the human brain. Developing a human neuroscience and then to use that to actually make a difference for people with brain disorders.

Hannah - And one of the audience members today, we were discussing some of the ethics raised by how this research might be used. So, research is coming out of peering into the human brain. This audience member made the point that as a neuroscience community, maybe we have to see ourselves as part ethicist and start to tackle these issues individually as neuroscientists. Do you agree with that? Or do you think that it's the job of the policy makers and the ethicists? George.

George - Well. When you submit a grant application, you are more importantly a training application in the United States, you are required to take a course in ethics that's administered by your institution. It wouldn't be too much of a stretch that that was also included some neuroethics for people who are involved in training for the nurse sciences. So, for example, post docs at the Scripps Research Institute have to take this course and it's required of all post-doctoral fellows and it has to be put on the application or the studies sections that are reviewing the application will find fault with the application. So, that's one parochial take on it but this could be expanded in such a way and maybe will be based on our discussions today.

Hannah - And Tom.

Tom - I am not sure that I really understand that proposal. I don't what it would mean to expect all neuroscientists to begin to incorporate neuroethics or ethics. I think I'd have to understand what the neuroscience area is about and I'd have to understand what that practically means. I have to say that today there are so many requirements for someone who wants to do science. I'm a little bit cautious about increasing the demands or the requirements that will create a speed bump. I'm okay on the guard rails and I'm okay on the idea that we could use those flashing yellow lights at times, but we know so little and the needs are so urgent. I want to make sure that we're not getting in the way of progress by asking too much of people.

Hannah - And Geoff.

Geoffrey - I've nothing more to add to that.

Hannah - And so, very final question. What do you think might be the dark sides that may emerge from the Brain Initiative and the Human Brain Project, for example? Can you foresee? Can you look into the future and say that there might be some dark internet-type analogies that might come out of the data. Starting with you, Tom.

Tom - One of the biggest concerns I have about the Brain Initiative is that it won't be able to accomplish the dream, either positive or negative of what could be done here. There is no question that the challenge is great and the need is even greater. But whether there will be funding, whether there will be a right amount of fundamental knowledge, and whether we'll get lucky with the technology to do what we want to do is a huge, huge question for me still. So, I have to confess that, yes there are days when I think about what could be the negative consequences of great progress in this area but at this stage in the process, most of all what I'm worried about is that we won't make the progress we need to have a positive impact.

Hannah - George.

George - I would think that my dark side would be the issue about neuroethics. When we get to the point where you can tell that someone has a small dysfunction in, say, the connection between the insula and amygdala. Does that really mean that they have a mental disorder or that they have an impulse control disorder, let's say, and then do we pay for treatment of that even though there is no manifestation, and there's going to be a huge ethical  concern about the cost of doing super sophisticated diagnosis and who's going to pay for it and is it really necessary, and who's going to have access to it. So, all of the questions that we always have about health care are going to be expanded and it's the way I would argue.

Hannah - Particularly relevant for America where you didn't have the National Health Service. And lastly, Geoff.

Geoffrey - Well. I would say that the thing I worry most about on the dark side is in fact what I would say is an exaggerated concern because a lot of these things in the near-term certainly just about all of the things in the near-term are really meant to improve the human condition. And one of the problems you have in any of these things as you're improve the human condition is that you do worry very much about the downsides, the negative or untoward consequences as it were. So, to put it very bluntly, I believe that there's things called the one-third rule. The one-third rule is one-third of the work and the effort goes into the basic research, one-third is just getting through regulation, and one-third is getting through the commercial hurdles before you actually see any of these things making it out into the patient and into the bed side. So, when you think about that, that means only one-third of it is actually the research. The other two-thirds are actually hurdles to get through to getting into the bed side and that to me is really the darkest side right now as far as I can see it.

Hannah - Thanks to Geoff Lang from the US Defence Advanced Research Projects Agency. George Koob from the National Institute on Alcohol Abuse and Alcoholism, and Tom Insel from the National Institute of Mental Health.

Comments

Add a comment