In a study that informs our understanding of how the brain allocates attention to different stimuli, scientists have successfully enabled human volunteers to choose between images on a computer screen just by altering their thoughts.
Writing in Nature, Caltech-based researcher Moran Cerf and his colleagues describe how they recruited 12 epileptic patients who were undergoing brain studies to identify the sources of their seizures.
Arrays of 64 electrodes had been temporarily implanted into the medial temporal lobes of the patients to monitor brain activity. The researchers showed the subjects over 100 images of familiar people, places, animals and buildings including Marilyn Monroe and Bill Clinton. Four electrodes that registered nerve cell activity exclusively when a subject viewed a specific image were then identified.
Next, in a thirty minute trial, the subjects were shown combinations of the four images with two of them superimposed in a semi-transparent manner on a screen, one in front of the other. A computer was programmed to monitor the output from the electrodes and to increase the visibility (opacity) of an image when the electrode to which it was linked detected more brain activity.
In each trial, the subjects were then instructed to try to make one of the two images more visible and fade out the other, by whatever thought processes they wished.
Intruigingly, the setup also allowed the team to probe the process of attention. They had expected that, when a subject was instructed to focus on, say, Marilyn Monroe, the electrode corresponding to her picture would become more active. Instead, the non-Marilyn Monroe electrodes became less active. In this way the brain is clearly focusing attention by reducing the level of distraction, almost like focusing the beam of a spotlamp on one part of a drama unfolding on stage.
And apart from academic interest, the new technique may be useful from a practical perspective since it might be possible to design a system to help paralysed patients to regain control over their environment. "We could actually learn how their brain looks when they think of water, food, or pain," says Cerf. This would allow a patient to communicate with doctors again.