Brain-monitoring headsets may soon let us operate computers by thoughts alone. Photo: Supplied
It’s an enticing thought. Imagine using a computer interface to read someone else’s mind, or donning a cool-looking headset and wirelessly controlling anything that’s digitally connected just by thinking about it.
Researchers working at the frontiers of brain science believe this will be possible in the forseeable future. Some interviewed for this story see it as inevitable, but let’s start by looking at the range of brain-reading gadgets already available online.
“I think of myself as a pathological optimist.”
Dr Kristyn Bates, University of WA
Companies like NeuroSky and Emotiv are selling headsets that connect electrical signals from the brain to desktop and mobile apps, and the hype surrounding them gives the impression that humanity has already entered a new era of mind control.
Not so. Electroencephalography (EEG) headsets were invented in the 1920s. NASA used them in the 1960s to analyse the effects of space travel on astronauts. They’ve been used in neurotherapy for decades, and for years marketers have been popping them on consumers’ heads to test emotional reactions to new products.
Anyone paying between $100 and $700 for one of the new consumer headsets in the hope of controlling their computer with their thoughts is likely to be disappointed. Even Emotiv’s latest version due out early next year for US $360 has strict limitations.
Emotiv’s vice president of corporate development, Kim Du, admits an EEG headset can’t compete with the speed and efficiency of a keyboard. “We’re trying to be very upfront in terms of the capabilities,” she told Fairfax Media from her San Francisco headquarters.
“EEG is a new industry. There’s still a lot being done and there are certainly aspects that have been sensationalised. Most people could use their mind to move a virtual or physical object with just one action. But to do three or four actions, that’s a more advanced user. That would require more training.”
To be fair, most people could send a range of commands using an EEG headset to make a character in a game run or fire a gun for instance, but they’d need to wink and grimace at their computer screen in way that might make onlookers doubt their sanity.
Understanding EEG’s limitations, as well as the opportunities it offers to everyday consumers and people with disabilities, requires delving into the intricacies of the brain and why millions of research dollars are being spent to find out how it works.
After all, the ultimate app is inside our heads, and all the wonders of science have yet to beat it. In one test, the world’s fourth-fastest supercomputer at the Okinawa Institute of Technology Graduate University in Japan took 40 minutes to simulate a single second of human brain activity.
The brain’s amazing processing power is generated by about 100 billion nerve cells called neurons. That’s roughly the same number of stars as there are in the Milky Way. EEG detects electrical pulses from masses of neurons firing in different parts of the brain.
These pulses are not thoughts. They are patterns of brain activity, including alpha, beta and theta waves, that can indicate a person’s mood and their levels of relaxation, anxiety or mental alertness.
The accuracy of an EEG headset depends on the number of sensors it places on the scalp, how and where they are placed, and the sophistication of the algorithms used to interpret brainwaves into digital information for a computer.
Electrical activity from muscle movements is also detected, and algorithms designed to recognise a wink or clenching of teeth can easily be used as computer commands, but that’s hardly thought control. EEG’s greatest value to consumers is in mind training, and most of the hundreds of apps now available for download operate in that area.
Stuart Johnstone, associate professor of psychology at the University of Wollongong, developed an EEG app designed to improve children’s memory, impulse control and ability to concentrate. It’s called Focus Pocus and is now marketed around the world.
“The EEG appears on screen as interesting games that let a child know whether their level of attention is low or high,” he said. “When they become aware of how being in a focused or unfocused state feels for them, they learn over time how to control that state.
“It works for children with ADHD as well as normal children, training them how to focus in a classroom, for instance, and then relax when they don’t need that level of attention. It can work for adults, too – in fact, anyone.”
EEG headsets have also been used to help a man with paralysed legs to learn to walk. After intense training he learned to send signals to electrodes on his knees that stimulated his leg muscles. However the University of California scientists involved suggested brain implants were more likely to provide a permanent solution.
That’s because of the inability of EEG to focus on the specific neurons involved in thought and word processing. “EEG is essentially recording signals from across the whole brain,” said Professor Geoffrey Goodhill of the Queensland Brain Institute, who is doing advanced research on how brains process information.
The professor points to other technologies such as the huge functional magnetic resonance imaging machines in hospitals that record much more localised areas of brain activity, and believes breakthroughs in super-conducting materials could allow them to be miniaturised.
“I think it’s very likely that within a few decades there will be wearable devices that can detect complex thoughts,” he said.
That likelihood took a leap forward in September last year when researchers were able to transmit a thought from the brain of a person wearing an EEG headset in India to a person in France wearing a transcranial magnetic stimulation device.
The transmissions were restricted to a single message: “hello”. However, the research team warned the technology would “eventually have a profound impact on the social structure of our civilisation and raise important ethical issues.”
Dr Kristyn Bates, an experimental neuroscientist at the University of Western Australia, says the results are exciting. “They were able to come up with an algorithm that could sort through the brainwaves from an EEG and interpret a particular thought pattern that was sent to someone else in a different country,” she said.
She believes a brain-computer interface that deals fluently with language could definitely be developed within the next 20 years, and that it will be a positive advance for humanity. “I think of myself as a pathological optimist,” she says.
“I hope it would improve empathy between people.”