Brain-computer interfaces can be invasive, using chips, or non-invasive
Scientists at Imperial College London believe non-invasive options are more realistic
In their picturesque garden in a quintessential English village in Oxfordshire, Tom and Ellen Nabarro are making mint tea with leaves cut from a plant nearby.
They’ve just surfaced after a late night in London, which Ellen attended almost straight off a plane having returned from vacation. But they are fresh and conversational, with no fretting over obstacles that could be in their way that day – or in fact, any day.
It’s easy to overlook that Tom uses a wheelchair. He’s paralyzed from the neck down and has been for over a decade, after a snowboarding accident severed his spine. But he lives the way he wants to.
“I have a very active life,” said Tom Nabarro, 32.
Perhaps it’s the comfort and ease with which he moves around his home, the calm and happy nature both he and his wife exude or the fact that both have professions they enjoy. Whatever the reason, the garden hosts a happily married couple with a constant flurry of friends dropping by.
Disability does not get in their way.
“I can work. I can travel. There’s support from everyone around us,” Tom said, acknowledging that his road to get to this point was far from easy. But now, they are happy. “I think we have probably a more active life than maybe we would have had before.”
Their secret: a firm grasp of technology.
Online and infrared assistance
Tom’s work as a software engineer enables him to know and understand technology that can assist him as part of his everyday.
Today, he uses eye gaze technology, head pointers and voice recognition to facilitate his movements. His wheelchair also has environmental controls that use infrared to communicate with devices across his home.
“I control my bed movements, so the bed sits up when I tap a certain button, and I can open the door and watch TV and change channels,” he said. “There are not many things you can’t do.”
Tom inherited his home from his grandfather and it has been adapted for his needs. He moves around it with ease using a switch at the top of his chair, controlled by his head. “I still feel very human, surprisingly,” he said.
He easily gets online and uses voice recognition and dictation to instruct his devices. “I use voice recognition for everything. … It’s incredibly enabling,” Tom said.
This independence, combined with assistance from two regular care workers, helps keep their marriage strong and allows Ellen to live the life she wants while caring for her husband.
But the Nabarros believe that these technologies are just the tip of the iceberg.
“Full automation would be good,” said Ellen, also 32, who works in theater, “meaning Tom can do more for himself and have to rely less on other people.”
Tom has even more ambition and believes that the burgeoning field of brain-computer interface technology – both invasive and not – will soon become mainstream.
Brain meets computer
“I spend a lot of time frustrated that I have to spend three minutes maneuvering somewhere when someone can walk there in seconds,” he said. Brain-computer interfaces may someday enable automated senses, he hopes, and machines could understand human cognition and intention and in turn make decisions for you. Then, you “can get on with daily life,” he said.
As regular – and somewhat blissful – as his life may seem, there is room for improvement, Tom and Ellen agree. “I need something that is as good as a body,” Tom said. “Your body is a physical extension of your brain, and if your brain is still functional, then you can pretty much still do anything.”
What he wishes for is the ability to make turns easily, to drive smoothly on uneven ground and to move close to people or turn to look at them as a situation fits. He believes wheelchairs will continue to be the way forward, rather than more ambitious technologies using exoskeletons. The Nabarros are working with scientists from Imperial College London to help make this happen.
They are working on a noninvasive brain-computer interface, without the need for implants, as they believe implants are still decades away from being a reality.
Th technology they are optimistic about does not require surgery, but instead combines modalities, according to Aldo Faisal, who runs the Brain and Behavior Lab at Imperial College London.
He gives the example of picking up a glass: “We can measure eye movements that tell us you’re looking at the glass. You can have a computer recognize that you’re looking at a glass, and we can then form a brain command and understand the person wants to grab that glass.”
Next would be a robotic hand grabbing that glass and bringing it to your lips.
Faisal points out that this life-changing assistance can be made possible with cameras and sensors that learn to understand what someone intends to do, and then initiating actions to make it happen.
Faisal believes people’s attitudes and acceptance – and affordability – will push this technology and determine the form it takes.
“Society will drive these technologies forward,” he said. “What we’ll be able to see in the next five to 10 years is self-driving wheelchairs.”
‘I want to carry on’
The experiments Nabarro has undertaken with Faisal have involved external technologies, such as skull caps embedded with electrodes that read brain signals to decipher the action someone would like to take. Both are keen on the idea, as it does not involve surgery and therefore there’s no time away from work – or life.
“I’m not into being a guinea pig,” Tom said. “I want to carry on living my life.”
But many other research teams globally are working with chips implanted in the brain in order to help people with paralysis regain control of their limbs. Closer proximity to brain cells reduces the noise from other brain signals and allows devices to focus on a region to better tap into someone’s thoughts.
This year, a man paralyzed from his shoulders down in the United States regained the use of his right hand with the aid of an experimental prosthetic that replaced lost connections between his brain and the muscles, thanks to a team at Case Western Reserve University.
In 2016, scientists at the University Medical School Utrecht enabled a woman with amyotropic lateral sclerosis to communicate by reading her thoughts, replacing her use of an eye tracker.
In an editorial published this year, Steve I. Perlmutter, an associate professor at the University of Washington, said that the research with the paralyzed man at Case Western was groundbreaking, but that this form of treatment is not nearly ready for use outside the lab.
“The algorithms for this type of brain-computer interface are very important, but there are many other factors that are also critical, including the ability to measure brain signals reliably for long periods of time,” he told CNN.
Join the conversation
The significant challenges that remain, as well as the need for surgery and monitoring, are why Faisal’s team is focusing its attention on less-invasive ways to tap into someone’s cognition.
“If we’re looking at technologies where we want to make sense out of brain recordings directly, I think that’s a lot further away,” Faisal said, possibly 10 to 20 years out and subject to regulatory approval.
He also highlighted affordability. “If you put things into the body, it’s a lot more expensive, because you need to undergo surgery. If you just pop on a pair of glasses, it’s a lot cheaper.”
As Nabarro and Faisal’s partnership grows, they hope it will accelerate the field and bring brain-computer interface technologies to the market.
“I spend a huge amount of my time trying to communicate with a computer,” Tom said. “The closer I can get my brain to a computer, the more efficient I will be.”