When patient T5 suffered a spinal cord injury that left him paralyzed, his dream of flying a drone seemed ceaselessly out of reach.
Now, due to a brain implant, he’s experienced the fun in a simulation. By picturing finger movements in his mind, the 69-year-old flew a virtual drone in a video game, with the quadcopter dodging obstacles and whizzing through randomly appearing rings in real time.
T5 is a component of the BrainGate2 Neural Interface System clinical trial, which launched in 2009 to assist paralyzed people control computer cursors, robotic arms, and other devices by decoding electrical activity of their brains. It’s not only for gaming. Having the flexibility to maneuver and click on a cursor gets them back online. Googling, emailing, streaming shows, scrolling though social media posts—what able-bodied people spend hours on daily—at the moment are again a part of their lives.
But cursors can only achieve this much. Popular gaming consoles—PlayStation, Xbox, Nintendo Switch—require you to exactly move your fingers, especially thumbs, fast and in multiple directions.
Current brain implants often take a bird’s-eye-view of all the hand. The recent study, published in Nature Medicine, separated the fingers into three groups—thumb, pointer and middle finger, and ring finger and pinky. After training, T5 could move each finger group independently with unprecedented finesse. His brain implant also picked up intentions to stretch, curl, or move his thumb side to side, letting him pilot the drone as if using a video game controller.
Calling his gaming sessions “stick time,” T5 enthusiastically said that piloting the drone allowed him to mentally “stand up” from his bed or chair for the primary time since his injury. Like other gamers, he asked the research team to record his best runs and share the videos with friends.
Brain-computer mind-melds are “expanding from functional to recreational applications,” wrote Nick Ramsey and Mariska Vansteensel on the University Medical Center Utrecht, who weren’t involved within the study.
Mind Control
Linking brains to machines has gone from science fiction to reality prior to now 20 years, and it’s been life-changing for people paralyzed from spinal cord injuries.
These injuries, either on account of accident or degeneration, sever nerve highways between the brain and muscles. Scientists have long sought to revive these connections. Some have worked to regenerate broken nerve endings contained in the body, with mixed results. Others are constructing artificial “bridges” over the gap. These implants, often placed within the spinal cord above the injury site, record signals from the brain, decode intention for movement, and stimulate muscles to contract or loosen up. Because of such systems, paralyzed people have been capable of walk again—often with assistance—for long distances and minimal training.
Other efforts have kept away from muscles altogether, as a substitute tapping directly into the brain’s electrical signals to hook the mind to a digital universe. Previous studies have found that watching or imagining movements—like, say, asking a patient to picture moving a cursor around a browser—generates similar brain patterns to physically performing the movements. Recording these “brain signatures” from individual people can then decode their intention to maneuver.
Noland Arbaugh, the primary person to receive a brain implant from Elon Musk’s Neuralink, is maybe probably the most well-known success. Late last yr, the young man livestreamed his life for 3 days, sharing his view while moving a cursor and playing a video game in bed.
Decoding individual finger movements, nevertheless, is a much bigger challenge. Our hands are especially dexterous and versatile, making it easy to type, play musical instruments, grab a cup of coffee, or twiddle our thumbs. Each finger is controlled by intricate networks of brain activity working together under the hood to generate complex movements.
Fingers curl, wiggle, and stretch apart. Deciphering the brain patterns that allow them to individually and collectively work together has stymied researchers. “In humans, finger decoding has only been demonstrated in prediction in offline analyses or classification from recorded neural activity,” wrote the authors. Brain signal control hasn’t been used to regulate fingers in real-time. Even in monkeys, brain implants have only been capable of separate fingers into two groups that move independently, limiting their paws’ overall flexibility.
A Virtual Flex
In 2016, T5 had two tiny implants inserted into the hand “knob” of his brain—one for either side that controls hand and finger movements. Each implant, the scale of a baby aspirin, had 96 microelectrode channels that quietly captured his brain activity as he went through a series of coaching tasks. On the time of surgery, T5 could only twitch his hands and feet randomly.
The team first designed a hand avatar. It didn’t fully capture the dexterity of a human hand. The index and middle finger moved together as a bunch, as did the ring and pinkie. Meanwhile, the thumbs could stretch, curl, and move side to side.
For training, T5 watched the hand avatar move and imagined moving his fingers in sync. Using a man-made neural network that makes a speciality of decoding signals across time, the team next built an AI to decipher T5’s brain activity and correlate each pattern with various kinds of finger movements. The “decoder” was then used to translate his intentions into actual movements of the hand avatar on the pc screen.
In an initial test that only allowed the thumb to increase and curl—what the researchers call “2D”—the participant was capable of extend his finger groups onto a virtual goal with over 98 percent accuracy. Each attempt took only a bit greater than a second.
Adding side-to-side movement of the thumb had an identical success rate, but doubled the period of time (though he got faster as he became conversant in the duty). Overall, T5 could mind-control his virtual hand to achieve around 76 targets a minute, far faster than previous attempts. The training “wasn’t tedious,” he said.
Each finger group movement was then mapped onto a virtual drone. Like moving joysticks and pressing buttons on a video game controller, the finger movements moved the quadcopter at will. The system kept the virtual hand in a relaxed, neutral pose unless T5 decided to maneuver any of the finger groups.
In a day of testing, he flew the drone a dozen times across multiple obstacle courses. Each course required him to make use of certainly one of the finger group movements to successfully navigate randomly appearing rings and other hurdles. One challenge, for instance, had him fly figure eights across multiple rings without hitting them. The system was roughly six times higher than prior systems.
Although his virtual fingers and their movements were shown on the pc screen while playing, the visuals weren’t crucial.
“When the drone is moving and the fingers are moving, it’s easier and faster to only take a look at the drone,” he said. Piloting it was intuitive, “like riding your bicycle in your method to work, [thinking] ‘what am I going to do at work today’, and also you’re still shifting gears in your bike and moving right along.”
Adapting from easy training exercises to more complicated movements was also easy. “It’s like when you’re a clarinet player, and you decide up another person’s clarinet. You understand the difference immediately, and there’s a bit learning curve involved, but that’s based on you [having] an implied competency together with your clarinet,” he said. To regulate the drone, you simply should “tickle it a direction,” he added.
The system continues to be removed from industrial use, and it is going to should be tested on more people. Recent brain implant hardware with more channels could further boost performance. Nevertheless it’s a primary step that opens up multiplayer online gaming—and potentially, higher control of other computer programs and complex robotic hands—to individuals with paralysis, enriching their social lives and overall wellbeing.