Scientists are using brain-computer connections to restore a lost sense of touch

Elon Musk: Neuralink brain-computer tech update coming soon

The complexity, and importance, of our sense of touch is huge. Just think about reaching out for a piece of fruit on the table – you need touch to know when your fingers have reached, you need to adjust your grip so you don’t squeeze it too hard, or squeeze too softly and drop it. Touch can tell you if the fruit is ripe or past its best, whether its fridge-cold or it’s been sitting on the counter. As well as all that, you need touch for good movement, and for fine movement like doing up a button, and the continuous feedback from the muscles in our feet is vital to keep us from stumbling.

When brain computer interfaces (BCI) are mentioned, it’s often in context of helping people with paralysed limbs start to move them again. But for those with spinal injuries, it’s not just movement that’s lost, it’s also that sense of touch, as areas of their body are left completely without feeling. Now, BCIs are helping to replace that too.

BCIs work because both computers and human nerves use pulses of electricity to communicate. Think of the nerve pathways between the brain and the peripheries of your body — your skin, your arms and legs — as a series of wires. In a spinal injury, the wire gets cut and the electricity can’t flow properly. Without signals travelling down to the muscles, limbs can’t move; without signals being relayed up from the skin, the brain doesn’t register a sense of touch.

SEE: Building the bionic brain (free PDF) (TechRepublic)

In this context, BCIs work by creating an alternative circuit for the electricity to flow through, restoring communication between the brain and peripheral nerves — down from the brain for movement, up from the peripheries for touch.

Most BCI systems that aim to restore touch start with an electrode array being implanted in the brain’s upper layer, the cortex. The electrode array can record signals from the brain, which are passed via a wired connection to machine-learning software. The software decodes the signals, and passes them on to the peripheral nerves, stimulating muscles to move. Touch takes the reverse journey — changes to the sense of touch are detected either from a human or electronic arm and fed back up to the brain in a series of electrical pulses that it understands as different sensations.

In the US, Battelle has created a BCI system that works by gathering ‘residual touch’ signals. Someone with a complete spinal cord injury may feel no sense of touch whatsoever below a certain level on their body, but often the faintest wisps of nerve fibre connecting the brain and the periphery still exist. They may not be strong enough to cause a person to feel a sensation of touch, but they can still provoke electrical activity in the brain. Battelle’s system picks up the signals in the brain via an array in the motor cortex, decodes and amplifies them, and then passes on the sensory feedback to a separate area on the users’ body which still has a working sense of touch.

“When there is residual touch, we artificially feed it back to the participant in the form of haptic feedback. So these are basically small motors, similar to what’s in the cell phone, that gives him sensation artificially that he’s touching something even though he can’t feel it,” Patrick Ganzer, research scientist at Battelle, told a recent Fermilab academic conference.

The participant was an individual who became paralysed after a spinal cord injury in 2010. With the system turned on, the person was able to identify whether they were holding something, and what type of object they were holding, 50 percent more of the time with the system than without it.

“With artificial sensory feedback, he feels more in control of his hand. This is called the sense of agency… there’s a cognitive attribute to this, where your hand might not really feel like it’s yours when you’re using this device. With that artificial sensory feedback, we increase that conscious perception that ‘this is my hand again’,” Ganzer said.

Other systems direct signals between the brain and a robotic arm.

In Utah and Pittsburgh Universities’ labs, for example, users can manipulate and receive sensory feedback from robotic arms. Utah University uses a robotic prosthesis known as the ‘Luke’ arm — a nod to the Star Wars character’s prosthesis. The sensory feedback has a knock-on effect in how well the user can control the hand’s movement says Robert Gaunt, assistant professor in the Department of Physical Medicine and Rehabilitation at the University of Pittsburgh. “If we can create these sensations, does it help this person control this robotic arm that they’re trying to reach out and move things around with them? The short answer is, yes, we believe that it that it does,” he says.

Restoring touch can do more than just aid movement — it can even reconfigure the brain in those that have lost a limb.

People who’ve lost limbs may experience a phenomenon called ‘phantom limb pain‘, where they experience pain in the limb that’s no longer there. With a robot hand in place, the brain adjusts and the pain can dissipate.

Our body images are not really created by our body, says Gregory Clarke, associate professor of biomedical engineering at the University of Utah – they’re created by our brains.

“The idea here is that if we could restore the sense that a prosthetic hand is not just a tool, but it’s actually part of a person’s own body image, in that case, the phantom hand wouldn’t have a place to live anymore. The phantom hand would be displaced by the now prosthetic hand again, that feels as if it’s the person’s own. And when the phantom limb disappears, so does the phantom pain,” he says.

SEE: What is a brain-computer interface? Everything you need to know about BCIs, neural interfaces and the future of mind-reading computers

Even though we talk about touch as a single discrete entity, it’s actually a whole bundle of things brought together: along with pain, there’s vibration, pressure, fine touch and temperature, which all have their own dedicated nerve fibres. Will BCI systems ever be able to unite and recreate the separate threads that create touch?

“The more accurately we provide information that’s similar to what the intact nervous system provides, then the better the brain will be able to understand it, and the better the brain will be able to use it. The challenge before us is not just to provide a kind of binary on/off sense that the hand has touched something, but to capture some of the richness of that experience,” Utah’s Clark says.

“A lot of where the research is right now is, how do we try to use biomimicry? How do we try and mimic what’s going on in the healthy nervous system, and recreate that with an artificial device. But work in that area is really just kind of getting going, and in many cases is actually somewhat limited by the fact that we actually don’t know how this all works normally,” Pittsburgh’s Gaunt adds — in other words, we yet don’t know enough about how the human brain processes and understands touch to be able to recreate it with BCIs.

However, the work of BCI researchers is beginning to open up the mysteries of how our brain works, not only bringing BCIs closer to more widespread use, but also enhancing neuroscience as a whole.

Along with an understanding of the brain, researchers agree that some of the challenges currently holding back the development of commercial systems are practical: the systems are too large and not portable enough. Currently, the use of BCIs is almost entirely lab-based. While researchers are looking to start home-based trials soon, it’s likely to be many years before users can access commercial systems for use around the clock.

Previous Post
Test and Trace program skipped GDPR privacy assessment
Next Post
Remote work is the new normal. But the tech problems won’t go away

Related Posts

No results found.

Menu