Virtual social life is possible with brain machine interfaces

prime target in the field of neuroprosthetics to improve the lives of paralyzed patients by restoring their lost abilities in the real world.

An example is work done in 2012 by neuroscientists Leigh Hochberg and John Donoghue at Brown University. Their team trained two long-term paralyzed subjects — a 58-year-old woman and a 66-year-old man — to use a brain-machine interface (BMI) that decodes signals from the motor cortex to guide a robotic arm to reach for and understand objects. One person was able to pick up a bottle and drink it using the device.

More recently, in 2017, a French team from the University Hospital of Grenoble surgically implanted a wireless epidural brain machine interface into a 28-year-old man with quadriplegia. After two years of training, the patient was able to control some functions of the exoskeleton using his brain activity alone.

From advanced robotics to the precise re-innervation of damaged peripheral nerves in patients’ arms and legs, these projects require extraordinary medical and technological breakthroughs. Large-scale development is still required to achieve real-world clinical applications of these methods.

However, perfecting the brain-computer interface itself—the precise translation of a brain signal into intended action—may require a simpler, cheaper, and safer technology: virtual reality. In fact, in many BMI projects, initial training is based on virtual simulation: for example, before trying to control a real robotic arm, subjects first learn to control a virtual arm.

As the gaming world and metaverse evolves, the next big breakthroughs in BMI applications will reach the virtual world before they materialize in the real world. This has already been shown to be possible by a team of researchers at Johns Hopkins who were able to teach a paralyzed patient to fly a warplane in a computer flight simulation using their BMI. According to their report, “From a subject perspective, this was one of the most interesting and entertaining experiments she had ever conducted.”

In 2023, we will see many BMI applications that will allow people with disabilities to fully participate in virtual worlds. At first, by participating in simpler interactive communication spaces such as chat rooms; Later, by taking full control of their 3D avatars in virtual spaces where they can shop, social interact, or even play games.

This applies to my own work at the University of California, San Francisco, where we construct BMIs to recover verbal communication. We can already train patients to communicate via chat and text messaging in real time. Our next goal now is to achieve real-time speech synthesis. We have previously shown that it is possible to do this offline with good accuracy, but doing it in real time presents a new challenge for paralyzed patients.

We are now expanding our work to include the ability to control facial avatars, which will enrich virtual social interactions. Seeing the mouth and lips move when someone is speaking greatly enhances speech perception and understanding. The areas of the brain that control the vocal apparatus and the mouth also overlap with the areas responsible for non-verbal facial expressions, so facial avatars will also be able to fully express them.

With the convergence of virtual reality and BMI, it’s no coincidence that tech companies are also developing consumer applications for neural interfaces, both non-invasive and invasive — needless to say, these developments will have major implications for all of us, not just in how we interact with our devices. computer, but how we interact with each other.

For paralyzed patients, the implication is more substantial – related to their ability to participate in social life. Social isolation is one of the most devastating aspects of paralysis. However, as human social interactions increasingly rely on digital formats – like text messaging and email – and virtual environments, we now have an opportunity that didn’t exist before. With brain-machine interfaces, we can finally satisfy this unmet need.

Leave a Comment