2017 has been a coming-out year for the Brain-Machine Interface (BMI), a technology that attempts to channel the mysterious contents of the two-and-a-half-pound glop inside our skulls to the machines that are increasingly central to our existence. Last April, Elon Musk announced a secretive new brain-interface company called Neuralink. Days later, Facebook CEO Mark Zuckerberg declared that “direct brain interfaces [are] going to, eventually, let you communicate only with your mind”. Facebook is building what it calls a “brain-computer speech-to-text interface” technology that’s supposed to translate your thoughts directly from your brain to a computer screen without any need for speech or fingertips. The idea is that this technology will be able to take what you’re thinking to yourself in silence, using non-invasive sensors that can read exactly what you intend to say, and turn it into readable text. Similarly, Braintree founder Bryan Johnson is investing $100 million to make a neuroprosthesis that will allow us to unlock the power of the human brain and, ultimately, make our neural code programmable.
Scientists and entrepreneurs believe that the direct linkages between brains and computers will advance our efforts to create higher levels of artificial intelligence, but we are still far away from that. Multiple labs have been experimenting with BMI and some have outstanding results to present. CTRL-Labs recently developed a terrycloth stretch band with microchips and electrodes woven into the fabric, which is intercepting signals his brain is sending to his fingers when the user mimics typing on an imaginary keyboard. The armband is interpreting them correctly and relaying the output to the computer, just as a keyboard would have. More importantly, CTRL-Labs’ cofounder & CEO Thomas Reardon and his colleagues have found that the machine can pick up more subtle signals—like the twitches of a finger—rather than mimicking actual typing. CTRL-Labs, which comes with both tech bona fides and an all-star neuroscience advisory board, bypasses the incredibly complicated tangle of connections inside the cranium and dispenses with the necessity of breaking the skin or the skull to insert a chip—the Big Ask of BMI. Instead, the company is concentrating on the rich set of signals controlling movement that travel through the spinal column, which is the nervous system’s low-hanging fruit.
Meanwhile Technology Review reports that Timothy Gentner and his students at the University of California, San Diego, managed to build a brain-to-tweet interface that figures out the song a bird is going to sing a fraction of a second before it does so, all with remarkable accuracy. “We decode realistic synthetic birdsong directly from neural activity,” the scientists announced in a new report published on the website bioRxiv. The team, which includes Argentinian birdsong expert Ezequiel Arneodo, calls the system the first prototype of “a decoder of complex, natural communication signals from neural activity.” A similar approach could fuel advances towards a human thought-to-text interface, the researchers say. Makoto Fukushima, a fellow at the National Institutes of Health who has used brain interfaces to study the simpler grunts made by monkeys, says the richer range of birdsong is why the new results have important implications for application in human speech.
This team used silicon electrodes in awake birds to measure the electrical chatter of neurons in a specific part of the brain called the sensory-motor nucleus, where “commands that shape the production of learned song” originate. “We decode realistic synthetic birdsong directly from neural activity.” The team’s main innovation was to simplify the brain-to-tweet translation by incorporating a physical model of how finches make noise. In their report, the researchers say it can predict what the bird will sing about 30 milliseconds before it does so.
That said, it seems Brain-Machine Interfaces are closer than we might have thought. But Zuckerberg’s hopes for a sort of telepathy or Elon Musk’s vision of cyborg humans with enhanced memory remain far from reality. The human brain displays creative abilities that are beyond brute force computation. Our most human traits rely on the faculty of human understanding. Understanding is a non-algorithmic, largely socially mediated phenomenon. Computers and AI will never be able to duplicate the human brain, hopefully. As I already wrote “Surely, enhancing human memory and computing abilities using AI is an attractive field for further research and cyborgs are an attractive theme for film making. Fortunately for us, if Penrose is right, being a cyborg has nothing to do with the real human intelligence. Consciousness remains the last human frontier…”
It is very magic! These people are doing something we don’t understand. I think it is very admirable.