Type with your mind: We’ve achieved a first in brain-computer research, says Facebook

Two years ago, Facebook announced plans to create a non-invasive wearable system that could type 100 words per minute by decoding phrases and words that a user just thinks.

The company has now published an update on its ambitions to build augmented-reality (AR) glasses that would allow people to communicate without using a smartphone.

Facebook thinks a machine that decodes words in the brain’s speech center could offer the speed of voice and the privacy of text.

“The promise of AR lies in its ability to seamlessly connect people to the world that surrounds them – and to each other,” Facebook researchers said in a blog.

“Rather than looking down at a phone screen or breaking out a laptop, we can maintain eye contact and retrieve useful information and context without ever missing a beat. It’s a tantalizing vision, but one that will require an enterprising spirit, hefty amounts of determination, and an open mind.”

The company believes it is a viable technology, assuming major advances can be made in the field of brain-computer interface (BCI) research, which today necessitates invasive techniques using implanted electrodes in the brain.

Facebook Reality Labs’ is investigating the potential for BCI and funded researchers at the University of California, San Francisco, who’ve detailed in Nature Communications a BCI that can detect when a person hears or says something and then decodes the phrase.

In this experiment, participants listened to questions and responded aloud with answers while the researchers used a system to decode the utterances from the brain in real time.

“After training, participants performed a task in which, during each trial, they listened to a question and responded aloud with an answer of their choice. Using only neural signals, we detect when participants are listening or speaking and predict the identity of each detected utterance using phone-level Viterbi decoding,” the researchers explain.

The participants were volunteers with normal speech who were undergoing brain surgery to treat epilepsy. They were asked questions like “How is your room currently?”. For this question they had five valid answers to choose from, including “Bright”, “Dark”, “Hot”, “Cold”, and “Fine”.

Facebook notes that the main advancement in this paper is that the researchers were able to “decode a small set of full, spoken words and phrases from brain activity in real time – a first in the field of BCI research.”

The researchers are aiming to decode speech at a rate of 100 words per minute with a 1,000-word vocabulary and a word error rate of less than 17%.

Facebook says it’s still a “long way” from achieving the same results with non-invasive technology, but it believes the work underway will help it develop decoding algorithms and figure out the specifications for a non-invasive wearable.

One non-invasive technique it’s exploring with researchers in another project is using near-infrared light to detect changes in neurons in the brain.

“Like other cells in your body, neurons consume oxygen when they’re active. So if we can detect shifts in oxygen levels within the brain, we can indirectly measure brain activity,” Facebook said.

The company doesn’t think this technique will allow it to decode thought phrases, but sees potential for recognizing key words like ‘home’, ‘select’, and ‘delete’.

Leave a Reply

*