Stanford researchers have successfully implemented a brain-computer interface capable of interpreting handwriting attempts.
The latest development in brain and computer interfaces (BCI) is converting a person’s thoughts about writing words into text. Using an artificial neural network, researchers at Stanford University in California have successfully transformed the brain signals of a 65-year-old man paralyzed under the neck due to a spinal cord injury.
Their BCI was faster than traditional head or eye tracking systems, which allowed users to move their mouse pointer to type messages. A recent example of this is studying Neuralink with Pager, a macaque that played Pong with their own eyes.
Jamie Henderson of Stanford University, a functional neurosurgeon at the university health center, points to flaws in these systems. He said, “If you use eye tracking to work with a computer, your eyes are connected to everything you do.” “You can’t look around, or look around, or do anything else. Having this extra input channel could be very important.”
Henderson and his team implanted two small sets of sensors beneath the surface of the human brain. Each network can detect signals from about 100 billion neurons in the brain.
Despite the limited range, observing 200 neurons gave the artificial neural network enough information to reliably interpret the signals of the human brain.
He was asked to imagine writing letters and words on a sheet of paper. Using only his imagination, he was able to write 90 letters per minute as implants discovered his writing attempts.
The article published in Nature says: “To our knowledge, these write speeds exceed those reported for any other BCI and are comparable to typical writing speeds for smartphones for individuals in our participants’ age group (115 characters per minute).”
The device was 94% accurate in its conversions, dropping to 99% when researchers introduced the auto-correction tool.
This is the first of its kind for BCI research. According to New Scientist, previous BCI interfaces were able to interpret larger signals, such as those produced by arm movements. But the Stanford device was the first device to successfully discover “subtle movements and dexterity like handwriting”.
Although the model used in this study is specific about the test and does not translate to other people, the researchers plan to advance the technology further. The goal is to create a speech decoder for people who are no longer able to speak but who still likely have the necessary neural pathways to function.
“Our results open up a new approach to BCIs and demonstrate the feasibility of accurately decoding rapid movements and dexterity after years of paralysis,” they said.