AI decoding what mice see could improve future BCIs
An AI tool that deciphers what mice see could improve future brain-computer interfaces, according to a new one study.
The system, called CEBRA, was developed by researchers at EPFL, a university in Switzerland. Their goal? To uncover hidden relationships between the brain and behavior.
To test CEBRA (pronounced “zebra”), the team tried to decipher what a mouse sees when it watches a video.
“Since the brain is the most complex structure in our universe, this is the ultimate test for CEBRA.
First, the researchers collected open-access neural data on rodents that watched movies. Part of the brain activity was measured with electrode probes in the visual cortex of a mouse. The rest came via optical probes from genetically modified mice, which were designed so that their neurons glowed green when activated.
Join the TNW conference in June and save 50% now
Take advantage of our 2for1 sale and bring your friend
All this data was used to train the basic algorithm in CEBRA. As a result, the system learned to assign brain activity to specific frames in a video.
Then the team applied the tool to another mouse that had watched the video. After analyzing the data, CEBRA was able to accurately predict what the mouse had seen based on the brain signals alone.
The team then reconstructed the clip from the neural activity. You can see the result for yourself in the video below:
It is not surprising that the researchers are not only interested in the movie viewing habits of rodents.
“The goal of CEBRA is to bring structure to complex systems. And given that the brain is the most complex structure in our universe, it’s the ultimate testing ground for CEBRA,” said EFPL’s Mackenzie Mathis, the study’s principal investigator.
“It could also give us insights into how the brain processes information and could provide a platform for discovering new principles in neuroscience by combining data across animals and even species.”
That’s not true either CERA is limited to neuroscientific research. It can also be applied to numerous datasets of temporal or joint information, Mathis says, including data on animal behavior and gene expression. But perhaps the most exciting application is in brain-computer interfaces (BCIs).
As the movie-loving mice showed, even the primary visual cortex – often believed to underlie only fairly basic visual processing – can be used to decode videos in a BCI style. For the researchers, an obvious next step is to use CEBRA to improve neural decoding in BCIs.
“This work is just a step toward the theoretically grounded algorithms needed in neurotechnology to enable powerful BMIs,” said Mathis.
You can read the full study paper in nature.