Skip to main content

A.I. translation tool sheds light on the secret language of mice

Breaking the communication code

Ever wanted to know what animals are saying? Neuroscientists at the University of Delaware have taken a big leap forward in decoding the sounds made by one particular animal in a way that takes us a whole lot closer than anyone has gotten so far. The animal in question? The humble mouse.

Recommended Videos

To study mouse vocalizations, the team gathered data as groups of four mice — two males and two females — interacted. They interacted for five hours at a time in a chamber that was kitted out with eight microphones and a video camera. In total, the researchers recorded encounters between a total of 44 mice. Starting with the enormous amounts of ensuing video and audio data, the researchers then used machine learning A.I. to develop a system that’s able to connect specific sounds with distinct animal behaviors. In short, it could work out which mouse was squeaking, where, and in what scenario.

“To link mouse vocalizations to specific actions, we needed multiple technological advances,” University of Delaware neuroscientist Joshua Neunuebel told Digital Trends. “First, we needed to be able to assign specific vocalizations to individual mice that were socially interacting. To do this, we developed a sound source localization system that simultaneously recorded mouse ultrasonic vocalizations on eight different microphones, as well as the position of the mice with a video camera.”

The combination of microphones and camera allowed the team to estimate the location of where a particular vocal signal was emitted and then assign the signal to a specific mouse. Once they were able to assign vocalizations to specific animals, the team used an unsupervised learning algorithm that groups items with similar features to categorize them. Finally, they used a tool called JAABA, the Janelia Automatic Animal Behavior Annotator, to automatically extract specific social behaviors with high fidelity.

“It’s not necessarily a translational tool per se, but it’s a tool that helps us interpret mouse social behaviors,” Neunuebel said. “However, this being said, mice are good models for understanding the neural basis of social behavior, which may ultimately shed light upon how the brain circuitry of humans is functioning.”

The work is the subject of two new papers published in the journals Nature Neuroscience and Scientific Reports. Both papers report on different aspects of the work on how communication shapes social behavior and the neural networks that encode this information.

As Neunuebel says, they haven’t developed a full-fledged human-to-mouse translation tool. Nonetheless, this work — alongside similar research into communications of animals like dolphins — certainly brings us closer to understanding the subtleties of animal chat.

Luke Dormehl
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Read the eerily beautiful ‘synthetic scripture’ of an A.I. that thinks it’s God
ai religion bot gpt 2 art 4

Travis DeShazo is, to paraphrase Cake’s 2001 song “Comfort Eagle,” building a religion. He is building it bigger. He is increasing the parameters. And adding more data.

The results are fairly convincing, too, at least as far as synthetic scripture (his words) goes. “Not a god of the void or of chaos, but a god of wisdom,” reads one message, posted on the @gods_txt Twitter feed for GPT-2 Religion A.I. “This is the knowledge of divinity that I, the Supreme Being, impart to you. When a man learns this, he attains what the rest of mankind has not, and becomes a true god. Obedience to Me! Obey!”

Read more
The future of A.I.: 4 big things to watch for in the next few years
brain with computer text scrolling artificial intelligence

A.I. isn’t going to put humanity on the scrap heap any time soon. Nor are we one Google DeepMind publication away from superintelligence. But make no mistake about it: Artificial intelligence is making enormous strides.

As noted in the Artificial Intelligence Index Report 2021, last year the number of journal publications in the field grew by 34.5%. That’s a much higher percentage than the 19.6% seen one year earlier. A.I. is going to transform everything from medicine to transportation, and there are few who would argue otherwise.

Read more
Google’s LaMDA is a smart language A.I. for better understanding conversation
LaMDA model

Artificial intelligence has made extraordinary advances when it comes to understanding words and even being able to translate them into other languages. Google has helped pave the way here with amazing tools like Google Translate and, recently, with its development of Transformer machine learning models. But language is tricky -- and there’s still plenty more work to be done to build A.I. that truly understands us.
Language Model for Dialogue Applications
At Tuesday’s Google I/O, the search giant announced a significant advance in this area with a new language model it calls LaMDA. Short for Language Model for Dialogue Applications, it’s a sophisticated A.I. language tool that Google claims is superior when it comes to understanding context in conversation. As Google CEO Sundar Pichai noted, this might be intelligently parsing an exchange like “What’s the weather today?” “It’s starting to feel like summer. I might eat lunch outside.” That makes perfect sense as a human dialogue, but would befuddle many A.I. systems looking for more literal answers.

LaMDA has superior knowledge of learned concepts which it’s able to synthesize from its training data. Pichai noted that responses never follow the same path twice, so conversations feel less scripted and more responsively natural.

Read more