Say Woof? AI in Animal Language Translation

Published in
5 min readJan 18, 2020


No matter whether we’re sharing our lives with beloved household pets or protecting wildlife in a remote location, wouldn’t it be wonderful if we could somehow lift the language barrier that has impeded interspecies communication for millennia?

The first digital “dog translator” was a Japanese novelty item awarded the satirical Ig Nobel Prize in 2002. But since then the idea of tech-based animal language translation has become more than just a joke, as we’ve discovered parallels between birdsong and human speech and developed dolphin whistle decoders. Now, artificial intelligence is bringing exciting new power and potential to the topic of human-animal communication.

We know that natural language processing (NLP) technologies can enable a machine to understand human language, but what about animal language? Researchers believe the simplest way to do this would be to build a deep learning model and train it on an animal language database in which various voice signals are linked with expressions of emotions (happy/sad/etc.) and feelings (cold/hot/etc.). It’s hoped that as the database evolves additional nuances and innovative approaches could be explored.

Zoolingua, founded in 2018, is a startup aimed at “helping people have great relationships with animals.” That idea can be traced back to 2012 when founder Con Slobodchikoff published his findings on animal communication in the book Chasing Doctor Dolittle. A biologist, Slobodchikoff discovered detailed messages in the prairie dog sounds he recorded and studied for decades.

With the goal of improving the relationship between dogs and their owners, Zoolingua is now developing a mobile app designed to translate dog body language and sounds to English. The team is collecting data on dog body language, facial expressions and vocalisations for model training. Zoolingua says the tech could be extended to other pets to improve communication and human understanding: “Where we’ve only seen behavior issues, we might hear and understand their fear, pain, and needs.”

Understanding animal language and improving animal-human communication is also appealing for farmers. Over the past five years, researchers from Georgia Institute of Technology have been collecting sound information from chickens.

The recordings were made when the chickens were exposed to certain conditions, e.g. heat or cold, light or dark. The collected sounds were then used to train a machine learning model to identify the difference between contented and distressed birds. The model has proven capable of detecting “emotional” changes in chickens with near perfect accuracy. The researchers believe the study’s finding can be used to improve poultry conditions and productivity.

An important application for animal language translation is in wildlife protection, as more and more wild species are being pushed to the edge of extinction. Silicon Valley based Conservation Metrics has been working with researchers in Africa to apply new artificial intelligence techniques to wild elephant protection. The project has collected 900,000 hours of recordings with elephant vocalisations in the Central African Forest. Using deep learning to analyze features in the sound data, the researchers are able to identify the sounds for greetings and other daily communications in a particular elephant herd, and, most importantly, the sound the elephants make when poachers are spotted. The project plays an important role in anti-poaching and other conservation efforts.

Although an increasing number of projects are targeting animal language translation, there are also doubters. Some question the depth of animal language, or why most technologies attempt to translate animal languages to English, which may be an inappropriate vehicle for expressing primal or simple thoughts. There are also concerns about studies based on supervised learning, where human-labeled training datasets could give huge influence to small biases. These are issues researchers can take into consideration in future studies.

When we were kids, we enjoyed stories about people could understand the language of cats, birds and other animals through meditation or spiritualism. And every pet owner knows there are times when we truly believe we can understand the cat or dog we’ve lived with for years.

It is certain that there are patterns and features in the animal communication process, expressed for example in loudness, frequency, and tones, etc., that lie beyond the abilities of human auditory perception. The complex layered structure of a neural network however excels at extracting such elusive features and patterns, and can surely do so from rich sound information. That promise and potential is motivating AI researchers to develop technologies that will enable us to decipher more and more animal language patterns and delve deeper and deeper into interspecies communication and understanding.

Author: Linyang Yu | Editor: Michael Sarazen

Thinking of contributing to Synced Review? Sharing My Research welcomes scholars to share their own research breakthroughs with global AI enthusiasts.

We know you don’t want to miss any story. Subscribe to our popular Synced Global AI Weekly to get weekly AI updates.

Need a comprehensive review of the past, present and future of modern AI research development? Trends of AI Technology Development Report is out!

2018 Fortune Global 500 Public Company AI Adaptivity Report is out!
Purchase a Kindle-formatted report on Amazon.
Apply for Insight Partner Program to get a complimentary full PDF report.




AI Technology & Industry Review — | Newsletter: | Share My Research | Twitter: @Synced_Global