Talking To Dolphins Isn't As Far Away As You Might Think — Here's Why
Humans have long been fascinated by the idea of a real-life "Doctor Dolittle," someone who "could talk to the animals, learn their languages, maybe take an animal degree." Pet owners ponder if dogs can understand our words, if cats can understand what humans say, and if parrots actually understand the human words they repeat. Many scientists and zoologists have delved into this area of research, teaching and studying communication with a menagerie of animals. Koko the gorilla learned more than 2,000 words in American Sign Language, Alex the grey parrot communicated using human language and understood concepts such as shapes, colors, and math, and Chaser the border collie knew over 1,000 proper nouns and could categorize her toys and grasp basic grammar.
One animal that comes up repeatedly in any discussion of animal communication and intelligence is the dolphin. In Douglas Adams' beloved book and radio series "The Hitchhiker's Guide to the Galaxy," dolphins are identified as one of the world's smartest creatures (second only to mice), and their final message — sadly misinterpreted by ignorant humans — before departing the about-to-be-destroyed planet Earth is "So long, and thanks for all the fish."
If dolphins really are trying to talk to us, we soon might be able to understand what they're saying, thanks to DolphinGemma. Google teamed up with researchers from the Wild Dolphin Project and the Georgia Institute of Technology to create this large language model focused on dolphins' complex communication.
What is DolphinGemma?
Dolphins communicate with each other using three main types of sounds: whistles, clicks, and burst pulses (groups of clicks). Since 1985, the Wild Dolphin Project has been gathering audio and video recordings of one pod of Atlantic spotted dolphins in the Bahamas, including the ways they communicate with one another. Over the years, researchers have accumulated an impressive collection of vocalizations.
"DolphinGemma was trained on that audio dataset, allowing the model to identify recurring patterns and structures in dolphin vocalizations," reads the DolphinGemma website. "Eventually, we hope the model will be able to predict the next sounds in a given sequence of dolphin noises — much like how LLM models can predict the next word in human languages." In addition to predicting patterns, DolphinGemma aims to interpret the meanings behind the sounds and, ultimately, create a shared vocabulary for dolphins and humans to use. (Next time dolphins try to tell us the world will be destroyed to make way for a hyperspace bypass, maybe we'll understand their warning.)
Work on DolphinGemma is taking place right now. Upon the AI model's release, Google intends to make DolphinGemma openly available for anyone to use. Dolphins aren't the only animals whose vocalizations have been analyzed by scientists using AI with the goal of interspecies communication, however. DeepSqueak was designed to interpret rodents' squeaks, while MeowTalk aims to translate feline communication. Talking to the animals might just happen sooner than we expect.