Google is training AI to speak dolphin

Big advances in artificial intelligence have renewed some scientists' dream of eventually understanding what dolphins are saying. But what if we humans could also respond?

Now a few researchers think a form of two-way communication with dolphins could be on the horizon.

In collaboration with the Georgia Institute of Technology and the nonprofit Wild Dolphin Project (WDP), Google has announced progress on what the team has as the first large language model (LLM) for dolphin vocalizations, called DolphinGemma.

The team now wants to project how AI completes vocalization sequences--like "when I'm typing into Google and it's finishing my sentence," says WDP founder Denise Herzing.

This is an interesting approach, but researchers must take care that they aren't unintentionally training the dolphins, Taylor says.

If the dolphins repeat the sound, "we have to think whether that's actually an understanding of language--or whether it's the same as teaching a dog to sit because they get a reward."

This summer, WDP plans to use Google’s Pixel 9 smartphone to power a platform that can create synthetic dolphin vocalizations and listen to dolphin sounds for a matching “reply.”

Other stories

How to Develop a Successful E-commerce App: A Step-by-Step Guide