Artificial Intelligence has been used for everything from teaching computers to play chess to helping speed ride-sharing services on thei rway. And now one government agency is using it to track humpback whales in the Pacific.
For more than a decade, the National Oceanic and Atmospheric Administration has been tracking whales by recording them.
But there are challenges – like the sheer volume of data. Researchers have to sift through years of audio. Literally. Years.
“So far we’ve collected over 170,000 hours of data. Let’s put that in real terms. If you were to sit and listen straight, not sleeping, not eating, taking no breaks, it would take you 19 years to listen to all that data,” says Ann Allen, a research oceanographer with NOAA’s Cetacean Research Program at the Pacific Islands Fisheries Science Center.
Her job is to basically find out the health and status of whales and dolphins around the U.S. Pacific Islands and across the Pacific Ocean. Part of her job is to work with all that data.
When Ann first started working at NOAA, she told her dad about the hundreds of terabytes of data she had to analyze.
“His response was ‘Well, why don’t you just get those Google or Shazam people to do it. They learn or can identify human song, I bet they can do it with whale songs.’
And I was a bit taken aback at that. I was like, ‘But that’s not . . . But we don’t do it . . . huh.’ So I went out and looked into whether these Google people would be interested in our data, and they were actually really enthusiastic. They have a lot of machine learning, and artificial intelligence techniques that they developed for their commercial purposes. They were really excited about putting to use on our scientific data set.”
Ann sent Google hard drives full of humpback whale songs. Humpbacks are difficult to identify because unlike other whale species, their songs (or the way the communicate) can change. So instead of clicks or grunts, you’ll get something that includes growls, high-pitch whistles, and something that sounds like a moan.
So Google, with the help of Ann, used artificial intelligence to recognize and identify various sounds – speeding up the process of analyzing the hundreds of thousands of hours of audio.
“So machine learning is different than normal programming. In that instead of teaching the computers step-by-step to do a task, it’s actually learning like a human would. By giving examples of humpback calls, it starts to recognize what’s a humpback whale, and what is a ship or other noise, and starts to pick those out from the data.”
Right now, this is still a pilot project. But Ann says it’s showing promise.
“This would actually be pretty revolutionary for our field. And it would make it much easier for us to address a lot of questions that have been pressing in the last few years. Especially towards conservation. I know that Google is very interested in whether or not it could be applied to reducing ship strikes for whales.”
Ann says she hopes the collaboration between NOAA and Google will continue, and that a tool can be developed to identify other species and data sets.
While AI may sound to some like the early stages of computers taking over the world, Ann is not among them.
“I definitely don’t worry about the machines taking over our whale data. It’s actually really going to help me do my job, and get back to doing what I’m good at. Rather than trying to program myself.”