Skip to Content

Research Training and Education Services

  • Montie lab students seine in the estuary for young fish.

ASPIRE AI funding enables estuarine eavesdropping

 

ASPIRE AI is now accepting proposals

Faculty may submit an ASPIRE AI proposal by 5:00 p.m. on Wednesday, Feburary 18, 2026.

Learn more and apply today

 

There are a lot of ways marine biology can study estuaries: surveying dolphin populations from a boat, sampling water for temperature and quality, monitoring fish reproduction by counting the abundance of young captured in seines. For Eric Montie and his student researchers, that’s all part of the job description, but what occupies most of their time is poring over spectrograms to annotate vessel noise, fish sounds and dolphin vocalizations.

Montie, professor of biology at University of South Carolina Beaufort (USCB), runs the Marine Sensory and Neurobiology Lab. And, since 2013, he’s been listening in on underwater sounds.

“We use soundscapes as ocean observatories,” says Montie, who founded the Estuarine Soundscape Observatory Network in the Southeast (ESONS). “This approach allows us to eavesdrop on animal behavior across multiple levels of biological complexity, which is especially valuable in estuaries, where visibility is limited.”

There’s a lot to learn from listening to marine organisms and vessel noise. Fish produce sounds to attract mates, while dolphins use vocalizations to communicate and locate prey in murky estuaries. Careful analysis of acoustic data can help identify how underwater soundscapes reflect the impact of hurricanes, tropical storms, impervious surface, rainfall, salinity drops, overfishing and vessel traffic on marine life.  

With their multiyear dataset, the team has done a remarkable job identifying acoustic patterns in marine life. Now, with over a decade of data from the May River and eight years of data from the Charleston Harbor, they can begin to understand how extreme weather events and human-generated noise are influencing these behaviors.

The challenge: there is a lot of data to sift through.

Montie’s team has 10 passive acoustic recording stations located throughout South Carolina, and each station collects two-minute sound files once an hour. Each year, stations from the May River, Charleston Harbor, Chechessee Creek, Colleton River and Pritchard’s Island produce a total of 87,600 sound files that need to be reviewed and annotated by a team of nine undergraduate and graduate students. Processing a year’s worth of data, even incorporating only half of the files, takes a little over seven months.

That’s what inspired Montie to apply for a grant through ASPIRE AI, an internal funding program under the Office of the Vice President for Research that aims to boost meritorious research projects incorporating artificial intelligence across disciplines.

“We have big questions, but it’s very challenging for us to answer them because we’re spending all our time manually reviewing data,” says Montie. “We can use the data from ESONS to understand impacts on marine life, but we really need AI to help expedite that process.”

The ASPIRE AI funding will support students in Montie’s lab, including Jessica Miller, a USCB student in the computational science master’s program. It will also strengthen the relationship between the marine biology and computational science programs at USCB, where there is a lot of potential for AI to revolutionize data processing.

“Jessica’s goal, with the help of undergraduate researchers, is to build an annotated library of 5,000–10,000 examples for each sound-producing organism in our dataset. To achieve this monumental objective, she will develop an interactive Python tool—or use existing software such as Raven Pro—to manually identify species-specific vocalizations in WAV files and export those annotations and sound clips for machine learning,” Montie says.

Once the annotated library is complete, Miller and their team can use it to teach a computer program to automatically recognize and count different sounds, like fish calls, dolphin vocalizations and vessel noise in new recordings. Lightening the load of manual data processing will allow the team to make greater headway on questions about the impact of not only noise but larger climate trends and events on marine life in the southeast.

That’s just the beginning. With AI-enabled technologies that can recognize the sounds of marine life, Montie envisions future instruments that could alert container ships to the presence of critically endangered species, allowing them to adjust their speed appropriately.

With ASPIRE AI funding and external collaborators submitting for larger grants, ESONS is poised to turn sci-fi dreams of artificial intelligence into real-life analysis of marine environments.


Challenge the conventional. Create the exceptional. No Limits.

©