Coral reefs are among the most diverse ecosystems on our planet. Although they cover less than 1% of the ocean’s surface, an astonishing one-quarter of all marine species rely on these reefs at some point in their lives. With such a rich variety of life concentrated in one area, researchers often find it challenging to accurately assess which species are present and their population sizes.
In a study published in The Journal of the Acoustical Society of America, scientists from the Woods Hole Oceanographic Institution utilised a combination of acoustic monitoring and a neural network to detect fish activity on coral reefs through sound.
For many years, passive acoustic monitoring has been employed to observe coral reef dynamics. Typically, an underwater acoustic recorder is placed in the ocean, where it captures audio from the reef over several months. While existing signal processing tools can analyse large volumes of acoustic data, they fall short when it comes to identifying specific sounds, which usually requires painstaking manual review of the data.
“But for the people that are doing that, it’s awful work, to be quite honest,” said author Seth McCammon. “It’s incredibly tedious work. It’s miserable.”
Moreover, this manual analysis is too slow to be practical. With many coral reefs facing threats from climate change and human activities, the ability to quickly identify and monitor changes in reef populations is vital for conservation efforts.
“It takes years to analyse data to that level with humans,” said McCammon. “The analysis of the data in this way is not useful at scale.”
To address this issue, the researchers developed a neural network capable of automatically sifting through vast amounts of acoustic data, analysing audio recordings in real time. Their algorithm matches the accuracy of human experts in recognising acoustic patterns on a reef but operates over 25 times faster, potentially revolutionising ocean monitoring and research practices.
“Now that we no longer need to have a human in the loop, what other sorts of devices—moving beyond just recorders—could we use?” said McCammon.
“Some work that my co-author Aran Mooney is doing involves integrating this type of neural network onto a floating mooring that’s broadcasting real-time updates of fish call counts. We are also working on putting our neural network onto our autonomous underwater vehicle, CUREE, so that it can listen for fish and map out hot spots of biological activity.”
This technology also has the potential to solve a long-standing problem in marine acoustic studies: matching each unique sound to a fish.
“For the vast majority of species, we haven’t gotten to the point yet where we can say with certainty that a call came from a particular species of fish,” said McCammon. “That’s, at least in my mind, the holy grail we’re looking for. By being able to do fish call detection in real time, we can start to build devices that are able to automatically hear a call and then see what fish are nearby.”
Ultimately, McCammon envisions that this neural network will enable researchers to track fish populations in real time, pinpoint endangered species, and react to emergencies. This innovative technology will assist conservationists in obtaining a better understanding of coral reef health, especially during a time when these ecosystems require significant support.
[Image: CUREE, an autonomous underwater robot, is used by the researchers to collect acoustic data for analysis. Credit: Austin Greene, Woods Hole Oceanographic Institution]