Share this story!
In order to protect the coral reefs and all of their inhabitants, learning as much as possible is key. Now, a team of researchers from University of Exeter is creating an algorithm to track coral reefs health by understanding their song.
A coral reef is a complex collective with a soundscape which comes from the abundance of fish and other creatures surrounding them. Earlier, it has been very difficult to capture the sound because of the careful analysis of recorded sounds it requires.
By using different recordings from the Mars Coral Reef Restoration Project (they restore heavily damaged coral reefs in Indonesia) an innovative AI program has been trained to recognize the difference between damaged and healthy reefs.
“Coral reefs are facing multiple threats including climate change, so monitoring their health and the success of conservation projects is vital,” said lead author Ben Williams to the University of Exeter. “One major difficulty is that visual and acoustic surveys of reefs usually rely on labor-intensive methods.”
“Our approach to that problem was to use machine learning – to see whether a computer could learn the song of the reef. Our findings show that a computer can pick up patterns that are undetectable to the human ear. It can tell us faster, and more accurately, how the reef is doing,” adds Williams.
Thanks to this method, the health of coral reefs can be successfully identified 92% faster than before. This helps to both improve the coral reef health monitoring and the speed and ease of conservation.
“This is a really exciting development. Sound recorders and AI could be used around the world to monitor the health of reefs, and discover whether attempts to protect and restore them are working,” said co-author Dr Tim Lamont also to University of Exeter. “In many cases, it’s easier and cheaper to deploy an underwater hydrophone on a reef and leave it there than to have expert divers visiting the reef repeatedly to survey it – especially in remote locations.”
If you’d like to read the whole study, you can do so here: Ecological Indicators – Enhancing automated analysis of marine soundscapes using machine learning to combine ecoacoustic indices