Neidio i’r brif dudalen lywio Neidio i chwilio Neidio i’r prif gynnwys

Unlocking the soundscape of coral reefs with artificial intelligence: pretrained networks and unsupervised learning win out

  • Ben Williams
  • , Santiago M Balvanera
  • , Sarab S Sethi
  • , Timothy A C Lamont
  • , Jamaluddin Jompa
  • , Mochyudho Prasetya
  • , Laura Richardson
  • , Lucille Chapuis
  • , Emma Weschke
  • , Andrew Hoey
  • , Ricardo Beldade
  • , Suzanne C Mills
  • , Anne Haguenauer
  • , Frederic Zuberer
  • , Stephen D Simpson
  • , David Curnick
  • , Kate E Jones
  • University College London
  • Grand Challenges in Ecosystem and the Environment Initiative, Imperial College London, Silwood Park Campus, Ascot, Berkshire SL5 7PY, UK [email protected].
  • Lancaster University
  • Mattersey Graduate School
  • MARS Sustainable Solutions
  • University of Bristol
  • Australian Research Council (ARC) Centre of Excellence for Coral Reef Studies, James Cook University, Townsville, Queensland
  • PSL Research University
  • Zoological Society of London

Allbwn ymchwil: Cyfraniad at gyfnodolynErthygladolygiad gan gymheiriaid

4 Wedi eu Llwytho i Lawr (Pure)

Crynodeb

Passive acoustic monitoring can offer insights into the state of coral reef ecosystems at low-costs and over extended temporal periods. Comparison of whole soundscape properties can rapidly deliver broad insights from acoustic data, in contrast to detailed but time-consuming analysis of individual bioacoustic events. However, a lack of effective automated analysis for whole soundscape data has impeded progress in this field. Here, we show that machine learning (ML) can be used to unlock greater insights from reef soundscapes. We showcase this on a diverse set of tasks using three biogeographically independent datasets, each containing fish community (high or low), coral cover (high or low) or depth zone (shallow or mesophotic) classes. We show supervised learning can be used to train models that can identify ecological classes and individual sites from whole soundscapes. However, we report unsupervised clustering achieves this whilst providing a more detailed understanding of ecological and site groupings within soundscape data. We also compare three different approaches for extracting feature embeddings from soundscape recordings for input into ML algorithms: acoustic indices commonly used by soundscape ecologists, a pretrained convolutional neural network (P-CNN) trained on 5.2 million hrs of YouTube audio, and CNN's which were trained on each individual task (T-CNN). Although the T-CNN performs marginally better across tasks, we reveal that the P-CNN offers a powerful tool for generating insights from marine soundscape data as it requires orders of magnitude less computational resources whilst achieving near comparable performance to the T-CNN, with significant performance improvements over the acoustic indices. Our findings have implications for soundscape ecology in any habitat. [Abstract copyright: Copyright: © 2025 Williams et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.]
Iaith wreiddiolSaesneg
Tudalennau (o-i)e1013029
CyfnodolynPLoS computational biology
Cyfrol21
Rhif cyhoeddi4
Dyddiad ar-lein cynnar28 Ebr 2025
Dynodwyr Gwrthrych Digidol (DOIs)
StatwsE-gyhoeddi cyn argraffu - 28 Ebr 2025

NDC y CU

Mae’r allbwn hwn yn cyfrannu at y Nod(au) Datblygu Cynaliadwy canlynol

  1. NDC 14 - Bywyd o Dan y Dŵr
    NDC 14 Bywyd o Dan y Dŵr

Ôl bys

Gweld gwybodaeth am bynciau ymchwil 'Unlocking the soundscape of coral reefs with artificial intelligence: pretrained networks and unsupervised learning win out'. Gyda’i gilydd, maen nhw’n ffurfio ôl bys unigryw.

Dyfynnu hyn