In this paper, we present SonarSplat, a novel Gaussian splatting framework for imaging sonar that demonstrates realistic novel view synthesis and models acoustic streaking phenomena. Our method represents the scene as a set of 3D Gaussians with acoustic reflectance and saturation properties.
We develop a novel method to efficiently rasterize Gaussians to produce a range/azimuth image that is faithful to the acoustic image formation model of imaging sonar. In particular, we develop a novel approach to model azimuth streaking in a Gaussian splatting framework.
We evaluate SonarSplat using real-world datasets of sonar images collected from an underwater robotic platform in a controlled test tank and in a real-world river environment. Compared to the state-of-the-art, SonarSplat offers improved image synthesis capabilities (+3.2 dB PSNR) and more accurate 3D reconstruction (52% lower Chamfer Distance). We also demonstrate that SonarSplat can be leveraged for azimuth streak removal.
@ARTICLE{11223217,
author={Sethuraman, Advaith V. and Rucker, Max and Bagoren, Onur and Kung, Pou-Chun and Amutha, Nibarkavi N.B. and Skinner, Katherine A.},
journal={IEEE Robotics and Automation Letters},
title={SonarSplat: Novel View Synthesis of Imaging Sonar via Gaussian Splatting},
year={2025},
volume={},
number={},
pages={1-8},
keywords={Sonar;Three-dimensional displays;Azimuth;Imaging;Acoustics;Rendering (computer graphics);Reflectivity;Neural radiance field;Robots;Covariance matrices;Mapping;Deep Learning for Visual Perception;Marine Robotics},
doi={10.1109/LRA.2025.3627089}}