Skip to content

Segment Anything Model, SAM, Sonar images

Notifications You must be signed in to change notification settings

lianjinke/SonarSAM

 
 

Repository files navigation

SonarSAM

This study presents the introduction of the Segment-Anything-Model (SAM) to sonar images. We conduct a comprehensive investigation into fine-tuning methods for SAM, including LoRA and visual prompt tuning. To facilitate comparison, we provide a framework that integrates these fine-tuning methods for SAM. If this project is helpful to your research, please consider citing our paper PDF.

@article{wang2023sonarsam,
  title={When SAM Meets Sonar Images},
  author={Wang, Lin and Ye, Xiufen and Zhu, Liqiang and Wu, Weijie and Zhang, Jianguo and Xing, Huiming and Hu, Chao},
  journal={arXiv preprint arXiv:2306.14109},
  year={2023}
}

Update

  • 2023-06-30 Support fine-tuning with LoRA on Mobile SAM backbone.
  • 2023-06-29 Support fully fine-tuning on Mobile SAM backbone.

Dataset

The Marine Debris dataset is used in this work, which is available at Forward-Looking Sonar Marine Debris Datasets.

Training

  • Using box prompts
python train_SAM_box.py --config ./configs/sam_box.yaml
  • Semantic segmentation
python train_SAM.py --config ./configs/sam.yaml

Acknowledgment

This project was developed based on the following awesome codes.

About

Segment Anything Model, SAM, Sonar images

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%