Skip to content

Latest commit

 

History

History
95 lines (80 loc) · 3.95 KB

README.md

File metadata and controls

95 lines (80 loc) · 3.95 KB

UniUSNet: A Promptable Framework for Universal Ultrasound Disease Prediction and Tissue Segmentation

UniUSNet is a universal framework for ultrasound image classification and segmentation, featuring:

  • A novel promptable module for incorporating detailed information into the model's learning process.
  • Versatility across various ultrasound natures, anatomical positions, and input types. Proficiency in both segmentation and classification tasks
  • Strong generalization capabilities demonstrated through zero-shot and fine-tuning experiments on new datasets.

For more details, see the accompanying paper and Project Page,

UniUSNet: A Promptable Framework for Universal Ultrasound Disease Prediction and Tissue Segmentation
Zehui Lin, Zhuoneng Zhang, Xindi Hu, Zhifan Gao, Xin Yang, Yue Sun, Dong Ni, Tao Tan. Arxiv, Jun 3, 2024. https://arxiv.org/abs/2406.01154

Installation

  • Clone this repository.
git clone https://github.com/Zehui-Lin/UniUSNet.git
cd UniUSNet
  • Create a new conda environment.
conda create -n UniUSNet python=3.10
conda activate UniUSNet
  • Install the required packages.
pip install -r requirements.txt

Data

data
├── classification
│   └── UDIAT
│       ├── 0
│       │   ├── 000001.png
│       │   ├── ...
│       ├── 1
│       │   ├── 000100.png
│       │   ├── ...
│       ├── config.yaml
│       ├── test.txt
│       ├── train.txt
│       └── val.txt
│   └── ...
└── segmentation
    └── BUSIS
        ├── config.yaml
        ├── imgs
        │   ├── 000001.png
        │   ├── ...
        ├── masks
        │   ├── 000001.png
        │   ├── ...
        ├── test.txt
        ├── train.txt
        └── val.txt
    └── ...
  • Please refer to the data_demo folder for examples.

Training

We use torch.distributed for multi-GPU training (also supports single GPU training). To train the model, run the following command:

python -m torch.distributed.launch --nproc_per_node=1 --master_port=1234 omni_train.py --output_dir exp_out/trial_1 --prompt

Testing

To test the model, run the following command:

python -m torch.distributed.launch --nproc_per_node=1 --master_port=1234 omni_test.py --output_dir exp_out/trial_1 --prompt

Checkpoints

  • You can download the pre-trained checkpoints from BaiduYun.

Citation

If you find this work useful, please consider citing:

@article{lin2024uniusnet,
  title={UniUSNet: A Promptable Framework for Universal Ultrasound Disease Prediction and Tissue Segmentation},
  author={Lin, Zehui and Zhang, Zhuoneng and Hu, Xindi and Gao, Zhifan and Yang, Xin and Sun, Yue and Ni, Dong and Tan, Tao},
  journal={arXiv preprint arXiv:2406.01154},
  year={2024}
}

Acknowledgements

This repository is based on the Swin-Unet repository. We thank the authors for their contributions.