Panoptic-DeepLab is a state-of-the-art bottom-up method for panoptic segmentation, where the goal is to assign semantic labels (e.g., person, dog, cat and so on) to every pixel in the input image as well as instance labels (e.g. an id of 1, 2, 3, etc) to pixels belonging to thing classes.
This is the PyTorch re-implementation of our CVPR2020 paper: Panoptic-DeepLab: A Simple, Strong, and Fast Baseline for Bottom-Up Panoptic Segmentation.
- This is a re-implementation of Panoptic-DeepLab, it is not guaranteed to reproduce all numbers in the paper, please refer to the original numbers from Panoptic-DeepLab: A Simple, Strong, and Fast Baseline for Bottom-Up Panoptic Segmentation when making comparison.
- We release a detailed technical report with implementation details and supplementary analysis on Panoptic-DeepLab. In particular, we find center prediction is almost perfect and the bottleneck of bottom-up method still lies in semantic segmentation
- It is powered by the PyTorch deep learning framework.
- Can be trained even on 4 1080TI GPUs (no need for 32 TPUs!).
See INSTALL.md.
See GETTING_STARTED.md.
See MODEL_ZOO.md.
See changelog
If you find this code helpful in your research or wish to refer to the baseline results, please use the following BibTeX entry.
@inproceedings{cheng2020panoptic,
title={Panoptic-DeepLab: A Simple, Strong, and Fast Baseline for Bottom-Up Panoptic Segmentation},
author={Cheng, Bowen and Collins, Maxwell D and Zhu, Yukun and Liu, Ting and Huang, Thomas S and Adam, Hartwig and Chen, Liang-Chieh},
booktitle={CVPR},
year={2020}
}
@inproceedings{cheng2019panoptic,
title={Panoptic-DeepLab},
author={Cheng, Bowen and Collins, Maxwell D and Zhu, Yukun and Liu, Ting and Huang, Thomas S and Adam, Hartwig and Chen, Liang-Chieh},
booktitle={ICCV COCO + Mapillary Joint Recognition Challenge Workshop},
year={2019}
}
We have used utility functions from other wonderful open-source projects, we would espeicially thank the authors of:
Bowen Cheng (bcheng9 AT illinois DOT edu) openseg-group (yuyua AT microsoft DOT com)