This respository contains the main codebase for the paper: Unsupervised Image Decomposition with Phase-Correlation Networks. [PDF]
The repository allows to reproduce the experiments and results from the paper.
To download the code, fork the repository or clone it using the following command:
git clone [email protected]:angelvillar96/Unsupervised-Decomposition-PCDNet.git
To get the repository running, you will need several python packages, e.g., Numpy, OpenCV PyTorch or Matplotlib.
You can install them all easily and avoiding dependency issues by installing the conda environment file included in the repository. To do so, run the following command from the Conda Command Window or from a Terminal:
$ conda env create -f environment.yml
$ conda activate PCDNet
Note: This step might take a few minutes
The following tree diagram displays the detailed directory structure of the project. Some directory names and paths can be modified in the CONFIG File.
Unsupervised-Decomposition-PCDNet
|
├── resources/
|
├── datasets/
| ├── AAD/
| ├── Tetrominoes/
| └── cars/
|
├── models/
|
├── src/
| ├── data/
│ |── lib/
│ |── models/
| |── notebooks/
│ ├── 01_create_experiment.py
│ ├── 02_train.py
│ ├── 03_evaluate.py
│ └── 04_generate_figures.py
|
├── environment.yml
└── README.md
Now, we give a short overview of the different directories:
-
resources/: Images from the README.md file
-
datasets/: Image datasets used in the paper. It can be downloaded from here
-
models/: Exemplary experiments and pretrained models
-
src/: Code to reproduce the results from the paper.
In this section, we explain how to use the repository to reproduce the experiments from the paper.
For creating an experiment, run
$ python 1_create_experiment.py [-h] -d EXP_DIRECTORY
The instruction automatically generates a directory in the specified EXP_DIRECTORY, containing a JSON file with the experiment parameters and subdirectories for the models and plots.
Once the experiment is initialized and the parameters from the experiment_params.json file are set, a PCDNet model can be trained running the following command.
$ CUDA_VISIBLE_DEVICES=0 python 02_train_dissentangle_model.py -d YOUR_EXP_DIRECTORY [--checkpoint CHECKPOINT]
Additionally, you can resume training from a checkpoint using the --checkpoint argument.
Run the following command to generate several figures, including the learned object and mask prototypes, reconstructions, segmentations, and decompositions.
$ CUDA_VISIBLE_DEVICES=0 python 04_generate_figures.py [-h] -d EXP_DIRECTORY [--checkpoint CHECKPOINT]
optional arguments:
-h, --help show this help message and exit
-d EXP_DIRECTORY, --exp_directory EXP_DIRECTORY
Path to the experiment directory
--checkpoint CHECKPOINT
Relative path to the model to evaluate
For example:
$ CUDA_VISIBLE_DEVICES=0 python 04_generate_figures.py -d Tetrominoes --checkpoint tetrominoes_model.pth
Please consider citing our paper if you find our work or our repository helpful.
@inproceedings{villar2021PCDNet,
title={Unsupervised Image Decomposition with Phase-Correlation Networks},
author={Villar-Corrales, A. and Behnke, S.},
booktitle={17th International Conference on Computer Vision Theory and Applications (VISAPP)},
year={2022}
}
This repository is maintained by Angel Villar-Corrales,
In case of any questions or problems regarding the project or repository, do not hesitate to contact the authors at [email protected].