Skip to content

Commit

Permalink
Merge pull request #80 from CosmoStat/feature_documentation
Browse files Browse the repository at this point in the history
Feature documentation
  • Loading branch information
sfarrens authored Nov 7, 2023
2 parents 425cee7 + a58fd20 commit 25dfa3a
Show file tree
Hide file tree
Showing 235 changed files with 18,266 additions and 5,822 deletions.
44 changes: 44 additions & 0 deletions .github/workflows/cd.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
# This workflow will install Python dependencies, run tests and lint with a single version of Python
# For more information see: https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-python

name: CD

on:
workflow_dispatch:
push:
branches:
- feature_documentation
- dummy_main


jobs:
docs:
name: Deploy API documentation
runs-on: [ubuntu-latest]

steps:
- name: Checkout
uses: actions/checkout@v3

- name: Set up Python 3.10.5
uses: actions/setup-python@v3
with:
python-version: "3.10.5"

- name: Check Python Version
run: python --version

- name: Install dependencies
run: |
python -m pip install ".[docs]"
- name: Build API documentation
run: |
sphinx-apidoc -Mfeo docs/source src/wf_psf
sphinx-build docs/source docs/build
- name: Deploy API documentation
uses: peaceiris/[email protected]
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
publish_dir: docs/build
2 changes: 1 addition & 1 deletion .github/workflows/ci.yml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# This workflow will install Python dependencies, run tests and lint with a single version of Python
# For more information see: https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-python

name: CI-test
name: CI

on:
pull_request:
Expand Down
10 changes: 8 additions & 2 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ __pycache__/

# Remove method comparison data
method-comparison/compatible-datasets/*
tf_notebooks

# Log files from slurm
*.err
Expand Down Expand Up @@ -66,6 +67,7 @@ coverage.xml
*.py,cover
.hypothesis/
.pytest_cache/
src/wf_psf/pytest.xml

# Translations
*.mo
Expand All @@ -86,6 +88,11 @@ instance/

# Sphinx documentation
docs/_build/
docs/source/wf_psf*.rst
docs/source/_static/file.png
docs/source/_static/images/logo_colab.png
docs/source/_static/minus.png
docs/source/_static/plus.png

# PyBuilder
target/
Expand Down Expand Up @@ -144,5 +151,4 @@ dmypy.json
# Pyre type checker
.pyre/

# UML plots
*.png

108 changes: 5 additions & 103 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,109 +3,11 @@
<h1 align='center'>WaveDiff</h1>
<h2 align='center'>A differentiable data-driven wavefront-based PSF modelling framework.</h2>

WaveDiff is a differentiable PSF modelling pipeline constructed with [Tensorflow](https://github.com/tensorflow/tensorflow). It was developed at the [CosmoStat lab](https://www.cosmostat.org) at CEA Paris-Saclay.

See the [documentation](https://cosmostat.github.io/wf-psf/) for details on how to install and run WaveDiff.

This repository includes:
- A differentiable PSF model entirely built in [Tensorflow](https://github.com/tensorflow/tensorflow).
- A numpy-based PSF simulator [here](https://github.com/tobias-liaudat/wf-psf/blob/main/wf_psf/SimPSFToolkit.py).
- A [numpy-based PSF simulator](https://github.com/CosmoStat/wf-psf/tree/dummy_main/src/wf_psf/sims).
- All the scripts, jobs and notebooks required to reproduce the results in [arXiv:2203.04908](http://arxiv.org/abs/2203.04908) and [arXiv:2111.12541](https://arxiv.org/abs/2111.12541).

~~For more information on how to use the WaveDiff model through configurable scripts see the `long-runs` directory's [README](https://github.com/tobias-liaudat/wf-psf/blob/main/long-runs/README.md).~~ (Scripts will become obsolute with next release.)

## Proposed framework

A schematic of the proposed framework can be seen below. The PSF model is estimated (trained) using star observations in the field-of-view.

<img height=300 src="assets/PSF_model_diagram_v6.png" >

<!-- Visual reconstruction example of the WaveDiff-original PSF model trained on a simplified Euclid-like setting.
<img height=800 src="assets/PSF_reconstruction_example.png" > -->


## Requirements
- [numpy](https://github.com/numpy/numpy) [>=1.19.2]
- [scipy](https://github.com/scipy/scipy) [>=1.5.2]
- [TensorFlow](https://www.tensorflow.org/) [==2.4.1]
- [TensorFlow Addons](https://github.com/tensorflow/addons) [==0.12.1]
- [Astropy](https://github.com/astropy/astropy) [==4.2]
- [zernike](https://github.com/jacopoantonello/zernike) [==0.0.31]
- [opencv-python](https://github.com/opencv/opencv-python) [>=4.5.1.48]
- [pillow](https://github.com/python-pillow/Pillow) [>=8.1.0]
- [galsim](https://github.com/GalSim-developers/GalSim) [>=2.3.1]

Optional packages:
- [matplotlib](https://github.com/matplotlib/matplotlib) [=3.3.2]
- [seaborn](https://github.com/mwaskom/seaborn) [>=0.11]

## Install

`wf-psf` is pure python and can be easily installed with `pip`. After cloning the repository, run the following commands:

```bash
$ cd wf-psf
$ git checkout dummy_main
$ pip install .
```

The package can then be imported in Python as `import wf_psf as wf`. ~~We recommend using the release `1.2.0` for stability as the current main branch is under development.~~

## Running `WaveDiff`

To run `WaveDiff`, we prepared a step-by-step [instruction guide](https://github.com/CosmoStat/wf-psf/wiki/Getting-started-tutorial).

[Read the tutorial to get started!](https://github.com/CosmoStat/wf-psf/wiki/Getting-started-tutorial)

## Reproducible research

#### [arXiv:2203.04908](http://arxiv.org/abs/2203.04908) Rethinking data-driven point spread function modeling with a differentiable optical model (2022)
_Submitted._

- Use the release 1.2.0.
- All the scripts, jobs and notebooks to reproduce the figures from the article can be found [here](https://github.com/tobias-liaudat/wf-psf/tree/main/papers/article_IOP).
- The trained PSF models are found [here](https://github.com/tobias-liaudat/wf-psf/tree/main/papers/article_IOP/data/models).
- The input PSF field can be found [here](https://github.com/tobias-liaudat/wf-psf/tree/main/data).
- The script used to generate the input PSF field is [this one](https://github.com/tobias-liaudat/wf-psf/blob/main/long-runs/LR-PSF-field-gen-coherentFields.py).
- The code required to run the comparison against pixel-based PSF models is in [this directory](https://github.com/tobias-liaudat/wf-psf/tree/main/method-comparison).
- The training of the models was done using [this script](https://github.com/tobias-liaudat/wf-psf/blob/main/long-runs/train_eval_plot_script_click.py). In order to match the script's option for the different models with the article you should follow:
- `poly->WaveDiff-original`
- `graph->WaveDiff-graph`
- `mccd->WaveDiff-Polygraph`

_Note: To run the comparison to other PSF models you need to install them first. See [RCA](https://github.com/CosmoStat/rca), [PSFEx](https://github.com/astromatic/psfex) and [MCCD](https://github.com/CosmoStat/mccd)._


#### [arXiv:2111.12541](https://arxiv.org/abs/2111.12541) Rethinking the modeling of the instrumental response of telescopes with a differentiable optical model (2021)
_NeurIPS 2021 Workshop on Machine Learning and the Physical Sciences._

- Use the release 1.2.0.
- All the scripts, jobs and notebooks to reproduce the figures from the article can be found [here](https://github.com/tobias-liaudat/wf-psf/tree/main/papers/Neurips2021_ML4Physics_workshop).



## Citation

If you use `wf-psf` in a scientific publication, we would appreciate citations to the following paper:

*Rethinking data-driven point spread function modeling with a differentiable optical model*, T. Liaudat, J.-L. Starck, M. Kilbinger, P.-A. Frugier, [arXiv:2203.04908](http://arxiv.org/abs/2203.04908), 2022.


The BibTeX citation is the following:
```
@misc{https://doi.org/10.48550/arxiv.2203.04908,
doi = {10.48550/ARXIV.2203.04908},
url = {https://arxiv.org/abs/2203.04908},
author = {Liaudat, Tobias and Starck, Jean-Luc and Kilbinger, Martin and Frugier, Pierre-Antoine},
keywords = {Instrumentation and Methods for Astrophysics (astro-ph.IM), Computer Vision and Pattern Recognition (cs.CV), FOS: Physical sciences, FOS: Physical sciences, FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Rethinking data-driven point spread function modeling with a differentiable optical model},
publisher = {arXiv},
year = {2022},
copyright = {arXiv.org perpetual, non-exclusive license}
}
```

6 changes: 3 additions & 3 deletions config/data_config.yaml
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
# Training and test datasets for training and/or metrics evaluation
# Training and test data sets for training and/or metrics evaluation
data:
training:
# Specify directory path to data; Default setting is /path/to/repo/data
data_dir: data/coherent_euclid_dataset/
file: train_Euclid_res_200_TrainStars_id_001.npy
# if training dataset file does not exist, generate a new one by setting values below
# if training data set file does not exist, generate a new one by setting values below
stars: null
positions: null
SEDS: null
Expand All @@ -28,7 +28,7 @@ data:
test:
data_dir: data/coherent_euclid_dataset/
file: test_Euclid_res_id_001.npy
# If test dataset file not provided produce a new one
# If test data set file not provided produce a new one
stars: null
noisy_stars: null
positions: null
Expand Down
18 changes: 9 additions & 9 deletions config/metrics_config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -4,21 +4,21 @@ metrics:
# Choose the training cycle for which to evaluate the psf_model. Can be: 1, 2, ...
saved_training_cycle: 2
# Metrics-only run: Specify model_params for a pre-trained model else leave blank if running training + metrics
# Specify path to Trained Model
trained_model_path: </path/to/trained/model>
# Specify path to Parent Directory of Trained Model
trained_model_path: </path/to/parent/directory/of/trained/model>
# Path to Trained Model Config file inside /trained_model_path/ parent directory
trained_model_config: </path/to/trained/model>
# Name of Plotting Config file - Enter name of yaml file to run plot metrics else if empty run metrics evaluation only
plotting_config: <enter name of plotting_config .yaml file or leave empty>
trained_model_config: </path/to/trained/model/config/file>
#Evaluate the monchromatic RMSE metric.
eval_mono_metric_rmse: True
#Evaluate the OPD RMSE metric.
eval_opd_metric_rmse: True
#Evaluate the super-resolution and the shape RMSE metrics for the train dataset.
eval_train_shape_sr_metric_rmse: True
# Name of Plotting Config file - Enter name of yaml file to run plot metrics else if empty run metrics evaluation only
plotting_config: <enter name of plotting_config .yaml file or leave empty>
ground_truth_model:
model_params:
#Model used as ground truth for the evaluation. Options are: 'poly', 'physical'.
#Model used as ground truth for the evaluation. Options are: 'poly' for polychromatic and 'physical' [not available].
model_name: poly

# Evaluation parameters
Expand Down Expand Up @@ -72,15 +72,15 @@ metrics:
#Zernike polynomial modes to use on the parametric part.
n_zernikes: 45

#Max polynomial degree of the parametric part. chg to max_deg_param
#Max polynomial degree of the parametric part.
d_max: 2

#Flag to save optimisation history for parametric model
save_optim_history_param: true

# Hyperparameters for non-parametric model
nonparam_hparams:
#Max polynomial degree of the non-parametric part. chg to max_deg_nonparam
#Max polynomial degree of the non-parametric part.
d_max_nonparam: 5

# Number of graph features
Expand All @@ -96,7 +96,7 @@ metrics:
reset_dd_features: False

#Flag to save optimisation history for non-parametric model
save_optim_history_nonparam: true
save_optim_history_nonparam: True

metrics_hparams:
# Batch size to use for the evaluation.
Expand Down
5 changes: 3 additions & 2 deletions config/plotting_config.yaml
Original file line number Diff line number Diff line change
@@ -1,12 +1,13 @@
plotting_params:
# Specify path to parent folder containing wf-psf metrics outputs for all runs, ex: $WORK/wf-outputs/
# Specify path to parent folder containing wf-outputs-xxxxxxxxxxx for all runs, ex: $WORK/wf-outputs/
metrics_output_path: <PATH>
# List directory(s) for metrics of trained PSF models to include in plot,
# List all of the parent output directories (i.e. wf-outputs-xxxxxxxxxxx) that contain metrics results to be included in the plot
metrics_dir:
# - wf-outputs-xxxxxxxxxxx1
# - wf-outputs-xxxxxxxxxxx2
# List of name of metric config file to add to plot (would like to change such that code goes and finds them in the metrics_dir)
metrics_config:
# - metrics_config_1.yaml
# - metrics_config_2.yaml
# Show Plots Flag
plot_show: False
29 changes: 13 additions & 16 deletions config/training_config.yaml
Original file line number Diff line number Diff line change
@@ -1,10 +1,11 @@
training:
# ID name
# Run ID name
id_name: -coherent_euclid_200stars
# Name of Data Config file
data_config: data_config.yaml
# Metrics Config file - Enter file to run metrics evaluation else if empty run train only
metrics_config: metrics_config.yaml
# PSF model parameters
model_params:
# Model type. Options are: 'mccd', 'graph', 'poly, 'param', 'poly_physical'."
model_name: poly
Expand Down Expand Up @@ -50,16 +51,16 @@ training:

# Hyperparameters for Parametric model
param_hparams:
# Random seed for Tensor Flow Initialization
# Set the random seed for Tensor Flow Initialization
random_seed: 3877572

# Parameter for the l2 loss function for the Optical path differences (OPD)/WFE
# Set the parameter for the l2 loss function for the Optical path differences (OPD)/WFE
l2_param: 0.

#Zernike polynomial modes to use on the parametric part.
n_zernikes: 15

#Max polynomial degree of the parametric part. chg to max_deg_param
#Max polynomial degree of the parametric part. m
d_max: 2

#Flag to save optimisation history for parametric model
Expand Down Expand Up @@ -87,35 +88,31 @@ training:

# Training hyperparameters
training_hparams:
n_epochs_params: [2, 2]

n_epochs_non_params: [2, 2]

# Batch Size
batch_size: 32

# Multi-cyclic Parameters
multi_cycle_params:

# Total amount of cycles to perform.
# Total number of cycles to perform for training.
total_cycles: 1

# Train cycle definition. It can be: 'parametric', 'non-parametric', 'complete', 'only-non-parametric' and 'only-parametric'."
cycle_def: complete

# Make checkpoint at every cycle or just save the checkpoint at the end of the training."
# Flag to save all cycles. If "True", create a checkpoint at every cycle, else if "False" only save the checkpoint at the end of the training."
save_all_cycles: False

#"Saved cycle to use for the evaluation. Can be 'cycle1', 'cycle2', ..."
saved_cycle: cycle1

# Learning rates for the parametric parts. It should be a str where numeric values are separated by spaces.
# Learning rates for training the parametric model parameters per cycle.
learning_rate_params: [1.0e-2, 1.0e-2]

# Learning rates for the non-parametric parts. It should be a str where numeric values are separated by spaces."
# Learning rates for training the non-parametric model parameters per cycle.
learning_rate_non_params: [1.0e-1, 1.0e-1]

# Number of training epochs of the parametric parts. It should be a strign where numeric values are separated by spaces."
# Number of training epochs for training the parametric model parameters per cycle.
n_epochs_params: [20, 20]

# Number of training epochs of the non-parametric parts. It should be a str where numeric values are separated by spaces."
# Number of training epochs for training the non-parametric model parameters per cycle.
n_epochs_non_params: [100, 120]

20 changes: 20 additions & 0 deletions docs/Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# Minimal makefile for Sphinx documentation
#

# You can set these variables from the command line, and also
# from the environment for the first two.
SPHINXOPTS ?=
SPHINXBUILD ?= sphinx-build
SOURCEDIR = source
BUILDDIR = build

# Put it first so that "make" without argument is like "make help".
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)

.PHONY: help Makefile

# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
Loading

0 comments on commit 25dfa3a

Please sign in to comment.