Skip to content

Commit

Permalink
update plot and package names
Browse files Browse the repository at this point in the history
  • Loading branch information
TimRoith committed May 15, 2024
1 parent ccb8278 commit b1d6133
Show file tree
Hide file tree
Showing 3 changed files with 44 additions and 3 deletions.
Binary file modified JOSS.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
41 changes: 41 additions & 0 deletions paper.bib
Original file line number Diff line number Diff line change
Expand Up @@ -175,6 +175,47 @@ @Article{Hunter_2007
year = {2007},
}

@Misc{gpyopt2016,
author = {The GPyOpt authors},
title = {{GPyOpt}: A Bayesian Optimization framework in python},
howpublished = {\url{http://github.com/SheffieldML/GPyOpt}},
year = {2016}
}

@inproceedings{balandat2020botorch,
title={{BoTorch: A Framework for Efficient Monte-Carlo Bayesian Optimization}},
author={Balandat, Maximilian and Karrer, Brian and Jiang, Daniel R. and Daulton, Samuel and Letham, Benjamin and Wilson, Andrew Gordon and Bakshy, Eytan},
booktitle = {Advances in Neural Information Processing Systems 33},
year={2020},
url = {http://arxiv.org/abs/1910.06403}
}

@online{GPflowOpt2017,
author = {Knudde, Nicolas and {van der Herten}, Joachim and Dhaene, Tom and Couckuyt, Ivo},
title = "{{GP}flow{O}pt: {A} {B}ayesian {O}ptimization {L}ibrary using Tensor{F}low}",
eprint = {1711.03845},
eprinttype = {arXiv},
year = {2017},
url = {https://arxiv.org/abs/1711.03845}
}

@article{Jiménez2017,
doi = {10.21105/joss.00431},
url = {https://doi.org/10.21105/joss.00431},
year = {2017},
publisher = {The Open Journal},
volume = {2}, number = {19},
pages = {431},
author = {José Jiménez and Josep Ginebra},
title = {pyGPGO: Bayesian Optimization for Python},
journal = {Journal of Open Source Software} }

@Misc{Bayesian14,
author = {Fernando Nogueira},
title = {{Bayesian Optimization}: Open source constrained global optimization tool for {Python}},
year = {2014--},
url = " https://github.com/bayesian-optimization/BayesianOptimization"
}

@software{scikitopt,
author = {Guo, Fei},
Expand Down
6 changes: 3 additions & 3 deletions paper.md
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@ $$
},
$$

where $\lambda$ and $\sigma$ are positive parameters, and where $B_t^i$ are independent Brownian motions in $d$ dimensions. The _consensus drift_ is a deterministic term that drives each agent towards the consensus point, with rate $\lambda$. Meanwhile, the _scaled diffusion_ is a stochastic term that encourages exploration of the landscape. While both the agents' positions and the consensus point evolve in time, it has been proven that all agents eventually reach the same position and that the consensus point $c_\alpha(x_t)$ is a good approximation of $x^*$ [@carrillo2018analytical;@fornasier2021consensus]. Other variations of the method, such as CBO with anisotropic noise [@carrillo2021consensus], _polarised CBO_ [@bungert2022polarized], or _consensus-based sampling_ (CBS) [@carrillo2022consensus] have also been proposed.
where $\lambda$ and $\sigma$ are positive parameters, and where $B_t^i$ are independent Brownian motions in $d$ dimensions. The _consensus drift_ is a deterministic term that drives each agent towards the consensus point, with rate $\lambda$. Meanwhile, the _scaled diffusion_ is a stochastic term that encourages exploration of the landscape. The scaling factor of the diffusion is proportional to the distance of the particle from the consensus point. Hence, whenever the position of a particle and the location of the weighted mean coincide, the particle stops moving. On the other hand, if the particle is far away from the consensus, its evolution has a stronger exploration behaviour. While both the agents' positions and the consensus point evolve in time, it has been proven that all agents eventually reach the same position and that the consensus point $c_\alpha(x_t)$ is a good approximation of $x^*$ [@carrillo2018analytical;@fornasier2021consensus]. Other variations of the method, such as CBO with anisotropic noise [@carrillo2021consensus], _polarised CBO_ [@bungert2022polarized], or _consensus-based sampling_ (CBS) [@carrillo2022consensus] have also been proposed.

In practice, the solution to the SDE above cannot be found exactly. Instead, an _Euler--Maruyama scheme_ [@KP1992] is used to update the position of the agents. The update is given by

Expand All @@ -118,9 +118,9 @@ CBX methods have been successfully applied and extended to several different set

In general, very few implementations of CBO already exist, and none have been designed with the generality of other CBX methods in mind. We summarise here the related software:

Regarding Python, we refer to @duan2023pypop7 and @scikitopt for a collection of various derivative-free optimisation strategies. A very recent implementation of Bayesian optimisation is described by @Kim2023. Furthermore, CMA-ES [@hansen1996adapting] was implemented in @hansen2019pycma. To the best of our knowledge the connection between consensus-based methods and evolution strategies is not fully understood, and is therefore an interesting future direction. PSO and SA implementations are already available [@miranda2018pyswarms;@scikitopt;@deapJMLR2012;@pagmo2017]. They are widely used by the community and provide a rich framework for the respective methods. However, adjusting these implementations to CBO is not straightforward. The first publicly available Python packages implementing CBX algorithms were given by some of the authors together with collaborators. @Igor_CBOinPython implement standard CBO [@pinnau2017consensus], and @Roith_polarcbo provide an implementation of polarised CBO [@bungert2022polarized]. [CBXPy](https://pdips.github.io/CBXpy/) is a significant extension of the latter.
Regarding Python, we refer to `PyPop7` [@duan2023pypop7] and `scikit-opt` [@scikitopt] for a collection of various derivative-free optimisation strategies. A very recent implementation of Bayesian optimisation is described in `BayesO` [@Kim2023], where we also refer to `bayesian-optimization` [@Bayesian14], `GPyOpt` [@gpyopt2016], `GPflowOpt` [@GPflowOpt2017], `pyGPGO` [@Jiménez2017] and `BoTorch` [@balandat2020botorch]. Furthermore, CMA-ES [@hansen1996adapting] was implemented in `pycma` [@hansen2019pycma]. To the best of our knowledge the connection between consensus-based methods and evolution strategies is not fully understood, and is therefore an interesting future direction. PSO and SA implementations are already available in `PySwarms` [@miranda2018pyswarms], `scikit-opt` [@scikitopt], `DEAP` [@deapJMLR2012] and `pagmo` [@pagmo2017]. They are widely used by the community and provide a rich framework for the respective methods. However, adjusting these implementations to CBO is not straightforward. The first publicly available Python packages implementing CBX algorithms were given by some of the authors together with collaborators. @Igor_CBOinPython implement standard CBO [@pinnau2017consensus], and the package `PolarCBO` [@Roith_polarcbo] provides an implementation of polarised CBO [@bungert2022polarized]. [CBXPy](https://pdips.github.io/CBXpy/) is a significant extension of the latter, which was tailored to polarised CBO. The code architecture was changed to a more general setup, which allowed the implementation of the whole CBO zoo within a common framework.

Regarding Julia, PSO and SA methods are, among others, implemented by @mogensen2018optim, @mejia2022metaheuristics, and @Bergmann2022. PSO and SA are also included in the meta-library [@DR2023], as well as Nelder--Mead, which is a direct search method. The latter is also implemented in @Bergmann2022, which further provides a manifold variant of CMA-ES [@colutto2009cma]. One of the authors gave the first specific Julia implementation of standard CBO [@Bailo_consensus]; that package has now been deprecated in favour of [ConsensusBasedX.jl](https://pdips.github.io/ConsensusBasedX.jl/), which offers additional CBX methods and a far more general interface.
Regarding Julia, PSO and SA methods are, among others, implemented in `optim.jl` [@mogensen2018optim], `Metaheuristics.jl` [@mejia2022metaheuristics], and `Manopt.jl` [@Bergmann2022]. PSO and SA are also included in the meta-library `Optimization.jl` [@DR2023], as well as Nelder--Mead, which is a direct search method. The latter is also implemented in `Manopt.jl` [@Bergmann2022], which further provides a manifold variant of CMA-ES [@colutto2009cma]. One of the authors gave the first specific Julia implementation of standard CBO `Consensus.jl`[@Bailo_consensus]; that package has now been deprecated in favour of [ConsensusBasedX.jl](https://pdips.github.io/ConsensusBasedX.jl/), which offers additional CBX methods and a far more general interface.

# Features

Expand Down

0 comments on commit b1d6133

Please sign in to comment.