From 80bfdda1f4a89382605a83c6ae717463cf1792a4 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?=E2=84=9Dafael=20Bailo?= Date: Fri, 7 Jun 2024 10:27:03 +0200 Subject: [PATCH] Specific differences --- paper.md | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/paper.md b/paper.md index 3015226..c2552f1 100644 --- a/paper.md +++ b/paper.md @@ -120,7 +120,7 @@ In general, very few implementations of CBO already exist, and none have been de Regarding Python, we refer to `PyPop7` [@duan2023pypop7] and `scikit-opt` [@scikitopt] for a collection of various derivative-free optimisation strategies. For packages connected to Bayesian optimisation, we refer to `BayesO` [@Kim2023], `bayesian-optimization` [@Bayesian14], `GPyOpt` [@gpyopt2016], `GPflowOpt` [@GPflowOpt2017], `pyGPGO` [@Jiménez2017], `PyBADS` [@Singh2024] and `BoTorch` [@balandat2020botorch]. Furthermore, CMA-ES [@hansen1996adapting] was implemented in `pycma` [@hansen2019pycma]. To the best of our knowledge the connection between consensus-based methods and evolution strategies is not fully understood, and is therefore an interesting future direction. PSO and SA implementations are already available in `PySwarms` [@miranda2018pyswarms], `scikit-opt` [@scikitopt], `DEAP` [@deapJMLR2012] and `pagmo` [@pagmo2017]. They are widely used by the community and provide a rich framework for the respective methods. However, adjusting these implementations to CBO is not straightforward. The first publicly available Python packages implementing CBX algorithms were given by some of the authors together with collaborators. @Igor_CBOinPython implement standard CBO [@pinnau2017consensus], and the package `PolarCBO` [@Roith_polarcbo] provides an implementation of polarised CBO [@bungert2022polarized]. [CBXPy](https://pdips.github.io/CBXpy/) is a significant extension of the latter, which was tailored to the polarised variant. The code architecture was generalised, which allowed the implementation of the whole CBX family within a common framework. -Regarding Julia, PSO and SA methods are, among others, implemented in `optim.jl` [@mogensen2018optim], `Metaheuristics.jl` [@mejia2022metaheuristics], and `Manopt.jl` [@Bergmann2022]. PSO and SA are also included in the meta-library `Optimization.jl` [@DR2023], as well as Nelder--Mead, which is a direct search method. The latter is also implemented in `Manopt.jl` [@Bergmann2022], which further provides a manifold variant of CMA-ES [@colutto2009cma]. One of the authors gave the first specific Julia implementation of standard CBO `Consensus.jl` [@Bailo_consensus]. That package has now been deprecated in favour of [ConsensusBasedX.jl](https://pdips.github.io/ConsensusBasedX.jl/), which improves the performance of the CBO implementation. The package also adds a CBS implementation, and overall presents a more general interface that accomodates the wider CBX class of methods. +Regarding Julia, PSO and SA methods are, among others, implemented in `optim.jl` [@mogensen2018optim], `Metaheuristics.jl` [@mejia2022metaheuristics], and `Manopt.jl` [@Bergmann2022]. PSO and SA are also included in the meta-library `Optimization.jl` [@DR2023], as well as Nelder--Mead, which is a direct search method. The latter is also implemented in `Manopt.jl` [@Bergmann2022], which further provides a manifold variant of CMA-ES [@colutto2009cma]. One of the authors gave the first specific Julia implementation of standard CBO `Consensus.jl` [@Bailo_consensus]. That package has now been deprecated in favour of [ConsensusBasedX.jl](https://pdips.github.io/ConsensusBasedX.jl/), which improves the performance of the CBO implementation with a type-stable and allocation-free implementation. The package also adds a CBS implementation, and overall presents a more general interface that accomodates the wider CBX class of methods. # Features @@ -153,11 +153,13 @@ More examples and details on the implementation are available in the [documentat [ConsensusBasedX.jl](https://pdips.github.io/ConsensusBasedX.jl/) has been almost entirely written in native Julia (with the exception of a single call to LAPACK). The code has been developed with performance in mind, thus the critical routines are fully type-stable and allocation-free. A specific tool is provided to benchmark a typical method iteration, which can be used to detect allocations. Through this tool, unit tests are in place to ensure zero allocations in all the provided methods. The benchmarking tool is also available to users, who can use it to test their implementations of $f$, as well as any new CBX methods. Basic function minimisation can be performed by running: + ```julia using ConsensusBasedX # load the ConsensusBasedX package f(x) = x[1]^2 + x[2]^2 # define the function to minimise x = minimise(f, D = 2) # run the minimisation ``` + The library is available on [GitHub](https://github.com/PdIPS/ConsensusBasedX.jl). It has been registered in the [general Julia registry](https://github.com/JuliaRegistries/General), and therefore it can be installed by running `]add ConsensusBasedX`. It is licensed under the MIT license. More examples and full instructions are available in the [documentation](https://pdips.github.io/ConsensusBasedX.jl/). # Acknowledgements