Consensus Based Optimization
   HOME

TheInfoList



OR:

Consensus-based optimization (CBO) is a multi-agent
derivative-free optimization Derivative-free optimization (sometimes referred to as blackbox optimization) is a discipline in mathematical optimization that does not use derivative information in the classical sense to find optimal solutions: Sometimes information about the de ...
method, designed to obtain solutions for global optimization problems of the form \min_ f(x), where f:\mathcal\to\R denotes the objective function acting on the state space \cal, which is assumed to be a
normed vector space The Ateliers et Chantiers de France (ACF, Workshops and Shipyards of France) was a major shipyard that was established in Dunkirk, France, in 1898. The shipyard boomed in the period before World War I (1914–18), but struggled in the inter-war ...
. The function f can potentially be nonconvex and nonsmooth. The algorithm employs particles or agents to explore the state space, which communicate with each other to update their positions. Their dynamics follows the paradigm of
metaheuristics In computer science and mathematical optimization, a metaheuristic is a higher-level procedure or heuristic designed to find, generate, tune, or select a heuristic (partial search algorithm) that may provide a sufficiently good solution to an opt ...
, which blend exporation with exploitation. In this sense, CBO is comparable to
ant colony optimization In computer science and operations research, the ant colony optimization algorithm (ACO) is a probabilistic technique for solving computational problems that can be reduced to finding good paths through graphs. Artificial ants represent multi ...
, wind driven optimization,
particle swarm optimization In computational science, particle swarm optimization (PSO) is a computational method that Mathematical optimization, optimizes a problem by iterative method, iteratively trying to improve a candidate solution with regard to a given measure of qu ...
or
Simulated annealing Simulated annealing (SA) is a probabilistic technique for approximating the global optimum of a given function. Specifically, it is a metaheuristic to approximate global optimization in a large search space for an optimization problem. ...
.


Algorithm

Consider an ensemble of points x_t = (x_t^1,\dots, x_t^N) \in ^N, dependent of the time t\in[0,\infty). Then the update for the ith particle is formulated as a stochastic differential equation, dx^i_t = -\lambda\, \underbrace_ + \sigma \underbrace_, with the following components: * The consensus point c_(x): The key idea of CBO is that in each step the particles “agree” on a common consensus point, by computing an average of their positions, weighted by their current objective function value c_\alpha(x_t) = \frac \sum_^N x^i_t\ \omega_\alpha(x^i_t), \quad\text\quad \omega_\alpha(\,\cdot\,) = \mathrm(-\alpha f(\,\cdot\,)). This point is then used in the drift term x^i_t-c_\alpha(x_t), which moves each particle into the direction of the consensus point. * Scaled noise: For each t\geq 0 and i=1,\dots,N, we denote by B^i_t independent standard Brownian motions. The function D:\to\R^s incorporates the drift of the ith particle and determines the noise model. The most common choices are: ** ''Isotropic noise'', D(\cdot) = \, \cdot \, : In this case s=1 and every component of the noise vector is scaled equally. This was used in the original version of the algorithm. ** ''Anisotropic noise'', D(\cdot) = , \cdot, : In the special case, where \subset \R^d, this means that s=d and D applies the absolute value function component-wise. Here, every component of the noise vector is scaled, dependent on the corresponding entry of the drift vector. * Hyperparameters: The parameter \sigma \geq 0 scales the influence of the noise term. The parameter \alpha \geq 0 determines the separation effect of the particles: ** in the limit \alpha\to 0 every particle is assigned the same weight and the consensus point is a regular mean. ** In the limit \alpha\to\infty the consensus point corresponds to the particle with the best objective value, completely ignoring the position of other points in the ensemble.


Implementation notes

In practice, the SDE is discretized via the Euler–Maruyama method such that the following explicit update formula for the ensemble x = (x^1,\dots,x^N) is obtained,x^i \gets x^i-\lambda\, (x^i-c_\alpha(x))\,dt + \sigma D(x^i-c_(x))\, B^i.If one can employ an efficient implementation of the
LogSumExp The LogSumExp (LSE) (also called RealSoftMax or multivariable softplus) function is a smooth maximum – a smooth approximation to the maximum function, mainly used by machine learning algorithms. It is defined as the logarithm of the sum of the ...
functions, this can be beneficial for numerical stability of the consensus point computation. We refer to existing implementation in
Python Python may refer to: Snakes * Pythonidae, a family of nonvenomous snakes found in Africa, Asia, and Australia ** ''Python'' (genus), a genus of Pythonidae found in Africa and Asia * Python (mythology), a mythical serpent Computing * Python (prog ...
br>
and Julia (programming language), Juliabr>


Variants


Sampling

Consensus-based optimization can be transformed into a sampling method by modifying the noise term and choosing appropriate hyperparameters. Namely, one considers the following SDE dx^i_t = -(x^i_t-c_\alpha(x_t))\,dt + \sqrt\,dB^i_t, where the weighted covariance matrix is defined as C_\alpha(x_t) := \frac\sum_^N (x_t^i - c(x_t)) \otimes (x_t^i - c(x_t)) \omega(x_t^i) . If the parameters are chosen such that \tilde^ = (1 + \alpha) the above scheme creates approximate samples of a probability distribution with a density, that is proportional to \exp(-\alpha f).


Polarization

If the function f is multi-modal, i.e., has more than one global minimum, the standard CBO algorithm can only find one of these points. However, one can “polarize” the consensus computation by introducing a kernel k: \cal\times\cal\to[0,\infty) that includes local information into the weighting. In this case, every particle has its own version of the consensus point, which is computed asc_\alpha^j(x) = \frac \sum_^N x^i\ \omega_\alpha^j(x^i), \quad\text\quad \omega_\alpha^j(\,\cdot\,) = \mathrm(-\alpha f(\,\cdot\,))\, k(\cdot,x^j). In this case, the drift is a vector field over the state space \cal . Intuitively, particles are now not only attracted to other particles based on their objective value, but also based on their spatial locality. For a constant kernel function, the polarized version corresponds to standard CBO and is therefore a generalization. We briefly give some examples of common configurations: * Gaussian kernel k(\cdot,\cdot) = \exp\left(- \frac \, \cdot-\cdot\, ^2_2 \right) : the parameter \kappa determines the communication radius of particles. This choice corresponds to a local convex regularization of the objective function f . * Mean-shift algorithm: Employing polarized CBO for a constant objective function f , together with no noise (i.e. \sigma = 0 ) and an Euler–Maruyama discretization with step size dt=1 , corresponds to the mean-shift algorithm. * Bounded confidence model: When choosing a constant objective function, no noise model, but also the special kernel function k(x,\tilde x) = 1_ , the SDE in transforms to a ODE known as the bounded confidence model, which arises in opinion dynamics.


See also

*
Particle Swarm Optimization In computational science, particle swarm optimization (PSO) is a computational method that Mathematical optimization, optimizes a problem by iterative method, iteratively trying to improve a candidate solution with regard to a given measure of qu ...
*
Simulated annealing Simulated annealing (SA) is a probabilistic technique for approximating the global optimum of a given function. Specifically, it is a metaheuristic to approximate global optimization in a large search space for an optimization problem. ...
*
Ant colony optimization algorithms In computer science and operations research, the ant colony optimization algorithm (ACO) is a probabilistic technique for solving computational problems that can be reduced to finding good paths through graphs. Artificial ants represent mul ...


References

{{reflist Optimization algorithms and methods Metaheuristics