Test functions for optimization
   HOME

TheInfoList



OR:

In applied mathematics, test functions, known as artificial landscapes, are useful to evaluate characteristics of optimization algorithms, such as: * Convergence rate. * Precision. * Robustness. * General performance. Here some test functions are presented with the aim of giving an idea about the different situations that optimization algorithms have to face when coping with these kinds of problems. In the first part, some objective functions for single-objective optimization cases are presented. In the second part, test functions with their respective Pareto fronts for
multi-objective optimization Multi-objective optimization (also known as multi-objective programming, vector optimization, multicriteria optimization, multiattribute optimization or Pareto optimization) is an area of multiple criteria decision making that is concerned with ...
problems (MOP) are given. The artificial landscapes presented herein for single-objective optimization problems are taken from Bäck, Haupt et al. and from Rody Oldenhuis software. Given the number of problems (55 in total), just a few are presented here. The test functions used to evaluate the algorithms for MOP were taken from Deb,Deb, Kalyanmoy (2002) Multiobjective optimization using evolutionary algorithms (Repr. ed.). Chichester .a. Wiley. . Binh et al.Binh T. and Korn U. (1997
MOBES: A Multiobjective Evolution Strategy for Constrained Optimization Problems
In: Proceedings of the Third International Conference on Genetic Algorithms. Czech Republic. pp. 176–182
and Binh.Binh T. (1999
A multiobjective evolutionary algorithm. The study cases.
Technical report. Institute for Automation and Communication. Barleben, Germany
The software developed by Deb can be downloaded,Deb K. (2011) Software for multi-objective NSGA-II code in C. Available at URL: https://www.iitk.ac.in/kangal/codes.shtml which implements the NSGA-II procedure with GAs, or the program posted on Internet, which implements the NSGA-II procedure with ES. Just a general form of the equation, a plot of the objective function, boundaries of the object variables and the coordinates of global minima are given herein.


Test functions for single-objective optimization


Test functions for constrained optimization


Test functions for multi-objective optimization

{, class="wikitable" style="text-align:center" , - ! Name !! Plot !! Functions !! Constraints !! Search domain , - , Binh and Korn function: , , , , \text{Minimize} = \begin{cases} f_{1}\left(x,y\right) = 4x^{2} + 4y^{2} \\ f_{2}\left(x,y\right) = \left(x - 5\right)^{2} + \left(y - 5\right)^{2} \\ \end{cases} , , \text{s.t.} = \begin{cases} g_{1}\left(x,y\right) = \left(x - 5\right)^{2} + y^{2} \leq 25 \\ g_{2}\left(x,y\right) = \left(x - 8\right)^{2} + \left(y + 3\right)^{2} \geq 7.7 \\ \end{cases} , , 0\le x \le 5, 0\le y \le 3 , - , Chankong and Haimes function: , , , , \text{Minimize} = \begin{cases} f_{1}\left(x,y\right) = 2 + \left(x-2\right)^{2} + \left(y-1\right)^{2} \\ f_{2}\left(x,y\right) = 9x - \left(y - 1\right)^{2} \\ \end{cases} , , \text{s.t.} = \begin{cases} g_{1}\left(x,y\right) = x^{2} + y^{2} \leq 225 \\ g_{2}\left(x,y\right) = x - 3y + 10 \leq 0 \\ \end{cases} , , -20\le x,y \le 20 , - , Fonseca–Fleming function: , , , , \text{Minimize} = \begin{cases} f_{1}\left(\boldsymbol{x}\right) = 1 - \exp \left \sum_{i=1}^{n} \left(x_{i} - \frac{1}{\sqrt{n \right)^{2} \right\\ f_{2}\left(\boldsymbol{x}\right) = 1 - \exp \left \sum_{i=1}^{n} \left(x_{i} + \frac{1}{\sqrt{n \right)^{2} \right\\ \end{cases} , , , , -4\le x_{i} \le 4, 1\le i \le n , - , Test function 4: , , , , \text{Minimize} = \begin{cases} f_{1}\left(x,y\right) = x^{2} - y \\ f_{2}\left(x,y\right) = -0.5x - y - 1 \\ \end{cases} , , \text{s.t.} = \begin{cases} g_{1}\left(x,y\right) = 6.5 - \frac{x}{6} - y \geq 0 \\ g_{2}\left(x,y\right) = 7.5 - 0.5x - y \geq 0 \\ g_{3}\left(x,y\right) = 30 - 5x - y \geq 0 \\ \end{cases} , , -7\le x,y \le 4 , - , Kursawe function:F. Kursawe,
A variant of evolution strategies for vector optimization
” in PPSN I, Vol 496 Lect Notes in Comput Sc. Springer-Verlag, 1991, pp. 193–197.
, , , , \text{Minimize} = \begin{cases} f_{1}\left(\boldsymbol{x}\right) = \sum_{i=1}^{2} \left 10 \exp \left(-0.2 \sqrt{x_{i}^{2} + x_{i+1}^{2 \right) \right\\ & \\ f_{2}\left(\boldsymbol{x}\right) = \sum_{i=1}^{3} \left x_{i}\^{0.8} + 5 \sin \left(x_{i}^{3} \right) \right\\ \end{cases} , , , , -5\le x_{i} \le 5, 1\le i \le 3. , - , Schaffer function N. 1: , , , , \text{Minimize} = \begin{cases} f_{1}\left(x\right) = x^{2} \\ f_{2}\left(x\right) = \left(x-2\right)^{2} \\ \end{cases} , , , , -A\le x \le A. Values of A from 10 to 10^{5} have been used successfully. Higher values of A increase the difficulty of the problem. , - , Schaffer function N. 2: , , , , \text{Minimize} = \begin{cases} f_{1}\left(x\right) = \begin{cases} -x, & \text{if } x \le 1 \\ x-2, & \text{if } 1 < x \le 3 \\ 4-x, & \text{if } 3 < x \le 4 \\ x-4, & \text{if } x > 4 \\ \end{cases} \\ f_{2}\left(x\right) = \left(x-5\right)^{2} \\ \end{cases} , , , , -5\le x \le 10. , - , Poloni's two objective function: , , , , \text{Minimize} = \begin{cases} f_{1}\left(x,y\right) = \left + \left(A_{1} - B_{1}\left(x,y\right) \right)^{2} + \left(A_{2} - B_{2}\left(x,y\right) \right)^{2} \right\\ f_{2}\left(x,y\right) = \left(x + 3\right)^{2} + \left(y + 1 \right)^{2} \\ \end{cases} \text{where} = \begin{cases} A_{1} = 0.5 \sin \left(1\right) - 2 \cos \left(1\right) + \sin \left(2\right) - 1.5 \cos \left(2\right) \\ A_{2} = 1.5 \sin \left(1\right) - \cos \left(1\right) + 2 \sin \left(2\right) - 0.5 \cos \left(2\right) \\ B_{1}\left(x,y\right) = 0.5 \sin \left(x\right) - 2 \cos \left(x\right) + \sin \left(y\right) - 1.5 \cos \left(y\right) \\ B_{2}\left(x,y\right) = 1.5 \sin \left(x\right) - \cos \left(x\right) + 2 \sin \left(y\right) - 0.5 \cos \left(y\right) \end{cases} , , , , -\pi\le x,y \le \pi , - , Zitzler–Deb–Thiele's function N. 1: , , , , \text{Minimize} = \begin{cases} f_{1}\left(\boldsymbol{x}\right) = x_{1} \\ f_{2}\left(\boldsymbol{x}\right) = g\left(\boldsymbol{x}\right) h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) \\ g\left(\boldsymbol{x}\right) = 1 + \frac{9}{29} \sum_{i=2}^{30} x_{i} \\ h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) = 1 - \sqrt{\frac{f_{1}\left(\boldsymbol{x}\right)}{g\left(\boldsymbol{x}\right) \\ \end{cases} , , , , 0\le x_{i} \le 1, 1\le i \le 30. , - , Zitzler–Deb–Thiele's function N. 2: , , , , \text{Minimize} = \begin{cases} f_{1}\left(\boldsymbol{x}\right) = x_{1} \\ f_{2}\left(\boldsymbol{x}\right) = g\left(\boldsymbol{x}\right) h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) \\ g\left(\boldsymbol{x}\right) = 1 + \frac{9}{29} \sum_{i=2}^{30} x_{i} \\ h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) = 1 - \left(\frac{f_{1}\left(\boldsymbol{x}\right)}{g\left(\boldsymbol{x}\right)}\right)^{2} \\ \end{cases} , , , , 0\le x_{i} \le 1, 1\le i \le 30. , - , Zitzler–Deb–Thiele's function N. 3: , , , , \text{Minimize} = \begin{cases} f_{1}\left(\boldsymbol{x}\right) = x_{1} \\ f_{2}\left(\boldsymbol{x}\right) = g\left(\boldsymbol{x}\right) h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) \\ g\left(\boldsymbol{x}\right) = 1 + \frac{9}{29} \sum_{i=2}^{30} x_{i} \\ h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) = 1 - \sqrt{\frac{f_{1}\left(\boldsymbol{x}\right)}{g\left(\boldsymbol{x} \right) - \left(\frac{f_{1}\left(\boldsymbol{x}\right)}{g\left(\boldsymbol{x}\right)} \right) \sin \left(10 \pi f_{1} \left(\boldsymbol{x} \right) \right) \end{cases} , , , , 0\le x_{i} \le 1, 1\le i \le 30. , - , Zitzler–Deb–Thiele's function N. 4: , , , , \text{Minimize} = \begin{cases} f_{1}\left(\boldsymbol{x}\right) = x_{1} \\ f_{2}\left(\boldsymbol{x}\right) = g\left(\boldsymbol{x}\right) h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) \\ g\left(\boldsymbol{x}\right) = 91 + \sum_{i=2}^{10} \left(x_{i}^{2} - 10 \cos \left(4 \pi x_{i}\right) \right) \\ h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) = 1 - \sqrt{\frac{f_{1}\left(\boldsymbol{x}\right)}{g\left(\boldsymbol{x} \right) \end{cases} , , , , 0\le x_{1} \le 1, -5\le x_{i} \le 5, 2\le i \le 10 , - , Zitzler–Deb–Thiele's function N. 6: , , , , \text{Minimize} = \begin{cases} f_{1}\left(\boldsymbol{x}\right) = 1 - \exp \left(-4x_{1}\right)\sin^{6}\left(6 \pi x_{1} \right) \\ f_{2}\left(\boldsymbol{x}\right) = g\left(\boldsymbol{x}\right) h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) \\ g\left(\boldsymbol{x}\right) = 1 + 9 \left frac{\sum_{i=2}^{10} x_{i{9}\right{0.25} \\ h \left(f_{1}\left(\boldsymbol{x}\right),g\left(\boldsymbol{x}\right)\right) = 1 - \left(\frac{f_{1}\left(\boldsymbol{x}\right)}{g\left(\boldsymbol{x} \right)}\right)^{2} \\ \end{cases} , , , , 0\le x_{i} \le 1, 1\le i \le 10. , - , Osyczka and Kundu function: , , , , \text{Minimize} = \begin{cases} f_{1}\left(\boldsymbol{x}\right) = -25 \left(x_{1}-2\right)^{2} - \left(x_{2}-2\right)^{2} - \left(x_{3}-1\right)^{2} - \left(x_{4}-4\right)^{2} - \left(x_{5}-1\right)^{2} \\ f_{2}\left(\boldsymbol{x}\right) = \sum_{i=1}^{6} x_{i}^{2} \\ \end{cases} , , \text{s.t.} = \begin{cases} g_{1}\left(\boldsymbol{x}\right) = x_{1} + x_{2} - 2 \geq 0 \\ g_{2}\left(\boldsymbol{x}\right) = 6 - x_{1} - x_{2} \geq 0 \\ g_{3}\left(\boldsymbol{x}\right) = 2 - x_{2} + x_{1} \geq 0 \\ g_{4}\left(\boldsymbol{x}\right) = 2 - x_{1} + 3x_{2} \geq 0 \\ g_{5}\left(\boldsymbol{x}\right) = 4 - \left(x_{3}-3\right)^{2} - x_{4} \geq 0 \\ g_{6}\left(\boldsymbol{x}\right) = \left(x_{5} - 3\right)^{2} + x_{6} - 4 \geq 0 \end{cases} , , 0\le x_{1},x_{2},x_{6} \le 10, 1\le x_{3},x_{5} \le 5, 0\le x_{4} \le 6. , - , CTP1 function (2 variables): , , , , \text{Minimize} = \begin{cases} f_{1}\left(x,y\right) = x \\ f_{2}\left(x,y\right) = \left(1 + y\right) \exp \left(-\frac{x}{1+y} \right) \end{cases} , , \text{s.t.} = \begin{cases} g_{1}\left(x,y\right) = \frac{f_{2}\left(x,y\right)}{0.858 \exp \left(-0.541 f_{1}\left(x,y\right)\right)} \geq 1 \\ g_{2}\left(x,y\right) = \frac{f_{2}\left(x,y\right)}{0.728 \exp \left(-0.295 f_{1}\left(x,y\right)\right)} \geq 1 \end{cases} , , 0\le x,y \le 1. , - , Constr-Ex problem: , , , , \text{Minimize} = \begin{cases} f_{1}\left(x,y\right) = x \\ f_{2}\left(x,y\right) = \frac{1 + y}{x} \\ \end{cases} , , \text{s.t.} = \begin{cases} g_{1}\left(x,y\right) = y + 9x \geq 6 \\ g_{2}\left(x,y\right) = -y + 9x \geq 1 \\ \end{cases} , , 0.1\le x \le 1, 0\le y \le 5 , - , Viennet function: , , , , \text{Minimize} = \begin{cases} f_{1}\left(x,y\right) = 0.5\left(x^{2} + y^{2}\right) + \sin\left(x^{2} + y^{2} \right) \\ f_{2}\left(x,y\right) = \frac{\left(3x - 2y + 4\right)^{2{8} + \frac{\left(x - y + 1\right)^{2{27} + 15 \\ f_{3}\left(x,y\right) = \frac{1}{x^{2} + y^{2} + 1} - 1.1 \exp \left(- \left(x^{2} + y^{2} \right) \right) \\ \end{cases} , , , , -3\le x,y \le 3.


See also

* Ackley function * Himmelblau's function * Rastrigin function *
Rosenbrock function In mathematical optimization, the Rosenbrock function is a non- convex function, introduced by Howard H. Rosenbrock in 1960, which is used as a performance test problem for optimization algorithms. It is also known as Rosenbrock's valley or Ros ...
* Shekel function * Binh function


References

{{DEFAULTSORT:Test functions for optimization Constraint programming Convex optimization Types of functions