HOME

TheInfoList



OR:

In
computer science Computer science is the study of computation, automation, and information. Computer science spans theoretical disciplines (such as algorithms, theory of computation, information theory, and automation) to Applied science, practical discipli ...
and
mathematical optimization Mathematical optimization (alternatively spelled ''optimisation'') or mathematical programming is the selection of a best element, with regard to some criterion, from some set of available alternatives. It is generally divided into two subfi ...
, a metaheuristic is a higher-level procedure or
heuristic A heuristic (; ), or heuristic technique, is any approach to problem solving or self-discovery that employs a practical method that is not guaranteed to be optimal, perfect, or rational, but is nevertheless sufficient for reaching an immediate ...
designed to find, generate, or select a heuristic (partial
search algorithm In computer science, a search algorithm is an algorithm designed to solve a search problem. Search algorithms work to retrieve information stored within particular data structure, or calculated in the search space of a problem domain, with eith ...
) that may provide a sufficiently good solution to an
optimization problem In mathematics, computer science and economics, an optimization problem is the problem of finding the ''best'' solution from all feasible solutions. Optimization problems can be divided into two categories, depending on whether the variables ...
, especially with incomplete or imperfect information or limited computation capacity. Metaheuristics sample a subset of solutions which is otherwise too large to be completely enumerated or otherwise explored. Metaheuristics may make relatively few assumptions about the optimization problem being solved and so may be usable for a variety of problems. Compared to
optimization algorithm Mathematical optimization (alternatively spelled ''optimisation'') or mathematical programming is the selection of a best element, with regard to some criterion, from some set of available alternatives. It is generally divided into two subfi ...
s and
iterative method In computational mathematics, an iterative method is a mathematical procedure that uses an initial value to generate a sequence of improving approximate solutions for a class of problems, in which the ''n''-th approximation is derived from the pre ...
s, metaheuristics do not guarantee that a globally optimal solution can be found on some class of problems. Many metaheuristics implement some form of
stochastic optimization Stochastic optimization (SO) methods are optimization methods that generate and use random variables. For stochastic problems, the random variables appear in the formulation of the optimization problem itself, which involves random objective functi ...
, so that the solution found is dependent on the set of
random variable A random variable (also called random quantity, aleatory variable, or stochastic variable) is a mathematical formalization of a quantity or object which depends on random events. It is a mapping or a function from possible outcomes (e.g., the po ...
s generated. In combinatorial optimization, by searching over a large set of feasible solutions, metaheuristics can often find good solutions with less computational effort than optimization algorithms, iterative methods, or simple heuristics. As such, they are useful approaches for optimization problems. Several books and survey papers have been published on the subject. Most literature on metaheuristics is experimental in nature, describing empirical results based on
computer experiment A computer experiment or simulation experiment is an experiment used to study a computer simulation, also referred to as an in silico system. This area includes computational physics, computational chemistry, computational biology and other similar ...
s with the algorithms. But some formal theoretical results are also available, often on convergence and the possibility of finding the global optimum. Many metaheuristic methods have been published with claims of novelty and practical efficacy. While the field also features high-quality research, many of the publications have been of poor quality; flaws include vagueness, lack of conceptual elaboration, poor experiments, and ignorance of previous literature.


Properties

These are properties that characterize most metaheuristics: * Metaheuristics are strategies that guide the search process. * The goal is to efficiently explore the search space in order to find near–optimal solutions. * Techniques which constitute metaheuristic algorithms range from simple local search procedures to complex learning processes. * Metaheuristic algorithms are approximate and usually non-deterministic. * Metaheuristics are not problem-specific.


Classification

There are a wide variety of metaheuristics and a number of properties with respect to which to classify them.


Local search vs. global search

One approach is to characterize the type of search strategy. One type of search strategy is an improvement on simple local search algorithms. A well known local search algorithm is the
hill climbing numerical analysis, hill climbing is a mathematical optimization technique which belongs to the family of local search. It is an iterative algorithm that starts with an arbitrary solution to a problem, then attempts to find a better solutio ...
method which is used to find local optimums. However, hill climbing does not guarantee finding global optimum solutions. Many metaheuristic ideas were proposed to improve local search heuristic in order to find better solutions. Such metaheuristics include simulated annealing,
tabu search Tabu search is a metaheuristic search method employing local search methods used for mathematical optimization. It was created by Fred W. Glover in 1986 and formalized in 1989. Local (neighborhood) searches take a potential solution to a pro ...
, iterated local search, variable neighborhood search, and GRASP. These metaheuristics can both be classified as local search-based or global search metaheuristics. Other global search metaheuristic that are not local search-based are usually population-based metaheuristics. Such metaheuristics include ant colony optimization,
evolutionary computation In computer science, evolutionary computation is a family of algorithms for global optimization inspired by biological evolution, and the subfield of artificial intelligence and soft computing studying these algorithms. In technical terms, th ...
, particle swarm optimization,
genetic algorithm In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). Genetic algorithms are commonly used to ge ...
, and rider optimization algorithm


Single-solution vs. population-based

Another classification dimension is single solution vs population-based searches. Single solution approaches focus on modifying and improving a single candidate solution; single solution metaheuristics include simulated annealing, iterated local search, variable neighborhood search, and guided local search. Population-based approaches maintain and improve multiple candidate solutions, often using population characteristics to guide the search; population based metaheuristics include
evolutionary computation In computer science, evolutionary computation is a family of algorithms for global optimization inspired by biological evolution, and the subfield of artificial intelligence and soft computing studying these algorithms. In technical terms, th ...
, genetic algorithms, and particle swarm optimization. Another category of metaheuristics is
Swarm intelligence Swarm intelligence (SI) is the collective behavior of decentralized, self-organized systems, natural or artificial. The concept is employed in work on artificial intelligence. The expression was introduced by Gerardo Beni and Jing Wang in 1989, ...
which is a collective behavior of decentralized, self-organized agents in a population or swarm. Ant colony optimization,M. Dorigo, ''Optimization, Learning and Natural Algorithms'', PhD thesis, Politecnico di Milano, Italie, 1992. particle swarm optimization, social cognitive optimization are examples of this category.


Hybridization and memetic algorithms

A hybrid metaheuristic is one that combines a metaheuristic with other optimization approaches, such as algorithms from
mathematical programming Mathematical optimization (alternatively spelled ''optimisation'') or mathematical programming is the selection of a best element, with regard to some criterion, from some set of available alternatives. It is generally divided into two subfi ...
, constraint programming, and
machine learning Machine learning (ML) is a field of inquiry devoted to understanding and building methods that 'learn', that is, methods that leverage data to improve performance on some set of tasks. It is seen as a part of artificial intelligence. Machine ...
. Both components of a hybrid metaheuristic may run concurrently and exchange information to guide the search. On the other hand,
Memetic algorithm A memetic algorithm (MA) in computer science and operations research, is an extension of the traditional genetic algorithm. It may provide a sufficiently good solution to an optimization problem. It uses a local search technique to reduce the like ...
s represent the synergy of evolutionary or any population-based approach with separate individual learning or local improvement procedures for problem search. An example of memetic algorithm is the use of a local search algorithm instead of a basic mutation operator in evolutionary algorithms.


Parallel metaheuristics

A parallel metaheuristic is one that uses the techniques of parallel programming to run multiple metaheuristic searches in parallel; these may range from simple distributed schemes to concurrent search runs that interact to improve the overall solution.


Nature-inspired and metaphor-based metaheuristics

A very active area of research is the design of nature-inspired metaheuristics. Many recent metaheuristics, especially evolutionary computation-based algorithms, are inspired by natural systems. Nature acts as a source of concepts, mechanisms and principles for designing of artificial computing systems to deal with complex computational problems. Such metaheuristics include simulated annealing, evolutionary algorithms, ant colony optimization and particle swarm optimization. A large number of more recent metaphor-inspired metaheuristics have started to attract criticism in the research community for hiding their lack of novelty behind an elaborate metaphor.


Applications

Metaheuristics are used for combinatorial optimization in which an optimal solution is sought over a discrete search-space. An example problem is the travelling salesman problem where the search-space of candidate solutions grows faster than exponentially as the size of the problem increases, which makes an exhaustive search for the optimal solution infeasible. Additionally, multidimensional combinatorial problems, including most design problems in
engineering Engineering is the use of scientific principles to design and build machines, structures, and other items, including bridges, tunnels, roads, vehicles, and buildings. The discipline of engineering encompasses a broad range of more speciali ...
such as form-finding and behavior-finding, suffer from the curse of dimensionality, which also makes them infeasible for exhaustive search or
analytical method Analytical technique is a method used to determine a chemical or physical property of a chemical substance, chemical element, or mixture. There is a wide variety of techniques used for analysis, from simple weighing to advanced techniques using high ...
s. Metaheuristics are also widely used for jobshop scheduling and job selection problems. Popular metaheuristics for combinatorial problems include simulated annealing by Kirkpatrick et al., genetic algorithms by Holland et al., scatter search and
tabu search Tabu search is a metaheuristic search method employing local search methods used for mathematical optimization. It was created by Fred W. Glover in 1986 and formalized in 1989. Local (neighborhood) searches take a potential solution to a pro ...
by Glover. Literature review on metaheuristic optimization, suggested that it was Fred Glover who coined the word metaheuristics.


Metaheuristic Optimization Frameworks

A MOF can be defined as ‘‘a set of software tools that provide a correct and reusable implementation of a set of metaheuristics, and the basic mechanisms to accelerate the implementation of its partner subordinate heuristics (possibly including solution encodings and technique-specific operators), which are necessary to solve a particular problem instance using techniques provided’’. There are many candidate optimization tools which can be considered as a MOF of varying feature: Comet, EvA2, evolvica, Evolutionary::Algorithm, GAPlayground, jaga, JCLEC, JGAP, jMetal, n-genes, Open Beagle, Opt4j, ParadisEO/EO, Pisa, Watchmaker, FOM, Hypercube, HotFrame, Templar, EasyLocal, iOpt, OptQuest, JDEAL, Optimization Algorithm Toolkit, HeuristicLab, MAFRA, Localizer, GALIB, DREAM, Discropt, MALLBA, MAGMA, Metaheuristics.jl, UOF and OptaPlanner.


Contributions

Many different metaheuristics are in existence and new variants are continually being proposed. Some of the most significant contributions to the field are: * 1952: Robbins and Monro work on stochastic optimization methods. * 1954: Barricelli carry out the first simulations of the
evolution Evolution is change in the heritable characteristics of biological populations over successive generations. These characteristics are the expressions of genes, which are passed on from parent to offspring during reproduction. Variation ...
process and use them on general optimization problems. * 1963: Rastrigin proposes
random search Random search (RS) is a family of numerical optimization methods that do not require the gradient of the problem to be optimized, and RS can hence be used on functions that are not continuous or differentiable. Such optimization methods are also ...
. * 1965: Matyas proposes
random optimization Random optimization (RO) is a family of numerical optimization methods that do not require the gradient of the problem to be optimized and RO can hence be used on functions that are not continuous or differentiable. Such optimization methods are als ...
. * 1965: Nelder and Mead propose a simplex heuristic, which was shown by Powell to converge to non-stationary points on some problems. * 1965: Ingo Rechenberg discovers the first
Evolution Strategies In computer science, an evolution strategy (ES) is an optimization technique based on ideas of evolution. It belongs to the general class of evolutionary computation or artificial evolution methodologies. History The 'evolution strategy' optimizat ...
algorithm. * 1966: Fogel et al. propose evolutionary programming. * 1970: Hastings proposes the
Metropolis–Hastings algorithm In statistics and statistical physics, the Metropolis–Hastings algorithm is a Markov chain Monte Carlo (MCMC) method for obtaining a sequence of random samples from a probability distribution from which direct sampling is difficult. This se ...
. * 1970: Cavicchio proposes adaptation of control parameters for an optimizer. * 1970: Kernighan and Lin propose a graph partitioning method, related to variable-depth search and prohibition-based (tabu) search. * 1975:
Holland Holland is a geographical regionG. Geerts & H. Heestermans, 1981, ''Groot Woordenboek der Nederlandse Taal. Deel I'', Van Dale Lexicografie, Utrecht, p 1105 and former Provinces of the Netherlands, province on the western coast of the Netherland ...
proposes the
genetic algorithm In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). Genetic algorithms are commonly used to ge ...
. * 1977: Glover proposes scatter search. * 1978: Mercer and Sampson propose a
metaplan Metaplan is an international management consulting firm, advising companies on how to manage the process of strategy development. Metaplan is well known for developing the ''Metaplan technique''. This discursive approach is a method for collectin ...
for tuning an optimizer's parameters by using another optimizer. * 1980: Smith describes genetic programming. * 1983: Kirkpatrick et al. propose simulated annealing. * 1986: Glover proposes
tabu search Tabu search is a metaheuristic search method employing local search methods used for mathematical optimization. It was created by Fred W. Glover in 1986 and formalized in 1989. Local (neighborhood) searches take a potential solution to a pro ...
, first mention of the term ''metaheuristic''. * 1989: Moscato proposes
memetic algorithms A memetic algorithm (MA) in computer science and operations research, is an extension of the traditional genetic algorithm. It may provide a sufficiently good solution to an optimization problem. It uses a local search technique to reduce the like ...
. * 1990: Moscato and Fontanari, and Dueck and Scheuer, independently proposed a deterministic update rule for simulated annealing which accelerated the search. This led to the threshold accepting metaheuristic. * 1992: Dorigo introduces ant colony optimization in his PhD thesis. * 1995: Wolpert and Macready prove the
no free lunch "There ain't no such thing as a free lunch" (alternatively, "There is no such thing as a free lunch" or other variants) is a popular adage communicating the idea that it is impossible to get something for nothing. The acronyms TANSTAAFL, TINSTAA ...
theorems.


See also

* Stochastic search *
Meta-optimization In numerical optimization, meta-optimization is the use of one optimization method to tune another optimization method. Meta-optimization is reported to have been used as early as in the late 1970s by Mercer and Sampson for finding optimal paramete ...
* Matheuristics *
Hyper-heuristics A hyper-heuristic is a heuristic search method that seeks to automate, often by the incorporation of machine learning techniques, the process of selecting, combining, generating or adapting several simpler heuristics (or components of such heuristic ...
*
Swarm intelligence Swarm intelligence (SI) is the collective behavior of decentralized, self-organized systems, natural or artificial. The concept is employed in work on artificial intelligence. The expression was introduced by Gerardo Beni and Jing Wang in 1989, ...
* Genetic algorithms * Genetic programming * Simulated annealing *
Workforce modeling {{peacock, date=January 2014 Workforce modeling is the process by which the need for skilled workers at a particular point in time (demand) is matched directly with the availability and preference of skilled workers (supply). The resulting mathema ...


References


Further reading

* * Ashish Sharma (2022), Nature Inspired Algorithms with Randomized Hypercomputational Perspective. ''Information Sciences.'' https://doi.org/10.1016/j.ins.2022.05.020.


External links

*
EU/ME
forum for researchers in the field. {{Optimization algorithms, heuristic