HOME

TheInfoList



OR:

Variable neighborhood search (VNS), proposed by Mladenović & Hansen in 1997, is a metaheuristic method for solving a set of combinatorial optimization and global optimization problems. It explores distant neighborhoods of the current incumbent solution, and moves from there to a new one if and only if an improvement was made. The local search method is applied repeatedly to get from solutions in the neighborhood to local optima. VNS was designed for approximating solutions of discrete and continuous optimization problems and according to these, it is aimed for solving linear program problems,
integer program An integer programming problem is a mathematical optimization or feasibility program in which some or all of the variables are restricted to be integers. In many settings the term refers to integer linear programming (ILP), in which the objective ...
problems, mixed integer program problems, nonlinear program problems, etc.


Introduction

VNS systematically changes the neighborhood in two phases: firstly, descent to find a local optimum and finally, a perturbation phase to get out of the corresponding valley. Applications are rapidly increasing in number and pertain to many fields: location theory,
cluster analysis Cluster analysis or clustering is the task of grouping a set of objects in such a way that objects in the same group (called a cluster) are more similar (in some sense) to each other than to those in other groups (clusters). It is a main task of ...
, scheduling, vehicle routing, network design, lot-sizing,
artificial intelligence Artificial intelligence (AI) is intelligence—perceiving, synthesizing, and inferring information—demonstrated by machine A machine is a physical system using Power (physics), power to apply Force, forces and control Motion, moveme ...
, engineering, pooling problems, biology,
phylogeny A phylogenetic tree (also phylogeny or evolutionary tree Felsenstein J. (2004). ''Inferring Phylogenies'' Sinauer Associates: Sunderland, MA.) is a branching diagram or a tree showing the evolutionary relationships among various biological spe ...
,
reliability Reliability, reliable, or unreliable may refer to: Science, technology, and mathematics Computing * Data reliability (disambiguation), a property of some disk arrays in computer storage * High availability * Reliability (computer networking), ...
, geometry, telecommunication design, etc. There are several books important for understanding VNS, such as: ''Handbook of Metaheuristics'', 2010, Handbook of Metaheuristics, 2003 and Search methodologies, 2005. Earlier work that motivated this approach can be found in # Davidon, W.C. # Fletcher, R., Powell, M.J.D. # Mladenović, N. and # Brimberg, J., Mladenović, N. Recent surveys on VNS methodology as well as numerous applications can be found in 4OR, 2008 and Annals of OR, 2010.


Definition of the problem

Define one deterministic
optimization problem In mathematics, computer science and economics, an optimization problem is the problem of finding the ''best'' solution from all feasible solutions. Optimization problems can be divided into two categories, depending on whether the variables ...
with \min , (1) where ''S'', ''X'', ''x'', and ''f'' are the solution space, the feasible set, a feasible solution, and a real-valued
objective function In mathematical optimization and decision theory, a loss function or cost function (sometimes also called an error function) is a function that maps an event or values of one or more variables onto a real number intuitively representing some "cos ...
, respectively. If ''S'' is a finite but large set, a combinatorial optimization problem is defined. If , there is continuous optimization model. A solution is optimal if . Exact algorithm for problem (1) is to be found an optimal solution ''x*'', with the validation of its optimal structure, or if it is unrealizable, in procedure have to be shown that there is no achievable solution, i.e., X =\varnothing, or the solution is unbounded. CPU time has to be finite and short. For continuous optimization, it is reasonable to allow for some degree of tolerance, i.e., to stop when a feasible solution x^ has been found such that or Some heuristics speedily accept an approximate solution, or optimal solution but one with no validation of its optimality. Some of them have an incorrect certificate, i.e., the solution x_h obtained satisfies for some \epsilon, though this is rarely small. Heuristics are faced with the problem of local optima as a result of avoiding boundless computing time. A local optimum x_L of problem is such that where N(x_) denotes a neighborhood of x_


Description

According to (Mladenović, 1995), VNS is a metaheuristic which systematically performs the procedure of neighborhood change, both in descent to local minima and in escape from the valleys which contain them. VNS is built upon the following perceptions: # A local minimum with respect to one neighborhood structure is not necessarily a local minimum for another neighborhood structure. # A global minimum is a local minimum with respect to all possible neighborhood structures. # For many problems, local minima with respect to one or several neighborhoods are relatively close to each other. Unlike many other metaheuristics, the basic schemes of VNS and its extensions are simple and require few, and sometimes no parameters. Therefore, in addition to providing very good solutions, often in simpler ways than other methods, VNS gives insight into the reasons for such a performance, which, in turn, can lead to more efficient and sophisticated implementations. There are several papers where it could be studied among recently mentioned, such as (Hansen and Mladenović 1999, 2001a, 2003, 2005; Moreno-Pérez et al.;)


Local search

A local search heuristic is performed through choosing an initial solution x, discovering a direction of descent from ''x'', within a neighborhood ''N(x)'', and proceeding to the minimum of ''f(x)'' within ''N(x)'' in the same direction. If there is no direction of descent, the heuristic stops; otherwise, it is iterated. Usually the highest direction of descent, also related to as best improvement, is used. This set of rules is summarized in Algorithm 1, where we assume that an initial solution ''x'' is given. The output consists of a local minimum, denoted by ''x, and its value. Observe that a neighborhood structure ''N(x)'' is defined for all ''x ∈ X''. At each step, the neighborhood ''N(x)'' of ''x'' is explored completely. As this may be time-consuming, an alternative is to use the first descent heuristic. Vectors x^i \in N(x) are then enumerated systematically and a move is made as soon as a direction for the descent is found. This is summarized in Algorithm 2.


Algorithm 1: Best improvement (highest descent) heuristic

Function BestImprovement(x)
  1: repeat
  2:     x' ← x
  3:     x ← argmin_, y ∈ N(x)
  4: until ( f(x) ≥ f(x') )
  5: return x'


Algorithm 2: First improvement (first descent) heuristic

Function FirstImprovement(x)
  1: repeat
  2:    x' ← x; i ← 0
  3:    repeat
  4:       i ← i + 1
  5:       x ← argmin, x^i ∈ N(x)
  6:    until ( f(x) < f(x^i) or i = , N(x),  )
  7: until ( f(x) ≥ f(x') )
  8: return x'
Let one denote \mathcal_k(k=1, . . . ,k_) , a finite set of pre-selected neighborhood structures, and with \mathcal_k(x) the set of solutions in the ''kth'' neighborhood of ''x''. One will also use the notation \mathcal_k(x), k = 1, . . . , k'_ when describing local descent. Neighborhoods \mathcal_k(x) or \mathcal_k(x) may be induced from one or more metric (or quasi-metric) functions introduced into a solution space ''S''. An optimal solution x_ (or global minimum) is a feasible solution where a minimum of problem is reached. We call ''x' ∈ X'' a local minimum of problem with respect to \mathcal_k(x) , if there is no solution x \in \mathcal_k(x) \subseteq X such that f(x) < f(x'). In order to solve problem by using several neighborhoods, facts 1–3 can be used in three different ways: (i) deterministic; (ii)
stochastic Stochastic (, ) refers to the property of being well described by a random probability distribution. Although stochasticity and randomness are distinct in that the former refers to a modeling approach and the latter refers to phenomena themselve ...
; (iii) both deterministic and stochastic. We first give in Algorithm 3 the steps of the neighborhood change function which will be used later. Function NeighborhoodChange() compares the new value f(x') with the incumbent value f(x) obtained in the neighborhood k (line 1). If an improvement is obtained, k is returned to its initial value and the new incumbent updated (line 2). Otherwise, the next neighborhood is considered (line 3).


Algorithm 3: – Neighborhood change

Function NeighborhoodChange (x, x', k)
 1: if f (x') < f(x) then
 2:    x ← x'     // Make a move
 3:    k ← 1      // Initial neighborhood
 4: else
 5:    k ← k+1    // Next neighborhood
When VNS does not render a good solution, there are several steps which could be helped in process, such as comparing first and best improvement strategies in local search, reducing neighborhood, intensifying shaking, adopting VND, adopting FSS, and experimenting with parameter settings. The Basic VNS (BVNS) method (''Handbook of Metaheuristics'', 2010) combines deterministic and stochastic changes of neighborhood. Its steps are given in Algorithm 4. Often successive neighborhoods \mathcal_k will be nested. Observe that point ''x is generated at random in Step 4 in order to avoid cycling, which might occur if a deterministic rule were applied. In Step 5, the best improvement local search (Algorithm 1) is usually adopted. However, it can be replaced with first improvement (Algorithm 2).


Algorithm 4: Basic VNS

Function VNS (x, kmax, tmax)
 1: repeat
 2:    k ← 1
 3:    repeat
 4:       x' ← Shake(x, k)                   // Shaking
 5:       x'' ← BestImprovement(x' )         // Local search
 6:       x ← NeighbourhoodChange(x, x'', k) // Change neighbourhood
 7:    until k = kmax
 8:    t ← CpuTime()
 9: until t > tmax


VNS variants

The basic VNS is a best improvement descent method with randomization. Without much additional effort, it can be transformed into a descent-ascent method: in NeighbourhoodChange() function, replace also ''x'' by ''x"'' with some probability, even if the solution is worse than the incumbent. It can also be changed into a first improvement method. Another variant of the basic VNS can be to find a solution ''x in the 'Shaking' step as the best among b (a parameter) randomly generated solutions from the ''k''th neighborhood. There are two possible variants of this extension: (1) to perform only one local search from the best among b points; (2) to perform all b local searches and then choose the best. In paper (Fleszar and Hindi) could be found algorithm.


Extensions

* VND :The variable neighborhood descent (VND) method is obtained if a change of neighborhoods is performed in a deterministic way. In the descriptions of its algorithms, we assume that an initial solution ''x'' is given. Most local search heuristics in their descent phase use very few neighborhoods. The final solution should be a local minimum with respect to all k_ neighborhoods; hence the chances to reach a global one are larger when using VND than with a single neighborhood structure. * RVNS :The reduced VNS (RVNS) method is obtained if random points are selected from \mathcal_k(x) and no descent is made. Rather, the values of these new points are compared with that of the incumbent and an update takes place in case of improvement. It is assumed that a stopping condition has been chosen like the maximum CPU time allowed t_ or the maximum number of iterations between two improvements. :To simplify the description of the algorithms it is used t_ below. Therefore, RVNS uses two parameters: t_ and k_. RVNS is useful in very large instances, for which local search is costly. It has been observed that the best value for the parameter k_ is often 2. In addition, the maximum number of iterations between two improvements is usually used as a stopping condition. RVNS is akin to a
Monte-Carlo method Monte Carlo methods, or Monte Carlo experiments, are a broad class of computational algorithms that rely on repeated random sampling to obtain numerical results. The underlying concept is to use randomness to solve problems that might be determini ...
, but is more systematic. * Skewed VNS :The skewed VNS (SVNS) method (Hansen et al.) addresses the problem of exploring valleys far from the incumbent solution. Indeed, once the best solution in a large region has been found, it is necessary to go some way to obtain an improved one. Solutions drawn at random in distant neighborhoods may differ substantially from the incumbent and VNS can then degenerate, to some extent, into the Multistart heuristic (in which descents are made iteratively from solutions generated at random, a heuristic which is known not to be very efficient). Consequently, some compensation for distance from the incumbent must be made. * Variable Neighborhood Decomposition Search :The variable neighborhood decomposition search (VNDS) method (Hansen et al.) extends the basic VNS into a two-level VNS scheme based upon decomposition of the problem. For ease of presentation, but without loss of generality, it is assumed that the solution x represents the set of some elements. * Parallel VNS :Several ways of parallelizing VNS have recently been proposed for solving the p-Median problem. In García-López et al.  three of them are tested: (i) parallelize local search; (ii) augment the number of solutions drawn from the current neighborhood and make a local search in parallel from each of them and (iii) do the same as (ii) but update the information about the best solution found. Three Parallel VNS strategies are also suggested for solving the
Travelling purchaser problem The traveling purchaser problem (TPP) is an NP-hard In computational complexity theory, NP-hardness ( non-deterministic polynomial-time hardness) is the defining property of a class of problems that are informally "at least as hard as the hardest ...
in Ochi et al. * Primal-dual VNS :For most modern heuristics, the difference in value between the optimal solution and the obtained one is completely unknown. Guaranteed performance of the primal heuristic may be determined if a lower bound on the objective function value is known. To this end, the standard approach is to relax the integrality condition on the primal variables, based on a mathematical programming formulation of the problem. :However, when the dimension of the problem is large, even the relaxed problem may be impossible to solve exactly by standard commercial solvers. Therefore, it seems a good idea to solve dual relaxed problems heuristically as well. It was obtained guaranteed bounds on the primal heuristics performance. In Primal-dual VNS (PD-VNS) (Hansen et al.) one possible general way to attain both the guaranteed bounds and the exact solution is proposed. * Variable Neighborhood Branching :The mixed integer linear programming (MILP) problem consists of maximizing or minimizing a linear function, subject to equality or inequality constraints, and integrality restrictions on some of the variables. * Variable Neighborhood Formulation Space Search :FSS is a method which is very useful because, one problem could be defined in addition formulations and moving through formulations is legitimate. It is proved that local search works within formulations, implying a final solution when started from some initial solution in first formulation. Local search systematically alternates between different formulations which was