Unrelated-machines scheduling is an
optimization problem
In mathematics, computer science and economics, an optimization problem is the problem of finding the ''best'' solution from all feasible solutions.
Optimization problems can be divided into two categories, depending on whether the variables ...
in
computer science
Computer science is the study of computation, automation, and information. Computer science spans theoretical disciplines (such as algorithms, theory of computation, information theory, and automation) to practical disciplines (includin ...
and
operations research
Operations research ( en-GB, operational research) (U.S. Air Force Specialty Code: Operations Analysis), often shortened to the initialism OR, is a discipline that deals with the development and application of analytical methods to improve dec ...
. It is a variant of
optimal job scheduling. We need to schedule ''n'' jobs ''J''
1, ''J''
2, ..., ''J
n'' on ''m'' different machines, such that a certain objective function is optimized (usually, the
makespan
In operations research, the makespan of a project is the length of time that elapses from the start of work to the end. This type of multi-mode resource constrained project scheduling problem (MRCPSP) seeks to create the shortest logical project sc ...
should be minimized). The time that machine ''i'' needs in order to process job j is denoted by ''p
i,j''. The term ''unrelated'' emphasizes that there is no relation between values of ''p
i,j'' for different ''i'' and ''j''. This is in contrast to two special cases of this problem:
uniform-machines scheduling - in which ''p
i,j'' = ''p
i'' / ''s
j'' (where ''s
j'' is the speed of machine ''j''), and
identical-machines scheduling Identical-machines scheduling is an optimization problem in computer science and operations research. We are given ''n'' jobs ''J''1, ''J''2, ..., ''Jn'' of varying processing times, which need to be scheduled on ''m'' identical machines, such th ...
- in which ''p
i,j'' = ''p
i'' (the same run-time on all machines).
In the standard
three-field notation for optimal job scheduling problems, the unrelated-machines variant is denoted by R in the first field. For example, the problem denoted by " R, ,
" is an unrelated-machines scheduling problem with no constraints, where the goal is to minimize the maximum completion time.
In some variants of the problem, instead of minimizing the ''maximum'' completion time, it is desired to minimize the ''average'' completion time (averaged over all ''n'' jobs); it is denoted by R, ,
. More generally, when some jobs are more important than others, it may be desired to minimize a ''
weighted average
The weighted arithmetic mean is similar to an ordinary arithmetic mean (the most common type of average), except that instead of each of the data points contributing equally to the final average, some data points contribute more than others. The ...
'' of the completion time, where each job has a different weight. This is denoted by R, ,
.
In a third variant, the goal is to ''maximize'' the ''minimum'' completion time, " R, ,
" . This variant corresponds to the problem of
Egalitarian item allocation
Egalitarian item allocation, also called max-min item allocation is a fair item allocation problem, in which the fairness criterion follows the egalitarian rule. The goal is to maximize the minimum value of an agent. That is, among all possible a ...
.
Algorithms
Minimizing the maximum completion time (makespan)
Minimizing the ''maximum'' completion time is NP-hard even for ''identical'' machines, by reduction from the
partition problem.
Horowitz and Sahni
presented:
* Exact
dynamic programming
Dynamic programming is both a mathematical optimization method and a computer programming method. The method was developed by Richard Bellman in the 1950s and has found applications in numerous fields, from aerospace engineering to economics.
I ...
algorithms for minimizing the ''maximum'' completion time on both uniform and unrelated machines. These algorithms run in exponential time (recall that these problems are all NP-hard).
*
Polynomial-time approximation schemes, which for any ''ε''>0, attain at most (1+ε)OPT. For minimizing the ''maximum'' completion time on two ''uniform'' machines, their algorithm runs in time
, where
is the smallest integer for which
. Therefore, the run-time is in
, so it is an
FPTAS A fully polynomial-time approximation scheme (FPTAS) is an algorithm for finding approximate solutions to function problems, especially optimization problems. An FPTAS takes as input an instance of the problem and a parameter ε > 0. I ...
. For minimizing the ''maximum'' completion time on two ''unrelated'' machines, the run-time is
=
. They claim that their algorithms can be easily extended for any number of uniform machines, but do not analyze the run-time in this case.
Lenstra, Shmoys and Tardos presented a polytime 2-factor approximation algorithm, and proved that no polytime algorithm with approximation factor smaller than 3/2 is possible unless P=NP. Closing the gap between the 2 and the 3/2 is a long-standing open problem.
Versache and Wiese presented a different 2-factor approximation algorithm.
Glass, Potts and Shade compare various
local search techniques for minimizing the makespan on unrelated machines. Using computerized simulations, they find that
tabu search Tabu search is a metaheuristic search method employing local search methods used for mathematical optimization. It was created by Fred W. Glover in 1986
and formalized in 1989.
Local (neighborhood) searches take a potential solution to a prob ...
and
simulated annealing
Simulated annealing (SA) is a probabilistic technique for approximating the global optimum of a given function. Specifically, it is a metaheuristic to approximate global optimization in a large search space for an optimization problem. ...
perform much better than
genetic algorithms
In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). Genetic algorithms are commonly used to gene ...
.
Minimizing the average completion time
Bruno, Coffman and Sethi present an algorithm, running in time
, for minimizing the average job completion time on ''unrelated'' machines, R, ,
(the average over all ''jobs'', of the time it takes to complete the jobs).
Minimizing the ''weighted average'' completion time, R, ,
(where ''w
j'' is the weight of job ''j''), is NP-hard even on ''identical'' machines, by reduction from the
knapsack problem
The knapsack problem is a problem in combinatorial optimization: Given a set of items, each with a weight and a value, determine the number of each item to include in a collection so that the total weight is less than or equal to a given limit a ...
.
It is NP-hard even if the number of machines is fixed and at least 2, by reduction from the
partition problem.
Schulz and Skutella present a (3/2+ε)-approximation algorithm using
randomized rounding
Within computer science and operations research,
many combinatorial optimization problems are computationally intractable to solve exactly (to optimality).
Many such problems do admit fast (polynomial time) approximation algorithms—that is, algor ...
. Their algorithm is a (2+ε)-approximation for the problem with job release times, R,
,
.
Maximizing the profit
Bar-Noy, Bar-Yehuda, Freund, Naor and Schieber consider a setting in which, for each job and machine, there is a ''profit'' for running this job on that machine. They present a 1/2 approximation for discrete input and (1-''ε'')/2 approximation for continuous input.
Maximizing the minimum completion time
Suppose that, instead of "jobs" we have valuable items, and instead of "machines" we have people. Person ''i'' values item j at ''p
i,j''. We would like to allocate the items to the people, such that the least-happy person is as happy as possible. This problem is equivalent to unrelated-machines scheduling in which the goal is to maximize the minimum completion time. It is better known by the name ''egalitarian'' or
''max-min item allocation''.
Linear programming formulation
A natural way to formulate the problem as a linear program is called the ''Lenstra–Shmoys–Tardos
linear program (LST LP)''. For each machine ''i'' and job ''j
,'' define a variable
, which equals 1 iff machine ''i'' processes job ''j'', and 0 otherwise. Then, the LP constraints are:
*
for every job ''j'' in 1,...,''n;''
*
for every machine ''i'' in 1,...,''m;''
*
for every ''i'', ''j''.
Relaxing the integer constraints gives a linear program with size polynomial in the input. Solving the relaxed problem can be rounded to obtain a 2-approximation to the problem.''
''
Another LP formulation is the
configuration linear program. For each machine ''i'', there are finitely many subsets of jobs that can be processed by machine ''i'' in time at most ''T''. Each such subset is called a ''configuration'' for machine ''i''. Denote by ''C
i''(''T'') the set of all configurations for machine ''i''. For each machine ''i'' and configuration ''c'' in ''C
i''(''T''), define a variable
which equals 1 iff the actual configuration used in machine ''i'' is ''c'', and 0 otherwise. Then, the LP constraints are:
*
for every machine ''i'' in 1,...,''m;''
*
for every job ''j'' in 1,...,''n;''
*
for every ''i'', ''j''.
Note that the number of configurations is usually exponential in the size of the problem, so the size of the configuration LP is exponential. However, in some cases it is possible to bound the number of possible configurations, and therefore find an approximate solution in polynomial time.
Special cases
There is a special case in which ''p
i,j'' is either 1 or infinity. In other words, each job can be processed on a subset of ''allowed machines'', and its run-time in each of these machines is 1. This variant is sometimes denoted by " P, p
j=1,M
j,
". It can be solved in polynomial time.
Extensions
Kim, Kim, Jang and Chen extend the problem by allowing each job to have a setup time, which depends on the job but not on the machine. They present a solution using
simulated annealing
Simulated annealing (SA) is a probabilistic technique for approximating the global optimum of a given function. Specifically, it is a metaheuristic to approximate global optimization in a large search space for an optimization problem. ...
. Vallada and Ruiz present a solution using a
genetic algorithm
In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). Genetic algorithms are commonly used to gen ...
.
Caragiannis
extends the problem in a different way, by assuming that the jobs are owned by selfish agents (see
Truthful job scheduling Truthful job scheduling is a mechanism design variant of the job shop scheduling problem from operations research.
We have a project composed of several "jobs" (tasks). There are several workers. Each worker can do any job, but for each worker it t ...
).
External links
Summary of parallel machine problems without preemtion
References
{{Scheduling problems
Optimal scheduling