Unrelated-machines Scheduling
   HOME

TheInfoList



OR:

Unrelated-machines scheduling is an
optimization problem In mathematics, engineering, computer science and economics Economics () is a behavioral science that studies the Production (economics), production, distribution (economics), distribution, and Consumption (economics), consumption of goo ...
in
computer science Computer science is the study of computation, information, and automation. Computer science spans Theoretical computer science, theoretical disciplines (such as algorithms, theory of computation, and information theory) to Applied science, ...
and
operations research Operations research () (U.S. Air Force Specialty Code: Operations Analysis), often shortened to the initialism OR, is a branch of applied mathematics that deals with the development and application of analytical methods to improve management and ...
. It is a variant of optimal job scheduling. We need to schedule ''n'' jobs ''J''1, ''J''2, ..., ''Jn'' on ''m'' different machines, such that a certain objective function is optimized (usually, the
makespan In operations research Operations research () (U.S. Air Force Specialty Code: Operations Analysis), often shortened to the initialism OR, is a branch of applied mathematics that deals with the development and application of analytical methods t ...
should be minimized). The time that machine ''i'' needs in order to process job j is denoted by ''pi,j''. The term ''unrelated'' emphasizes that there is no relation between values of ''pi,j'' for different ''i'' and ''j''. This is in contrast to two special cases of this problem:
uniform-machines scheduling Uniform machine scheduling (also called uniformly-related machine scheduling or related machine scheduling) is an optimization problem in computer science and operations research. It is a variant of optimal job scheduling. We are given ''n'' jobs ...
- in which ''pi,j'' = ''pi'' / ''sj'' (where ''sj'' is the speed of machine ''j''), and
identical-machines scheduling Identical-machines scheduling is an optimization problem in computer science and operations research. We are given ''n'' jobs ''J''1, ''J''2, ..., ''Jn'' of varying processing times, which need to be scheduled on ''m'' identical machines, such that ...
- in which ''pi,j'' = ''pi'' (the same run-time on all machines). In the standard three-field notation for optimal job scheduling problems, the unrelated-machines variant is denoted by R in the first field. For example, the problem denoted by " R, , C_\max" is an unrelated-machines scheduling problem with no constraints, where the goal is to minimize the maximum completion time. In some variants of the problem, instead of minimizing the ''maximum'' completion time, it is desired to minimize the ''average'' completion time (averaged over all ''n'' jobs); it is denoted by R, , \sum C_i. More generally, when some jobs are more important than others, it may be desired to minimize a ''
weighted average The weighted arithmetic mean is similar to an ordinary arithmetic mean (the most common type of average), except that instead of each of the data points contributing equally to the final average, some data points contribute more than others. The ...
'' of the completion time, where each job has a different weight. This is denoted by R, , \sum w_i C_i. In a third variant, the goal is to ''maximize'' the ''minimum'' completion time, " R, , C_\min" . This variant corresponds to the problem of Egalitarian item allocation.


Algorithms


Minimizing the maximum completion time (makespan)

Minimizing the ''maximum'' completion time is NP-hard even for ''identical'' machines, by reduction from the
partition problem In number theory and computer science, the partition problem, or number partitioning, is the task of deciding whether a given multiset ''S'' of positive integers can be partition of a set, partitioned into two subsets ''S''1 and ''S''2 such that th ...
. Horowitz and Sahni presented: * Exact dynamic programming algorithms for minimizing the ''maximum'' completion time on both uniform and unrelated machines. These algorithms run in exponential time (recall that these problems are all NP-hard). * Polynomial-time approximation schemes, which for any ''ε''>0, attain at most (1+ε)OPT. For minimizing the ''maximum'' completion time on two ''uniform'' machines, their algorithm runs in time O(10^ n), where l is the smallest integer for which \epsilon \geq 2\cdot 10^. Therefore, the run-time is in O( n / \epsilon^2), so it is an
FPTAS A fully polynomial-time approximation scheme (FPTAS) is an algorithm for finding approximate solutions to function problems, especially optimization problems. An FPTAS takes as input an instance of the problem and a parameter ε > 0. It r ...
. For minimizing the ''maximum'' completion time on two ''unrelated'' machines, the run-time is O(10^ n^2) = O( n^2 / \epsilon). They claim that their algorithms can be easily extended for any number of uniform machines, but do not analyze the run-time in this case. Lenstra, Shmoys and Tardos presented a polytime 2-factor approximation algorithm, and proved that no polytime algorithm with approximation factor smaller than 3/2 is possible unless P=NP. Closing the gap between the 2 and the 3/2 is a long-standing open problem. Verschae and Wiese presented a different 2-factor approximation algorithm. Glass, Potts and Shade compare various local search techniques for minimizing the makespan on unrelated machines. Using computerized simulations, they find that
tabu search Tabu search (TS) is a metaheuristic search method employing local search methods used for mathematical optimization. It was created by Fred W. Glover in 1986 and formalized in 1989. Local (neighborhood) searches take a potential solution to a ...
and
simulated annealing Simulated annealing (SA) is a probabilistic technique for approximating the global optimum of a given function. Specifically, it is a metaheuristic to approximate global optimization in a large search space for an optimization problem. ...
perform much better than
genetic algorithms In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). Genetic algorithms are commonly used to g ...
.


Minimizing the average completion time

Bruno, Coffman and Sethi present an algorithm, running in time O(\max(m n^2,n^3)), for minimizing the average job completion time on ''unrelated'' machines, R, , \sum C_j (the average over all ''jobs'', of the time it takes to complete the jobs). Minimizing the ''weighted average'' completion time, R, , \sum w_j C_j (where ''wj'' is the weight of job ''j''), is NP-hard even on ''identical'' machines, by reduction from the
knapsack problem The knapsack problem is the following problem in combinatorial optimization: :''Given a set of items, each with a weight and a value, determine which items to include in the collection so that the total weight is less than or equal to a given lim ...
. It is NP-hard even if the number of machines is fixed and at least 2, by reduction from the
partition problem In number theory and computer science, the partition problem, or number partitioning, is the task of deciding whether a given multiset ''S'' of positive integers can be partition of a set, partitioned into two subsets ''S''1 and ''S''2 such that th ...
. Schulz and Skutella present a (3/2+ε)-approximation algorithm using
randomized rounding In computer science and operations research, randomized rounding is a widely used approach for designing and analyzing approximation algorithms. Many combinatorial optimization problems are computationally intractability (complexity), intrac ...
. Their algorithm is a (2+ε)-approximation for the problem with job release times, R, r_j, \sum w_j C_j.


Maximizing the profit

Bar-Noy, Bar-Yehuda, Freund, Naor and Schieber consider a setting in which, for each job and machine, there is a ''profit'' for running this job on that machine. They present a 1/2 approximation for discrete input and (1-''ε'')/2 approximation for continuous input.


Maximizing the minimum completion time

Suppose that, instead of "jobs" we have valuable items, and instead of "machines" we have people. Person ''i'' values item j at ''pi,j''. We would like to allocate the items to the people, such that the least-happy person is as happy as possible. This problem is equivalent to unrelated-machines scheduling in which the goal is to maximize the minimum completion time. It is better known by the name ''egalitarian'' or '' max-min item allocation''.


Linear programming formulation

A natural way to formulate the problem as a linear program is called the ''Lenstra–Shmoys–Tardos linear program (LST LP)''. For each machine ''i'' and job ''j,'' define a variable z_, which equals 1 if machine ''i'' processes job ''j'', and 0 otherwise. Then, the LP constraints are: * \sum_^m z_ = 1 for every job ''j'' in 1,...,''n;'' * \sum_^n z_\cdot p_ \leq T for every machine ''i'' in 1,...,''m;'' * z_ \in \ for every ''i'', ''j''. Relaxing the integer constraints gives a linear program with size polynomial in the input. Solving the relaxed problem can be rounded to obtain a 2-approximation to the problem.'''' Another LP formulation is the configuration linear program. For each machine ''i'', there are finitely many subsets of jobs that can be processed by machine ''i'' in time at most ''T''. Each such subset is called a ''configuration'' for machine ''i''. Denote by ''Ci''(''T'') the set of all configurations for machine ''i''. For each machine ''i'' and configuration ''c'' in ''Ci''(''T''), define a variable x_ which equals 1 if the actual configuration used in machine ''i'' is ''c'', and 0 otherwise. Then, the LP constraints are: * \sum_x_ = 1 for every machine ''i'' in 1,...,''m;'' * \sum_^m \sum_x_ = 1 for every job ''j'' in 1,...,''n;'' * x_ \in \ for every ''i'', ''j''. Note that the number of configurations is usually exponential in the size of the problem, so the size of the configuration LP is exponential. However, in some cases it is possible to bound the number of possible configurations, and therefore find an approximate solution in polynomial time.


Special cases

There is a special case in which ''pi,j'' is either 1 or infinity. In other words, each job can be processed on a subset of ''allowed machines'', and its run-time in each of these machines is 1. This variant is sometimes denoted by " P, pj=1,Mj, C_\max". It can be solved in polynomial time.


Extensions

Kim, Kim, Jang and Chen extend the problem by allowing each job to have a setup time, which depends on the job but not on the machine. They present a solution using
simulated annealing Simulated annealing (SA) is a probabilistic technique for approximating the global optimum of a given function. Specifically, it is a metaheuristic to approximate global optimization in a large search space for an optimization problem. ...
. Vallada and Ruiz present a solution using a
genetic algorithm In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). Genetic algorithms are commonly used to g ...
. Nisan and Ronen in their 1999 paper on
algorithmic mechanism design Algorithmic mechanism design (AMD) lies at the intersection of economic game theory, optimization, and computer science. The prototypical problem in mechanism design is to design a system for multiple self-interested participants, such that the part ...
. extend the problem in a different way, by assuming that the jobs are owned by selfish agents (see Truthful job scheduling).


External links


Summary of parallel machine problems without preemtion


References

{{Scheduling problems Optimal scheduling