HOME

TheInfoList



OR:

In the
analysis of algorithms In computer science, the analysis of algorithms is the process of finding the computational complexity of algorithms—the amount of time, storage, or other resources needed to execute them. Usually, this involves determining a function that r ...
, the master theorem for divide-and-conquer recurrences provides an
asymptotic analysis In mathematical analysis, asymptotic analysis, also known as asymptotics, is a method of describing limiting behavior. As an illustration, suppose that we are interested in the properties of a function as becomes very large. If , then as bec ...
(using
Big O notation Big ''O'' notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity. Big O is a member of a family of notations invented by Paul Bachmann, Edmund L ...
) for
recurrence relation In mathematics, a recurrence relation is an equation according to which the nth term of a sequence of numbers is equal to some combination of the previous terms. Often, only k previous terms of the sequence appear in the equation, for a parameter ...
s of types that occur in the
analysis Analysis ( : analyses) is the process of breaking a complex topic or substance into smaller parts in order to gain a better understanding of it. The technique has been applied in the study of mathematics and logic since before Aristotle (3 ...
of many
divide and conquer algorithm In computer science, divide and conquer is an algorithm design paradigm. A divide-and-conquer algorithm recursively breaks down a problem into two or more sub-problems of the same or related type, until these become simple enough to be solved direc ...
s. The approach was first presented by Jon Bentley, Dorothea Blostein (née Haken), and James B. Saxe in 1980, where it was described as a "unifying method" for solving such recurrences. The name "master theorem" was popularized by the widely used algorithms textbook ''
Introduction to Algorithms ''Introduction to Algorithms'' is a book on computer programming by Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, and Clifford Stein. The book has been widely used as the textbook for algorithms courses at many universities and is ...
'' by Cormen, Leiserson,
Rivest Ronald Linn Rivest (; born May 6, 1947) is a cryptographer and an Institute Professor at MIT. He is a member of MIT's Department of Electrical Engineering and Computer Science (EECS) and a member of MIT's Computer Science and Artificial Intel ...
, and
Stein Stein is a German, Yiddish and Norwegian word meaning "stone" and "pip" or "kernel". It stems from the same Germanic root as the English word stone. It may refer to: Places In Austria * Stein, a neighbourhood of Krems an der Donau, Lower Aust ...
. Not all recurrence relations can be solved with the use of this theorem; its generalizations include the
Akra–Bazzi method In computer science, the Akra–Bazzi method, or Akra–Bazzi theorem, is used to analyze the asymptotic behavior of the mathematical recurrences that appear in the analysis of divide and conquer algorithms where the sub-problems have substantia ...
.


Introduction

Consider a problem that can be solved using a recursive algorithm such as the following: procedure p(input ''x'' of size ''n''): if ''n'' < some constant ''k'': Solve ''x'' directly without recursion else: Create ''a'' subproblems of ''x'', each having size ''n''/''b'' Call procedure ''p'' recursively on each subproblem Combine the results from the subproblems The above algorithm divides the problem into a number of subproblems recursively, each subproblem being of size . Its
solution tree Solution may refer to: * Solution (chemistry), a mixture where one substance is dissolved in another * Solution (equation), in mathematics ** Numerical solution, in numerical analysis, approximate solutions within specified error bounds * Soluti ...
has a node for each recursive call, with the children of that node being the other calls made from that call. The leaves of the tree are the base cases of the recursion, the subproblems (of size less than ''k'') that do not recurse. The above example would have child nodes at each non-leaf node. Each node does an amount of work that corresponds to the size of the sub problem passed to that instance of the recursive call and given by f(n). The total amount of work done by the entire algorithm is the sum of the work performed by all the nodes in the tree. The runtime of an algorithm such as the 'p' above on an input of size 'n', usually denoted T(n), can be expressed by the
recurrence relation In mathematics, a recurrence relation is an equation according to which the nth term of a sequence of numbers is equal to some combination of the previous terms. Often, only k previous terms of the sequence appear in the equation, for a parameter ...
:T(n) = a \; T\left(\frac\right) + f(n), where f(n) is the time to create the subproblems and combine their results in the above procedure. This equation can be successively substituted into itself and expanded to obtain an expression for the total amount of work done. The master theorem allows many recurrence relations of this form to be converted to Θ-notation directly, without doing an expansion of the recursive relation.


Generic form

The master theorem always yields asymptotically tight bounds to recurrences from
divide and conquer algorithm In computer science, divide and conquer is an algorithm design paradigm. A divide-and-conquer algorithm recursively breaks down a problem into two or more sub-problems of the same or related type, until these become simple enough to be solved direc ...
s that partition an input into smaller subproblems of equal sizes, solve the subproblems recursively, and then combine the subproblem solutions to give a solution to the original problem. The time for such an algorithm can be expressed by adding the work that they perform at the top level of their recursion (to divide the problems into subproblems and then combine the subproblem solutions) together with the time made in the recursive calls of the algorithm. If T(n) denotes the total time for the algorithm on an input of size n, and f(n) denotes the amount of time taken at the top level of the recurrence then the time can be expressed by a
recurrence relation In mathematics, a recurrence relation is an equation according to which the nth term of a sequence of numbers is equal to some combination of the previous terms. Often, only k previous terms of the sequence appear in the equation, for a parameter ...
that takes the form: :T(n) = a \; T\!\left(\frac\right) + f(n) Here n is the size of an input problem, a is the number of subproblems in the recursion, and b is the factor by which the subproblem size is reduced in each recursive call (b>1). Crucially, a and b must not depend on n. The theorem below also assumes that, as a base case for the recurrence, T(n)=\Theta(1) when n is less than some bound \kappa > 0, the smallest input size that will lead to a recursive call. Recurrences of this form often satisfy one of the three following regimes, based on how the work to split/recombine the problem f(n) relates to the ''critical exponent'' c_=\log_b a. (The table below uses standard
big O notation Big ''O'' notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity. Big O is a member of a family of notations invented by Paul Bachmann, Edmund L ...
.) :c_ = \log_b a = \log(\#\text)/\log(\text) A useful extension of Case 2 handles all values of k: Chee Yap, A real elementary approach to the master recurrence and generalizations, Proceedings of the 8th annual conference on Theory and applications of models of computation (TAMC'11), pages 14–26, 2011
Online copy.


Examples


Case 1 example

:T(n) = 8 T\left(\frac\right) + 1000n^2 As one can see from the formula above: :a = 8, \, b = 2, \, f(n) = 1000n^2, so :f(n) = O\left(n^c\right), where c = 2 Next, we see if we satisfy the case 1 condition: :\log_b a = \log_2 8 = 3>c. It follows from the first case of the master theorem that :T(n) = \Theta\left( n^ \right) = \Theta\left( n^ \right) (This result is confirmed by the exact solution of the recurrence relation, which is T(n) = 1001 n^3 - 1000 n^2, assuming T(1) = 1).


Case 2 example

T(n) = 2 T\left(\frac\right) + 10n As we can see in the formula above the variables get the following values: :a = 2, \, b = 2, \, c = 1, \, f(n) = 10n :f(n) = \Theta\left(n^ \log^ n\right) where c = 1, k = 0 Next, we see if we satisfy the case 2 condition: :\log_b a = \log_2 2 = 1, and therefore, c and log_b a are equal So it follows from the second case of the master theorem: :T(n) = \Theta\left( n^ \log^ n\right) = \Theta\left( n^ \log^ n\right) = \Theta\left(n \log n\right) Thus the given recurrence relation ''T''(''n'') was in Θ(''n'' log ''n''). (This result is confirmed by the exact solution of the recurrence relation, which is T(n) = n + 10 n\log_2 n, assuming T(1) = 1.)


Case 3 example

:T(n) = 2 T\left(\frac\right) + n^2 As we can see in the formula above the variables get the following values: :a = 2, \, b = 2, \, f(n) = n^2 :f(n) = \Omega\left(n^c\right), where c = 2 Next, we see if we satisfy the case 3 condition: :\log_b a = \log_2 2 = 1, and therefore, yes, c > \log_b a The regularity condition also holds: : 2 \left(\frac\right) \le k n^2 , choosing k = 1/2 So it follows from the third case of the master theorem: :T \left(n \right) = \Theta\left(f(n)\right) = \Theta \left(n^2 \right). Thus the given recurrence relation T(n) was in \Theta(n^2), that complies with the f(n) of the original formula. (This result is confirmed by the exact solution of the recurrence relation, which is T(n) = 2 n^2 - n, assuming T(1) = 1.)


Inadmissible equations

The following equations cannot be solved using the master theorem: Massachusetts Institute of Technology (MIT), "Master Theorem: Practice Problems and Solutions", https://people.csail.mit.edu/thies/6.046-web/master.pdf *T(n) = 2^nT\left (\frac\right )+n^n *:''a'' is not a constant; the number of subproblems should be fixed *T(n) = 2T\left (\frac\right )+\frac *:non-polynomial difference between f(n) and n^ (see below; extended version applies) *T(n) = 0.5T\left (\frac\right )+n *: a<1 cannot have less than one sub problem *T(n) = 64T\left (\frac\right )-n^2\log n *:f(n), which is the combination time, is not positive *T(n) = T\left (\frac\right )+n(2-\cos n) *:case 3 but regularity violation. In the second inadmissible example above, the difference between f(n) and n^ can be expressed with the ratio \frac = \frac = \frac = \frac. It is clear that \frac < n^\epsilon for any constant \epsilon > 0. Therefore, the difference is not polynomial and the basic form of the Master Theorem does not apply. The extended form (case 2b) does apply, giving the solution T(n) = \Theta(n\log\log n).


Application to common algorithms


See also

*
Akra–Bazzi method In computer science, the Akra–Bazzi method, or Akra–Bazzi theorem, is used to analyze the asymptotic behavior of the mathematical recurrences that appear in the analysis of divide and conquer algorithms where the sub-problems have substantia ...
* Asymptotic complexity


Notes


References

*
Thomas H. Cormen Thomas H. Cormen is the co-author of ''Introduction to Algorithms'', along with Charles Leiserson, Ron Rivest, and Cliff Stein. In 2013, he published a new book titled '' Algorithms Unlocked''. He is a professor of computer science at Dartmou ...
, Charles E. Leiserson, Ronald L. Rivest, and Clifford Stein. ''
Introduction to Algorithms ''Introduction to Algorithms'' is a book on computer programming by Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, and Clifford Stein. The book has been widely used as the textbook for algorithms courses at many universities and is ...
'', Second Edition. MIT Press and McGraw–Hill, 2001. . Sections 4.3 (The master method) and 4.4 (Proof of the master theorem), pp. 73–90. * Michael T. Goodrich and
Roberto Tamassia Roberto Tamassia is an American Italian computer scientist, the Plastech Professor of Computer Science at Brown University, and served as the chair of the Brown Computer Science department from 2007 to 2014.
. ''Algorithm Design: Foundation, Analysis, and Internet Examples''. Wiley, 2002. . The master theorem (including the version of Case 2 included here, which is stronger than the one from CLRS) is on pp. 268–270.


External links


Teorema Mestre e Exemplos Resolvidos
{{DEFAULTSORT:Master Theorem Asymptotic analysis Theorems in computational complexity theory Recurrence relations Analysis of algorithms