Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Master theorem (analysis of algorithms)

From Wikipedia, the free encyclopedia
Tool for analyzing divide-and-conquer algorithms

In theanalysis of algorithms, themaster theorem for divide-and-conquer recurrences provides anasymptotic analysis for manyrecurrence relations that occur in theanalysis ofdivide-and-conquer algorithms. The approach was first presented byJon Bentley,Dorothea Blostein (née Haken), andJames B. Saxe in 1980, where it was described as a "unifying method" for solving such recurrences.[1] The name "master theorem" was popularized by the widely used algorithms textbookIntroduction to Algorithms byCormen,Leiserson,Rivest, andStein.

Not all recurrence relations can be solved by this theorem; its generalizations include theAkra–Bazzi method.

Introduction

[edit]

Consider a problem that can be solved using arecursive algorithm such as the following:

procedure p(inputx of sizen):ifn < some constantk:        Solvex directly without recursionelse:        Createa subproblems ofx, each having sizen/b        Call procedurep recursively on each subproblem        Combine the results from the subproblems
Solution tree.

The above algorithm divides the problem into a number (a) of subproblems recursively, each subproblem being of sizen/b. The factor by which the size of subproblems is reduced (b) need not, in general, be the same as the number of subproblems (a). Its solutiontree has a node for each recursive call, with the children of that node being the other calls made from that call. The leaves of the tree are the base cases of the recursion, the subproblems (of size less thank) that do not recurse. The above example would havea child nodes at each non-leaf node. Each node does an amount of work that corresponds to the size of the subproblemn passed to that instance of the recursive call and given byf(n){\displaystyle f(n)}. The total amount of work done by the entire algorithm is the sum of the work performed by all the nodes in the tree.

The runtime of an algorithm such as thep above on an input of sizen, usually denotedT(n){\displaystyle T(n)}, can be expressed by therecurrence relation

T(n)=aT(nb)+f(n),{\displaystyle T(n)=a\;T\left({\frac {n}{b}}\right)+f(n),}

wheref(n){\displaystyle f(n)} is the time to create the subproblems and combine their results in the above procedure. This equation can be successively substituted into itself and expanded to obtain an expression for the total amount of work done.[2] The master theorem allows many recurrence relations of this form to be converted toΘ-notation directly, without doing an expansion of the recursive relation.

Generic form

[edit]

The master theorem always yieldsasymptotically tight bounds to recurrences fromdivide and conquer algorithms that partition an input into smaller subproblems of equal sizes, solve the subproblems recursively, and then combine the subproblem solutions to give a solution to the original problem. The time for such an algorithm can be expressed by adding the work that they perform at the top level of their recursion (to divide the problems into subproblems and then combine the subproblem solutions) together with the time made in the recursive calls of the algorithm. IfT(n){\displaystyle T(n)} denotes the total time for the algorithm on an input of sizen{\displaystyle n}, andf(n){\displaystyle f(n)} denotes the amount of time taken at the top level of the recurrence then the time can be expressed by arecurrence relation that takes the form:

T(n)=aT(nb)+f(n){\displaystyle T(n)=a\;T\!\left({\frac {n}{b}}\right)+f(n)}

Heren{\displaystyle n} is the size of an input problem,a{\displaystyle a} is the number of subproblems in the recursion, andb{\displaystyle b} is the factor by which the subproblem size is reduced in each recursive call (b>1{\displaystyle b>1}). Crucially,a{\displaystyle a} andb{\displaystyle b} must not depend onn{\displaystyle n}. The theorem below also assumes that, as a base case for the recurrence,T(n)=Θ(1){\displaystyle T(n)=\Theta (1)} whenn{\displaystyle n} is less than some boundκ>0{\displaystyle \kappa >0}, the smallest input size that will lead to a recursive call.

Recurrences of this form often satisfy one of the three following regimes, based on how the work to split/recombine the problemf(n){\displaystyle f(n)} relates to thecritical exponentccrit=logba{\displaystyle c_{\operatorname {crit} }=\log _{b}a}. (The table below uses standardbig O notation). Throughout,(logn)k{\displaystyle (\log n)^{k}} is used for clarity, though in textbooks this is usually renderedlogkn{\displaystyle \log ^{k}n}.

ccrit=logba=log(#subproblems)/log(relative subproblem size){\displaystyle c_{\operatorname {crit} }=\log _{b}a=\log(\#{\text{subproblems}})/\log({\text{relative subproblem size}})}
CaseDescriptionCondition onf(n){\displaystyle f(n)} in relation toccrit{\displaystyle c_{\operatorname {crit} }}, i.e.logba{\displaystyle \log _{b}a}Master Theorem boundNotational examples
1Work to split/recombine a problem is dominated by subproblems.

i.e. the recursion tree is leaf-heavy.

Whenf(n)=O(nc){\displaystyle f(n)=O(n^{c})} wherec<ccrit{\displaystyle c<c_{\operatorname {crit} }}

(upper-bounded by a lesser exponent polynomial)

... thenT(n)=Θ(nccrit){\displaystyle T(n)=\Theta \left(n^{c_{\operatorname {crit} }}\right)}

(The splitting term does not appear; the recursive tree structure dominates.)

Ifb=a2{\displaystyle b=a^{2}} andf(n)=O(n1/2ϵ){\displaystyle f(n)=O(n^{1/2-\epsilon })}, thenT(n)=Θ(n1/2){\displaystyle T(n)=\Theta (n^{1/2})}.
2Work to split/recombine a problem is comparable to subproblems.Whenf(n)=Θ(nccrit(logn)k){\displaystyle f(n)=\Theta (n^{c_{\operatorname {crit} }}(\log n)^{k})} for ak0{\displaystyle k\geq 0}

(rangebound by the critical-exponent polynomial, times zero or more optionallog{\displaystyle \log }s)

... thenT(n)=Θ(nccrit(logn)k+1){\displaystyle T(n)=\Theta \left(n^{c_{\operatorname {crit} }}(\log n)^{k+1}\right)}

(The bound is the splitting term, where the log is augmented by a single power.)

Ifb=a2{\displaystyle b=a^{2}} andf(n)=Θ(n1/2){\displaystyle f(n)=\Theta (n^{1/2})}, thenT(n)=Θ(n1/2logn){\displaystyle T(n)=\Theta (n^{1/2}\log n)}.

Ifb=a2{\displaystyle b=a^{2}} andf(n)=Θ(n1/2logn){\displaystyle f(n)=\Theta (n^{1/2}\log n)}, thenT(n)=Θ(n1/2(logn)2){\displaystyle T(n)=\Theta (n^{1/2}(\log n)^{2})}.

3Work to split/recombine a problem dominates subproblems.

i.e. the recursion tree is root-heavy.

Whenf(n)=Ω(nc){\displaystyle f(n)=\Omega (n^{c})} wherec>ccrit{\displaystyle c>c_{\operatorname {crit} }}

(lower-bounded by a greater-exponent polynomial)

... this doesn't necessarily yield anything. Furthermore, if
af(nb)kf(n){\displaystyle af\left({\frac {n}{b}}\right)\leq kf(n)} for some constantk<1{\displaystyle k<1} and all sufficiently largen{\displaystyle n} (often called theregularity condition)

then the total is dominated by the splitting termf(n){\displaystyle f(n)}:

T(n)=Θ(f(n)){\displaystyle T\left(n\right)=\Theta \left(f(n)\right)}
Ifb=a2{\displaystyle b=a^{2}} andf(n)=Ω(n1/2+ϵ){\displaystyle f(n)=\Omega (n^{1/2+\epsilon })} and the regularity condition holds, thenT(n)=Θ(f(n)){\displaystyle T(n)=\Theta (f(n))}.

A useful extension of Case 2 handles all values ofk{\displaystyle k}:[3]

CaseCondition onf(n){\displaystyle f(n)} in relation toccrit{\displaystyle c_{\operatorname {crit} }}, i.e.logba{\displaystyle \log _{b}a}Master Theorem boundNotational examples
2aWhenf(n)=Θ(nccrit(logn)k){\displaystyle f(n)=\Theta (n^{c_{\operatorname {crit} }}(\log n)^{k})} for anyk>1{\displaystyle k>-1}... thenT(n)=Θ(nccrit(logn)k+1){\displaystyle T(n)=\Theta \left(n^{c_{\operatorname {crit} }}(\log n)^{k+1}\right)}

(The bound is the splitting term, where the log is augmented by a single power.)

Ifb=a2{\displaystyle b=a^{2}} andf(n)=Θ(n1/2/(logn)1/2){\displaystyle f(n)=\Theta (n^{1/2}/(\log n)^{1/2})}, thenT(n)=Θ(n1/2(logn)1/2){\displaystyle T(n)=\Theta (n^{1/2}(\log n)^{1/2})}.
2bWhenf(n)=Θ(nccrit(logn)k){\displaystyle f(n)=\Theta (n^{c_{\operatorname {crit} }}(\log n)^{k})} fork=1{\displaystyle k=-1}... thenT(n)=Θ(nccritloglogn){\displaystyle T(n)=\Theta \left(n^{c_{\operatorname {crit} }}\log \log n\right)}

(The bound is the splitting term, where the log reciprocal is replaced by an iterated log.)

Ifb=a2{\displaystyle b=a^{2}} andf(n)=Θ(n1/2/logn){\displaystyle f(n)=\Theta (n^{1/2}/\log n)}, thenT(n)=Θ(n1/2loglogn){\displaystyle T(n)=\Theta (n^{1/2}\log \log n)}.
2cWhenf(n)=Θ(nccrit(logn)k){\displaystyle f(n)=\Theta (n^{c_{\operatorname {crit} }}(\log n)^{k})} for anyk<1{\displaystyle k<-1}... thenT(n)=Θ(nccrit){\displaystyle T(n)=\Theta \left(n^{c_{\operatorname {crit} }}\right)}

(The bound is the splitting term, where the log disappears.)

Ifb=a2{\displaystyle b=a^{2}} andf(n)=Θ(n1/2/(logn)2){\displaystyle f(n)=\Theta (n^{1/2}/(\log n)^{2})}, thenT(n)=Θ(n1/2){\displaystyle T(n)=\Theta (n^{1/2})}.

Examples

[edit]

Case 1 example

[edit]
T(n)=8T(n2)+1000n2{\displaystyle T(n)=8T\left({\frac {n}{2}}\right)+1000n^{2}}

As one can see from the formula above:

a=8,b=2,f(n)=1000n2{\displaystyle a=8,\,b=2,\,f(n)=1000n^{2}}, so
f(n)=O(nc){\displaystyle f(n)=O\left(n^{c}\right)}, wherec=2{\displaystyle c=2}

Next, we see if we satisfy the case 1 condition:

logba=log28=3>c{\displaystyle \log _{b}a=\log _{2}8=3>c}.

It follows from the first case of the master theorem that

T(n)=Θ(nlogba)=Θ(n3){\displaystyle T(n)=\Theta \left(n^{\log _{b}a}\right)=\Theta \left(n^{3}\right)}

(This result is confirmed by the exact solution of the recurrence relation, which isT(n)=1001n31000n2{\displaystyle T(n)=1001n^{3}-1000n^{2}}, assumingT(1)=1{\displaystyle T(1)=1}).

Case 2 example

[edit]

T(n)=2T(n2)+10n{\displaystyle T(n)=2T\left({\frac {n}{2}}\right)+10n}

As we can see in the formula above the variables get the following values:

a=2,b=2,c=1,f(n)=10n{\displaystyle a=2,\,b=2,\,c=1,\,f(n)=10n}
f(n)=Θ(nc(logn)k){\displaystyle f(n)=\Theta \left(n^{c}(\log n)^{k}\right)} wherec=1,k=0{\displaystyle c=1,k=0}

Next, we see if we satisfy the case 2 condition:

logba=log22=1{\displaystyle \log _{b}a=\log _{2}2=1}, and therefore, c andlogba{\displaystyle \log _{b}a} are equal

So it follows from the second case of the master theorem:

T(n)=Θ(nlogba(logn)k+1)=Θ(n1(logn)1)=Θ(nlogn){\displaystyle T(n)=\Theta \left(n^{\log _{b}a}(\log n)^{k+1}\right)=\Theta \left(n^{1}(\log n)^{1}\right)=\Theta \left(n\log n\right)}

Thus the given recurrence relationT(n){\displaystyle T(n)} was inΘ(nlogn){\displaystyle \Theta (n\log n)}.

(This result is confirmed by the exact solution of the recurrence relation, which isT(n)=n+10nlog2n{\displaystyle T(n)=n+10n\log _{2}n}, assumingT(1)=1{\displaystyle T(1)=1}).

Case 3 example

[edit]
T(n)=2T(n2)+n2{\displaystyle T(n)=2T\left({\frac {n}{2}}\right)+n^{2}}

As we can see in the formula above the variables get the following values:

a=2,b=2,f(n)=n2{\displaystyle a=2,\,b=2,\,f(n)=n^{2}}
f(n)=Ω(nc){\displaystyle f(n)=\Omega \left(n^{c}\right)}, wherec=2{\displaystyle c=2}

Next, we see if we satisfy the case 3 condition:

logba=log22=1{\displaystyle \log _{b}a=\log _{2}2=1}, and therefore, yes,c>logba{\displaystyle c>\log _{b}a}

The regularity condition also holds:

2(n24)kn2{\displaystyle 2\left({\frac {n^{2}}{4}}\right)\leq kn^{2}}, choosingk=1/2{\displaystyle k=1/2}

So it follows from the third case of the master theorem:

T(n)=Θ(f(n))=Θ(n2).{\displaystyle T\left(n\right)=\Theta \left(f(n)\right)=\Theta \left(n^{2}\right).}

Thus the given recurrence relationT(n){\displaystyle T(n)} was inΘ(n2){\displaystyle \Theta (n^{2})}, that complies with thef(n){\displaystyle f(n)} of the original formula.

(This result is confirmed by the exact solution of the recurrence relation, which isT(n)=2n2n{\displaystyle T(n)=2n^{2}-n}, assumingT(1)=1{\displaystyle T(1)=1}.)

Inadmissible equations

[edit]

The following equations cannot be solved using the master theorem:[4]

In the second inadmissible example above, the difference betweenf(n){\displaystyle f(n)} andnlogba{\displaystyle n^{\log _{b}a}} can be expressed with the ratiof(n)nlogba=n/lognnlog22=nnlogn=1logn{\displaystyle {\frac {f(n)}{n^{\log _{b}a}}}={\frac {n/\log n}{n^{\log _{2}2}}}={\frac {n}{n\log n}}={\frac {1}{\log n}}}. It is clear that1logn<nϵ{\displaystyle {\frac {1}{\log n}}<n^{\epsilon }} for any constantϵ>0{\displaystyle \epsilon >0}. Therefore, the difference is not polynomial and the basic form of the Master Theorem does not apply. The extended form (case 2b) does apply, giving the solutionT(n)=Θ(nloglogn){\displaystyle T(n)=\Theta (n\log \log n)}.

Application to common algorithms

[edit]
AlgorithmRecurrence relationshipRun timeComment
Binary searchT(n)=T(n2)+O(1){\displaystyle T(n)=T\left({\frac {n}{2}}\right)+O(1)}O(logn){\displaystyle O(\log n)}Apply Master theorem casec=logba{\displaystyle c=\log _{b}a}, wherea=1,b=2,c=0,k=0{\displaystyle a=1,b=2,c=0,k=0}[5]
Binary tree traversalT(n)=2T(n2)+O(1){\displaystyle T(n)=2T\left({\frac {n}{2}}\right)+O(1)}O(n){\displaystyle O(n)}Apply Master theorem casec<logba{\displaystyle c<\log _{b}a} wherea=2,b=2,c=0{\displaystyle a=2,b=2,c=0}[5]
Optimal sorted matrix searchT(n)=2T(n2)+O(logn){\displaystyle T(n)=2T\left({\frac {n}{2}}\right)+O(\log n)}O(n){\displaystyle O(n)}Apply theAkra–Bazzi theorem forp=1{\displaystyle p=1} andg(u)=log(u){\displaystyle g(u)=\log(u)} to getΘ(2nlogn){\displaystyle \Theta (2n-\log n)}
Merge sortT(n)=2T(n2)+O(n){\displaystyle T(n)=2T\left({\frac {n}{2}}\right)+O(n)}O(nlogn){\displaystyle O(n\log n)}Apply Master theorem casec=logba{\displaystyle c=\log _{b}a}, wherea=2,b=2,c=1,k=0{\displaystyle a=2,b=2,c=1,k=0}

See also

[edit]

Notes

[edit]
  1. ^Bentley, Jon Louis;Haken, Dorothea;Saxe, James B. (September 1980),"A general method for solving divide-and-conquer recurrences",ACM SIGACT News,12 (3):36–44,doi:10.1145/1008861.1008865,S2CID 40642274, archived fromthe original on September 22, 2017
  2. ^ Duke University, "Big-Oh for Recursive Functions: Recurrence Relations",http://www.cs.duke.edu/~ola/ap/recurrence.html
  3. ^Chee Yap, A real elementary approach to the master recurrence and generalizations, Proceedings of the 8th annual conference on Theory and applications of models of computation (TAMC'11), pages 14–26, 2011.Online copy.
  4. ^ Massachusetts Institute of Technology (MIT), "Master Theorem: Practice Problems and Solutions",https://people.csail.mit.edu/thies/6.046-web/master.pdf
  5. ^ab Dartmouth College,http://www.math.dartmouth.edu/archive/m19w03/public_html/Section5-2.pdf

References

[edit]
Retrieved from "https://en.wikipedia.org/w/index.php?title=Master_theorem_(analysis_of_algorithms)&oldid=1277959662"
Categories:
Hidden categories:

[8]ページ先頭

©2009-2026 Movatter.jp