Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Asymptotically optimal algorithm

From Wikipedia, the free encyclopedia
(Redirected fromAsymptotically optimal)
Measure of algorithm performance for large inputs
icon
This articleneeds additional citations forverification. Please helpimprove this article byadding citations to reliable sources. Unsourced material may be challenged and removed.
Find sources: "Asymptotically optimal algorithm" – news ·newspapers ·books ·scholar ·JSTOR
(October 2008) (Learn how and when to remove this message)

Incomputer science, analgorithm is said to beasymptotically optimal if, roughly speaking, for large inputs it performsat worst a constant factor (independent of the input size) worse than any possible algorithm. It is a term commonly encountered in computer science research as a result of widespread use ofbigO notation.

More formally, an algorithm is asymptotically optimal with respect to a particularresource if the problem has been proven to requireΩ(f(n)) of that resource, and the algorithm has been proven to use onlyO(f(n)).

These proofs require an assumption of a particularmodel of computation, i.e., certain restrictions on operations allowable with the input data.

As a simple example, it's known that allcomparison sorts require at leastΩ(n logn) comparisons in the average and worst cases.Mergesort andheapsort are comparison sorts that performO(n logn) comparisons, so they are asymptotically optimal in this sense.

If the input data have somea priori properties that can be exploited in construction of algorithms, in addition to comparisons, then asymptotically faster algorithms may be possible. For example, if it is known that theN objects are (not necessarily distinct)integers from the range[1,N], then they may be sortedO(N) time, e.g., by thebucket sort.

A consequence of an algorithm being asymptotically optimal is that, for large enough inputs, no algorithm can outperform it by more than a constant factor. For this reason, asymptotically optimal algorithms are often seen as the "end of the line" in research, the attaining of a result that cannot be dramatically improved upon. Conversely, if an algorithm is not asymptotically optimal, this implies that as the input grows in size, the algorithm performs increasingly worse than the best possible algorithms.

In practice it's useful to find algorithms that perform better, even if they do not enjoy any asymptotic advantage. New algorithms may also present advantages such as better performance on specific inputs, decreased use of other resources, or being simpler to describe and implement. Thus asymptotically optimal algorithms are not always the "end of the line".

Although asymptotically optimal algorithms are important theoretical results, an asymptotically optimal algorithm might not be used in a number of practical situations:

  • It only outperforms more commonly used methods forn beyond the range of practical input sizes, such as inputs with morebits than could fit in anycomputer storage system.
  • It is too complex, so that the difficulty of comprehending and implementing it correctly outweighs its potential benefit in the range of input sizes under consideration.
  • The inputs encountered in practice fall intospecial cases that have more efficient algorithms or thatheuristic algorithms with bad worst-case times can nevertheless solve efficiently.
  • On modern computers,hardware optimizations such asmemory cache andparallel processing may be "broken" by an asymptotically optimal algorithm (assuming the analysis did not take these hardware optimizations into account). In this case, there could be sub-optimal algorithms that make better use of these features and outperform an optimal algorithm on realistic data.

An example of an asymptotically optimal algorithm not used in practice isBernard Chazelle'slinear-time algorithm fortriangulation of asimple polygon. Another is theresizable array data structure published in "Resizable Arrays in Optimal Time and Space",[1] which can index in constant time but on many machines carries a heavy practical penalty compared to ordinary array indexing.

Formal definitions

[edit]

Formally, suppose that we have a lower-bound theorem showing that a problem requires Ω(f(n)) time to solve for an instance (input) of sizen (seeBig O notation § Big Omega notation for the definition of Ω). Then an algorithm that solves the problem inO(f(n)) time is said to be asymptotically optimal.

Although usually applied to time efficiency, an algorithm can be said to use asymptotically optimal space, random bits, number of processors, or any other resource commonly measured using bigO notation.

Sometimes vague or implicit assumptions can make it unclear whether an algorithm is asymptotically optimal. For example, a lower bound theorem might assume a particularabstract machine model, as in the case of comparison sorts, or a particular organization of memory. By violating these assumptions, a new algorithm could potentially asymptotically outperform the lower bound and the "asymptotically optimal" algorithms.

Speedup

[edit]

The nonexistence of an asymptotically optimal algorithm is called speedup.Blum's speedup theorem shows that there exist artificially constructed problems with speedup. However, it is anopen problem whether many of the most well-known algorithms today are asymptotically optimal or not. For example, there is anO(nα(n)){\displaystyle O(n\alpha (n))} algorithm for findingminimum spanning trees, whereα(n){\displaystyle \alpha (n)} is the very slowly growing inverse of theAckermann function, but the best known lower bound is the trivialΩ(n){\displaystyle \Omega (n)}. Whether this algorithm is asymptotically optimal is unknown, and would be likely to be hailed as a significant result if it were resolved either way. Coppersmith and Winograd (1982) proved thatmatrix multiplication has a weak form of speed-up among a restricted class of algorithms (Strassen-type bilinear identities with lambda-computation).

See also

[edit]

References

[edit]
  1. ^Brodnik, Andrej; Carlsson, Svante;Sedgewick, Robert; Munro, JI; Demaine, ED (1999),Resizable Arrays in Optimal Time and Space(PDF), Department of Computer Science, University of Waterloo
Retrieved from "https://en.wikipedia.org/w/index.php?title=Asymptotically_optimal_algorithm&oldid=1324271129"
Category:
Hidden categories:

[8]ページ先頭

©2009-2026 Movatter.jp