Movatterモバイル変換


[0]ホーム

URL:


Jump to content
WikipediaThe Free Encyclopedia
Search

Powell's method

From Wikipedia, the free encyclopedia
Algorithm for finding a local minimum of a function
icon
This articleneeds additional citations forverification. Please helpimprove this article byadding citations to reliable sources. Unsourced material may be challenged and removed.
Find sources: "Powell's method" – news ·newspapers ·books ·scholar ·JSTOR
(August 2009) (Learn how and when to remove this message)

Powell's method, strictlyPowell's conjugate direction method, is analgorithm proposed byMichael J. D. Powell for finding alocal minimum of a function. The function need not be differentiable, and no derivatives are taken.

The function must be a real-valued function of a fixed number of real-valued inputs. The caller passes in the initial point. The caller also passes in a set of initial search vectors. TypicallyN search vectors (say{s1,,sN}{\textstyle \{s_{1},\dots ,s_{N}\}}) are passed in which are simply the normals aligned to each axis.[1]

The method minimises the function by a bi-directional search along each search vector, in turn. The bi-directional line search along each search vector can be done byGolden-section search orBrent's method. Let the minima found during each bi-directional line search be{x0+α1s1,x0+i=12αisi,,x0+i=1Nαisi}{\textstyle \{x_{0}+\alpha _{1}s_{1},{x}_{0}+\sum _{i=1}^{2}\alpha _{i}{s}_{i},\dots ,{x}_{0}+\sum _{i=1}^{N}\alpha _{i}{s}_{i}\}}, wherex0{\textstyle {x}_{0}} is the initial starting point andαi{\textstyle \alpha _{i}} is the scalar determined during bi-directional search alongsi{\textstyle {s}_{i}}. The new position (x1{\textstyle x_{1}}) can then be expressed as a linear combination of the search vectors i.e.x1=x0+i=1Nαisi{\textstyle x_{1}=x_{0}+\sum _{i=1}^{N}\alpha _{i}s_{i}}. The new displacement vector (i=1Nαisi{\textstyle \sum _{i=1}^{N}\alpha _{i}s_{i}}) becomes a new search vector, and is added to the end of the search vector list. Meanwhile, the search vector which contributed most to the new direction, i.e. the one which was most successful (id=argmaxi=1N|αi|si{\textstyle i_{d}=\arg \max _{i=1}^{N}|\alpha _{i}|\|s_{i}\|}), is deleted from the search vector list. The new set ofN search vectors is{s1,,sid1,sid+1,,sN,i=1Nαisi}{\textstyle \{s_{1},\dots ,s_{i_{d}-1},s_{i_{d}+1},\dots ,s_{N},\sum _{i=1}^{N}\alpha _{i}s_{i}\}}. The algorithm iterates an arbitrary number of times until no significant improvement is made.[1]

The method is useful for calculating the local minimum of a continuous but complex function, especially one without an underlying mathematical definition, because it is not necessary to take derivatives. The basic algorithm is simple; the complexity is in the linear searches along the search vectors, which can be achieved viaBrent's method.

References

[edit]
  1. ^abMathews, John H."Module for Powell Search Method for a Minimum".California State University, Fullerton. Retrieved16 June 2017.
Functions
Gradients
Convergence
Quasi–Newton
Other methods
Hessians
Graph of a strictly concave quadratic function with unique maximum.
Optimization computes maxima and minima.
General
Differentiable
Convex
minimization
Linear and
quadratic
Interior point
Basis-exchange
Paradigms
Graph
algorithms
Minimum
spanning tree
Shortest path
Network flows
Retrieved from "https://en.wikipedia.org/w/index.php?title=Powell%27s_method&oldid=1262812869"
Category:
Hidden categories:

[8]ページ先頭

©2009-2026 Movatter.jp