Thetruncated Newton method, originated in a paper by Ron Dembo and Trond Steihaug,[1] also known asHessian-free optimization,[2] are a family ofoptimization algorithms designed for optimizing non-linear functions with large numbers ofindependent variables. A truncated Newton method consists of repeated application of an iterative optimization algorithm to approximately solveNewton's equations, to determine an update to the function's parameters. The inner solver istruncated, i.e., run for only a limited number of iterations. It follows that, for truncated Newton methods to work, the inner solver needs to produce a good approximation in a finite number of iterations;[3]conjugate gradient has been suggested and evaluated as a candidate inner loop.[2] Another prerequisite is goodpreconditioning for the inner algorithm.[4]
Thisapplied mathematics–related article is astub. You can help Wikipedia byadding missing information. |