brent#
- scipy.optimize.brent(func,args=(),brack=None,tol=1.48e-08,full_output=0,maxiter=500)[source]#
Given a function of one variable and a possible bracket, returna local minimizer of the function isolated to a fractional precisionof tol.
- Parameters:
- funccallable f(x,*args)
Objective function.
- argstuple, optional
Additional arguments (if present).
- bracktuple, optional
Either a triple
(xa,xb,xc)satisfyingxa<xb<xcandfunc(xb)<func(xa)and func(xb)<func(xc), or a pair(xa,xb)to be used as initial points for a downhill bracket search(seescipy.optimize.bracket).The minimizerxwill not necessarily satisfyxa<=x<=xb.- tolfloat, optional
Relative error in solutionxopt acceptable for convergence.
- full_outputbool, optional
If True, return all output args (xmin, fval, iter,funcalls).
- maxiterint, optional
Maximum number of iterations in solution.
- Returns:
- xminndarray
Optimum point.
- fvalfloat
(Optional output) Optimum function value.
- iterint
(Optional output) Number of iterations.
- funcallsint
(Optional output) Number of objective function evaluations made.
See also
minimize_scalarInterface to minimization algorithms for scalar univariate functions. See the ‘Brent’method in particular.
Notes
Uses inverse parabolic interpolation when possible to speed upconvergence of golden section method.
Does not ensure that the minimum lies in the range specified bybrack. See
scipy.optimize.fminbound.Examples
We illustrate the behaviour of the function whenbrack is ofsize 2 and 3 respectively. In the case wherebrack is of theform
(xa,xb), we can see for the given values, the output doesnot necessarily lie in the range(xa,xb).>>>deff(x):...return(x-1)**2
>>>fromscipyimportoptimize
>>>minimizer=optimize.brent(f,brack=(1,2))>>>minimizer1>>>res=optimize.brent(f,brack=(-1,0.5,2),full_output=True)>>>xmin,fval,iter,funcalls=res>>>f(xmin),fval(0.0, 0.0)