scipy.optimize.

shgo#

scipy.optimize.shgo(func,bounds,args=(),constraints=None,n=100,iters=1,callback=None,minimizer_kwargs=None,options=None,sampling_method='simplicial',*,workers=1)[source]#

Finds the global minimum of a function using SHG optimization.

SHGO stands for “simplicial homology global optimization”.

Parameters:
funccallable

The objective function to be minimized. Must be in the formf(x,*args), wherex is the argument in the form of a 1-D arrayandargs is a tuple of any additional fixed parameters needed tocompletely specify the function.

boundssequence orBounds

Bounds for variables. There are two ways to specify the bounds:

  1. Instance ofBounds class.

  2. Sequence of(min,max) pairs for each element inx.

argstuple, optional

Any additional fixed parameters needed to completely specify theobjective function.

constraints{Constraint, dict} or List of {Constraint, dict}, optional

Constraints definition. Only for COBYLA, COBYQA, SLSQP and trust-constr.See the tutorial[5] for further details on specifying constraints.

Note

Only COBYLA, COBYQA, SLSQP, and trust-constr local minimize methodscurrently support constraint arguments. If theconstraintssequence used in the local optimization problem is not defined inminimizer_kwargs and a constrained method is used then theglobalconstraints will be used.(Defining aconstraints sequence inminimizer_kwargsmeans thatconstraints will not be added so if equalityconstraints and so forth need to be added then the inequalityfunctions inconstraints need to be added tominimizer_kwargs too).COBYLA only supports inequality constraints.

Changed in version 1.11.0:constraints acceptsNonlinearConstraint,LinearConstraint.

nint, optional

Number of sampling points used in the construction of the simplicialcomplex. For the defaultsimplicial sampling method 2**dim + 1sampling points are generated instead of the defaultn=100. For allother specified valuesn sampling points are generated. Forsobol,halton and other arbitrarysampling_methodsn=100 oranother specified number of sampling points are generated.

itersint, optional

Number of iterations used in the construction of the simplicialcomplex. Default is 1.

callbackcallable, optional

Called after each iteration, ascallback(xk), wherexk is thecurrent parameter vector.

minimizer_kwargsdict, optional

Extra keyword arguments to be passed to the minimizerscipy.optimize.minimize. Some important options could be:

methodstr

The minimization method. If not given, chosen to be one ofBFGS, L-BFGS-B, SLSQP, depending on whether or not theproblem has constraints or bounds.

argstuple

Extra arguments passed to the objective function (func) andits derivatives (Jacobian, Hessian).

optionsdict, optional

Note that by default the tolerance is specified as{ftol:1e-12}

optionsdict, optional

A dictionary of solver options. Many of the options specified for theglobal routine are also passed to thescipy.optimize.minimizeroutine. The options that are also passed to the local routine aremarked with “(L)”.

Stopping criteria, the algorithm will terminate if any of the specifiedcriteria are met. However, the default algorithm does not require anyto be specified:

maxfevint (L)

Maximum number of function evaluations in the feasible domain.(Note only methods that support this option will terminatethe routine at precisely exact specified value. Otherwise thecriterion will only terminate during a global iteration)

f_minfloat

Specify the minimum objective function value, if it is known.

f_tolfloat

Precision goal for the value of f in the stoppingcriterion. Note that the global routine will alsoterminate if a sampling point in the global routine iswithin this tolerance.

maxiterint

Maximum number of iterations to perform.

maxevint

Maximum number of sampling evaluations to perform (includessearching in infeasible points).

maxtimefloat

Maximum processing runtime allowed

minhgrdint

Minimum homology group rank differential. The homology group of theobjective function is calculated (approximately) during everyiteration. The rank of this group has a one-to-one correspondencewith the number of locally convex subdomains in the objectivefunction (after adequate sampling points each of these subdomainscontain a unique global minimum). If the difference in the hgr is 0between iterations formaxhgrd specified iterations thealgorithm will terminate.

Objective function knowledge:

symmetrylist or bool

Specify if the objective function contains symmetric variables.The search space (and therefore performance) is decreased by up toO(n!) times in the fully symmetric case. IfTrue is specifiedthen all variables will be set symmetric to the first variable.Defaultis set to False.

E.g. f(x) = (x_1 + x_2 + x_3) + (x_4)**2 + (x_5)**2 + (x_6)**2

In this equation x_2 and x_3 are symmetric to x_1, while x_5 andx_6 are symmetric to x_4, this can be specified to the solver as:

symmetry=[0,# Variable 10,# symmetric to variable 10,# symmetric to variable 13,# Variable 43,# symmetric to variable 43,# symmetric to variable 4]
jacbool or callable, optional

Jacobian (gradient) of objective function. Only for CG, BFGS,Newton-CG, L-BFGS-B, TNC, SLSQP, dogleg, trust-ncg. Ifjac is aboolean and is True,fun is assumed to return the gradientalong with the objective function. If False, the gradient will beestimated numerically.jac can also be a callable returning thegradient of the objective. In this case, it must accept the samearguments asfun. (Passed toscipy.optimize.minimizeautomatically)

hess, hesspcallable, optional

Hessian (matrix of second-order derivatives) of objective functionor Hessian of objective function times an arbitrary vector p.Only for Newton-CG, dogleg, trust-ncg. Only one ofhessp orhess needs to be given. Ifhess is provided, thenhessp will be ignored. If neitherhess norhessp isprovided, then the Hessian product will be approximated usingfinite differences onjac.hessp must compute the Hessiantimes an arbitrary vector. (Passed toscipy.optimize.minimizeautomatically)

Algorithm settings:

minimize_every_iterbool

If True then promising global sampling points will be passed to alocal minimization routine every iteration. If False then only thefinal minimizer pool will be run. Defaults to True.

local_iterint

Only evaluate a few of the best minimizer pool candidates everyiteration. If False all potential points are passed to the localminimization routine.

infty_constraintsbool

If True then any sampling points generated which are outside willthe feasible domain will be saved and given an objective functionvalue ofinf. If False then these points will be discarded.Using this functionality could lead to higher performance withrespect to function evaluations before the global minimum is found,specifying False will use less memory at the cost of a slightdecrease in performance. Defaults to True.

Feedback:

dispbool (L)

Set to True to print convergence messages.

sampling_methodstr or function, optional

Current built in sampling method options arehalton,sobol andsimplicial. The defaultsimplicial providesthe theoretical guarantee of convergence to the global minimum infinite time.halton andsobol method are faster in terms ofsampling point generation at the cost of the loss ofguaranteed convergence. It is more appropriate for most “easier”problems where the convergence is relatively fast.User defined sampling functions must accept two arguments ofnsampling points of dimensiondim per call and output an array ofsampling points with shapen x dim.

workersint or map-like callable, optional

Sample and run the local serial minimizations in parallel.Supply -1 to use all available CPU cores, or an int to usethat many Processes (usesmultiprocessing.Pool).

Alternatively supply a map-like callable, such asmultiprocessing.Pool.map for parallel evaluation.This evaluation is carried out asworkers(func,iterable).Requires thatfunc be pickleable.

Added in version 1.11.0.

Returns:
resOptimizeResult

The optimization result represented as aOptimizeResult object.Important attributes are:x the solution array corresponding to the global minimum,fun the function output at the global solution,xl an ordered list of local minima solutions,funl the function output at the corresponding local solutions,success a Boolean flag indicating if the optimizer exitedsuccessfully,message which describes the cause of the termination,nfev the total number of objective function evaluations includingthe sampling calls,nlfev the total number of objective function evaluationsculminating from all local search optimizations,nit number of iterations performed by the global routine.

Notes

Global optimization using simplicial homology global optimization[1].Appropriate for solving general purpose NLP and blackbox optimizationproblems to global optimality (low-dimensional problems).

In general, the optimization problems are of the form:

minimizef(x)subjecttog_i(x)>=0,i=1,...,mh_j(x)=0,j=1,...,p

where x is a vector of one or more variables.f(x) is the objectivefunctionR^n->R,g_i(x) are the inequality constraints, andh_j(x) are the equality constraints.

Optionally, the lower and upper bounds for each element in x can also bespecified using thebounds argument.

While most of the theoretical advantages of SHGO are only proven for whenf(x) is a Lipschitz smooth function, the algorithm is also proven toconverge to the global optimum for the more general case wheref(x) isnon-continuous, non-convex and non-smooth, if the default sampling methodis used[1].

The local search method may be specified using theminimizer_kwargsparameter which is passed on toscipy.optimize.minimize. By default,theSLSQP method is used. In general, it is recommended to use theSLSQP,COBYLA, orCOBYQA local minimization if inequalityconstraints are defined for the problem since the other methods do not useconstraints.

Thehalton andsobol method points are generated usingscipy.stats.qmc. Any other QMC method could be used.

References

[1](1,2)

Endres, SC, Sandrock, C, Focke, WW (2018) “A simplicial homologyalgorithm for lipschitz optimisation”, Journal of GlobalOptimization.

[2]

Joe, SW and Kuo, FY (2008) “Constructing Sobol’ sequences withbetter two-dimensional projections”, SIAM J. Sci. Comput. 30,2635-2654.

[3](1,2)

Hock, W and Schittkowski, K (1981) “Test examples for nonlinearprogramming codes”, Lecture Notes in Economics and MathematicalSystems, 187. Springer-Verlag, New York.http://www.ai7.uni-bayreuth.de/test_problem_coll.pdf

[4]

Wales, DJ (2015) “Perspective: Insight into reaction coordinates anddynamics from the potential energy landscape”,Journal of Chemical Physics, 142(13), 2015.

Examples

First consider the problem of minimizing the Rosenbrock function,rosen:

>>>fromscipy.optimizeimportrosen,shgo>>>bounds=[(0,2),(0,2),(0,2),(0,2),(0,2)]>>>result=shgo(rosen,bounds)>>>result.x,result.fun(array([1., 1., 1., 1., 1.]), 2.920392374190081e-18)

Note that bounds determine the dimensionality of the objectivefunction and is therefore a required input, however you can specifyempty bounds usingNone or objects likenp.inf which will beconverted to large float numbers.

>>>bounds=[(None,None),]*4>>>result=shgo(rosen,bounds)>>>result.xarray([0.99999851, 0.99999704, 0.99999411, 0.9999882 ])

Next, we consider the Eggholder function, a problem with several localminima and one global minimum. We will demonstrate the use of arguments andthe capabilities ofshgo.(https://en.wikipedia.org/wiki/Test_functions_for_optimization)

>>>importnumpyasnp>>>defeggholder(x):...return(-(x[1]+47.0)...*np.sin(np.sqrt(abs(x[0]/2.0+(x[1]+47.0))))...-x[0]*np.sin(np.sqrt(abs(x[0]-(x[1]+47.0))))...)...>>>bounds=[(-512,512),(-512,512)]

shgo has built-in low discrepancy sampling sequences. First, we willinput 64 initial sampling points of theSobol’ sequence:

>>>result=shgo(eggholder,bounds,n=64,sampling_method='sobol')>>>result.x,result.fun(array([512.        , 404.23180824]), -959.6406627208397)

shgo also has a return for any other local minima that was found, thesecan be called using:

>>>result.xlarray([[ 512.        ,  404.23180824],       [ 283.0759062 , -487.12565635],       [-294.66820039, -462.01964031],       [-105.87688911,  423.15323845],       [-242.97926   ,  274.38030925],       [-506.25823477,    6.3131022 ],       [-408.71980731, -156.10116949],       [ 150.23207937,  301.31376595],       [  91.00920901, -391.283763  ],       [ 202.89662724, -269.38043241],       [ 361.66623976, -106.96493868],       [-219.40612786, -244.06020508]])
>>>result.funlarray([-959.64066272, -718.16745962, -704.80659592, -565.99778097,       -559.78685655, -557.36868733, -507.87385942, -493.9605115 ,       -426.48799655, -421.15571437, -419.31194957, -410.98477763])

These results are useful in applications where there are many global minimaand the values of other global minima are desired or where the local minimacan provide insight into the system (for example morphologiesin physical chemistry[4]).

If we want to find a larger number of local minima, we can increase thenumber of sampling points or the number of iterations. We’ll increase thenumber of sampling points to 64 and the number of iterations from thedefault of 1 to 3. Usingsimplicial this would have given us64 x 3 = 192 initial sampling points.

>>>result_2=shgo(eggholder,...bounds,n=64,iters=3,sampling_method='sobol')>>>len(result.xl),len(result_2.xl)(12, 23)

Note the difference between, e.g.,n=192,iters=1 andn=64,iters=3.In the first case the promising points contained in the minimiser poolare processed only once. In the latter case it is processed every 64sampling points for a total of 3 times.

To demonstrate solving problems with non-linear constraints consider thefollowing example from Hock and Schittkowski problem 73 (cattle-feed)[3]:

minimize:f=24.55*x_1+26.75*x_2+39*x_3+40.50*x_4subjectto:2.3*x_1+5.6*x_2+11.1*x_3+1.3*x_4-5>=0,12*x_1+11.9*x_2+41.8*x_3+52.1*x_4-21-1.645*sqrt(0.28*x_1**2+0.19*x_2**2+20.5*x_3**2+0.62*x_4**2)>=0,x_1+x_2+x_3+x_4-1==0,1>=x_i>=0foralli

The approximate answer given in[3] is:

f([0.6355216,-0.12e-11,0.3127019,0.05177655])=29.894378
>>>deff(x):# (cattle-feed)...return24.55*x[0]+26.75*x[1]+39*x[2]+40.50*x[3]...>>>defg1(x):...return2.3*x[0]+5.6*x[1]+11.1*x[2]+1.3*x[3]-5# >=0...>>>defg2(x):...return(12*x[0]+11.9*x[1]+41.8*x[2]+52.1*x[3]-21...-1.645*np.sqrt(0.28*x[0]**2+0.19*x[1]**2...+20.5*x[2]**2+0.62*x[3]**2)...)# >=0...>>>defh1(x):...returnx[0]+x[1]+x[2]+x[3]-1# == 0...>>>cons=({'type':'ineq','fun':g1},...{'type':'ineq','fun':g2},...{'type':'eq','fun':h1})>>>bounds=[(0,1.0),]*4>>>res=shgo(f,bounds,n=150,constraints=cons)>>>res message: Optimization terminated successfully. success: True     fun: 29.894378159142136    funl: [ 2.989e+01]       x: [ 6.355e-01  1.137e-13  3.127e-01  5.178e-02] # may vary      xl: [[ 6.355e-01  1.137e-13  3.127e-01  5.178e-02]] # may vary     nit: 1    nfev: 142 # may vary   nlfev: 35 # may vary   nljev: 5   nlhev: 0
>>>g1(res.x),g2(res.x),h1(res.x)(-5.062616992290714e-14, -2.9594104944408173e-12, 0.0)
On this page