Uh oh!
There was an error while loading.Please reload this page.
- Notifications
You must be signed in to change notification settings - Fork96
Mathematical Optimization in Julia. Local, global, gradient-based and derivative-free. Linear, Quadratic, Convex, Mixed-Integer, and Nonlinear Optimization in one simple, fast, and differentiable interface.
License
SciML/Optimization.jl
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
Optimization.jl is a package with a scope that is beyond your normal global optimizationpackage. Optimization.jl seeks to bring together all of the optimization packagesit can find, local and global, into one unified Julia interface. This means, youlearn one package and you learn them all! Optimization.jl adds a few high-levelfeatures, such as integrating with automatic differentiation, to make its usagefairly simple for most cases, while allowing all of the options in a singleunified interface.
Assuming that you already have Julia correctly installed, it suffices to importOptimization.jl in the standard way:
using PkgPkg.add("Optimization")
The packages relevant to the core functionality of Optimization.jl will be importedaccordingly and, in most cases, you do not have to worry about the manualinstallation of dependencies. Below is the list of packages that need to beinstalled explicitly if you intend to use the specific optimization algorithmsoffered by them:
- OptimizationAuglag for augmented Lagrangian methods
- OptimizationBBO forBlackBoxOptim.jl
- OptimizationCMAEvolutionStrategy forCMAEvolutionStrategy.jl
- OptimizationEvolutionary forEvolutionary.jl (see alsothis documentation)
- OptimizationGCMAES forGCMAES.jl
- OptimizationIpopt forIpopt.jl
- OptimizationLBFGSB forLBFGSB.jl
- OptimizationMadNLP forMadNLP.jl
- OptimizationManopt forManopt.jl (optimization on manifolds)
- OptimizationMetaheuristics forMetaheuristics.jl (see alsothis documentation)
- OptimizationMOI forMathOptInterface.jl (usage of algorithm via MathOptInterface API; see also the APIdocumentation)
- OptimizationMultistartOptimization forMultistartOptimization.jl (see alsothis documentation)
- OptimizationNLopt forNLopt.jl (usage via the NLopt API; see also the availablealgorithms)
- OptimizationNLPModels forNLPModels.jl
- OptimizationNOMAD forNOMAD.jl (see alsothis documentation)
- OptimizationODE for optimization of steady-state and time-dependent ODE problems
- OptimizationOptimJL forOptim.jl
- OptimizationOptimisers forOptimisers.jl (machine learning optimizers)
- OptimizationPolyalgorithms for polyalgorithm optimization strategies
- OptimizationPRIMA forPRIMA.jl
- OptimizationPyCMA for Python's CMA-ES implementation viaPythonCall.jl
- OptimizationQuadDIRECT forQuadDIRECT.jl
- OptimizationSciPy forSciPy optimization algorithms viaPythonCall.jl
- OptimizationSophia for Sophia optimizer (second-order stochastic optimizer)
- OptimizationSpeedMapping forSpeedMapping.jl (see alsothis documentation)
For information on using the package,see the stable documentation. Use thein-development documentation for the version ofthe documentation, which contains the unreleased features.
using Optimizationrosenbrock(x, p)= (p[1]- x[1])^2+ p[2]* (x[2]- x[1]^2)^2x0=zeros(2)p= [1.0,100.0]prob=OptimizationProblem(rosenbrock, x0, p)using OptimizationOptimJLsol=solve(prob,NelderMead())using OptimizationBBOprob=OptimizationProblem(rosenbrock, x0, p, lb= [-1.0,-1.0], ub= [1.0,1.0])sol=solve(prob,BBO_adaptive_de_rand_1_bin_radiuslimited())
Warning: The output of the second optimization task (BBO_adaptive_de_rand_1_bin_radiuslimited()) iscurrently misleading in the sense that it returnsStatus: failure (reached maximum number of iterations). However, convergence is actuallyreached and the confusing message stems from the reliance on the Optim.jl outputstruct (where the situation of reaching the maximum number of iterations isrightly regarded as a failure). The improved output struct will soon beimplemented.
The output of the first optimization task (with theNelderMead() algorithm)is given below:
* Status: success* Candidate solution Final objective value: 3.525527e-09* Found with Algorithm: Nelder-Mead* Convergence measures √(Σ(yᵢ-ȳ)²)/n ≤ 1.0e-08* Work counters Seconds run: 0 (vs limit Inf) Iterations: 60 f(x) calls: 118We can also explore other methods in a similar way:
using ForwardDifff=OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff())prob=OptimizationProblem(f, x0, p)sol=solve(prob,BFGS())
For instance, the above optimization task produces the following output:
* Status: success* Candidate solution Final objective value: 7.645684e-21* Found with Algorithm: BFGS* Convergence measures |x - x'| = 3.48e-07 ≰ 0.0e+00 |x - x'|/|x'| = 3.48e-07 ≰ 0.0e+00 |f(x) - f(x')| = 6.91e-14 ≰ 0.0e+00 |f(x) - f(x')|/|f(x')| = 9.03e+06 ≰ 0.0e+00 |g(x)| = 2.32e-09 ≤ 1.0e-08* Work counters Seconds run: 0 (vs limit Inf) Iterations: 16 f(x) calls: 53 ∇f(x) calls: 53prob=OptimizationProblem(f, x0, p, lb= [-1.0,-1.0], ub= [1.0,1.0])sol=solve(prob,Fminbox(GradientDescent()))
The examples clearly demonstrate that Optimization.jl provides an intuitiveway of specifying optimization tasks and offers a relativelyeasy access to a wide range of optimization algorithms.
About
Mathematical Optimization in Julia. Local, global, gradient-based and derivative-free. Linear, Quadratic, Convex, Mixed-Integer, and Nonlinear Optimization in one simple, fast, and differentiable interface.
Topics
Resources
License
Code of conduct
Contributing
Security policy
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Sponsor this project
Uh oh!
There was an error while loading.Please reload this page.
Packages0
Uh oh!
There was an error while loading.Please reload this page.