- Notifications
You must be signed in to change notification settings - Fork12
A .NET library that provides fast, accurate and automatic differentiation (computes derivative / gradient) of mathematical functions.
License
alexshtf/autodiff
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
AppVeyor CI | NuGet Package |
---|---|
A library that provides moderately fast, accurate, and automatic differentiation (computes derivative / gradient) of mathematical functions.
AutoDiff provides a simple and intuitive API for computing function gradients/derivatives along with a fast algorithm for performing the computation. Such computations are mainly useful in iterative numerical optimization scenarios.
usingAutoDiff;classProgram{publicstaticvoidMain(string[]args){// define variablesvarx=newVariable();vary=newVariable();varz=newVariable();// define our functionvarfunc=(x+y)*TermBuilder.Exp(z+x*y);// prepare arrays needed for evaluation/differentiationVariable[]vars={x,y,z};double[]values={1,2,-3};// evaluate func at (1, 2, -3)doublevalue=func.Evaluate(vars,values);// calculate the gradient at (1, 2, -3)double[]gradient=func.Differentiate(vars,values);// print resultsConsole.WriteLine("The value at (1, 2, -3) is "+value);Console.WriteLine("The gradient at (1, 2, -3) is ({0}, {1}, {2})",gradient[0],gradient[1],gradient[2]);}}
TheDocumentation contains some basic tutorials, we have anarticle on CodeProject, and finally source code contains some code examples in addition to the code of the library itself.
There are many open and commercial .NET libraries that have numeric optimization as one of their features (for example,Microsoft Solver Foundation,AlgLib,Extreme Optimization,CenterSpace NMath) . Most of them require the user to be able to evaluate the function and the function's gradient. This library tries to save the work in manually developing the function's gradient and coding it.Once the developer defines his/her function, the AutoDiff library can automatically evaluate and differentiate this function at any point. This allowseasy development and prototyping of applications which require numerical optimization.
- Moderate execution speeds. We aim computing agradient within no more than 50 times the duration offunction evaluation by manually tuned code.
- Composition of functions using arithmetic operators, Exp, Log, Power and user-defined unary and binary functions.
- Function gradient evaluation at specified points
- Function value evaluation at specified points
- Computes gradients using Reverse-Mode AD algorithm inlinear time, which is substantially faster than numerical gradient approximation for multivariate functions.
If you like the library and it helps you publish a research paper, please cite the paper I originally wrote the library forgeosemantic.bib
- Andreas Witsch, Hendrik Skubch, Stefan Niemczyk, Kurt GeihsUsing incomplete satisfiability modulo theories to determine robotic tasksIntelligent Robots and Systems (IROS), 2013 IEEE/RSJ International Conference
- Michael Kommenda, Michael Affenzeller, Gabriel Kronberger, Stephan M. WinklerNonlinear Least Squares Optimization of Constants in Symbolic RegressionRevised Selected Papers of the 14th International Conference on Computer Aided Systems Theory - EUROCAST 2013 - Volume 8111
- Alex Shtof, Alexander Agathos, Yotam Gingold, Ariel Shamir, Daniel Cohen-OrGeosemantic Snapping for Sketch-Based ModelingEurographics 2013 proceedings (code repository)
- Michael Kommenda, Gabriel Kronberger, Stephan Winkler, Michael Affenzeller, Stefan WagnerEffects of constant optimization by nonlinear least squares minimization in symbolic regressionProceeding of the fifteenth annual conference companion on Genetic and evolutionary computation conference companion
- Hendrik Skubch,Solving non-linear arithmetic constraints in soft realtime environmentsProceedings of the 27th Annual ACM Symposium on Applied Computing
- AlicaEngine - A cooperative planning engine for robotics. You can see it in action in thisvideo
- HeuristicsLab - a framework for heuristic and evolutionary algorithms that is developed by members of theHeuristic and Evolutionary Algorithms Laboratory (HEAL)
About
A .NET library that provides fast, accurate and automatic differentiation (computes derivative / gradient) of mathematical functions.
Topics
Resources
License
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Packages0
Uh oh!
There was an error while loading.Please reload this page.
Contributors2
Uh oh!
There was an error while loading.Please reload this page.