Uh oh!
There was an error while loading.Please reload this page.
- Notifications
You must be signed in to change notification settings - Fork0
Automatic differentiation library written in pure Vim script.
License
pit-ray/vim-autograd
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
Automatic differentiation library written in pure Vim script.
vim-autograd provides a foundation for automatic differentiation through the Define-by-Run style algorithm such as Chainer or PyTorch. Since it is written completely in pure Vim script, there are no dependencies.
This library allows us to create next-generation plugins with numerical computation of multidimensional arrays or deep learning using the gradient descent method.
If you are usingvim-plug, can install as follows.
Plug'pit-ray/vim-autograd'If you want to use the more efficient Vim9 script, please install the experimentalvim9 branch implementation.
Plug'pit-ray/vim-autograd', {'branch':'vim9'}
A computational graph is constructed by applying the provided differentiable functions to a Tensor object, and the gradient is calculated by backpropagating from the output.
function!s:f(x)abort" y = x^5 - 2x^3lety=autograd#sub(a:x.p(5),a:x.p(3).m(2))returnyendfunctionfunction!s:example()abortletx=autograd#tensor(2.0)lety=s:f(x)cally.backward()echox.grad.dataendfunctioncalls:example()
Output
[56.0]The computational graph is automatically generated like the below.
- Basic differentiation and computational graph visualization
- Higher-order differentiation using double-backprop
- Classification using deep learning
This library is provided byMIT License.
- pit-ray
About
Automatic differentiation library written in pure Vim script.
Topics
Resources
License
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Sponsor this project
Uh oh!
There was an error while loading.Please reload this page.
Uh oh!
There was an error while loading.Please reload this page.
