Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/pit-ray/vim-autograd
Automatic differentiation library written in pure Vim script.
https://github.com/pit-ray/vim-autograd
autograd deep-learning neural-network vim vim-autograd vim8 vim9 vim9script vimscript
Last synced: 1 day ago
JSON representation
Automatic differentiation library written in pure Vim script.
- Host: GitHub
- URL: https://github.com/pit-ray/vim-autograd
- Owner: pit-ray
- License: mit
- Created: 2022-03-13T19:13:18.000Z (over 2 years ago)
- Default Branch: main
- Last Pushed: 2024-02-23T14:18:10.000Z (9 months ago)
- Last Synced: 2024-05-01T16:30:44.599Z (7 months ago)
- Topics: autograd, deep-learning, neural-network, vim, vim-autograd, vim8, vim9, vim9script, vimscript
- Language: Vim Script
- Homepage:
- Size: 269 KB
- Stars: 26
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Funding: .github/FUNDING.yml
- License: LICENSE
Awesome Lists containing this project
README
# vim-autograd
**Automatic differentiation library written in pure Vim script.**[![test](https://github.com/pit-ray/vim-autograd/actions/workflows/test.yml/badge.svg?branch=main)](https://github.com/pit-ray/vim-autograd/actions/workflows/test.yml) [![test-vim9](https://github.com/pit-ray/vim-autograd/actions/workflows/test-vim9.yml/badge.svg?branch=vim9)](https://github.com/pit-ray/vim-autograd/actions/workflows/test-vim9.yml)
vim-autograd provides a foundation for automatic differentiation through the Define-by-Run style algorithm such as Chainer or PyTorch. Since it is written completely in pure Vim script, there are no dependencies.
This library allows us to create next-generation plugins with numerical computation of multidimensional arrays or deep learning using the gradient descent method.
## Installation
### Vim script
If you are using [vim-plug](https://github.com/junegunn/vim-plug), can install as follows.```vim
Plug 'pit-ray/vim-autograd'
```### Vim9 script
If you want to use the more efficient Vim9 script, please install the experimental [vim9 branch](https://github.com/pit-ray/vim-autograd/tree/vim9) implementation.```vim
Plug 'pit-ray/vim-autograd', {'branch': 'vim9'}
```## Usage
A computational graph is constructed by applying the provided differentiable functions to a Tensor object, and the gradient is calculated by backpropagating from the output.
```vim
function! s:f(x) abort
" y = x^5 - 2x^3
let y = autograd#sub(a:x.p(5), a:x.p(3).m(2))
return y
endfunctionfunction! s:example() abort
let x = autograd#tensor(2.0)
let y = s:f(x)call y.backward()
echo x.grad.data
endfunctioncall s:example()
```**Output**
```
[56.0]
```The computational graph is automatically generated like the below.
## Examples
- [Basic differentiation and computational graph visualization](examples/README.md#simplest-differentiation)
- [Higher-order differentiation using double-backprop](examples/README.md#higher-order-differentiation)
- [Classification using deep learning](examples/README.md#classification-using-deep-learning)## Related posts
- https://zenn.dev/pitray/articles/482e89ddff329c## References
- [oreilly-japan/deep-learning-from-scratch-3](https://github.com/oreilly-japan/deep-learning-from-scratch-3)
- [chainer/chainer](https://github.com/chainer/chainer)
- [pytorch/pytorch](https://github.com/pytorch/pytorch)
- [numpy/numpy](https://github.com/numpy/numpy)
- [mattn/vim-brain](https://github.com/mattn/vim-brain)## License
This library is provided by **MIT License**.## Author
- pit-ray