https://github.com/o1lo01ol1o/diffhask
DSL for forward and reverse mode automatic differentiation in Haskell. Port of DiffSharp.
https://github.com/o1lo01ol1o/diffhask
automatic-differentiation backpropagation deep-learning diffsharp haskell
Last synced: 10 months ago
JSON representation
DSL for forward and reverse mode automatic differentiation in Haskell. Port of DiffSharp.
- Host: GitHub
- URL: https://github.com/o1lo01ol1o/diffhask
- Owner: o1lo01ol1o
- License: mit
- Created: 2017-11-26T16:10:14.000Z (about 8 years ago)
- Default Branch: master
- Last Pushed: 2019-06-27T12:49:11.000Z (over 6 years ago)
- Last Synced: 2024-10-31T14:40:42.073Z (about 1 year ago)
- Topics: automatic-differentiation, backpropagation, deep-learning, diffsharp, haskell
- Language: Haskell
- Homepage:
- Size: 160 KB
- Stars: 26
- Watchers: 5
- Forks: 3
- Open Issues: 7
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- License: LICENSE
Awesome Lists containing this project
- awesome-haskell-deep-learning - diffhask - DSL for forward and reverse mode automatic differentiation via a version of operator overloading. Port of DiffSharp to Haskell; currently a work in progress. | [Tim Pierson](https://github.com/o1lo01ol1o) (Haskell Packages / Packages Under Active Development)
README
# diffhask
[](https://hackage.haskell.org/package/diffhask)
[](https://travis-ci.org/o1lo01ol1o/diffhask)
[](https://github.com/o1lo01ol1o/diffhask/blob/master/LICENSE)
> Vive la programmation différentielle!
>
> -- Yan LeCun
DSL for forward and reverse mode automatic differentiation via a version of operator overloading.
Port of [DiffSharp](https://github.com/DiffSharp/DiffSharp) to Haskell; currently a work in progress.
Example usage:
```haskell
let g a b = (a + b / a) / (D 2.0 :: D Float)
compute $ diff' (fixPoint g (D 1.2)) (D 25.0 :: D Float)
-- (D 1.0, D 5.0), (D 1.0, D 0.1)
```