https://github.com/lancelet/cropduster
Neural Networks for Near-Optimal Control, in *HASKELL*
https://github.com/lancelet/cropduster
crazy haskell neural-network optimal-control
Last synced: 3 months ago
JSON representation
Neural Networks for Near-Optimal Control, in *HASKELL*
- Host: GitHub
- URL: https://github.com/lancelet/cropduster
- Owner: lancelet
- Created: 2023-09-28T11:01:26.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2023-11-11T00:51:24.000Z (over 1 year ago)
- Last Synced: 2025-01-13T21:27:25.219Z (4 months ago)
- Topics: crazy, haskell, neural-network, optimal-control
- Language: Haskell
- Homepage:
- Size: 107 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
Awesome Lists containing this project
README
# Optimal Control with Neural Networks
[](https://github.com/lancelet/cropduster/actions/workflows/haskell.yml)
The GitHub Pages site for this repository is at
[https://lancelet.github.io/cropduster](https://lancelet.github.io/cropduster).
The GitHub Pages site currently contains videos of some of the animated
plots generated by the Haskell code.WIP!
This is a repository containing work for a planned presentation to FP-Syd
(the Sydney, Australia, Functional Programming Group) on near-optimal control
using neural networks.The presentation will eventually cover:
- Basics of SGD.
- Backpropagation.
- Training supervised networks on batches of examples.
- Training networks for near-optimal control (ie. control of dynamical systems
using networks that have been trained by minimising an objective function
typical of an optimal control problem).Code examples will (hopefully) include:
- Least-squares linear fitting by SGD. Compared against the closed-form
solution for a least-squares fit. This introduces SGD with manual gradient
calculation to cover the basic approach, in order to motivate automatic
differentiation.
- Fitting the parameters of a mass-spring-damper system to an observed
trajectory. This demonstrates backpropagation through a fixed-step RK4
ODE solver.
- Training a network to swing-up a pole in a cart-pole pendulum system.
- Training a network to do a rocket landing in 2D.## Plan
Major items
- [x] Show SGD linear fitting.
- [x] Show SGD linear fitting in phase space.
- [x] Implement RK4 for backprop.
- [x] Fit parameters to a mass-spring-damper system using backprop.
- [x] Run compiling and plotting in GitHub CI.
- [x] Trial Haskell chart to see if it has better plotting performance.
- [x] Trial video.js in place of plain HTML video tags.
- [x] Generate slides using Pandoc and reveal.js.
- [x] Clean up `Plot.hs` now I'm using Cairo for plots.
- [x] Set up a flow to render images and then process them to videos straight
from Haskell to avoid crazy bash scripting.
- [x] Figure out how to include reveal.js presentation in the Hugo output.
- [ ] Tidy spring-damper example; add noise to data; make mass a
constant since only spring and damping constants are truly free
parameters.
- [ ] Tidy build.
- [ ] Create other plots using Haskell Chart.
- [ ] Network for cart-pole pendulum balancing example.
- [ ] Network for 2D rocket landing example.
- [ ] Presentation for FP-Syd.
- [ ] Try plyr.js instead of video.js.## Running
To generate linear fitting example movies:
```
$ ./linfit-examples.sh # generates movies under the plots directory
```To generate mass-spring-damper example movies:
```
$ ./msd-examples.sh # generates movies under the plots directory
```