Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/ggiuffre/machine-lambda
Purely functional implementation of SGD on feed-forward neural nets
https://github.com/ggiuffre/machine-lambda
deep-neural-networks functional-programming haskell-library neural-network stochastic-gradient-descent
Last synced: 5 days ago
JSON representation
Purely functional implementation of SGD on feed-forward neural nets
- Host: GitHub
- URL: https://github.com/ggiuffre/machine-lambda
- Owner: ggiuffre
- License: gpl-3.0
- Created: 2019-12-28T14:50:56.000Z (about 5 years ago)
- Default Branch: master
- Last Pushed: 2020-05-13T06:53:25.000Z (over 4 years ago)
- Last Synced: 2024-11-18T23:56:17.811Z (2 months ago)
- Topics: deep-neural-networks, functional-programming, haskell-library, neural-network, stochastic-gradient-descent
- Language: Haskell
- Homepage:
- Size: 73.2 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Machine Lambda
A purely functional Haskell implementation of stochastic gradient descent for deep feed-forward neural networks.
Why functional programming for neural networks? The goal of this library is mathematical readability: the close coupling between functional programming and mathematical expressions allows to manipulate neural networks (or anything that can be expressed mathematically) by declaring expressions, rather than by implementing algorithms.
Approaching a problem by declaring its solution instead of implementing an algorithm to find that solution allows the programmer to be succint and clear. For example, here's the function that computes the output of a neural network for a given input:
```haskell
output :: (Floating t) => Matrix t -> Network t -> Matrix t
output = foldl activation
```Here, `output` is a function of two arguments: an input `Matrix` and a `Network`. `output` is defined as a [left fold](https://en.wikipedia.org/wiki/Fold_(higher-order_function)) across the layers of the network, with the input matrix as the initial state of the accumulator and the `activation` function as the combining operation.
## Usage
The `example.hs` program shows an example of how to use this library.
The `DeepNN` module exports data structures and functions to create, train, and use deep feed-forward neural networks:
* `Network` is a data type that represents a deep neural network;
* `CostFunction` is the class of cost functions, whose available instances are `QuadCost` and `CrossEntCost`;
* `output` is the output of a neural network, given some input;
* `sgdUpdates` and `sgdUpdates'` are an infinite list of networks whose parameters are updated with SGD throughout (infinite) epochs, respectively with and without re-shuffling the dataset at each epoch;
* `cost` is the cost of a neural network on a dataset, w.r.t. a given cost function;
* `binAccuracy` and `catAccuracy` are the accuracies (resp. binary and categorical) of a neural network on a dataset;
* `randNet` is a network with random `Double` weights sampled from a normal distribution with given mean and standard deviation.The `Dataset` module currently exports the following functions:
* `fromCsv` is a dataset of 1D `Double` samples, taken from a given CSV file;
* `shuffled` is a random permutation of the elements in a list, given a random number generator;
* `foreach` is the result of applying a function to each element of a matrix;
* `standardized` is a list of training samples that have been standardized.To use a module (such as `DeepNN` for example), have `DeepNN.hs` in the search path of GHC, then `import DeepNN` inside your Haskell program. See the example program.
## Dependencies
`DeepNN` depends on several built-in Haskell modules, but also on the [`Data.matrix` module](https://hackage.haskell.org/package/matrix-0.3.6.1/docs/Data-Matrix.html): to install it, type `cabal install matrix` in a shell.