https://github.com/yixuan/MiniDNN
A header-only C++ library for deep neural networks
https://github.com/yixuan/MiniDNN
deep-learning header-only machine-learning neural-network statistical-models
Last synced: about 20 hours ago
JSON representation
A header-only C++ library for deep neural networks
- Host: GitHub
- URL: https://github.com/yixuan/MiniDNN
- Owner: yixuan
- Created: 2018-01-21T15:02:01.000Z (almost 8 years ago)
- Default Branch: master
- Last Pushed: 2021-04-16T09:25:56.000Z (over 4 years ago)
- Last Synced: 2025-10-13T04:29:49.660Z (about 1 month ago)
- Topics: deep-learning, header-only, machine-learning, neural-network, statistical-models
- Language: C++
- Size: 707 KB
- Stars: 425
- Watchers: 26
- Forks: 96
- Open Issues: 12
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
- awesome-robotics-libraries - [github
- awesome-hpp - MiniDNN - only C++ library for deep neural networks. | [](https://opensource.org/licenses/MPL-2.0) | (Data Mining, Machine Learning, and Deep Learning)
- awesome-robotics-libraries - [github
README
# MiniDNN
**MiniDNN** is a C++ library that implements a number of popular
deep neural network (DNN) models. It has a mini codebase but is fully functional
to construct different types of feed-forward neural networks. **MiniDNN** is
built on top of [Eigen](http://eigen.tuxfamily.org).
**MiniDNN** is a header-only library implemented purely in C++98, whose only
dependency, **Eigen**, is also header-only. These features make it easy to embed
**MiniDNN** into larger projects with a broad range of compiler support.
This project was largely inspired by the [tiny-dnn](https://github.com/tiny-dnn/tiny-dnn/)
library, a header-only C++14 implementation of deep learning models. What makes
**MiniDNN** different is that **MiniDNN** is based on the high-performance **Eigen**
library for numerical computing, and it has better compiler support.
**MiniDNN** is still quite **experimental** for now. Originally I wrote it with the aim of
studying deep learning and practicing model implementation, but I also find it useful in
my own statistical and machine learning research projects.
## Features
- Able to build feed-forward neural networks with a few lines of code
- Header-only, highly portable
- Fast on CPU
- Modularized and extensible
- Provides detailed documentation that is a resource for learning
- Helps understanding how DNN works
- A wonderful opportunity to learn and practice both the nice and dirty parts of DNN
## Quick Start
The self-explanatory code below is a minimal example to fit a DNN model:
```cpp
#include
using namespace MiniDNN;
typedef Eigen::MatrixXd Matrix;
typedef Eigen::VectorXd Vector;
int main()
{
// Set random seed and generate some data
std::srand(123);
// Predictors -- each column is an observation
Matrix x = Matrix::Random(400, 100);
// Response variables -- each column is an observation
Matrix y = Matrix::Random(2, 100);
// Construct a network object
Network net;
// Create three layers
// Layer 1 -- convolutional, input size 20x20x1, 3 output channels, filter size 5x5
Layer* layer1 = new Convolutional(20, 20, 1, 3, 5, 5);
// Layer 2 -- max pooling, input size 16x16x3, pooling window size 3x3
Layer* layer2 = new MaxPooling(16, 16, 3, 3, 3);
// Layer 3 -- fully connected, input size 5x5x3, output size 2
Layer* layer3 = new FullyConnected(5 * 5 * 3, 2);
// Add layers to the network object
net.add_layer(layer1);
net.add_layer(layer2);
net.add_layer(layer3);
// Set output layer
net.set_output(new RegressionMSE());
// Create optimizer object
RMSProp opt;
opt.m_lrate = 0.001;
// (Optional) set callback function object
VerboseCallback callback;
net.set_callback(callback);
// Initialize parameters with N(0, 0.01^2) using random seed 123
net.init(0, 0.01, 123);
// Fit the model with a batch size of 100, running 10 epochs with random seed 123
net.fit(opt, x, y, 100, 10, 123);
// Obtain prediction -- each column is an observation
Matrix pred = net.predict(x);
// Layer objects will be freed by the network object,
// so do not manually delete them
return 0;
}
```
To compile and run this example, simply download the source code of **MiniDNN** and
[Eigen](http://bitbucket.org/eigen/eigen/get/3.3.4.tar.gz),
and let the compiler know about their paths. For example:
```bash
g++ -O2 -I/path/to/eigen -I/path/to/MiniDNN/include example.cpp
```
## Documentation
The [API reference](https://yixuan.cos.name/MiniDNN/doc/) page contains the documentation
of **MiniDNN** generated by [Doxygen](http://www.doxygen.org/), including all the class APIs.
## License
**MiniDNN** is an open source project licensed under
[MPL2](https://www.mozilla.org/MPL/2.0/).