https://github.com/sangyx/mlkit
A head-only library provides sklearn-api with gpu support.
https://github.com/sangyx/mlkit
c-plus-plus gpu-support header-only machine-learning sklearn
Last synced: about 1 year ago
JSON representation
A head-only library provides sklearn-api with gpu support.
- Host: GitHub
- URL: https://github.com/sangyx/mlkit
- Owner: sangyx
- License: mit
- Created: 2019-12-14T12:24:30.000Z (about 6 years ago)
- Default Branch: master
- Last Pushed: 2020-03-03T06:54:40.000Z (almost 6 years ago)
- Last Synced: 2025-01-05T03:42:15.830Z (about 1 year ago)
- Topics: c-plus-plus, gpu-support, header-only, machine-learning, sklearn
- Language: C++
- Homepage:
- Size: 134 KB
- Stars: 1
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# MLKIT
 [](https://travis-ci.com/sangyx/mlkit) [](https://codecov.io/gh/sangyx/mlkit) 
> A HEADER-ONLY LIBRARY PROVIDES SKLEARN-LIKE API WITH GPU SUPPORT.
## Dependencies
* [ArrayFire](http://arrayfire.org/): a general purpose GPU library.
* [Googletest](https://github.com/google/googletest): Google Testing and Mocking Framework.
## Examples
```cpp
#include "mlkit.hpp"
using namespace std;
using namespace mk;
int main(int argc, char **argv)
{
int device = argc > 1 ? atoi(argv[1]) : -1; // default -1
try {
if(device >= 0)
af::setBackend(AF_BACKEND_CUDA); // use gpu
else
af::setBackend(AF_BACKEND_CPU); // use cpu
af::info();
af::array X = af::randn(100, 3);
af::array y = 1 * X.col(0) + 2 * X.col(1) + 3 * X.col(2) + 4 + af::randu(100, 1) * 0.5;
linear_model::LinearRegression lr = linear_model::LinearRegression(true);
lr.fit(X, y);
cout << endl \
<< "[linear regression]" << endl \
<< "-----------------------------------------------" << endl \
<< "expect coef: [1, 2, 3], expect intercept: 4" << endl \
<< "-----------------------------------------------" << endl \
<< "fit result: " << endl;
af_print(lr.coef_);
af_print(lr.intercept_)
cout << "-----------------------------------------------" << endl;
lr.score(X, y);
} catch (af::exception &ae) {
cerr << ae.what() << endl;
}
return 0;
}
```
The output:
```bash
# compiler command
g++ -std=c++11 -g example.cpp -o test -I/opt/arrayfire/include -Imlkit/include -laf -L/opt/arrayfire/lib
# output
ArrayFire v3.7.0 (CPU, 64-bit Linux, build c30d5455)
[0] Intel: Intel(R) Xeon(R) Silver 4114 CPU @ 2.20GHz, 95293 MB, Max threads(20) GNU Compiler Collection(GCC/G++) 7.4.0
[linear regression]
-----------------------------------------------
expect coef: [1, 2, 3], expect intercept: 4
-----------------------------------------------
fit result:
lr.coef_
[3 1 1 1]
Offset: 1
Strides: [1 4 4 4]
0.9999
1.9851
2.9896
lr.intercept_
[1 1 1 1]
Offset: 0
Strides: [1 4 4 4]
4.2475
-----------------------------------------------
Mean Sqaure Error: 0.01791
```
## Algorithms
* Statistical Learning:
- [x] linear_model.LinearRegression
- [x] linear_model.LogisticRegression
- [x] neighbors.KNeighborsClassifier
- [x] cluster.KMeans
- [x] decomposition.PCA
- [x] tree.DecisionTreeClassifier
- [x] mixture.GaussianMixture
- [x] svm.LinearSVC
## Reference
* 李航. 统计学习方法[M]. 2012.
* Harrington P. Machine Learning in Action[M]. 2012.