https://github.com/howion/activation-functions
Javascript implementation of some activation functions.
https://github.com/howion/activation-functions
activation-function deep-learning javascript machine-learning neural-networks
Last synced: 10 months ago
JSON representation
Javascript implementation of some activation functions.
- Host: GitHub
- URL: https://github.com/howion/activation-functions
- Owner: howion
- License: mit
- Created: 2018-09-15T17:44:02.000Z (over 7 years ago)
- Default Branch: master
- Last Pushed: 2020-11-30T00:52:48.000Z (about 5 years ago)
- Last Synced: 2025-02-02T21:25:41.634Z (11 months ago)
- Topics: activation-function, deep-learning, javascript, machine-learning, neural-networks
- Language: JavaScript
- Homepage:
- Size: 5.86 KB
- Stars: 13
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# activation-functions


[](https://github.com/howion/activation-functions/issues)
[](https://github.com/howion/activation-functions/blob/master/LICENSE)
## Installation
From [NPM](https://www.npmjs.com/package/activation-functions)
```bash
$ npm install activation-functions
```
## Available Functions
```js
// x is always considered to be in radians
.Identity(x)
.Inverse(x)
.BinaryStep(x)
.Bipolar(x)
.Logistic(x) | .Sigmoid(x) | .SoftStep(x)
.BipolarSigmoid(x)
.Tanh(x)
.HardTanh(x)
.ArcTan(x)
.ElliotSig(x) | .SoftSign(x)
.Erf(x)
.Sinc(x)
.Sinusoid(x)
.Gaussian(x)
.ISRU(x, a)
.ReLU(x)
.GELU(x)
.PReLU(x, a)
.ELU(x, a)
.SELU(x)
.SoftPlus(x)
.Mish(x)
.SQNL(x)
.BentIdentity(x)
.SiLU(x) | .Swish1(x)
```
Mish: [Official Repsoitory](https://github.com/digantamisra98/Mish)
## License
[**MIT**](https://github.com/howion/activation-functions/blob/master/LICENSE)