Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/luca-parisi/hyper_sinh

hyper-sinh: An Accurate and Reliable Activation Function from Shallow to Deep Learning in TensorFlow, Keras, and PyTorch
https://github.com/luca-parisi/hyper_sinh

activation-functions cnn cnn-classification cnn-keras cnn-model deep-learning deep-neural-networks keras keras-neural-networks keras-tensorflow machine-learning machinelearning-python pytorch sinh tensorflow tensorflow-models torch

Last synced: about 1 hour ago
JSON representation

hyper-sinh: An Accurate and Reliable Activation Function from Shallow to Deep Learning in TensorFlow, Keras, and PyTorch

Awesome Lists containing this project

README

        

# hyper-sinh in TensorFlow, Keras, and PyTorch
## An Accurate and Reliable Function from Shallow to Deep Learning

The **'hyper-sinh'** is a Python custom activation function available for both shallow and deep neural networks in TensorFlow, Keras, and PyTorch for Machine Learning- and Deep Learning-based classification. It is distributed under the [CC BY 4.0 license](http://creativecommons.org/licenses/by/4.0/).

Details on this function, implementation and validation against gold standard activation functions for both shallow and deep neural networks are available at the following paper: **[Parisi et al., 2021a](https://www.sciencedirect.com/science/article/pii/S2666827021000566)**.

### Dependencies

The dependencies are included in the `environment.yml` file.
Run the following command to install the required version of Python (v3.9.16) and all dependencies in a conda virtual
environment (replace `` with your environment name):

- `conda env create --name --file environment.yml`

### Usage

You can use the custom `HyperSinh` activation function in Keras or PyTorch as a layer:

#### Example of usage in a sequential model in Keras with a `HyperSinh` layer between a convolutional layer and a pooling layer

Either

```python
model = models.Sequential()
model.add(layers.Conv2D(32, (3, 3), input_shape=(32, 32, 3)))
model.add(HyperSinh())
model.add(layers.MaxPooling2D((2, 2)))
```

or

```python
model = keras.Sequential(
keras.Input(shape=(32, 32, 3)),

layers.Conv2D(32, kernel_size=(3, 3)),
HyperSinh(),

layers.MaxPooling2D(pool_size=(2, 2)),
]
)
```

#### Example of usage in a sequential model in PyTorch with a `HyperSinh` layer between a convolutional layer and a pooling layer

```python
self.conv1 = nn.Conv2d(1, OUT_CHANNEL_CONV1, kernel_size=KERNEL_SIZE_CONV)
self.hyper_sinh1 = HyperSinh()
self.pool1 = nn.MaxPool2d(kernel_size=KERNEL_SIZE_MAX_POOL)
```

### Linting
`isort` is used to ensure a consistent order of imports, whilst `autopep8` to ensure adherence of the codes to PEP-8,
via the following two commands respectively:

- `isort `
- `autopep8 --in-place --recursive .`

### Citation request

If you are using this function, please cite the papers by:
* **[Parisi et al., 2020](https://arxiv.org/abs/2011.07661)**.
* **[Parisi et al., 2021a](https://www.sciencedirect.com/science/article/pii/S2666827021000566)**.
* **[Parisi et al., 2021b](https://www.wseas.org/multimedia/journals/computerresearch/2021/a025118-001(2021).pdf)**.