https://github.com/DanielAvdar/activations-plus
https://github.com/DanielAvdar/activations-plus
Last synced: 6 months ago
JSON representation
- Host: GitHub
- URL: https://github.com/DanielAvdar/activations-plus
- Owner: DanielAvdar
- License: mit
- Created: 2025-04-13T13:55:30.000Z (6 months ago)
- Default Branch: main
- Last Pushed: 2025-04-20T12:00:52.000Z (6 months ago)
- Last Synced: 2025-04-22T19:47:58.234Z (6 months ago)
- Language: Python
- Size: 5.21 MB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Authors: AUTHORS.md
Awesome Lists containing this project
- awesome-opensource-israel - activations-plus - Collection of advanced activation functions for deep learning. (Projects by main language / python)
README
# Activations Plus
Activations Plus is a Python package designed to provide a collection of advanced activation functions for machine learning and deep learning models. These activation functions are implemented to enhance the performance of neural networks by addressing specific challenges such as sparsity, non-linearity, and gradient flow.
[](https://pypi.org/project/activations-plus/)
[](https://img.shields.io/pypi/v/activations-plus)
[](https://opensource.org/licenses/MIT)



[](https://github.com/DanielAvdar/activations-plus/actions/workflows/ci.yml)
[](https://github.com/DanielAvdar/activations-plus/actions/workflows/code-checks.yml)
[](https://codecov.io/gh/DanielAvdar/activations-plus)
[](https://github.com/astral-sh/ruff)
## Features
- **Entmax**: Sparse activation function for probabilistic models.
- **Sparsemax**: Sparse alternative to softmax.
- **Bent Identity**: Smooth approximation of the identity function. *(Experimental feature require review)*
- **ELiSH (Exponential Linear Squared Hyperbolic)**: Combines exponential and linear properties. *(Experimental feature require review)*
- **Maxout**: Learns piecewise linear functions. *(Experimental feature require review)*
- **Soft Clipping**: Smoothly clips values to a range. *(Experimental feature require review)*
- **SReLU (S-shaped Rectified Linear Unit)**: Combines linear and non-linear properties. *(Experimental feature require review)*## Installation
To install the package, use pip:
```bash
pip install activations-plus
```## Usage
Import and use any activation function in your PyTorch models:
```python
import torch
from activations_plus.sparsemax import Sparsemax
from activations_plus.entmax import Entmax# Example with Sparsemax
sparsemax = Sparsemax()
x = torch.tensor([[1.0, 2.0, 3.0], [1.0, 2.0, -1.0]])
output_sparsemax = sparsemax(x)
print("Sparsemax Output:", output_sparsemax)# Example with Entmax
entmax = Entmax(alpha=1.5)
output_entmax = entmax(x)
print("Entmax Output:", output_entmax)
```These examples demonstrate how to use Sparsemax and Entmax activation functions in PyTorch models.
## Documentation
Comprehensive documentation is available [documentation](https://activations-plus.readthedocs.io/en/latest/).
## Supported Activation Functions
1. **Entmax**: Sparse activation function for probabilistic models. [Reference Paper](https://arxiv.org/abs/1905.05702)
2. **Sparsemax**: Sparse alternative to softmax for probabilistic outputs. [Reference Paper](https://arxiv.org/abs/1602.02068)
3. **Bent Identity**: A smooth approximation of the identity function. *(Experimental feature require review)* [reference missing]()
4. **ELiSH**: Combines exponential and linear properties for better gradient flow. *(Experimental feature require review)* [Reference Paper](https://arxiv.org/abs/1808.00783)
6. **Maxout**: Learns piecewise linear functions for better expressiveness. *(Experimental feature require review)* [Reference Paper](https://arxiv.org/abs/1302.4389)
7. **Soft Clipping**: Smoothly clips values to a range to avoid extreme outputs. *(Experimental feature require review)* [Reference Paper](https://arxiv.org/abs/2406.16640)
8. **SReLU**: Combines linear and non-linear properties for better flexibility. *(Experimental feature require review)* [Reference Paper](https://arxiv.org/abs/1512.07030)## Contributing
Contributions are welcome! Please read the [CONTRIBUTING.md](CONTRIBUTING.md) file for guidelines.
## Testing
To run the tests, use the following command:
```bash
pytest tests/
```## License
This project is licensed under the MIT License. See the [LICENSE](LICENSE) file for details.
## Acknowledgments
Special thanks to the contributors and the open-source community for their support.