https://github.com/jameschapman19/proxtorch
An efficient GPU-compatible library built on PyTorch, offering a wide range of proximal operators and constraints for optimization and machine learning tasks.
https://github.com/jameschapman19/proxtorch
proximal-gradient-descent proximal-operator proximal-operators proximal-regularization
Last synced: about 1 month ago
JSON representation
An efficient GPU-compatible library built on PyTorch, offering a wide range of proximal operators and constraints for optimization and machine learning tasks.
- Host: GitHub
- URL: https://github.com/jameschapman19/proxtorch
- Owner: jameschapman19
- License: mit
- Created: 2023-08-10T11:34:50.000Z (almost 2 years ago)
- Default Branch: main
- Last Pushed: 2023-10-09T12:45:11.000Z (over 1 year ago)
- Last Synced: 2025-04-08T10:22:53.529Z (about 1 month ago)
- Topics: proximal-gradient-descent, proximal-operator, proximal-operators, proximal-regularization
- Language: Python
- Homepage: https://proxtorch.readthedocs.io/en/latest/
- Size: 429 KB
- Stars: 3
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
- Citation: CITATION.cff
Awesome Lists containing this project
README
# ProxTorch
**Unleashing Proximal Gradient Descent on PyTorch** π
[](https://doi.org/10.5281/zenodo.4382739)
[](https://codecov.io/gh/jameschapman19/ProxTorch)
[](https://pypi.org/project/ProxTorch/)
[](https://pypi.org/project/ProxTorch/)π **What is ProxTorch?**
Dive into a rich realm of proximal operators and constraints with `ProxTorch`, a state-of-the-art Python library crafted
on PyTorch. Whether it's optimization challenges or the complexities of machine learning, `ProxTorch` is designed for
speed, efficiency, and seamless GPU integration.## β¨ **Features**
- **π GPU-Boosted**: Experience lightning-fast computations with extensive CUDA support.
- **π₯ PyTorch Synergy**: Naturally integrates with all your PyTorch endeavours.
- **π Expansive Library**: From elemental norms (`L0`, `L1`, `L2`, `Lβ`) to advanced regularizations like Total
Variation and Fused Lasso.
- **π€ User-Friendly**: Jump right in! Intuitive design means minimal disruptions to your existing projects.## π **Installation**
Getting started with `ProxTorch` is a breeze. Install from PyPI with:
```bash
pip install proxtorch
```Or install from source with:
```bash
git clone
cd ProxTorch
pip install -e .
```## π **Quick Start**
Dive in with this straightforward example:
```python
import torch
from proxtorch.operators import L1# Define a sample tensor
x = torch.tensor([0.5, -1.2, 0.3, -0.4, 0.7])# Initialize the L1 proximal operator
l1_prox = L1(sigma=0.1)# Compute the regularization component value
reg_value = l1_prox(x)
print("Regularization Value:", reg_value)# Apply the proximal operator
result = l1_prox.prox(x)
print("Prox Result:", result)
```## π **Diverse Proximal Operators**
### **Regularizers**
- **L1**, **L2 (Ridge)**, **ElasticNet**, **GroupLasso**, **TV** (includes TV_2D, TV_3D, TVL1_2D, TVL1_3D), **Frobenius
**
- **Norms**: TraceNorm, NuclearNorm
- **FusedLasso**, **Huber**### **Constraints**
- **L0Ball**, **L1Ball**, **L2Ball**, **LβBall (Infinity Norm)**, **Frobenius**, **TraceNorm**, **Box**
## π **Documentation**
Explore the comprehensive documentation on [Read the Docs](https://proxtorch.readthedocs.io/en/latest/).
## π **Credits**
`ProxTorch` stands on the shoulders of giants:
- [pyproximal](https://github.com/PyLops/pyproximal)
- [ProxGradPyTorch](https://github.com/KentonMurray/ProxGradPytorch)
- [nilearn](https://github.com/nilearn/nilearn/blob/321494420f95c7a5e2172108400194b37a02e628/nilearn/decoding/proximal_operators.py)We're thrilled to introduce `ProxTorch` as an exciting addition to the PyTorch ecosystem. We're confident you'll love
it!## π€ **Contribute to the ProxTorch Revolution**
Got ideas? Join our vibrant community and make `ProxTorch` even better!
## π **License**
`ProxTorch` is proudly released under the [MIT License](LICENSE).
```