Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/jamesyang007/adelie

A fast and flexible Python package for solving group elastic net problems.
https://github.com/jamesyang007/adelie

convex-optimization coordinate-descent cpp17 elastic group lasso net python310 python311 python312 python39

Last synced: about 1 month ago
JSON representation

A fast and flexible Python package for solving group elastic net problems.

Awesome Lists containing this project

README

        




![GitHub Actions Workflow Status](https://img.shields.io/github/actions/workflow/status/JamesYang007/adelie/test_docs.yml)
[![Downloads](https://static.pepy.tech/badge/adelie)](https://pepy.tech/project/adelie)
[![Downloads](https://static.pepy.tech/badge/adelie/month)](https://pepy.tech/project/adelie)
![versions](https://img.shields.io/pypi/pyversions/adelie.svg)
![PyPI - Version](https://img.shields.io/pypi/v/adelie)
![GitHub Release](https://img.shields.io/github/v/release/JamesYang007/adelie)

Adelie is a fast and flexible Python package for solving
lasso, elastic net, group lasso, and group elastic net problems.

- **Installation**: [https://jamesyang007.github.io/adelie/notebooks/installation.html](https://jamesyang007.github.io/adelie/notebooks/installation.html)
- **Documentation**: [https://jamesyang007.github.io/adelie](https://jamesyang007.github.io/adelie/)
- **Source code**: [https://github.com/JamesYang007/adelie](https://github.com/JamesYang007/adelie)
- **Issue Tracker**: [https://github.com/JamesYang007/adelie/issues](https://github.com/JamesYang007/adelie/issues)

It offers a general purpose group elastic net solver,
a wide range of matrix classes that can exploit special structure to allow large-scale inputs,
and an assortment of generalized linear model (GLM) classes for fitting various types of data.
These matrix and GLM classes can be extended by the user for added flexibility.
Many inner routines such as matrix-vector products
and gradient, hessian, and loss of GLM functions have been heavily optimized and parallelized.
Algorithmic optimizations such as the pivot rule for screening variables
and the proximal Newton method have been carefully tuned for convergence and numerical stability.