https://github.com/khushi130404/regulexa
Regulexa is a Python project that showcases and compares Ridge, Lasso, and Elastic-Net regularization techniques in machine learning. It includes visualizations and performance insights to help prevent overfitting and improve model generalization.
https://github.com/khushi130404/regulexa
elastic-net-regression lasso-regression numpy ridge-regression
Last synced: 4 months ago
JSON representation
Regulexa is a Python project that showcases and compares Ridge, Lasso, and Elastic-Net regularization techniques in machine learning. It includes visualizations and performance insights to help prevent overfitting and improve model generalization.
- Host: GitHub
- URL: https://github.com/khushi130404/regulexa
- Owner: Khushi130404
- License: mit
- Created: 2024-12-26T08:56:50.000Z (6 months ago)
- Default Branch: main
- Last Pushed: 2024-12-29T03:16:06.000Z (6 months ago)
- Last Synced: 2025-01-12T04:55:01.202Z (5 months ago)
- Topics: elastic-net-regression, lasso-regression, numpy, ridge-regression
- Language: Jupyter Notebook
- Homepage:
- Size: 877 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Regulexa
Regulexa is a Python-based project designed to demonstrate and compare different regularization techniques used in machine learning: Ridge, Lasso, and Elastic-Net. Regularization helps to prevent overfitting and improves the generalization of models by adding a penalty term to the loss function. This project includes visualizations and performance comparisons for these techniques, making it a valuable resource for data science enthusiasts and machine learning practitioners.
## Regularization Techniques
1. Ridge Regression
- Ridge regression adds a penalty equal to the square of the magnitude of coefficients. It helps to reduce model complexity and multicollinearity.
- Penalty: α * ||w||₂² (L2 norm)
- Shrinks coefficients towards zero but never sets them exactly to zero.2. Lasso Regression
- Lasso regression adds a penalty equal to the absolute value of the coefficients. It performs both variable selection and regularization.
- Penalty: α * ||w||₁ (L1 norm)
- Can shrink some coefficients to exactly zero, effectively performing feature selection.3. Elastic-Net Regression
- Elastic-Net combines both Ridge and Lasso penalties.
- Penalty: α * [(1 - λ) ||w||₂² + λ ||w||₁]
- Suitable for datasets with correlated features and when feature selection is required.## Features
- Synthetic Data: Generates synthetic datasets for demonstration purposes.
- Visualizations: Plots showing how regularization affects coefficients and model performance.
- Comparisons: Side-by-side comparison of Ridge, Lasso, and Elastic-Net.
- Metrics: Evaluation using metrics like Mean Squared Error (MSE) and R².
## Tech Used
- Jupyter Notebook: For interactive coding and visualizations.
- scikit-learn (sklearn): For implementing regularization techniques and machine learning models.
- Matplotlib: For creating visualizations.
- NumPy: For numerical computations.
- Pandas: For data manipulation and analysis.