Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/xgagandeep/linear-regression-from-scratch-with-gradient-descent
This project demonstrates the implementation of linear regression using gradient descent.
https://github.com/xgagandeep/linear-regression-from-scratch-with-gradient-descent
Last synced: 6 days ago
JSON representation
This project demonstrates the implementation of linear regression using gradient descent.
- Host: GitHub
- URL: https://github.com/xgagandeep/linear-regression-from-scratch-with-gradient-descent
- Owner: xgagandeep
- Created: 2024-09-11T16:51:42.000Z (2 months ago)
- Default Branch: main
- Last Pushed: 2024-09-11T17:08:46.000Z (2 months ago)
- Last Synced: 2024-09-12T02:30:32.146Z (2 months ago)
- Language: Jupyter Notebook
- Homepage:
- Size: 238 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Project: Linear Regression with Gradient Descent
**Date:** 2020
**Language:** Python
**Libraries:** NumPy, Pandas, Matplotlib
**Type:** Linear Regression Implementation## Description
This project demonstrates the implementation of linear regression using gradient descent. The goal is to fit a linear model to the training data and make predictions on test data. The code covers data normalization, hypothesis computation, error calculation, gradient computation, and visualization of the results.
## Features
- **Data Loading:** Reads training and test datasets from CSV files.
- **Data Normalization:** Normalizes feature values to standardize the range.
- **Linear Regression:** Implements hypothesis function, error function, and gradient descent for model training.
- **Visualization:** Plots data points, regression lines, and error convergence.
- **Prediction:** Makes predictions on test data and saves the results to a CSV file.
- **Interactive Visualization:** Displays interactive plots of the regression process.## Files
- `Linear_X_Train.csv`: Training features dataset.
- `Linear_Y_Train.csv`: Training target values dataset.
- `Linear_X_Test.csv`: Test features dataset.
- `output.csv`: Output file with predictions for the test dataset.## Installation
To run this project, you need Python and the required libraries installed. Follow these steps:
1. **Clone the repository:**
```bash
git clone https://github.com/xgagandeep/Linear-Regression-from-scratch-with-gradient-descent.git
```2. **Navigate to the project directory:**
```bash
cd Linear Regression from scratch with gradient descent
```3. **Install the required libraries:**
```bash
pip install numpy pandas matplotlib
```4. **Prepare Data Files:**
Make sure to place `Linear_X_Train.csv`, `Linear_Y_Train.csv`, and `Linear_X_Test.csv` in the appropriate directories as specified in the script.
5. **Run the Script:**
```bash
jupyter notebook Linear Regression Gradient Descent.ipynb
```## Usage
1. **Load Data:** The script loads training and test data from CSV files.
2. **Normalize Data:** The feature values are normalized to improve model performance.
3. **Train Model:** Linear regression parameters are learned using gradient descent.
4. **Visualize Results:** The script generates plots of the training data, regression line, and error convergence.
5. **Make Predictions:** Predictions are made on the test data and saved to `output.csv`.## Functions
- `hypothesis(x, theta)`: Computes the hypothesis function for given inputs.
- `error(X, Y, theta)`: Calculates the mean squared error for the given parameters.
- `gradient(X, Y, theta)`: Computes the gradient of the error function.
- `gradientDescent(X, Y, max_iteration, learning_rate)`: Performs gradient descent to optimize the parameters.
- `r2score()`: Calculates the R^2 score to evaluate model performance.## Contribution
Feel free to contribute to this project by submitting issues or pull requests. For any questions or feedback, please open an issue on the GitHub repository.
-