https://github.com/winterwind/housingpricesproject
A two-part project involving making predictions using various regressors and then implementing linear regression from scratch and predicting that way
https://github.com/winterwind/housingpricesproject
csv csv-files data-science decision-tree gradient-descent jupyter jupyter-notebook knearest-neighbors knn linear-regression linear-regression-scratch machine-learning matplotlib matplotlib-pyplot numpy pandas python random-forest regression sklearn
Last synced: about 1 month ago
JSON representation
A two-part project involving making predictions using various regressors and then implementing linear regression from scratch and predicting that way
- Host: GitHub
- URL: https://github.com/winterwind/housingpricesproject
- Owner: Winterwind
- Created: 2024-12-12T23:16:41.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2024-12-12T23:27:23.000Z (about 1 year ago)
- Last Synced: 2025-06-11T05:03:16.126Z (8 months ago)
- Topics: csv, csv-files, data-science, decision-tree, gradient-descent, jupyter, jupyter-notebook, knearest-neighbors, knn, linear-regression, linear-regression-scratch, machine-learning, matplotlib, matplotlib-pyplot, numpy, pandas, python, random-forest, regression, sklearn
- Language: Jupyter Notebook
- Homepage:
- Size: 138 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# HousingPricesProject
A two-part project involving making predictions using various regressors and then implementing linear regression from scratch and predicting that way. Both extract data from .csv files and can be found in their respective folders.
## Task A
Here, I parse data from a .csv file, handle missing data, and then apply three different regressors: KNearestNeighbors, DecisionTree, and RandomForest; and then compare and contrast the results.
## Task B
Here, I parse data from a similar .csv file (except it has no missing data) and then apply the premade linear regressor model on it. Then, I make a linear regressor class form scratch by way of gradient descent and then also apply that to the data. I then compare and contrast the two to see the validity of my homemade linear regressor model.