https://github.com/nonkloq/ml-from-scratch
Machine Learning Algortihms from scratch.
https://github.com/nonkloq/ml-from-scratch
candidate-elimination-algorithm decision-trees expectation-maximization-algorithm gaussian-mixture-models kmeans-clustering linear-classification linear-regression ml-from-scratch multilayer-perceptron naive-bayes-classifier random-forest unsupervised-learning
Last synced: 6 months ago
JSON representation
Machine Learning Algortihms from scratch.
- Host: GitHub
- URL: https://github.com/nonkloq/ml-from-scratch
- Owner: nonkloq
- Created: 2023-04-15T07:11:04.000Z (over 2 years ago)
- Default Branch: main
- Last Pushed: 2023-12-26T09:38:49.000Z (almost 2 years ago)
- Last Synced: 2025-02-04T16:28:24.026Z (8 months ago)
- Topics: candidate-elimination-algorithm, decision-trees, expectation-maximization-algorithm, gaussian-mixture-models, kmeans-clustering, linear-classification, linear-regression, ml-from-scratch, multilayer-perceptron, naive-bayes-classifier, random-forest, unsupervised-learning
- Language: Jupyter Notebook
- Homepage:
- Size: 11 MB
- Stars: 1
- Watchers: 1
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Machine Learning Algorithms From Scratch
Implementations of several machine learning algorithms in python using numpy.
## Notebooks
The following notebooks are included:
### Supervised Learning
- [Linear Regression](linear%20regression.ipynb)
- Simple Least Squares
- Oridnary Least Squares
- Bayesian Linear Regression
- Least Mean Squares
- [Locally Weighted Regression (LWR)](LWR.ipynb)- [Linear Classification](linear%20classification.ipynb)
- Perceptron Learning Algorithm
- Logistic Regression
- Naive Bayes Classifier
- Support Vector Machine (Not implemented from scratch)
- [Multilayered Perceptron (Neural Network)](https://github.com/nonkloq/nn_dqn-from-scratch/blob/main/nn-mlp_from_scratch.ipynb)- [Multinomial & Gaussian Naive Bayes](Multinomial_and_GaussianNP.ipynb)
- Gaussian Naive Bayes (Clone from Linear Classification)
- Multinomial Naive Bayes#### Inductive Learning
- [Candidate Elimination Algorithm (CEL)](CEL.ipynb)
#### Ensemble Learning
- [Decision Tree and Random Forest](trees_forest.ipynb)
- ID3 Algorithm
- Random Forest### Unsupervised Learning
- [Unsupervised Learning](unsupervised%20learners.ipynb)
- K-Means
- K Nearest Neighbours
- Brute Force KNN
- KD-Tree
- Kernels to Compute Weight: ID_weight, Epanechnikov & Tricube
- Expectation Maximisation for Gaussian Mixture Model- [K-Means and Expectation-Maximisation for Gaussian Mixture Model (Viz)](EM_for_GMM_and_Kmeans.ipynb) (clone from unsupervised learners)
- Visual Comparison of K-Means vs. EM for GMM Using 2-Dimensional MNIST Data Reduced by PCA & Synthetic Data from N Gaussian Distributions.
- [K-Nearest Neighbors (KNN) for classification (iris dataset)](KNN_for_iris.ipynb) (clone from unsupervised learners)## Usage
The code is provided in Jupyter Notebook format (.ipynb), which you can either view directly on GitHub or download and run on your local machine. Each notebook contains clear implementations of the algorithms, along with the relevant formulas and pseudo code for that algorithm. For better understanding of the algorithms, check out the specific notebook of interest.
## References
- Pattern Recognition and Machine Learning by Christopher Bishop
- Machine Learning: An Algorithmic Perspective, Second Edition by Stephen Marsland