Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/daodavid/classic-ml
Implementation of classic machine learning concepts and algorithms from scratch and math behind their implementation.Written in Jupiter Notebook Python
https://github.com/daodavid/classic-ml
baysian cross-entropy entropy gradient-descent information-gain k-fold-cross-validation lasso-regression linear machine-learning maximum-likelihood-estimation naive-bayes-classifier pca principle-component-analysis probability python regression ridge-regression sigmoid-function softmax-regression suprise
Last synced: about 14 hours ago
JSON representation
Implementation of classic machine learning concepts and algorithms from scratch and math behind their implementation.Written in Jupiter Notebook Python
- Host: GitHub
- URL: https://github.com/daodavid/classic-ml
- Owner: daodavid
- License: apache-2.0
- Created: 2022-06-20T13:44:07.000Z (over 2 years ago)
- Default Branch: main
- Last Pushed: 2022-06-28T08:34:42.000Z (over 2 years ago)
- Last Synced: 2023-03-08T04:49:53.014Z (almost 2 years ago)
- Topics: baysian, cross-entropy, entropy, gradient-descent, information-gain, k-fold-cross-validation, lasso-regression, linear, machine-learning, maximum-likelihood-estimation, naive-bayes-classifier, pca, principle-component-analysis, probability, python, regression, ridge-regression, sigmoid-function, softmax-regression, suprise
- Language: HTML
- Homepage:
- Size: 1.49 MB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# classic Machine Learning Algorithms
Linear Regression
Simple Linear Regression
Gradient Descent over simple linear regression
Effect of different values for learning rate
Multiple Linear Regression
Implementation of gradient descent for Multiple Linear regression using NUMPY
Test of our implemntation in 'insurance.csv' dataset
The probabilistic approach to linear regression.Maximum likelihood estimation
Regularization
-
Polynomial Regression, Bias and Variance -
Lasso Regression (L1 Regularization) -
Lasso as feature selection -
Ridge regression (L2 regularization) -
K-fold cross validation -
References
Logistic Regression
-
Log-odds or Loggit function -
The math origin of the Sigmoid function -
Properties and Identities Of Sigmoid Function -
Maximum Likelihood of Logistic regression, Cross-entropy Loss -
Mathematical derivation of cross-entopy loss.Gradient Descent -
Implementation of BinaryLogisticRegression using numpy -
Reguralization of Logistic Regression -
References
Soft-Max Regression
-
Abstract -
Softmaxt definition and how it works? -
Optimizaton of Softmax Loss with Gradient Descent (Deep math calculation) -
Implementation of Softmax using numpy -
Regularization of softmax by learning rate and max iterations -
Conclusion