{"id":24310646,"url":"https://github.com/daodavid/classic-ml","last_synced_at":"2025-12-04T21:14:57.001Z","repository":{"id":39164882,"uuid":"505458658","full_name":"daodavid/classic-ML","owner":"daodavid","description":"Implementation of classic machine learning concepts and algorithms from scratch and math behind their implementation.Written in Jupiter Notebook Python","archived":false,"fork":false,"pushed_at":"2022-06-28T08:34:42.000Z","size":1560,"stargazers_count":1,"open_issues_count":1,"forks_count":0,"subscribers_count":1,"default_branch":"main","last_synced_at":"2025-01-17T06:17:21.684Z","etag":null,"topics":["baysian","cross-entropy","entropy","gradient-descent","information-gain","k-fold-cross-validation","lasso-regression","linear","machine-learning","maximum-likelihood-estimation","naive-bayes-classifier","pca","principle-component-analysis","probability","python","regression","ridge-regression","sigmoid-function","softmax-regression","suprise"],"latest_commit_sha":null,"homepage":"","language":"HTML","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"apache-2.0","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/daodavid.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2022-06-20T13:44:07.000Z","updated_at":"2022-06-28T00:30:30.000Z","dependencies_parsed_at":"2022-09-02T13:10:55.382Z","dependency_job_id":null,"html_url":"https://github.com/daodavid/classic-ML","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/daodavid%2Fclassic-ML","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/daodavid%2Fclassic-ML/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/daodavid%2Fclassic-ML/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/daodavid%2Fclassic-ML/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/daodavid","download_url":"https://codeload.github.com/daodavid/classic-ML/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":242483273,"owners_count":20135784,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["baysian","cross-entropy","entropy","gradient-descent","information-gain","k-fold-cross-validation","lasso-regression","linear","machine-learning","maximum-likelihood-estimation","naive-bayes-classifier","pca","principle-component-analysis","probability","python","regression","ridge-regression","sigmoid-function","softmax-regression","suprise"],"created_at":"2025-01-17T06:17:33.014Z","updated_at":"2025-12-04T21:14:56.960Z","avatar_url":"https://github.com/daodavid.png","language":"HTML","readme":"# classic Machine Learning Algorithms\n\u003chr\u003e \u003chr\u003e\n\u003ch1\u003e \u003ca href=\"https://daodavid.github.io/classic-ML/notes/linear-regression.html\"\u003e Linear Regression \u003c/a\u003e\u003c/h1\u003e\n  \u003ch4\u003e\n  \u003cfont size=\"4\" face=\"Times New Roma\" color=\"#3f134f\"\u003e   \n  \u003cul style=\"margin-left: 30px\"\u003e\n      \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/linear-regression.html#simple~linear~regression\"\u003eSimple Linear Regression \u003c/a\u003e \u003c/li\u003e \u003cbr\u003e\n      \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/linear-regression.html#grad~sim~linear\"\u003eGradient Descent over simple linear regression\u003c/a\u003e \u003c/li\u003e \u003cbr\u003e\n      \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/linear-regression.html#learning-rate\"\u003eEffect of different values for learning rate\u003c/a\u003e \u003c/li\u003e \u003cbr\u003e\n      \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/linear-regression.html#m-linear-r\"\u003eMultiple Linear Regression\u003c/a\u003e \u003c/li\u003e \u003cbr\u003e\n    \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/linear-regression.html#impl-multi\"\u003eImplementation of gradient descent for Multiple Linear regression using NUMPY\u003c/a\u003e \u003c/li\u003e \u003cbr\u003e\n     \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/linear-regression.html#insurence\"\u003eTest of our implemntation in 'insurance.csv' dataset \u003c/a\u003e \u003c/li\u003e \u003cbr\u003e\n     \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/linear-regression.html#MLE\"\u003eThe probabilistic approach to linear regression.Maximum likelihood estimation \u003c/a\u003e \u003c/li\u003e \u003cbr\u003e\n\u003c/ul\u003e \n\u003c/font\u003e \n\u003c/h4\u003e\n\u003chr\u003e \u003chr\u003e\n\n \u003ch1\u003e \u003ca href=\"https://daodavid.github.io/classic-ML/notes/reguralization.html\"\u003eRegularization\u003c/a\u003e\u003c/h1\u003e\n \u003cfont size=\"4\" face=\"Times New Roma\" color=\"#3f134f\"\u003e \n    \u003cul style=\"margin-left: 30px\"\u003e\n      \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/reguralization.html#intro-pol\"\u003e Polynomial Regression, Bias and Variance \u003c/a\u003e \u003c/li\u003e \u003cbr\u003e\n      \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/reguralization.html#lasso\"\u003e Lasso Regression (L1 Regularization)\u003c/a\u003e \u003c/li\u003e\u003cbr\u003e\n      \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/reguralization.html#feature\"\u003e Lasso as feature selection\u003c/a\u003e \u003c/li\u003e\u003cbr\u003e  \n      \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/reguralization.html#ridge\"\u003e Ridge regression (L2 regularization)\u003c/a\u003e \u003c/li\u003e\u003cbr\u003e          \n      \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/reguralization.html#k-fold\"\u003e  K-fold cross validation \u003c/a\u003e \u003c/li\u003e\u003cbr\u003e       \n      \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/reguralization.html#ref\"\u003e References \u003c/a\u003e \u003c/li\u003e\u003cbr\u003e     \n\u003c/ul\u003e    \n \u003c/font\u003e\n\u003chr\u003e \u003chr\u003e\n\u003ch1\u003e \u003ca href=\"https://daodavid.github.io/classic-ML/notes/logistic_regression.html\"\u003e Logistic Regression \u003c/a\u003e\u003c/h1\u003e\n \u003cfont size=\"4\" face=\"Times New Roma\" color=\"#3f134f\"\u003e \n \u003cul style=\"margin-left: 30px\"\u003e\n      \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/logistic_regression.html#odds-ration\"\u003e Log-odds or Loggit function  \u003c/a\u003e \u003c/li\u003e \u003cbr\u003e\n         \u003cli\u003e\u003ca href=\"#origin\"\u003eThe math origin of the Sigmoid function\u003c/a\u003e \u003c/li\u003e\u003cbr\u003e  \n      \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/logistic_regression.html#prop\"\u003e Properties and Identities Of Sigmoid Function\u003c/a\u003e \u003c/li\u003e\u003cbr\u003e  \n      \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/logistic_regression.html#max-li\"\u003eMaximum Likelihood of Logistic regression, Cross-entropy Loss\u003c/a\u003e \u003c/li\u003e\u003cbr\u003e   \n      \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/logistic_regression.html#grad-descent\"\u003e  Mathematical derivation of cross-entopy loss.Gradient Descent \u003c/a\u003e \u003c/li\u003e\u003cbr\u003e   \n      \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/logistic_regression.html#impl\"\u003e   Implementation of BinaryLogisticRegression using numpy \u003c/a\u003e \u003c/li\u003e\u003cbr\u003e       \n      \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/logistic_regression.html#reg\"\u003e Reguralization of Logistic Regression  \u003c/a\u003e \u003c/li\u003e\u003cbr\u003e       \n      \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/logistic_regression.html#ref\"\u003e References \u003c/a\u003e \u003c/li\u003e\u003cbr\u003e     \n\u003c/ul\u003e    \n \u003c/font\u003e\n\u003chr\u003e \u003chr\u003e\n\n\u003ch1\u003e \u003ca href=\"https://daodavid.github.io/classic-ML/notes/softmax-regression.html\"\u003eSoft-Max Regression\u003c/a\u003e\u003c/h1\u003e\n\u003cfont size=\"4\" face=\"Times New Roma\" color=\"#3f134f\"\u003e \n    \u003cul style=\"margin-left: 30px\"\u003e\n      \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/softmax-regression.html#abstract\"\u003eAbstract \u003c/a\u003e \u003c/li\u003e \u003cbr\u003e\n      \u003c!--\u003cli\u003e\u003ca href='#int-1'\u003eIntroduction \u003c/a\u003e \u003c/li\u003e\u003cbr\u003e --\u003e\n      \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/softmax-regression.html#deff_softmax\"\u003eSoftmaxt definition and  how it works?\u003c/a\u003e \u003c/li\u003e\u003cbr\u003e\n      \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/softmax-regression.html#optimization\"\u003eOptimizaton of  Softmax Loss with Gradient Descent (Deep math calculation)\u003c/a\u003e \u003c/li\u003e\u003cbr\u003e  \n      \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/softmax-regression.html#impl\"\u003eImplementation of Softmax using numpy \u003c/a\u003e \u003c/li\u003e\u003cbr\u003e\n       \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/softmax-regression.html#reg\"\u003eRegularization of softmax by learning rate and max iterations\u003c/a\u003e \u003c/li\u003e\u003cbr\u003e \n       \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/softmax-regression.html#conclusion\"\u003eConclusion\u003c/a\u003e \u003c/li\u003e\u003cbr\u003e  \n\n\u003c/ul\u003e    \n \u003c/font\u003e\n\u003chr\u003e \u003chr\u003e\n\n\u003ch1\u003e \u003ca href=\"https://daodavid.github.io/classic-ML/notes/naive_bayes_classifier.html\"\u003eNaive Bayes Classifier\u003c/a\u003e\u003c/h1\u003e\n\u003cfont size=\"4\" face=\"Times New Roma\" color=\"#3f134f\"\u003e \n    \u003cul style=\"margin-left: 30px\"\u003e\n      \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/naive_bayes_classifier.html#bayes_theorem\"\u003eBayes Theorem\u003c/a\u003e \u003c/li\u003e \u003cbr\u003e\n      \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/naive_bayes_classifier.html#works\"\u003eHow does Binomial Naive Bayes work?\u003c/a\u003e \u003c/li\u003e\u003cbr\u003e\n      \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/naive_bayes_classifier.html#likeli-invest\"\u003eInvestigation of likelihood and posterior probability  throw features values of Titanic dataset\u003c/a\u003e \u003c/li\u003e\u003cbr\u003e  \n      \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/naive_bayes_classifier.html#testing\"\u003e Implementation of likelihood table for Gaussian Naive Bayes and testing on Titanic \u003c/a\u003e \u003c/li\u003e\u003cbr\u003e\n       \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/naive_bayes_classifier.html#bernuli\"\u003e Bernoulli Naive Bayes\u003c/a\u003e \u003c/li\u003e\u003cbr\u003e \n       \u003cli\u003e\u003ca href=\"https://daodavid.github.io/classic-ML/notes/naive_bayes_classifier.html#ref\"\u003eReferences\u003c/a\u003e \u003c/li\u003e\u003cbr\u003e  \n    \u003c/ul\u003e    \n\u003c/font\u003e\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdaodavid%2Fclassic-ml","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fdaodavid%2Fclassic-ml","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fdaodavid%2Fclassic-ml/lists"}