{"id":16509266,"url":"https://github.com/mohd-faizy/machine-learning-algorithms","last_synced_at":"2025-10-08T02:38:26.687Z","repository":{"id":152100100,"uuid":"291106531","full_name":"mohd-faizy/Machine-Learning-Algorithms","owner":"mohd-faizy","description":"This Repository consist of some popular Machine Learning Algorithms and their implementation of both theory and code in Jupyter Notebooks","archived":false,"fork":false,"pushed_at":"2020-11-30T11:07:40.000Z","size":6175,"stargazers_count":5,"open_issues_count":0,"forks_count":4,"subscribers_count":1,"default_branch":"master","last_synced_at":"2025-04-08T17:22:47.918Z","etag":null,"topics":["bayesian-algorithm","convolutional-neural-networks","decision-trees","deep-learning","dimensionality-reduction-algorithms","gradient-boosting","k-means-clustering","k-nearest-neighbors","linear-regression","logistic-regression","machine-learning-algorithms","naive-bayes-classifier","neural-network","random-forest","supervised-learning","support-vector-machines","unsupervised-learning"],"latest_commit_sha":null,"homepage":"","language":"Jupyter Notebook","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/mohd-faizy.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2020-08-28T17:28:03.000Z","updated_at":"2025-03-02T17:05:51.000Z","dependencies_parsed_at":"2023-07-09T13:01:28.760Z","dependency_job_id":null,"html_url":"https://github.com/mohd-faizy/Machine-Learning-Algorithms","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/mohd-faizy%2FMachine-Learning-Algorithms","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/mohd-faizy%2FMachine-Learning-Algorithms/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/mohd-faizy%2FMachine-Learning-Algorithms/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/mohd-faizy%2FMachine-Learning-Algorithms/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/mohd-faizy","download_url":"https://codeload.github.com/mohd-faizy/Machine-Learning-Algorithms/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":252445678,"owners_count":21749096,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["bayesian-algorithm","convolutional-neural-networks","decision-trees","deep-learning","dimensionality-reduction-algorithms","gradient-boosting","k-means-clustering","k-nearest-neighbors","linear-regression","logistic-regression","machine-learning-algorithms","naive-bayes-classifier","neural-network","random-forest","supervised-learning","support-vector-machines","unsupervised-learning"],"created_at":"2024-10-11T15:49:33.720Z","updated_at":"2025-10-08T02:38:21.644Z","avatar_url":"https://github.com/mohd-faizy.png","language":"Jupyter Notebook","readme":"![author](https://img.shields.io/badge/author-mohd--faizy-red)\n![made-with-Markdown](https://img.shields.io/badge/Made%20with-markdown-blue)\n![Language](https://img.shields.io/github/languages/top/mohd-faizy/Machine-Learning-Algorithms)\n![Platform](https://img.shields.io/badge/platform-jupyter%20labs-blue)\n![Maintained](https://img.shields.io/maintenance/yes/2020)\n![Last Commit](https://img.shields.io/github/last-commit/mohd-faizy/Machine-Learning-Algorithms)\n[![GitHub issues](https://img.shields.io/github/issues/mohd-faizy/Machine-Learning-Algorithms)](https://github.com/mohd-faizy/Machine-Learning-Algorithms/issues)\n[![Open Source Love svg2](https://badges.frapsoft.com/os/v2/open-source.svg?v=103)](https://opensource.com/resources/what-open-source)\n![Stars GitHub](https://img.shields.io/github/stars/mohd-faizy/Machine-Learning-Algorithms)\n[![GitHub license](https://img.shields.io/github/license/mohd-faizy/Machine-Learning-Algorithms)](https://github.com/mohd-faizy/Machine-Learning-Algorithms/blob/master/LICENSE)\n![Size](https://img.shields.io/github/repo-size/mohd-faizy/Machine-Learning-Algorithms)\n\n\u003cstrong\u003e\u003ch1 align='center'\u003e\n    Machine Learning Algorithms\n\u003c/h1\u003e\u003c/strong\u003e\n\n\u003cimg src='https://github.com/mohd-faizy/__Machine_Learning_Algorithms__/blob/master/Algorithms_png/Head_ML.png'\u003e\n\n## __Classification according to the ways of learning:__\n\n:black_circle: Supervised learning\n\n:white_circle: Unsupervised learning\n\n:black_circle: Semi-supervised learning\n\n:white_circle: Reinforcement learning\n\n\n---\n\u003ch2 style=\"text-align: left;\"\u003eClassification according to the function\u003c/h2\u003e\n\u003ctable style=\"height: 496px; width: 629px;\"\u003e\n    \u003ctbody\u003e\n        \u003ctr\u003e\n            \u003ctd style=\"width: 241px;\"\u003e\n                \u003ch3\u003e\u003cstrong\u003e\u0026nbsp;Regression algorithm\u003c/strong\u003e\u003c/h3\u003e\n            \u003c/td\u003e\n            \u003ctd style=\"width: 372px;\"\u003e\n                \u003cul\u003e\n                    \u003cli\u003eLinear regression\u003c/li\u003e\n                    \u003cli\u003e\u0026nbsp;Logistic regression\u003c/li\u003e\n                    \u003cli\u003eMultiple Adaptive Regression (MARS)\u003c/li\u003e\n                    \u003cli\u003e\u0026nbsp;Local scatter smoothing estimate (LOESS)\u003c/li\u003e\n                \u003c/ul\u003e\n            \u003c/td\u003e\n        \u003c/tr\u003e\n        \u003ctr\u003e\n            \u003ctd style=\"width: 241px;\"\u003e\n                \u003ch3\u003e\u003cstrong\u003e\u0026nbsp;Instance-based Learning Algorithm\u003c/strong\u003e\u003c/h3\u003e\n            \u003c/td\u003e\n            \u003ctd style=\"width: 372px;\"\u003e\n                \u003cul\u003e\n                    \u003cli\u003eK \u0026mdash; proximity algorithm (kNN)\u003c/li\u003e\n                    \u003cli\u003eLearning vectorization (LVQ)\u003c/li\u003e\n                    \u003cli\u003eSelf-Organizing Mapping Algorithm (SOM)\u003c/li\u003e\n                    \u003cli\u003eLocal Weighted Learning Algorithm (LWL)\u003c/li\u003e\n                \u003c/ul\u003e\n            \u003c/td\u003e\n        \u003c/tr\u003e\n        \u003ctr\u003e\n            \u003ctd style=\"width: 241px;\"\u003e\n                \u003ch3\u003e\u0026nbsp;\u003cstrong\u003eRegularization Algorithm\u003c/strong\u003e\u003c/h3\u003e\n            \u003c/td\u003e\n            \u003ctd style=\"width: 372px;\"\u003e\n                \u003cul\u003e\n                    \u003cli\u003eRidge Regression\u003c/li\u003e\n                    \u003cli\u003eLASSO（Least Absolute Shrinkage and Selection Operator)\u003c/li\u003e\n                    \u003cli\u003eElastic Net\u003c/li\u003e\n                    \u003cli\u003eMinimum Angle Regression (LARS)\u003c/li\u003e\n                \u003c/ul\u003e\n            \u003c/td\u003e\n        \u003c/tr\u003e\n        \u003ctr\u003e\n            \u003ctd style=\"width: 241px;\"\u003e\n                \u003ch3\u003e\u003cstrong\u003e\u0026nbsp;Decision tree Algorithm\u003c/strong\u003e\u003c/h3\u003e\n            \u003c/td\u003e\n            \u003ctd style=\"width: 372px;\"\u003e\n                \u003cul\u003e\n                    \u003cli\u003eClassification and Regression Tree (CART)\u003c/li\u003e\n                    \u003cli\u003eID3 algorithm (Iterative Dichotomiser 3)\u003c/li\u003e\n                    \u003cli\u003eC4.5 and C5.0\u003c/li\u003e\n                    \u003cli\u003eCHAID（Chi-squared Automatic Interaction Detection）\u003c/li\u003e\n                    \u003cli\u003eRandom Forest\u003c/li\u003e\n                    \u003cli\u003eMultivariate Adaptive Regression Spline (MARS)\u003c/li\u003e\n                    \u003cli\u003eGradient Boosting Machine (GBM)\u003c/li\u003e\n                \u003c/ul\u003e\n            \u003c/td\u003e\n        \u003c/tr\u003e\n        \u003ctr\u003e\n            \u003ctd style=\"width: 241px;\"\u003e\n                \u003ch3\u003e\u003cstrong\u003e\u0026nbsp;Bayesian Algorithm\u003c/strong\u003e\u003c/h3\u003e\n            \u003c/td\u003e\n            \u003ctd style=\"width: 372px;\"\u003e\n                \u003cul\u003e\n                    \u003cli\u003eNaive Bayes\u003c/li\u003e\n                    \u003cli\u003eGaussian Bayes\u003c/li\u003e\n                    \u003cli\u003ePolynomial naive Bayes\u003c/li\u003e\n                    \u003cli\u003eAODE（Averaged One-Dependence Estimators）\u003c/li\u003e\n                    \u003cli\u003eBayesian Belief Network\u003c/li\u003e\n                \u003c/ul\u003e\n            \u003c/td\u003e\n        \u003c/tr\u003e\n        \u003ctr\u003e\n            \u003ctd style=\"width: 241px;\"\u003e\n                \u003ch3\u003e\u003cstrong\u003e\u0026nbsp;Kernel-based Algorithm\u003c/strong\u003e\u003c/h3\u003e\n            \u003c/td\u003e\n            \u003ctd style=\"width: 372px;\"\u003e\n                \u003cul\u003e\n                    \u003cli\u003eSupport vector machine (SVM)\u003c/li\u003e\n                    \u003cli\u003eRadial Basis Function (RBF)\u003c/li\u003e\n                    \u003cli\u003eLinear Discriminate Analysis (LDA)\u003c/li\u003e\n                \u003c/ul\u003e\n            \u003c/td\u003e\n        \u003c/tr\u003e\n        \u003ctr\u003e\n            \u003ctd style=\"width: 241px;\"\u003e\u0026nbsp;\n                \u003ch3\u003e\u003cstrong\u003e\u0026nbsp;Clustering Algorithm\u003c/strong\u003e\u003c/h3\u003e\n            \u003c/td\u003e\n            \u003ctd style=\"width: 372px;\"\u003e\n                \u003cul\u003e\n                    \u003cli\u003eK \u0026mdash; mean\u003c/li\u003e\n                    \u003cli\u003eK \u0026mdash; medium number\u003c/li\u003e\n                    \u003cli\u003eEM algorithm\u003c/li\u003e\n                    \u003cli\u003eHierarchical clustering\u003c/li\u003e\n                \u003c/ul\u003e\n            \u003c/td\u003e\n        \u003c/tr\u003e\n        \u003ctr\u003e\n            \u003ctd style=\"width: 241px;\"\u003e\u0026nbsp;\n                \u003ch3\u003e\u003cstrong\u003e\u0026nbsp;Association Rule Learning\u003c/strong\u003e\u003c/h3\u003e\n            \u003c/td\u003e\n            \u003ctd style=\"width: 372px;\"\u003e\n                \u003cul\u003e\n                    \u003cli\u003e\u0026nbsp;Apriori algorithm\u003c/li\u003e\n                    \u003cli\u003e\u0026nbsp;Eclat algorithm\u003c/li\u003e\n                \u003c/ul\u003e\n            \u003c/td\u003e\n        \u003c/tr\u003e\n        \u003ctr\u003e\n            \u003ctd style=\"width: 241px;\"\u003e\u0026nbsp;\n                \u003ch3\u003e\u003cstrong\u003e\u0026nbsp;Neural Networks\u003c/strong\u003e\u003c/h3\u003e\n            \u003c/td\u003e\n            \u003ctd style=\"width: 372px;\"\u003e\n                \u003cul\u003e\n                    \u003cli\u003eSensor\u003c/li\u003e\n                    \u003cli\u003eBackpropagation algorithm (BP)\u003c/li\u003e\n                    \u003cli\u003eHopfield network\u003c/li\u003e\n                    \u003cli\u003eRadial Basis Function Network (RBFN)\u003c/li\u003e\n                \u003c/ul\u003e\n            \u003c/td\u003e\n        \u003c/tr\u003e\n        \u003ctr\u003e\n            \u003ctd style=\"width: 241px;\"\u003e\n                \u003ch3\u003e\u003cstrong\u003e\u0026nbsp;Deep Learning\u003c/strong\u003e\u003c/h3\u003e\n            \u003c/td\u003e\n            \u003ctd style=\"width: 372px;\"\u003e\n                \u003cul\u003e\n                    \u003cli\u003eDeep Boltzmann Machine (DBM)\u003c/li\u003e\n                    \u003cli\u003eConvolutional Neural Network (CNN)\u003c/li\u003e\n                    \u003cli\u003eRecurrent neural network (RNN, LSTM)\u003c/li\u003e\n                    \u003cli\u003eStacked Auto-Encoder\u003c/li\u003e\n                \u003c/ul\u003e\n            \u003c/td\u003e\n        \u003c/tr\u003e\n        \u003ctr\u003e\n            \u003ctd style=\"width: 241px;\"\u003e\n                \u003ch3\u003e\u003cstrong\u003e\u0026nbsp;Dimensionality Reduction Algorithm\u003c/strong\u003e\u003c/h3\u003e\n            \u003c/td\u003e\n            \u003ctd style=\"width: 372px;\"\u003e\n                \u003cul\u003e\n                    \u003cli\u003ePrincipal Component Analysis (PCA)\u003c/li\u003e\n                    \u003cli\u003ePrincipal component regression (PCR)\u003c/li\u003e\n                    \u003cli\u003ePartial least squares regression (PLSR)\u003c/li\u003e\n                    \u003cli\u003eSalmon map\u003c/li\u003e\n                    \u003cli\u003eMultidimensional scaling analysis (MDS)\u003c/li\u003e\n                    \u003cli\u003eProjection pursuit method (PP)\u003c/li\u003e\n                    \u003cli\u003eLinear Discriminant Analysis (LDA)\u003c/li\u003e\n                    \u003cli\u003eMixed Discriminant Analysis (MDA)\u003c/li\u003e\n                    \u003cli\u003eQuadratic Discriminant Analysis (QDA)\u003c/li\u003e\n                    \u003cli\u003eFlexible Discriminant Analysis (FDA\u003c/li\u003e\n                \u003c/ul\u003e\n            \u003c/td\u003e\n        \u003c/tr\u003e\n        \u003ctr\u003e\n            \u003ctd style=\"width: 241px;\"\u003e\n                \u003ch3\u003e\u003cstrong\u003e\u0026nbsp;Integrated Algorithm\u003c/strong\u003e\u003c/h3\u003e\n            \u003c/td\u003e\n            \u003ctd style=\"width: 372px;\"\u003e\u0026nbsp;\n                \u003cul\u003e\n                    \u003cli\u003eBoosting\u003c/li\u003e\n                    \u003cli\u003eBagging\u003c/li\u003e\n                    \u003cli\u003eAdaBoost\u003c/li\u003e\n                    \u003cli\u003eStack generalization (mixed)\u003c/li\u003e\n                    \u003cli\u003eGBM algorithm\u003c/li\u003e\n                    \u003cli\u003eGBRT algorithm\u003c/li\u003e\n                    \u003cli\u003eRandom forest\u003c/li\u003e\n                \u003c/ul\u003e\n            \u003c/td\u003e\n        \u003c/tr\u003e\n        \u003ctr\u003e\n            \u003ctd style=\"width: 241px;\"\u003e\n                \u003ch3\u003e\u003cstrong\u003e\u0026nbsp;Other Algorithms\u003c/strong\u003e\u003c/h3\u003e\n            \u003c/td\u003e\n            \u003ctd style=\"width: 372px;\"\u003e\u0026nbsp;\n                \u003cul\u003e\n                    \u003cli\u003eFeature selection algorithm\u003c/li\u003e\n                    \u003cli\u003ePerformance evaluation algorithm\u003c/li\u003e\n                    \u003cli\u003eNatural language processing\u003c/li\u003e\n                    \u003cli\u003eComputer vision\u003c/li\u003e\n                    \u003cli\u003eRecommended system\u003c/li\u003e\n                    \u003cli\u003eReinforcement learning\u003c/li\u003e\n                    \u003cli\u003eMigration learning\u003c/li\u003e\n                \u003c/ul\u003e\n            \u003c/td\u003e\n        \u003c/tr\u003e\n    \u003c/tbody\u003e\n\u003c/table\u003e\n\u003cp\u003e\u0026nbsp;\u003c/p\u003e\n\n---\n## __Popular Machine Learning Algorithms__\n\n## :one:__Linear Regression:__\n\n```python\n\n# Import Library\n# Import other necessary libraries like panda, numpy...\n\nfrom sklearn import linear_model\n\n# Load Train and Test datasets\n# Identify feature and response variable(s) and \n# values must be numeric and numpy arrays\n\nx_train = input_variables_values_training_datasets\ny_train = target_variables_values_training_datasets  \nx_test = input_variables_values_test_datasets\n\n# Create linear regression object\nlinear = linear model.LinearRegression()\n\n#Train the model using the training sets and\n#check score \n\nlinear.fit(x train, y_train)\nlinear.score(x train, y_train)\n\n# Equation coefficient and Intercept\n\nprint('Coefficient: \\n', linear.coef_)\nprint('Intercept: \\n', linear. intercept_) \n\n#Predict Output \npredicted = linear.predict(x_test) \n```\n\n\n\n## :two:__Logistic Regression:__\n\n```python\n\n# Import Library \nfrom sklearn.linear model import LogisticRegression\n\n# Assumed you have, X (predictor) and Y (target) \n# for training data set and x_test(predictor) of test dataset \n\n# Create logistic regression object \nmodel = LogisticRegression()\n\n# Train the model using the training sets and check score \nmodel.fit(X, y)\nmodel.score(X, y)\n\n# Equation coefficient and Intercept \nprint('Coefficient: \\n', model.coef_) \nprint('Intercept: \\n', model.intercept_)\n\n# Predict Output\npredicted = model. predict(x_test) \n\n```\n\n\n## :three:__Decision Tree:__\n\n```python\n\n# Import Library\n# Import other necessary libraries like pandas, numpy...\n\nfrom sklearn import tree\n\n# Assumed you have, X (predictor) and Y (target) for\n# training data set and x_test(predictor) of test dataset \n\n# Create tree object \nmodel = tree.DecisionTreeClassifier(criterion='gini') \n\n# for classification, here you can change the\n# algorithm as gini or entropy (information gain) by \n# default it is gini \n\nmodel = tree.DecisionTreeRegressor() # for regression\n\n# Train the model using the training sets and check score \nmodel.fit(X, y)\nmodel.score(X, y) \n\n# Predict Output \npredicted = model.predict(x_test) \n```\n\n\n## :four:__Support Vector Machine(SVM):__\n\n```python\n\n# Import Library\nfrom sklearn import svm\n\n# Assumed you have, X (predictor) and Y (target) for\n# training data set and x_test(predictor) of test_dataset \n\n# Create SVM classification object\nmodel = svm.svc()\n\n# there are various options associated with it, this is simple for classification.\n\n# Train the model using the training sets \u0026 check the score\nmodel.fit(X, y)\nmodel.score(X, y)\n\n# Predict Output \npredicted = model.predict(x_test) \n\n```\n\n## :five:__Naive Bayes:__\n\n```python\n\n# Import Library\nfrom sklearn.naive bayes import GaussianNB\n\n# Assumed you have, X (predictor) and Y (target) for\n# training data set and x_test(predictor) of test_dataset \n\n# Create SVM classification object \nmodel = GaussianNB()\n\n# there is other distribution for multinomial classes like Bernoulli Naive Bayes\n\n# Train the model using the training sets and check score\nmodel.fit(X, y)\n\n# Predict Output \npredicted = model.predict(x_test) \n\n```\n\n\n## :six:__K-Nearest Neighbors(kNN):__\n\n```python\n\n# Import Library \nfrom sklearn.neighbors import KNeighborsClassifier\n\n# Assumed you have, X (predictor) and Y (target) for \n# training data set and x_test(predictor) of test_dataset\n\n# Create KNeighbors classifier object model\nKNeighborsClassifier(n_neighbors=6) # default value for n neighbors is 5\n\n\n# Train the model using the training sets and check score\nmodel.fit(X, y)\n\n# Predict Output\npredicted = model.predict(x_test) \n\n```\n\n## :seven:__k-Means Clustering:__\n\n```python\n\n# Import Library\nfrom sklearn.cluster import KMeans\n\n# Assumed you have, X (attributes) for training data set \n# and x test(attributes) of test dataset\n\n# Create KNeighbors classifier object model\nk means - KMeans(n clusters-3, random state=0)\n\n#Train the model using the training sets and check score\nmodel.fit(X)\n\n#Predict Output \npredicted = model.predict(x_test) \n\n```\n## :eight:__Random Forest:__\n\n```python\n\n# Import Library\nfrom sklearn.ensemble import RandomForestClassifier\n\n# Assumed you have, X (predictor) and Y (target) for \n# training data set and x_test(predictor) of test_dataset\n\n# Create Random Forest object\nmodel= RandomForestClassifier()\n\n# Train the model using the training sets and check score\nmodel.fit(X, y)\n\n# Predict Output \npredicted = model.predict(x_test) \n```\n\n\n\n## :nine:__Dimensionality Reduction Algorithms(e.g. PCA):__\n\n```python\n\n# Import Library \nfrom sklearn import decomposition\n\n# Assumed you have training and test data set as train and test\n\n# Create PCA object \npca= decomposition.PCA(n_components=k) # default value of k -min(n sample, n features)\n\n# For Factor analysis \nfa= decomposition.FactorAnalysis()\n\n# Reduced the dimension of training dataset using PCA \ntrain_reduced = pca.fit_transform(train)\n\n# Reduced the dimension of test dataset\ntest_reduced = pca.transform(test) \n```\n\n## :one::zero:__Gradient Boosting \u0026 AdaBoost(e.g. GBDT):__\n\n```python\n \n# Import Library \nfrom sklearn.ensemble import GradientBoostingClassifier\n\n# Assumed you have, X (predictor) and Y (target) for \n# training data set and x_test(predictor) of test_dataset\n\n# Create Gradient Boosting Classifier object\nmodel= GradientBoostingClassifier(n_estimators=100, \\\n         learning_rate=1.0, max_depth=1, random_state=0)\n         \n# Train the model using the training sets and check score \nmodel.fit(X, y) \n\n# Predict Output \npredicted = model.predict(x_test) \n```\n\n\n### Connect with me:\n\n\n[\u003cimg align=\"left\" alt=\"codeSTACKr | Twitter\" width=\"22px\" src=\"https://cdn.jsdelivr.net/npm/simple-icons@v3/icons/twitter.svg\" /\u003e][twitter]\n[\u003cimg align=\"left\" alt=\"codeSTACKr | LinkedIn\" width=\"22px\" src=\"https://cdn.jsdelivr.net/npm/simple-icons@v3/icons/linkedin.svg\" /\u003e][linkedin]\n[\u003cimg align=\"left\" alt=\"codeSTACKr.com\" width=\"22px\" src=\"https://raw.githubusercontent.com/iconic/open-iconic/master/svg/globe.svg\" /\u003e][StackExchange AI]\n\n[twitter]: https://twitter.com/F4izy\n[linkedin]: https://www.linkedin.com/in/faizy-mohd-836573122/\n[StackExchange AI]: https://ai.stackexchange.com/users/36737/cypher\n\n\n---\n\n\n![Faizy's github stats](https://github-readme-stats.vercel.app/api?username=mohd-faizy\u0026show_icons=true)\n\n\n[![Top Langs](https://github-readme-stats.vercel.app/api/top-langs/?username=mohd-faizy\u0026layout=compact)](https://github.com/mohd-faizy/github-readme-stats)\n\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmohd-faizy%2Fmachine-learning-algorithms","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fmohd-faizy%2Fmachine-learning-algorithms","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmohd-faizy%2Fmachine-learning-algorithms/lists"}