{"id":13468270,"url":"https://github.com/Avik-Jain/100-Days-Of-ML-Code","last_synced_at":"2025-03-26T05:30:57.801Z","repository":{"id":37549502,"uuid":"139824423","full_name":"Avik-Jain/100-Days-Of-ML-Code","owner":"Avik-Jain","description":"100 Days of ML Coding","archived":false,"fork":false,"pushed_at":"2023-12-29T07:57:53.000Z","size":10955,"stargazers_count":45249,"open_issues_count":62,"forks_count":10583,"subscribers_count":2440,"default_branch":"master","last_synced_at":"2024-10-29T09:17:12.525Z","etag":null,"topics":["100-days-of-code-log","100daysofcode","deep-learning","implementation","infographics","linear-algebra","linear-regression","logistic-regression","machine-learning","machine-learning-algorithms","naive-bayes-classifier","python","scikit-learn","siraj-raval","siraj-raval-challenge","support-vector-machines","svm","tutorial"],"latest_commit_sha":null,"homepage":"","language":null,"has_issues":false,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/Avik-Jain.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":"CONTRIBUTING.md","funding":null,"license":"LICENSE","code_of_conduct":"CODE_OF_CONDUCT.md","threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null}},"created_at":"2018-07-05T09:11:43.000Z","updated_at":"2024-10-29T08:16:16.000Z","dependencies_parsed_at":"2024-02-14T04:43:45.949Z","dependency_job_id":null,"html_url":"https://github.com/Avik-Jain/100-Days-Of-ML-Code","commit_stats":{"total_commits":109,"total_committers":8,"mean_commits":13.625,"dds":0.5045871559633027,"last_synced_commit":"5d67810c1498082e7bb262cf6397d7861dfd9891"},"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Avik-Jain%2F100-Days-Of-ML-Code","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Avik-Jain%2F100-Days-Of-ML-Code/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Avik-Jain%2F100-Days-Of-ML-Code/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/Avik-Jain%2F100-Days-Of-ML-Code/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/Avik-Jain","download_url":"https://codeload.github.com/Avik-Jain/100-Days-Of-ML-Code/tar.gz/refs/heads/master","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":245507060,"owners_count":20626537,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["100-days-of-code-log","100daysofcode","deep-learning","implementation","infographics","linear-algebra","linear-regression","logistic-regression","machine-learning","machine-learning-algorithms","naive-bayes-classifier","python","scikit-learn","siraj-raval","siraj-raval-challenge","support-vector-machines","svm","tutorial"],"created_at":"2024-07-31T15:01:07.950Z","updated_at":"2025-03-26T05:30:57.765Z","avatar_url":"https://github.com/Avik-Jain.png","language":null,"readme":"# 100-Days-Of-ML-Code\n\n100 Days of Machine Learning Coding as proposed by [Siraj Raval](https://github.com/llSourcell)\n\nGet the datasets from [here](https://github.com/Avik-Jain/100-Days-Of-ML-Code/tree/master/datasets)\n\n## Data PreProcessing | Day 1\nCheck out the code from [here](https://github.com/Avik-Jain/100-Days-Of-ML-Code/blob/master/Code/Day%201_Data%20PreProcessing.md).\n\n\u003cp align=\"center\"\u003e\n  \u003cimg src=\"https://github.com/Avik-Jain/100-Days-Of-ML-Code/blob/master/Info-graphs/Day%201.jpg\"\u003e\n\u003c/p\u003e\n\n## Simple Linear Regression | Day 2\nCheck out the code from [here](https://github.com/Avik-Jain/100-Days-Of-ML-Code/blob/master/Code/Day2_Simple_Linear_Regression.md).\n\n\u003cp align=\"center\"\u003e\n  \u003cimg src=\"https://github.com/Avik-Jain/100-Days-Of-ML-Code/blob/master/Info-graphs/Day%202.jpg\"\u003e\n\u003c/p\u003e\n\n## Multiple Linear Regression | Day 3\nCheck out the code from [here](https://github.com/Avik-Jain/100-Days-Of-ML-Code/blob/master/Code/Day3_Multiple_Linear_Regression.md).\n\n\u003cp align=\"center\"\u003e\n  \u003cimg src=\"https://github.com/Avik-Jain/100-Days-Of-ML-Code/blob/master/Info-graphs/Day%203.jpg\"\u003e\n\u003c/p\u003e\n\n## Logistic Regression | Day 4\n\n\u003cp align=\"center\"\u003e\n  \u003cimg src=\"https://github.com/Avik-Jain/100-Days-Of-ML-Code/blob/master/Info-graphs/Day%204.jpg\"\u003e\n\u003c/p\u003e\n\n## Logistic Regression | Day 5\nMoving forward into #100DaysOfMLCode today I dived into the deeper depth of what Logistic Regression actually is and what is the math involved behind it. Learned how cost function is calculated and then how to apply gradient descent algorithm to cost function to minimize the error in prediction.  \nDue to less time I will now be posting an infographic on alternate days.\nAlso if someone wants to help me out in documentaion of code and already has some experince in the field and knows Markdown for github please contact me on LinkedIn :) .\n\n## Implementing Logistic Regression | Day 6\nCheck out the Code [here](https://github.com/Avik-Jain/100-Days-Of-ML-Code/blob/master/Code/Day%206%20Logistic%20Regression.md)\n\n## K Nearest Neighbours | Day 7\n\u003cp align=\"center\"\u003e\n  \u003cimg src=\"https://github.com/Avik-Jain/100-Days-Of-ML-Code/blob/master/Info-graphs/Day%207.jpg\"\u003e\n\u003c/p\u003e\n\n## Math Behind Logistic Regression | Day 8 \n\n#100DaysOfMLCode To clear my insights on logistic regression I was searching on the internet for some resource or article and I came across this article (https://towardsdatascience.com/logistic-regression-detailed-overview-46c4da4303bc) by Saishruthi Swaminathan. \n\nIt gives a detailed description of Logistic Regression. Do check it out.\n\n## Support Vector Machines | Day 9\nGot an intution on what SVM is and how it is used to solve Classification problem.\n\n## SVM and KNN | Day 10\nLearned more about how SVM works and implementing the K-NN algorithm.\n\n## Implementation of K-NN | Day 11  \n\nImplemented the K-NN algorithm for classification. #100DaysOfMLCode \nSupport Vector Machine Infographic is halfway complete. Will update it tomorrow.\n\n## Support Vector Machines | Day 12\n\u003cp align=\"center\"\u003e\n  \u003cimg src=\"https://github.com/Avik-Jain/100-Days-Of-ML-Code/blob/master/Info-graphs/Day%2012.jpg\"\u003e\n\u003c/p\u003e\n\n## Naive Bayes Classifier | Day 13\n\nContinuing with #100DaysOfMLCode today I went through the Naive Bayes classifier.\nI am also implementing the SVM in python using scikit-learn. Will update the code soon.\n\n## Implementation of SVM | Day 14\nToday I implemented SVM on linearly related data. Used Scikit-Learn library. In Scikit-Learn we have SVC classifier which we use to achieve this task. Will be using kernel-trick on next implementation.\nCheck the code [here](https://github.com/Avik-Jain/100-Days-Of-ML-Code/blob/master/Code/Day%2013%20SVM.md).\n\n## Naive Bayes Classifier and Black Box Machine Learning | Day 15\nLearned about different types of naive bayes classifiers. Also started the lectures by [Bloomberg](https://bloomberg.github.io/foml/#home). First one in the playlist was Black Box Machine Learning. It gives the whole overview about prediction functions, feature extraction, learning algorithms, performance evaluation, cross-validation, sample bias, nonstationarity, overfitting, and hyperparameter tuning.\n\n## Implemented SVM using Kernel Trick | Day 16\nUsing Scikit-Learn library implemented SVM algorithm along with kernel function which maps our data points into higher dimension to find optimal hyperplane. \n\n## Started Deep learning Specialization on Coursera | Day 17\nCompleted the whole Week 1 and Week 2 on a single day. Learned Logistic regression as Neural Network. \n\n## Deep learning Specialization on Coursera | Day 18\nCompleted the Course 1 of the deep learning specialization. Implemented a neural net in python.\n\n## The Learning Problem , Professor Yaser Abu-Mostafa | Day 19\nStarted Lecture 1 of 18 of Caltech's Machine Learning Course - CS 156 by Professor Yaser Abu-Mostafa. It was basically an introduction to the upcoming lectures. He also explained Perceptron Algorithm.\n\n## Started Deep learning Specialization Course 2 | Day 20\nCompleted the Week 1 of Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization.\n\n## Web Scraping | Day 21\nWatched some tutorials on how to do web scraping using Beautiful Soup in order to collect data for building a model.\n\n## Is Learning Feasible? | Day 22\nLecture 2 of 18 of Caltech's Machine Learning Course - CS 156 by Professor Yaser Abu-Mostafa. Learned about Hoeffding Inequality.\n\n## Decision Trees | Day 23\n\u003cp align=\"center\"\u003e\n  \u003cimg src=\"https://github.com/Avik-Jain/100-Days-Of-ML-Code/blob/master/Info-graphs/Day%2023.jpg\"\u003e\n\u003c/p\u003e\n\n## Introduction To Statistical Learning Theory | Day 24\nLec 3 of Bloomberg ML course introduced some of the core concepts like input space, action space, outcome space, prediction functions, loss functions, and hypothesis spaces.\n\n## Implementing Decision Trees | Day 25\nCheck the code [here.](https://github.com/Avik-Jain/100-Days-Of-ML-Code/blob/master/Code/Day%2025%20Decision%20Tree.md)\n\n## Jumped To Brush up Linear Algebra | Day 26\nFound an amazing [channel](https://www.youtube.com/channel/UCYO_jab_esuFRV4b17AJtAw) on youtube 3Blue1Brown. It has a playlist called Essence of Linear Algebra. Started off by completing 4 videos which gave a complete overview of Vectors, Linear Combinations, Spans, Basis Vectors, Linear Transformations and Matrix Multiplication. \n\nLink to the playlist [here.](https://www.youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab)\n\n## Jumped To Brush up Linear Algebra | Day 27\nContinuing with the playlist completed next 4 videos discussing topics 3D Transformations, Determinants, Inverse Matrix, Column Space, Null Space and Non-Square Matrices.\n\nLink to the playlist [here.](https://www.youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab)\n\n## Jumped To Brush up Linear Algebra | Day 28\nIn the playlist of 3Blue1Brown completed another 3 videos from the essence of linear algebra. \nTopics covered were Dot Product and Cross Product.\n\nLink to the playlist [here.](https://www.youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab)\n\n\n## Jumped To Brush up Linear Algebra | Day 29\nCompleted the whole playlist today, videos 12-14. Really an amazing playlist to refresh the concepts of Linear Algebra.\nTopics covered were the change of basis, Eigenvectors and Eigenvalues, and Abstract Vector Spaces.\n\nLink to the playlist [here.](https://www.youtube.com/playlist?list=PLZHQObOWTQDPD3MizzM2xVFitgF8hE_ab)\n\n## Essence of calculus | Day 30\nCompleting the playlist - Essence of Linear Algebra by 3blue1brown a suggestion popped up by youtube regarding a series of videos again by the same channel 3Blue1Brown. Being already impressed by the previous series on Linear algebra I dived straight into it.\nCompleted about 5 videos on topics such as Derivatives, Chain Rule, Product Rule, and derivative of exponential.\n\nLink to the playlist [here.](https://www.youtube.com/playlist?list=PLZHQObOWTQDMsr9K-rj53DwVRMYO3t5Yr)\n\n## Essence of calculus | Day 31\nWatched 2 Videos on topic Implicit Diffrentiation and Limits from the playlist Essence of Calculus.\n\nLink to the playlist [here.](https://www.youtube.com/playlist?list=PLZHQObOWTQDMsr9K-rj53DwVRMYO3t5Yr)\n\n## Essence of calculus | Day 32\nWatched the remaining 4 videos covering topics Like Integration and Higher order derivatives.\n\nLink to the playlist [here.](https://www.youtube.com/playlist?list=PLZHQObOWTQDMsr9K-rj53DwVRMYO3t5Yr)\n\n## Random Forests | Day 33\n\u003cp align=\"center\"\u003e\n  \u003cimg src=\"https://github.com/Avik-Jain/100-Days-Of-ML-Code/blob/master/Info-graphs/Day%2033.jpg\"\u003e\n\u003c/p\u003e\n\n## Implementing Random Forests | Day 34\nCheck the code [here.](https://github.com/Avik-Jain/100-Days-Of-ML-Code/blob/master/Code/Day%2034%20Random_Forest.md)\n\n## But what *is* a Neural Network? | Deep learning, chapter 1  | Day 35\nAn Amazing Video on neural networks by 3Blue1Brown youtube channel. This video gives a good understanding of Neural Networks and uses Handwritten digit dataset to explain the concept. \nLink To the [video.](https://www.youtube.com/watch?v=aircAruvnKk\u0026t=7s)\n\n## Gradient descent, how neural networks learn | Deep learning, chapter 2 | Day 36\nPart two of neural networks by 3Blue1Brown youtube channel. This video explains the concepts of Gradient Descent in an interesting way. 169 must watch and highly recommended.\nLink To the [video.](https://www.youtube.com/watch?v=IHZwWFHWa-w)\n\n## What is backpropagation really doing? | Deep learning, chapter 3 | Day 37\nPart three of neural networks by 3Blue1Brown youtube channel. This video mostly discusses the partial derivatives and backpropagation.\nLink To the [video.](https://www.youtube.com/watch?v=Ilg3gGewQ5U)\n\n## Backpropagation calculus | Deep learning, chapter 4 | Day 38\nPart four of neural networks by 3Blue1Brown youtube channel. The goal here is to represent, in somewhat more formal terms, the intuition for how backpropagation works and the video moslty discusses the partial derivatives and backpropagation.\nLink To the [video.](https://www.youtube.com/watch?v=tIeHLnjs5U8)\n\n## Deep Learning with Python, TensorFlow, and Keras tutorial | Day 39\nLink To the [video.](https://www.youtube.com/watch?v=wQ8BIBpya2k\u0026t=19s\u0026index=2\u0026list=PLQVvvaa0QuDfhTox0AjmQ6tvTgMBZBEXN)\n\n## Loading in your own data - Deep Learning basics with Python, TensorFlow and Keras p.2 | Day 40\nLink To the [video.](https://www.youtube.com/watch?v=j-3vuBynnOE\u0026list=PLQVvvaa0QuDfhTox0AjmQ6tvTgMBZBEXN\u0026index=2)\n\n## Convolutional Neural Networks - Deep Learning basics with Python, TensorFlow and Keras p.3 | Day 41\nLink To the [video.](https://www.youtube.com/watch?v=WvoLTXIjBYU\u0026list=PLQVvvaa0QuDfhTox0AjmQ6tvTgMBZBEXN\u0026index=3)\n\n## Analyzing Models with TensorBoard - Deep Learning with Python, TensorFlow and Keras p.4 | Day 42\nLink To the [video.](https://www.youtube.com/watch?v=BqgTU7_cBnk\u0026list=PLQVvvaa0QuDfhTox0AjmQ6tvTgMBZBEXN\u0026index=4)\n\n## K Means Clustering | Day 43\nMoved to Unsupervised Learning and studied about Clustering.\nWorking on my website check it out [avikjain.me](http://www.avikjain.me/)\nAlso found a wonderful animation that can help to easily understand K - Means Clustering [Link](http://shabal.in/visuals/kmeans/6.html)\n\n\u003cp align=\"center\"\u003e\n  \u003cimg src=\"https://github.com/Avik-Jain/100-Days-Of-ML-Code/blob/master/Info-graphs/Day%2043.jpg\"\u003e\n\u003c/p\u003e\n\n## K Means Clustering Implementation | Day 44\nImplemented K Means Clustering. Check the code [here.]()\n\n## Digging Deeper | NUMPY  | Day 45\nGot a new book \"Python Data Science HandBook\" by JK VanderPlas Check the Jupyter notebooks [here.](https://github.com/jakevdp/PythonDataScienceHandbook)\n\u003cbr\u003eStarted with chapter 2 : Introduction to Numpy. Covered topics like Data Types, Numpy arrays and Computations on Numpy arrays.\n\u003cbr\u003eCheck the code - \n\u003cbr\u003e[Introduction to NumPy](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/02.00-Introduction-to-NumPy.ipynb)\n\u003cbr\u003e[Understanding Data Types in Python](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/02.01-Understanding-Data-Types.ipynb)\n\u003cbr\u003e[The Basics of NumPy Arrays](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/02.02-The-Basics-Of-NumPy-Arrays.ipynb)\n\u003cbr\u003e[Computation on NumPy Arrays: Universal Functions](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/02.03-Computation-on-arrays-ufuncs.ipynb)\n\n## Digging Deeper | NUMPY | Day 46\nChapter 2 : Aggregations, Comparisions and Broadcasting\n\u003cbr\u003eLink to Notebook:\n\u003cbr\u003e[Aggregations: Min, Max, and Everything In Between](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/02.04-Computation-on-arrays-aggregates.ipynb)\n\u003cbr\u003e[Computation on Arrays: Broadcasting](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/02.05-Computation-on-arrays-broadcasting.ipynb)\n\u003cbr\u003e[Comparisons, Masks, and Boolean Logic](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/02.06-Boolean-Arrays-and-Masks.ipynb)\n\n## Digging Deeper | NUMPY | Day 47\nChapter 2 : Fancy Indexing, sorting arrays, Struchered Data\n\u003cbr\u003eLink to Notebook:\n\u003cbr\u003e[Fancy Indexing](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/02.07-Fancy-Indexing.ipynb)\n\u003cbr\u003e[Sorting Arrays](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/02.08-Sorting.ipynb)\n\u003cbr\u003e[Structured Data: NumPy's Structured Arrays](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/02.09-\u003cbr\u003eStructured-Data-NumPy.ipynb)\n\n## Digging Deeper | PANDAS | Day 48\nChapter 3 : Data Manipulation with Pandas\n\u003cbr\u003e Covered Various topics like Pandas Objects, Data Indexing and Selection, Operating on Data, Handling Missing Data, Hierarchical Indexing, ConCat and Append.\n\u003cbr\u003eLink To the Notebooks:\n\u003cbr\u003e[Data Manipulation with Pandas](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/03.00-Introduction-to-Pandas.ipynb)\n\u003cbr\u003e[Introducing Pandas Objects](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/03.01-Introducing-Pandas-Objects.ipynb)\n\u003cbr\u003e[Data Indexing and Selection](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/03.02-Data-Indexing-and-Selection.ipynb)\n\u003cbr\u003e[Operating on Data in Pandas](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/03.03-Operations-in-Pandas.ipynb)\n\u003cbr\u003e[Handling Missing Data](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/03.04-Missing-Values.ipynb)\n\u003cbr\u003e[Hierarchical Indexing](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/03.05-Hierarchical-Indexing.ipynb)\n\u003cbr\u003e[Combining Datasets: Concat and Append](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/03.06-Concat-And-Append.ipynb)\n\n## Digging Deeper | PANDAS | Day 49\nChapter 3: Completed following topics- Merge and Join, Aggregation and grouping and Pivot Tables.\n\u003cbr\u003e[Combining Datasets: Merge and Join](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/03.07-Merge-and-Join.ipynb)\n\u003cbr\u003e[Aggregation and Grouping](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/03.08-Aggregation-and-Grouping.ipynb)\n\u003cbr\u003e[Pivot Tables](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/03.09-Pivot-Tables.ipynb)\n\n## Digging Deeper | PANDAS | Day 50\nChapter 3: Vectorized Strings Operations, Working with Time Series\n\u003cbr\u003eLinks to Notebooks:\n\u003cbr\u003e[Vectorized String Operations](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/03.10-Working-With-Strings.ipynb)\n\u003cbr\u003e[Working with Time Series](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/03.11-Working-with-Time-Series.ipynb)\n\u003cbr\u003e[High-Performance Pandas: eval() and query()](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/03.12-Performance-Eval-and-Query.ipynb)\n\n## Digging Deeper | MATPLOTLIB | Day 51\nChapter 4: Visualization with Matplotlib \nLearned about Simple Line Plots, Simple Scatter Plotsand Density and Contour Plots.\n\u003cbr\u003eLinks to Notebooks: \n\u003cbr\u003e[Visualization with Matplotlib](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/04.00-Introduction-To-Matplotlib.ipynb)\n\u003cbr\u003e[Simple Line Plots](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/04.01-Simple-Line-Plots.ipynb)\n\u003cbr\u003e[Simple Scatter Plots](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/04.02-Simple-Scatter-Plots.ipynb)\n\u003cbr\u003e[Visualizing Errors](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/04.03-Errorbars.ipynb)\n\u003cbr\u003e[Density and Contour Plots](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/04.04-Density-and-Contour-Plots.ipynb)\n\n## Digging Deeper | MATPLOTLIB | Day 52\nChapter 4: Visualization with Matplotlib \nLearned about Histograms, How to customize plot legends, colorbars, and buliding Multiple Subplots.\n\u003cbr\u003eLinks to Notebooks: \n\u003cbr\u003e[Histograms, Binnings, and Density](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/04.05-Histograms-and-Binnings.ipynb)\n\u003cbr\u003e[Customizing Plot Legends](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/04.06-Customizing-Legends.ipynb)\n\u003cbr\u003e[Customizing Colorbars](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/04.07-Customizing-Colorbars.ipynb)\n\u003cbr\u003e[Multiple Subplots](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/04.08-Multiple-Subplots.ipynb)\n\u003cbr\u003e[Text and Annotation](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/04.09-Text-and-Annotation.ipynb)\n\n## Digging Deeper | MATPLOTLIB | Day 53\nChapter 4: Covered Three Dimensional Plotting in Mathplotlib.\n\u003cbr\u003eLinks to Notebooks:\n\u003cbr\u003e[Three-Dimensional Plotting in Matplotlib](https://github.com/jakevdp/PythonDataScienceHandbook/blob/master/notebooks/04.12-Three-Dimensional-Plotting.ipynb)\n\n## Hierarchical Clustering | Day 54\nStudied about Hierarchical Clustering.\nCheck out this amazing [Visualization.](https://cdn-images-1.medium.com/max/800/1*ET8kCcPpr893vNZFs8j4xg.gif)\n\u003cp align=\"center\"\u003e\n  \u003cimg src=\"https://github.com/Avik-Jain/100-Days-Of-ML-Code/blob/master/Info-graphs/Day%2054.jpg\"\u003e\n\u003c/p\u003e\n","funding_links":[],"categories":["Tutorials","Don't forget to give a :star: to make the project popular","Python","Others","AI, ML, DL","Machine Learning","大数据/人工智能","Repos","机器学习","tutorial",":octocat: GitHub Repositories","Uncategorized","🧠 AI / ML / DS Resources","List of Most Starred Github Projects related to Deep Learning","Welcome to Learn101","🤖 Machine Learning \u0026 AI"],"sub_categories":["Machine Learning","ML","Ukraine","Uncategorized","📂 GitHub Repositories","机器学习","Roadmaps","Resources"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FAvik-Jain%2F100-Days-Of-ML-Code","html_url":"https://awesome.ecosyste.ms/projects/github.com%2FAvik-Jain%2F100-Days-Of-ML-Code","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2FAvik-Jain%2F100-Days-Of-ML-Code/lists"}