https://github.com/sohaib90/ml-coding-practice
ML Coding Practices
https://github.com/sohaib90/ml-coding-practice
100daysofmlcode datascience deeplearnign-ai machinelearning-python
Last synced: 3 months ago
JSON representation
ML Coding Practices
- Host: GitHub
- URL: https://github.com/sohaib90/ml-coding-practice
- Owner: Sohaib90
- Created: 2018-10-24T11:02:58.000Z (over 6 years ago)
- Default Branch: master
- Last Pushed: 2019-12-15T13:24:39.000Z (over 5 years ago)
- Last Synced: 2025-01-24T12:25:16.583Z (4 months ago)
- Topics: 100daysofmlcode, datascience, deeplearnign-ai, machinelearning-python
- Language: Jupyter Notebook
- Homepage:
- Size: 59.4 MB
- Stars: 1
- Watchers: 1
- Forks: 1
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Machine Learning Practice
Day 1 :
Data Preprocessing for Machine Leanring ModelsThings learned :
- Data Selection: Consider what data is available, what data is missing and what data can be removed.
- Data Preprocessing: Organize selected data by formatting, cleaning and sampling from it.
- Data Transformation: Transform preprocessed data ready for machine learning by engineering features using scaling, attribute decomposition and attribute aggregation.
- Used numpy, matplotlib, numpy and sklearn libraries to handle, visualize and manipulate data
Day 2 :
Support Vector Machines: Implementation in Python
Things learned :
- What is SVM? Concepts and theory
- Implementation in python: SVM and Kernel SVM
- Different Kernel SVMs and comparison
Day 3 :
Decison Trees: Implementation in Python (also sklearn implementation)
Things learned : (tutorial helps from machinelearningmastery)
- How Decision Trees work
- Splits made on the basis of entropy or Gini Index
- Implementation in sklearn and from scratch
Day 4 :
Logistic Regression and Feed Forward network to recognize handwritten digits (Tensorflow)
Things learned : (Fundamentals of Deep Learning, Chapter 3)
- Logistic Regression on MNIST Data
- Feed Forward Network on MNIST data and comparison
- Tensorflow implementation : using variable scope and name scope for network
Day 5 :
Beyond Gradient Descent (Chapter 4 of Fundamentals of Deep Learning by Nikhil Budma)
Things learned
- Challenges with Gradient Descent: Local Minima and their effect in deep learning error surfaces
- Momentum based Optimization: keeping memory of grdients for smoother error surfaces
- Learning Rate Adaptation: (1) Adagrad (2) RMSProp (3) Adam
- Adagrad accumaltes and adapts the global learning rate using istorical gradients
- RMSProp is exponentially weighted moving average of gradients: it enables us to "toss out" measurements we made a long time ago
- Adam is the variant of both RMSProp and AutoGrad
Day 6 :
MADL-Videos Day (Machine and Deep Learning- Videos Day)
Watched Documentaries and videos relating ML and DL
- Watched AlphaGo Documentary to learn how the game was built to be smart enough to beat the world champion
- Watched Jeremy Howard's account on machine learning and computer vision integrated with machine learning
- Fei-Fei Li: How we're teaching computers to understand pictures
Day 7 :
Naive Bayes Classification on diabetes dataset
Things learned
- Implementation of Naive Bayes Classifier in python
- Class probability and attribute probability based classifier
- Functions for class probabilities and attribute probabilities
- More work required on the concepts
Day 8 :
Lecture 2/3 of Bloomberg Foundation of Machine Learning: (2) Churn Prediction (3) Statistical Machine Learning
Things learned
- How to think about a machine learning problem
- Howto think about the output and to analyse what we want to predict from the model
- While thinking about features and input values, think about the availability of all the data at deployment time
Day 9 :
Speech Recognition with Python, a simple Guess word game
Things learned
- How to use speech recognition library in python for speech_recognition from microphone
- Develop a small guessing game based on the recognized speech from microphone
- Theory of how it all works
Day 10 :
Convolutional Neural Networks: Introduction and Implementation
Things learned
- What are convolutional neural networks and what was the need?
- What are filters and feature maps and how convolution helps extracting features
- Implementation on MNIST data using Tensorflow
Day 11 :
Convolutional Neural Networks: Day 10 continued
Things learned
- Batch normalization and how it is helpful for training
- How CIFAR 10 dataset is handled by using batch normalization
- Implementation of the network
Day 12
Chapter 6: Fundamentals of Deep Learning Book (Embedding and Representation Learning)
Things learned
- Embedding and Representation Learning: A way to escape the curse of dimensionality
- Principal Component Analysis: concepts and mathematical formulation study
- AutoEncoders: Introduction and basic concepts
Day 13
Chapter 6 (continued): Fundamentals of Deep Learning Book (Embedding and Representation Learning)
Things learned
- Denoising Autoencoders: More Robust Autoencoders
- Introducing Sparsity in Autoencoders
- When context is important in representations: English language as an example and their representation learning
- Coded an mnist autoencoder (using dense layers) in keras
Day 14
Autoencoder: Continued (Using convolutional neural networks)
Things learned
- Implementing Autoencoders using convolutional neural networks
- Using convo layers works better than fully connected layer in terms of reconstruction
Day 15
Deep Learning and tutorials of Pytorch (A numpy based deep learning framework)
Things learned
- pytorch methods for building neural networks
- Data loading and manipulations to tensors in pytorch
- Autograd and backprop concepts in pytorch
Day 16
Data Loading and Processing with Pytorch (pytorch tutorials continued)
Things learned
- How to create your own custom dataloader
- How to transform your data to make it same
- How to use DataLoader pytorch class to enable batching, shuffling and parallel loading of data
Day 17
Leraning Pytorch with Examples
Things learned
- Main Features of Pytorch: n-dimensional tensors, kinda like numpy arrays but can run on GPU. Second important is Automatic differentiation for building and training networks
- Deeper understanding of Autograd and how pytorch builds computational graphs to computer gradients and weight updates
- Difference between Tensorflow and Pytorch: Dynamic and static graph
- Hw to build custom nn modules and optimizers
Day 18
Transfer Learning with Pytorch
Things learned
- What is transfer learning and the importance of transfer learning: fine-tuning, as a fixed feature extractor
- Constraints of Transfer learning
- Implementation of transfer learning using pytorch (help from pytorch tutorial)
Day 19
Recurrent Neural Networks and Sequence-to-Sequence (Tutorials and concepts)
Day 20
How Google does Machine Learning (Learning google cloud platform and ML APIs)
The course took 5 days with extensive lab introduction and practice