Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/adityapathakk/cats-vs-dogs
Couldn't decide if I saw a cat or a dog, so I made a guide on CNNs & Transfer Learning with the Dogs Vs Cats dataset on Kaggle (check README.md).
https://github.com/adityapathakk/cats-vs-dogs
cats-vs-dogs cnn-classification deep-learning kaggle keras python tensorflow transfer-learning
Last synced: about 2 months ago
JSON representation
Couldn't decide if I saw a cat or a dog, so I made a guide on CNNs & Transfer Learning with the Dogs Vs Cats dataset on Kaggle (check README.md).
- Host: GitHub
- URL: https://github.com/adityapathakk/cats-vs-dogs
- Owner: adityapathakk
- Created: 2024-08-24T09:48:49.000Z (6 months ago)
- Default Branch: main
- Last Pushed: 2024-08-24T18:31:17.000Z (6 months ago)
- Last Synced: 2024-12-21T05:42:12.776Z (about 2 months ago)
- Topics: cats-vs-dogs, cnn-classification, deep-learning, kaggle, keras, python, tensorflow, transfer-learning
- Language: Jupyter Notebook
- Homepage: https://www.kaggle.com/code/adityapathak03/comprehensive-guide-to-cnns-with-cats-vs-dogs
- Size: 2.34 MB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Project: Classifier for Cats and Dogs
### Tech-Stack: Deep Learning in Python
- Libraries: Tensorflow, Keras, OpenCV, Matplotlib, Pandas
- Concepts: CNNs, Data Augmentation, Transfer Learning, Data Visualisation
- Tools: Google Colab, Kaggle, VS CodeI developed this FOR FUN to revise Deep Learning fundamentals using the popular classification problem - Cats Vs Dogs!
If you'd like to upvote/comment/copy this notebook on Kaggle, here's the link to my Comprehensive Guide to CNNs with Cats Vs Dogs!# Project Methodology & Details
### 1. Importing Libraries and Loading Dataset
The dataset used for this project can be found here. It is a popular dataset with a file size of 1GB.
Refer to the picture below to see some of the libraries that were used.
#### 1.1 Data Normalisation
A ```process``` function was defined to maintain the pixel values of the images in the dataset between 0 and 1 as opposed to 0 and 255.
### 2. Data Visualisation
I observed some of the images present in the dataset.
![]()
![]()
### 3. Developing Simple CNNs
I developed 5 simple CNNs with increasing complexity -- Model 1 is a basic CNN for introductory purposes.
- Model 2 shows how Batch Normalisation and Dropout increase validation accuracy quite a bit. This is a type of regularisation.
- Model 3 tries L1 regularisation, while Model 4 tries L2 regularisation.
- Model 5 is a complex CNN that does quite well. It has a lot of convolutional layers that can gather a lot of information from the dataset.Take a look below for some snapshots of the models and their training process -
![]()
![]()
![]()
![]()
![]()
Out of all these, Model 5 performed the best, with both training accuracy and validation accuracy reaching > 91%!### 4. CNN with Augmented Data
Using some simple data augmentation layers like ```RandomFlip```, ```RandomRotation```, ```RandomZoom```, ```RandomTranslation```, and ```RandomContrast```, implemented a CNN with augmented data.
### 5. Transfer Learning - VGG16
Harnessed the popular pre-trained model VGG16 to achieve extremely accurate results.
![]()
![]()
### 6. Visualized Accuracy with Graphs
Plotted graphs for training accuracy vs validation accuracy and training loss vs validation loss for all the models![image](https://github.com/user-attachments/assets/dff73ade-ff40-487d-8acf-4c1548259bb6)
### 7. Making Predictions
2 images were selected at random. After resizing, predictions were made using the VGG model. They were correct :D
### 8. Closing Note
- If you observe the model fitting process of some models (for example, CNN Model #4), then you'll notice that the model reached a certain peak in accuracy during training, but when the number of epochs ended, it was nowhere near that peak. To fix this, we can use callback functions like 'Early Stopping' and 'Model Checkpoint'.
- You have to be extremely patient with Deep Learning. You have to keep evaluating model performance, fine-tune/update weights/layers and just find out what works and what doesn't. All this can be time-consuming, but it has to be done! So don't worry about taking your time.# Thanks!
For checking this out. I hope you learnt something new here!