Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/adityapathakk/cats-vs-dogs

Couldn't decide if I saw a cat or a dog, so I made a guide on CNNs & Transfer Learning with the Dogs Vs Cats dataset on Kaggle (check README.md).
https://github.com/adityapathakk/cats-vs-dogs

cats-vs-dogs cnn-classification deep-learning kaggle keras python tensorflow transfer-learning

Last synced: about 2 months ago
JSON representation

Couldn't decide if I saw a cat or a dog, so I made a guide on CNNs & Transfer Learning with the Dogs Vs Cats dataset on Kaggle (check README.md).

Awesome Lists containing this project

README

        

# Project: Classifier for Cats and Dogs
### Tech-Stack: Deep Learning in Python
- Libraries: Tensorflow, Keras, OpenCV, Matplotlib, Pandas
- Concepts: CNNs, Data Augmentation, Transfer Learning, Data Visualisation
- Tools: Google Colab, Kaggle, VS Code

I developed this FOR FUN to revise Deep Learning fundamentals using the popular classification problem - Cats Vs Dogs!

If you'd like to upvote/comment/copy this notebook on Kaggle, here's the link to my Comprehensive Guide to CNNs with Cats Vs Dogs!

# Project Methodology & Details

### 1. Importing Libraries and Loading Dataset
The dataset used for this project can be found here. It is a popular dataset with a file size of 1GB.

Refer to the picture below to see some of the libraries that were used.

Screenshot 2024-08-24 at 3 47 56 PM

#### 1.1 Data Normalisation
A ```process``` function was defined to maintain the pixel values of the images in the dataset between 0 and 1 as opposed to 0 and 255.

Screenshot 2024-08-24 at 3 51 10 PM

### 2. Data Visualisation
I observed some of the images present in the dataset.


Screenshot 2024-08-24 at 4 21 30 PM
Screenshot 2024-08-24 at 3 54 01 PM

### 3. Developing Simple CNNs
I developed 5 simple CNNs with increasing complexity -

- Model 1 is a basic CNN for introductory purposes.
- Model 2 shows how Batch Normalisation and Dropout increase validation accuracy quite a bit. This is a type of regularisation.
- Model 3 tries L1 regularisation, while Model 4 tries L2 regularisation.
- Model 5 is a complex CNN that does quite well. It has a lot of convolutional layers that can gather a lot of information from the dataset.

Take a look below for some snapshots of the models and their training process -


CNN Model 1
CNN Model 2
CNN Model 3
CNN Model 4
CNN Model 5


Out of all these, Model 5 performed the best, with both training accuracy and validation accuracy reaching > 91%!

### 4. CNN with Augmented Data
Using some simple data augmentation layers like ```RandomFlip```, ```RandomRotation```, ```RandomZoom```, ```RandomTranslation```, and ```RandomContrast```, implemented a CNN with augmented data.

Screenshot 2024-08-24 at 11 44 42 PM

### 5. Transfer Learning - VGG16
Harnessed the popular pre-trained model VGG16 to achieve extremely accurate results.


Screenshot 2024-08-24 at 11 46 40 PM
Screenshot 2024-08-24 at 11 47 46 PM

### 6. Visualized Accuracy with Graphs
Plotted graphs for training accuracy vs validation accuracy and training loss vs validation loss for all the models

![image](https://github.com/user-attachments/assets/dff73ade-ff40-487d-8acf-4c1548259bb6)

### 7. Making Predictions
2 images were selected at random. After resizing, predictions were made using the VGG model. They were correct :D

Screenshot 2024-08-24 at 11 52 19 PM

### 8. Closing Note
- If you observe the model fitting process of some models (for example, CNN Model #4), then you'll notice that the model reached a certain peak in accuracy during training, but when the number of epochs ended, it was nowhere near that peak. To fix this, we can use callback functions like 'Early Stopping' and 'Model Checkpoint'.
- You have to be extremely patient with Deep Learning. You have to keep evaluating model performance, fine-tune/update weights/layers and just find out what works and what doesn't. All this can be time-consuming, but it has to be done! So don't worry about taking your time.

# Thanks!
For checking this out. I hope you learnt something new here!