https://github.com/mttcrn/anndl-project
Artificial Neural Network and Deep Learning - Challenges - A.Y. 2024/2025
https://github.com/mttcrn/anndl-project
deep-neural-networks image-classifier image-segmentation keras-tensorflow
Last synced: about 1 month ago
JSON representation
Artificial Neural Network and Deep Learning - Challenges - A.Y. 2024/2025
- Host: GitHub
- URL: https://github.com/mttcrn/anndl-project
- Owner: mttcrn
- Created: 2024-11-11T09:47:32.000Z (6 months ago)
- Default Branch: main
- Last Pushed: 2025-04-01T08:42:51.000Z (about 2 months ago)
- Last Synced: 2025-04-01T09:32:54.848Z (about 2 months ago)
- Topics: deep-neural-networks, image-classifier, image-segmentation, keras-tensorflow
- Language: Jupyter Notebook
- Homepage:
- Size: 150 MB
- Stars: 0
- Watchers: 2
- Forks: 2
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Kaggle challenges - Artificial Neural Networks and Deep Learning
This repository contains projects developed for the Artificial Neural Networks and Deep Learning course at Politecnico di Milano during the academic year 2024-2025.The repository is organized into two main challenges:
## Challenge 1: Image Classification
- Objective: Classify images of red blood cells into different categories based on their morphology.
- Approach: Implemented **transfer learning** with **DenseNet161**, **fine-tuning** the model to optimize classification accuracy.
- Challenges: Dealing with **class imbalance**, enhancing feature extraction, and optimizing generalization.
- Results: Achieved a high classification accuracy using data augmentation, **SMOTE** for dataset balancing, and fine-tuning selected layers.## Challenge 2: Image Segmentation
- Objective: Classify Mars surface images into distinct terrain categories.
- Approach: Implemented **U-Net** with custom architectural modifications (SE blocks, **attention mechanisms**).
- Challenges: No pre-trained models allowed, requiring training from scratch.
- Results: Achieved competitive segmentation accuracy, optimizing performance through **data augmentation** and **loss function tuning**.