https://github.com/ialwayslikedgrime/deep_learning_exam_implementation
Multi Input Multi Output model for sentiment analysis task on a Hotel Reviews Dataset
https://github.com/ialwayslikedgrime/deep_learning_exam_implementation
ai deep-learning deep-neural-networks deeplearning keras long-short-term-memory lstm machine-learning neural-networks nlp rnn sentiment-analysis sentiment-classification sentiment-classifier tensorflow text-analysis text-classification text-processing
Last synced: 29 days ago
JSON representation
Multi Input Multi Output model for sentiment analysis task on a Hotel Reviews Dataset
- Host: GitHub
- URL: https://github.com/ialwayslikedgrime/deep_learning_exam_implementation
- Owner: ialwayslikedgrime
- Created: 2025-08-28T22:53:54.000Z (about 1 month ago)
- Default Branch: main
- Last Pushed: 2025-08-28T23:15:36.000Z (about 1 month ago)
- Last Synced: 2025-08-29T03:54:47.495Z (about 1 month ago)
- Topics: ai, deep-learning, deep-neural-networks, deeplearning, keras, long-short-term-memory, lstm, machine-learning, neural-networks, nlp, rnn, sentiment-analysis, sentiment-classification, sentiment-classifier, tensorflow, text-analysis, text-classification, text-processing
- Language: Jupyter Notebook
- Homepage:
- Size: 152 KB
- Stars: 0
- Watchers: 0
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README






# Deep Learning Exam ImplementationThis repository contains the implementation of my Deep Learning exam project for the course Machine Learning, Artificial Neural Networks and Deep Learning (Exam session: June 2025).
I scored full marks on this exam.
## Exam Context
The exam problem was structured into **six open questions**, covering the entire deep learning pipeline:
1. **Model** — choice of the most appropriate architecture and rationale
2. **Input** — preprocessing strategy, input types, shapes, and value domains
3. **Output** — design of output layers and justification
4. **Loss** — choice of loss functions and label formatting
5. **Model Configuration** — layer composition, hyperparameters, and optimization strategy
6. **Evaluation** — assessing generalization on unseen data**Format**:
- Students first answered these questions in writing, **without access to the dataset**.
- Afterwards, each student had to deliver a Colab notebook implementation that **faithfully adhered** to their written design choices, with no changes allowed.The original exam text is available at:
[Exam text (PDF)](docs/exam_test.pdf)## My Solution
- **Model Architecture**
Multi-input, multi-output neural network implemented with the **Keras Functional API**.- **Inputs**
- **Text reviews** → tokenized, padded sequences → Embedding + LSTM branch
- **Categorical metadata** → seasons, reviewer continent, hotel popularity quartiles → Dense layers branch- **Outputs**
- **Binary classification** → Good vs. Bad review (sigmoid activation)
- **Regression** → Review score (linear activation)- **Loss Functions**
- Binary cross-entropy (classification)
- Mean Squared Error (regression)
- Combined via weighted sum- **Optimization**
Adam optimizer with tuned hyperparameters: learning rate, dropout, batch size, LSTM units- **Evaluation**
- Baseline training
- Random search for hyperparameter tuning
- 5-fold cross-validation to assess generalizationThe original dataset (`input_data.pkl`) is no longer publicly available. (This is why there is not a requirements.txt anymore here)
done by me, ialwayslikedgrime alias grimey_s
let's connect on X ! https://x.com/grimey_s