https://github.com/marta-barea/mlp-iris-classifier
A simple project to train and evaluate a multilayer perceptron on the Iris Sepecies data using TensorFlow, SciKeras, and Scikit-Learn.
https://github.com/marta-barea/mlp-iris-classifier
classification-algorithm deep-learning iris-classification iris-dataset multilayer-perceptron neural-networks python
Last synced: about 2 months ago
JSON representation
A simple project to train and evaluate a multilayer perceptron on the Iris Sepecies data using TensorFlow, SciKeras, and Scikit-Learn.
- Host: GitHub
- URL: https://github.com/marta-barea/mlp-iris-classifier
- Owner: Marta-Barea
- License: gpl-3.0
- Created: 2025-06-08T12:29:18.000Z (4 months ago)
- Default Branch: main
- Last Pushed: 2025-06-19T13:12:12.000Z (4 months ago)
- Last Synced: 2025-06-19T14:24:52.034Z (4 months ago)
- Topics: classification-algorithm, deep-learning, iris-classification, iris-dataset, multilayer-perceptron, neural-networks, python
- Language: Python
- Homepage:
- Size: 101 KB
- Stars: 1
- Watchers: 0
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# MLP Iris Classifier
A simple project to train and evaluate a multilayer perceptron on the Iris Sepecies data using TensorFlow, SciKeras, and Scikit-Learn.
---
# Installation
1. Clone the repo
```bash
git clone https://github.com/yourusername/mlp-iris-classifier.git
cd mlp-iris-classifier
```2. Set up the Conda environment
It is included an `environment.yml` for Conda users:
```bash
conda env create -f environment.yml
conda activate mlp-iris-classifier
```# Dependencies
- Python 3.7+
- numpy, scikt-learn, tensorflow, scikeras, PyYAML, matplotlib.You can also install them with:
```bash
pip install -r requirements.txt
```# Usage
1. Verify the dataset
The [Iris Species Dataset](https://archive.ics.uci.edu/dataset/53/iris) from the UCI Machine Learning Repository is already included under `data/Iris.csv`.
2. Adjust settings
Open `config.yaml`and tweak any values you like (seed, test_size_hyperparameters list, etc.)
3. Run the full pipeline
```bash
python run_all.py
```This will:
- Train de MLP with randomized hyperparameter search
- Save the best model to `models/best_mlp.h5`
- Print train/test accuracy and predictions
- Save evaluation plots to `reports/`# Project Structure
```
mlp-iris-classifier/
│
├── config.yaml # Experiment settings
├── environment.yml # Conda environment spec
├── requirements.txt # Pinned pip dependencies (for Docker)
├── docker-compose.yml # Docker Compose setup
├── Dockerfile # Image build definition
├── .dockerignore
├── .gitignore
├── pytest.ini
│
├── data/
│ └── Iris.csv # Iris Species Dataset
│
├── models/ # (Auto-created) Trained model & params
│
├── reports/
│ └── figures # (Auto-created) Plots
│
├── tests/ # Test suite
│ ├── unit
│ ├── integration
│ └── e2e
│
├── src/
│ ├── config.py # Loads config.yaml
│ ├── data_loader.py # Reads & splits data
│ ├── model_builder.py # Defines the Keras MLP
│ ├── train.py # Hyperparameter search & model saving
│ └── evaluate.py # Loads model & prints metrics
│
└── run_all.py # Runs train.py then evaluate.py```
# Dockerized Support
This project is fully containerized for portability and reproducibility.
## Docker Dependencies
Before using Docker, you need to have the following installed locally on your system:
- [Docker Engine](https://docs.docker.com/get-started/get-docker/)
- [Docker Compose](https://docs.docker.com/compose/install/)✅ Note: These tools are required only if you want to run the project in a containerized environment. If you're using Conda, Docker is optional.
## How to Run
To build the image and run the project inside a container:
```bash
docker-compose up --build
```This will:
- Build the Docker image using the included Dockerfile
- Run the run_all.py pipeline (training + evaluation)
- Save the best trained model in the `models/` directory
- Save plots and metrics in the `reports/` directory✅ Note: Both `models/` and `reports/` are mounted to your host machine, so your outputs are preserved outside the container.
# Testing
The project includes a complete test suite using [pytest](https://docs.pytest.org/en/stable/). Tests use temporary directories, mock inputs, and validate expected outputs including saved models and plots.
## Run all tests
```bash
pytest
```This will automatically discover and run:
- Unit tests (`tests/unit/`)
- Integration tests (`tests/integration/`)
- End-to-End tests (`tests/e2e/`)## Run a specific group
```bash
pytest tests/unit/
```