https://github.com/thieu1995/metaperceptron
MetaPerceptron: Unleashing the Power of Metaheuristic-optimized Multi-Layer Perceptron - A Python Library
https://github.com/thieu1995/metaperceptron
adagrad-approach adam-optimizer adelta-optimizer classification-models genetic-algorithm global-search gradient-free-based-multi-layer-perceptron metaheuristic-algorithms metaheuristic-based-multi-layer-perceptron metaheuristics mlp multi-layer-perceptron nature-inspired-optimization neural-network particle-swarm-optimization regression-models sgd-optimizer whale-optimization-algorithm
Last synced: 10 months ago
JSON representation
MetaPerceptron: Unleashing the Power of Metaheuristic-optimized Multi-Layer Perceptron - A Python Library
- Host: GitHub
- URL: https://github.com/thieu1995/metaperceptron
- Owner: thieu1995
- License: gpl-3.0
- Created: 2023-08-08T12:04:48.000Z (over 2 years ago)
- Default Branch: main
- Last Pushed: 2024-11-04T07:29:33.000Z (over 1 year ago)
- Last Synced: 2024-11-04T07:46:45.991Z (over 1 year ago)
- Topics: adagrad-approach, adam-optimizer, adelta-optimizer, classification-models, genetic-algorithm, global-search, gradient-free-based-multi-layer-perceptron, metaheuristic-algorithms, metaheuristic-based-multi-layer-perceptron, metaheuristics, mlp, multi-layer-perceptron, nature-inspired-optimization, neural-network, particle-swarm-optimization, regression-models, sgd-optimizer, whale-optimization-algorithm
- Language: Python
- Homepage: https://metaperceptron.readthedocs.org
- Size: 330 KB
- Stars: 7
- Watchers: 2
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Changelog: ChangeLog.md
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
- Citation: CITATION.cff
Awesome Lists containing this project
README
---
[](https://github.com/thieu1995/MetaPerceptron/releases)
[](https://pypi.python.org/pypi/metaperceptron)
[](https://badge.fury.io/py/metaperceptron)


[](https://pepy.tech/project/metaperceptron)
[](https://github.com/thieu1995/metaperceptron/actions/workflows/publish-package.yml)
[](https://metaperceptron.readthedocs.io/en/latest/?badge=latest)
[](https://t.me/+fRVCJGuGJg1mNDg1)
[](https://zenodo.org/doi/10.5281/zenodo.10251021)
[](https://www.gnu.org/licenses/gpl-3.0)
`MetaPerceptron` (Metaheuristic-optimized Multi-Layer Perceptron) is a powerful and extensible Python library that
brings the best of both worlds: metaheuristic optimization and deep learning via Multi-Layer Perceptron (MLP).
Whether you're working with classic Gradient Descent techniques or state-of-the-art metaheuristic algorithms
like GA, PSO, WOA, DE, etc., `MetaPerceptron` has you covered. With `MetaPerceptron`, you can perform searches,
feature selection, and hyperparameter tuning using the features provided by the Scikit-Learn library.
## ๐ Features at a Glance
- ๐ง **Estimators**: `MlpRegressor`, `MlpClassifier`, `MhaMlpRegressor`, `MhaMlpClassifier`
- ๐ **Utilities**: `MhaMlpTuner`, `MhaMlpComparator`
- ๐ง **Model Zoo**:
- 200+ Metaheuristic-trained MLP Regressors
- 200+ Metaheuristic-trained MLP Classifiers
- 12 Gradient Descent-trained MLP Regressors
- 12 Gradient Descent-trained MLP Classifiers
- ๐ **67+ Performance Metrics** (47 for regression, 20 for classification)
- โ๏ธ **Support**: GPU support (for GD-based models), Scikit-learn compatible API
- ๐ **Documentation**: https://metaperceptron.readthedocs.io
- ๐ **Python**: 3.8+
- ๐ฆ **Dependencies**: numpy, scipy, scikit-learn, pytorch, mealpy, pandas, permetrics
## ๐ Citation
If MetaPerceptron supports your work, please consider citing the following:
```bibtex
@article{van2025metaperceptron,
title={MetaPerceptron: A Standardized Framework for Metaheuristic-Driven Multi-Layer Perceptron Optimization},
author={Van Thieu, Nguyen and Mirjalili, Seyedali and Garg, Harish and Hoang, Nguyen Thanh},
journal={Computer Standards \& Interfaces},
pages={103977},
year={2025},
publisher={Elsevier},
doi={10.1016/j.csi.2025.103977},
url={https://doi.org/10.1016/j.csi.2025.103977}
}
@article{van2023mealpy,
title={MEALPY: An open-source library for latest meta-heuristic algorithms in Python},
author={Van Thieu, Nguyen and Mirjalili, Seyedali},
journal={Journal of Systems Architecture},
year={2023},
publisher={Elsevier},
doi={10.1016/j.sysarc.2023.102871}
}
@article{van2023groundwater,
title={Groundwater level modeling using Augmented Artificial Ecosystem Optimization},
author={Van Thieu, Nguyen and Barma, Surajit Deb and Van Lam, To and Kisi, Ozgur and Mahesha, Amai},
journal={Journal of Hydrology},
volume={617},
pages={129034},
year={2023},
publisher={Elsevier},
doi={10.1016/j.jhydrol.2022.129034}
}
```
## ๐งช Quick Start
Install the latest version using pip:
```bash
pip install metaperceptron
```
After that, check the version to ensure successful installation:
```python
import metaperceptron
print(metaperceptron.__version__)
```
### โ
Import core components
Here is how you can import all provided classes from `MetaPerceptron`
```python
from metaperceptron import DataTransformer, Data
from metaperceptron import MhaMlpRegressor, MhaMlpClassifier, MlpRegressor, MlpClassifier
from metaperceptron import MhaMlpTuner, MhaMlpComparator
```
### ๐ Example: Training an MLP Classifier with Genetic Algorithm
In this tutorial, we will use Genetic Algorithm to train Multi-Layer Perceptron network for classification task.
For more complex examples and use cases, please check the folder [examples](examples).
```python
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
from metaperceptron import DataTransformer, MhaMlpClassifier
## Load the dataset
X, y = load_iris(return_X_y=True)
## Split train and test
X_train, y_train, X_test, y_test = train_test_split(X, y, test_size=0.2)
## Scale dataset with two methods: standard and minmax
dt = DataTransformer(scaling_methods=("standard", "minmax"))
X_train_scaled = dt.fit_transform(X_train)
X_test_scaled = dt.transform(X_test)
## Define Genetic Algorithm-trained Multi-Layer Perceptron
model = MhaMlpClassifier(hidden_layers=(50, 15), act_names="Tanh",
dropout_rates=None, act_output=None,
optim="BaseGA", optim_params={"epoch": 100, "pop_size": 20, "name": "GA"},
obj_name="F1S", seed=42, verbose=True)
## Train the model
model.fit(X=X_train_scaled, y=y_train)
## Test the model
y_pred = model.predict(X_test)
print(y_pred)
## Print the score
print(model.score(X_test_scaled, y_test))
## Calculate some metrics
print(model.evaluate(y_true=y_test, y_pred=y_pred, list_metrics=["AS", "PS", "RS", "F2S", "CKS", "FBS"]))
```
## ๐ฌ Support
- ๐ฆ [Source Code](https://github.com/thieu1995/MetaPerceptron)
- ๐ [Documentation](https://metaperceptron.readthedocs.io/)
- โฌ๏ธ [PyPI Releases](https://pypi.org/project/metaperceptron/)
- โ [Report Issues](https://github.com/thieu1995/MetaPerceptron/issues)
- ๐ [Changelog](https://github.com/thieu1995/MetaPerceptron/blob/master/ChangeLog.md)
- ๐ฌ [Chat Group](https://t.me/+fRVCJGuGJg1mNDg1)
---
Developed by: [Thieu](mailto:nguyenthieu2102@gmail.com?Subject=MetaPerceptron_QUESTIONS) @ 2025