https://github.com/thieu1995/metaperceptron
MetaPerceptron: Unleashing the Power of Metaheuristic-optimized Multi-Layer Perceptron - A Python Library
https://github.com/thieu1995/metaperceptron
adagrad-approach adam-optimizer adelta-optimizer classification-models genetic-algorithm global-search gradient-free-based-multi-layer-perceptron metaheuristic-algorithms metaheuristic-based-multi-layer-perceptron metaheuristics mlp multi-layer-perceptron nature-inspired-optimization neural-network particle-swarm-optimization regression-models sgd-optimizer whale-optimization-algorithm
Last synced: 6 months ago
JSON representation
MetaPerceptron: Unleashing the Power of Metaheuristic-optimized Multi-Layer Perceptron - A Python Library
- Host: GitHub
- URL: https://github.com/thieu1995/metaperceptron
- Owner: thieu1995
- License: gpl-3.0
- Created: 2023-08-08T12:04:48.000Z (almost 2 years ago)
- Default Branch: main
- Last Pushed: 2024-11-04T07:29:33.000Z (7 months ago)
- Last Synced: 2024-11-04T07:46:45.991Z (7 months ago)
- Topics: adagrad-approach, adam-optimizer, adelta-optimizer, classification-models, genetic-algorithm, global-search, gradient-free-based-multi-layer-perceptron, metaheuristic-algorithms, metaheuristic-based-multi-layer-perceptron, metaheuristics, mlp, multi-layer-perceptron, nature-inspired-optimization, neural-network, particle-swarm-optimization, regression-models, sgd-optimizer, whale-optimization-algorithm
- Language: Python
- Homepage: https://metaperceptron.readthedocs.org
- Size: 330 KB
- Stars: 7
- Watchers: 2
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Changelog: ChangeLog.md
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
- Citation: CITATION.cff
Awesome Lists containing this project
README
![]()
---
[](https://github.com/thieu1995/MetaPerceptron/releases)
[](https://pypi.python.org/pypi/metaperceptron)
[](https://badge.fury.io/py/metaperceptron)



[](https://pepy.tech/project/metaperceptron)
[](https://github.com/thieu1995/metaperceptron/actions/workflows/publish-package.yaml)

[](https://metaperceptron.readthedocs.io/en/latest/?badge=latest)
[](https://t.me/+fRVCJGuGJg1mNDg1)

[](https://git-scm.com/book/en/v2/GitHub-Contributing-to-a-Project)
[](https://zenodo.org/doi/10.5281/zenodo.10251021)
[](https://www.gnu.org/licenses/gpl-3.0)MetaPerceptron (Metaheuristic-optimized Multi-Layer Perceptron) is a Python library that implements variants and the
traditional version of Multi-Layer Perceptron models. These include Metaheuristic-optimized MLP models (GA, PSO, WOA, TLO, DE, ...)
and Gradient Descent-optimized MLP models (SGD, Adam, Adelta, Adagrad, ...). It provides a comprehensive list of
optimizers for training MLP models and is also compatible with the Scikit-Learn library. With MetaPerceptron,
you can perform searches and hyperparameter tuning using the features provided by the Scikit-Learn library.* **Free software:** GNU General Public License (GPL) V3 license
* **Provided Estimator**: `MlpRegressor`, `MlpClassifier`, `MhaMlpRegressor`, `MhaMlpClassifier`
* **Provided Utility**: `MhaMlpTuner` and `MhaMlpComparator`
* **Total Metaheuristic-trained MLP Regressor**: > 200 Models
* **Total Metaheuristic-trained MLP Classifier**: > 200 Models
* **Total Gradient Descent-trained MLP Regressor**: 12 Models
* **Total Gradient Descent-trained MLP Classifier**: 12 Models
* **Supported performance metrics**: >= 67 (47 regressions and 20 classifications)
* **Documentation:** https://metaperceptron.readthedocs.io
* **Python versions:** >= 3.8.x
* **Dependencies:** numpy, scipy, scikit-learn, pytorch, mealpy, pandas, permetrics.# Citation Request
If you want to understand how Metaheuristic is applied to Multi-Layer Perceptron, you need to read the paper
titled **"Let a biogeography-based optimizer train your Multi-Layer Perceptron"**.
The paper can be accessed at the following [link](https://doi.org/10.1016/j.ins.2014.01.038)Please include these citations if you plan to use this library:
```code
@software{nguyen_van_thieu_2023_10251022,
author = {Nguyen Van Thieu},
title = {MetaPerceptron: A Standardized Framework for Metaheuristic-Trained Multi-Layer Perceptron},
month = dec,
year = 2023,
publisher = {Zenodo},
doi = {10.5281/zenodo.10251021},
url = {https://github.com/thieu1995/MetaPerceptron}
}@article{van2023mealpy,
title={MEALPY: An open-source library for latest meta-heuristic algorithms in Python},
author={Van Thieu, Nguyen and Mirjalili, Seyedali},
journal={Journal of Systems Architecture},
year={2023},
publisher={Elsevier},
doi={10.1016/j.sysarc.2023.102871}
}@article{van2023groundwater,
title={Groundwater level modeling using Augmented Artificial Ecosystem Optimization},
author={Van Thieu, Nguyen and Barma, Surajit Deb and Van Lam, To and Kisi, Ozgur and Mahesha, Amai},
journal={Journal of Hydrology},
volume={617},
pages={129034},
year={2023},
publisher={Elsevier}
}@article{thieu2019efficient,
title={Efficient time-series forecasting using neural network and opposition-based coral reefs optimization},
author={Thieu Nguyen, Tu Nguyen and Nguyen, Binh Minh and Nguyen, Giang},
journal={International Journal of Computational Intelligence Systems},
volume={12},
number={2},
pages={1144--1161},
year={2019}
}```
# Simple Tutorial
* Install the [current PyPI release](https://pypi.python.org/pypi/metaperceptron):
```sh
$ pip install metaperceptron==2.0.0
```* Check the version:
```sh
$ python
>>> import metaperceptron
>>> metaperceptron.__version__
```* Here is how you can import all provided classes from `MetaPerceptron`
```python
from metaperceptron import DataTransformer, Data
from metaperceptron import MhaMlpRegressor, MhaMlpClassifier, MlpRegressor, MlpClassifier
from metaperceptron import MhaMlpTuner, MhaMlpComparator
```* In this tutorial, we will use Genetic Algorithm to train Multi-Layer Perceptron network for classification task.
For more complex examples and use cases, please check the folder [examples](examples).```python
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
from metaperceptron import DataTransformer, MhaMlpClassifier## Load the dataset
X, y = load_iris(return_X_y=True)## Split train and test
X_train, y_train, X_test, y_test = train_test_split(X, y, test_size=0.2)## Scale dataset with two methods: standard and minmax
dt = DataTransformer(scaling_methods=("standard", "minmax"))
X_train_scaled = dt.fit_transform(X_train)
X_test_scaled = dt.transform(X_test)## Define Genetic Algorithm-trained Multi-Layer Perceptron
opt_paras = {"epoch": 100, "pop_size": 20}
model = MhaMlpClassifier(hidden_layers=(50, 15), act_names="Tanh", dropout_rates=None, act_output=None,
optim="BaseGA", optim_paras=opt_paras, obj_name="F1S", seed=42, verbose=True)
## Train the model
model.fit(X=X_train_scaled, y=y_train)## Test the model
y_pred = model.predict(X_test)
print(y_pred)## Print the score
print(model.score(X_test_scaled, y_test))## Calculate some metrics
print(model.evaluate(y_true=y_test, y_pred=y_pred, list_metrics=["AS", "PS", "RS", "F2S", "CKS", "FBS"]))
```# Support (questions, problems)
### Official Links
* Official source code repo: [link](https://github.com/thieu1995/MetaPerceptron)
* Official document: [link](https://metapeceptron.readthedocs.io/)
* Download releases: [link](https://pypi.org/project/metaperceptron/)
* Issue tracker: [link](https://github.com/thieu1995/MetaPerceptron/issues)
* Notable changes log: [link](https://github.com/thieu1995/MetaPerceptron/blob/master/ChangeLog.md)
* Official chat group: [link](https://t.me/+fRVCJGuGJg1mNDg1)