Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/jamormoussa/firefly-hyperparameter-tuning-for-word2vec-embedding-model
FireFly algorithm is a metaheuristic, inspired from flashing behavior of fireflies.
https://github.com/jamormoussa/firefly-hyperparameter-tuning-for-word2vec-embedding-model
Last synced: 5 days ago
JSON representation
FireFly algorithm is a metaheuristic, inspired from flashing behavior of fireflies.
- Host: GitHub
- URL: https://github.com/jamormoussa/firefly-hyperparameter-tuning-for-word2vec-embedding-model
- Owner: JamorMoussa
- Created: 2024-05-08T12:47:45.000Z (8 months ago)
- Default Branch: main
- Last Pushed: 2024-05-29T12:04:01.000Z (7 months ago)
- Last Synced: 2024-12-16T04:17:11.928Z (9 days ago)
- Language: Jupyter Notebook
- Homepage:
- Size: 2.7 MB
- Stars: 2
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# FireFly Hyperparameter Tuning for Word2Vec Embedding Model
## Motivation
**Hyperparameter tuning** is a critical step in the development of deep learning models. This project aims to use the **FireFly Algorithm** to perform hyperparameter tuning for a Word2Vec embedding model. The goal is to leverage meta-heuristic optimization methods to find the best and most optimal hyperparameters.
## FireFly Algorithm
In this project, we provide an implementation of the FireFly Algorithm with some additional features, such as:
- Ensuring that parameters remain within specified ranges.
- Different parameters are given different scales, respecting their nature as described by the problem.### Example
Below is an example demonstrating how to use the FireFly Algorithm for optimization:
```python
from firefly import FireFlyOptimizer, FireFlyConfig, FireFlyParameterBounderdef fitness(x):
return x[0]**2 + x[1]**2configs = FireFlyConfig.get_defaults()
configs.pop_size = 100
configs.max_iters = 10
print(configs)bounder = FireFlyParameterBounder(bounds=[(-5, 5), (-5, 5)])
print(bounder)FA = FireFlyOptimizer(config=configs, bounder=bounder)
FA.run(func=fitness, dim=2)
print("Best Fitness:", FA.best_intensity)
print("Best position:", FA.best_pos)
``````bash
➜ FireFly-Optimizer-Deep-Learning git:(main) ✗ python3 firefly_example.pyFireFlyConfig(pop_size=100, alpha=1.0, beta0=1.0, gamma=0.01, max_iters=10, seed=None)
FireFlyParameterBounder(bounds=[(-5, 5), (-5, 5)])
Best Fitness: 0.009908765479110442
Best position: [0.06986411 0.07090678]
```## Hyperparameter Tuning With FireFly
To use FireFly as an optimizer for hyperparameter tuning, run the following command and specify some arguments:
```bash
➜ python3 hyperparam.py --popsize=5 --alpha=1.0 --beta0=1.0 --gamma=0.01 --maxiters=5
```## Train CBoW Model
After obtaining the optimal hyperparameters with the previous scripts, let's train a CBoW Embeddings model with these parameters:
```bash
➜ python3 train.py --lr=0.0066 --beta1=0.899 --beta2=0.9989 --windowsize=1 --embdim=2 --epochs=50
```