Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/ilex-paraguariensis/examples
An example mate project
https://github.com/ilex-paraguariensis/examples
Last synced: 17 days ago
JSON representation
An example mate project
- Host: GitHub
- URL: https://github.com/ilex-paraguariensis/examples
- Owner: ilex-paraguariensis
- License: mit
- Created: 2022-08-15T19:22:28.000Z (about 2 years ago)
- Default Branch: main
- Last Pushed: 2022-11-12T21:45:52.000Z (almost 2 years ago)
- Last Synced: 2024-10-09T12:54:20.923Z (28 days ago)
- Language: TypeScript
- Size: 22.5 MB
- Stars: 0
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
## Examples with [MatΓ© π§](https://github.com/ilex-paraguariensis/yerbamate/tree/v2)
## Install locally
First install mate by:
```bash
git clone https://github.com/ilex-paraguariensis/yerbamate -b v2
```
Then move to the yerbamate directory
```
cd yerbamate
python install.py
```
Then install the requirements:
```bash
pip install -r requirements.txt
```
Then clone this repo:
```bash
git clone https://github.com/ilex-paraguariensis/vision -b v2
```
And test that everything is working by:
```bash
mate list models
```
Should output:
```bash
vit
lightning
jax
keras
```## Running the project
To run the project, you can use Mate to run different configurations. Look at `resnet/hyperparatmers/vanilla.json` and `vit/hyperparameters/vanilla.json` for examples of configurations. Any configuration file can be selected to train. To train a model, run:
```bash
mate train {model_name} {hyperparameter_file}
```
where `{model_name}` can be anything e.g., `resnet` or `vit` and `{hyperparameter_file}` is the name of the hyperparameter file and the experiment.## Logging
The project by default uses [Weights and Biases](https://wandb.ai/) to log the training process. You can also select any pytorch lightning loggers, e.g., `TensorBoardLogger` or `CSVLogger`. See `/vit/hyperparateres/tensorboard.json` for an example.## Training
You can select any combination of your models with hyperparameters, for example:
```bash
mate train vit cifar100 # train vit on cifar100
mate train resnet fine_tune # fine tune a resnet trained on imagenet on cifar
mate train vit small_datasets # model from Vision Transformer for Small-Size Datasets paper
mate train vit vanilla # original ViT paper: An Image is Worth 16x16 Words
```You can consequently restart the training with the same configuration by running:
```bash
mate restart vit vanilla
```
## Experimenting and trying other models
You can try other models by changing the model in the hyperparameters or making new configuration file. Over 30 ViTs are available to experiment with. You can also fork the vit models and change the source code as you wish:
```bash
mate clone vit awesome_vit
```
Then, change the models in `project/models/awesome_vit` and keep on experimenting.## Customizing the hyperparameters
You can customize the hyperparameters by changing the hyperparameter file. For example, you can change the model, learning rate, batch size, optimizer, etc. this project is not limited to cifar dataset, with adding a PytorchLightningDataModule, you can train on any dataset. Optimizers, Trainers, Models and Pytorch-Lightning modules are directly created from the arguments in the configuration file and pytorch packages.## Special thanks
Special thanks to the legend lucidrains for the [vit-pytorch](https://github.com/lucidrains/vit-pytorch) library. His licence applies to the ViT models in this project.