{"id":12915219,"url":"https://github.com/machinelearningnuremberg/QuickTune","last_synced_at":"2025-03-02T23:30:51.515Z","repository":{"id":222364732,"uuid":"746921957","full_name":"machinelearningnuremberg/QuickTune","owner":"machinelearningnuremberg","description":"[ICLR2024] Quick-Tune: Quickly Learning Which Pretrained Model to Finetune and How ","archived":false,"fork":false,"pushed_at":"2024-06-11T15:12:24.000Z","size":6137,"stargazers_count":27,"open_issues_count":0,"forks_count":5,"subscribers_count":2,"default_branch":"main","last_synced_at":"2024-10-18T12:45:35.097Z","etag":null,"topics":["deep-learning","finetuning","hyperparameter-tuning","model-hub","optimization","pretrained-models"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":null,"status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/machinelearningnuremberg.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":null,"code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2024-01-22T23:11:21.000Z","updated_at":"2024-10-07T08:36:11.000Z","dependencies_parsed_at":"2024-02-13T20:30:00.030Z","dependency_job_id":"f7b60e40-700a-4ac3-9437-41f8ef9781c6","html_url":"https://github.com/machinelearningnuremberg/QuickTune","commit_stats":null,"previous_names":["releaunifreiburg/quicktune","machinelearningnuremberg/quicktune"],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/machinelearningnuremberg%2FQuickTune","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/machinelearningnuremberg%2FQuickTune/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/machinelearningnuremberg%2FQuickTune/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/machinelearningnuremberg%2FQuickTune/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/machinelearningnuremberg","download_url":"https://codeload.github.com/machinelearningnuremberg/QuickTune/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":241587870,"owners_count":19986627,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["deep-learning","finetuning","hyperparameter-tuning","model-hub","optimization","pretrained-models"],"created_at":"2024-07-22T05:18:55.613Z","updated_at":"2025-03-02T23:30:51.509Z","avatar_url":"https://github.com/machinelearningnuremberg.png","language":"Python","readme":"# QuickTune\n## Quick-Tune: Quickly Learning Which Pretrained Model to Finetune and How [ICLR2024]\n\nThis repo contains the code for reproducing the main experiments in the **QuickTune** [paper](https://openreview.net/forum?id=tqh1zdXIra).\n\n![Architecture](figures/figure.svg)\n\n## Prepare environment\nCreate environment and install requirements:\n\n```bash\nconda create -n quick_tune python=3.9\nconda activate quick_tune\npip install -r requirements_qt.txt\n```\n\nInstall torch and gpytorch version:\n\n```bash\nconda install pytorch==1.12.1 torchvision==0.13.1 torchaudio==0.12.1 cudatoolkit=10.2 -c pytorch\nconda install gpytorch -c gpytorch\n```\n\n\n\n## Finetune a pipeline (fixed Hyperparameters)\n\nYou can download a dataset and fine-tune a pipeline. In this example, we will use a dataset from meta-album. The metadataset curves were generated in this way.\n\n```bash\nmkdir data \u0026\u0026 cd data\nmkdir mtlbm \u0026\u0026 cd mtlbm\nwget https://rewind.tf.uni-freiburg.de/index.php/s/pGyowo3WBp7f33S/download/PLT_VIL_Micro.zip\nunzip PLT_VIL_Micro.zip\n```\n\nFrom the root folder, you can fine-tune the network by providing any hyperparameter as follows:\n\n```bash\nmkdir output \npython finetune.py data --model dla46x_c \\\n\t\t\t\t\t--pct_to_freeze 0.8 \\\n\t\t\t\t\t--dataset \"mtlb/PLT_VIL_Micro\"\\\n\t\t\t\t\t--train-split train \\\n\t\t\t\t\t--val-split val  \\\n\t\t\t\t\t--experiment test_experiment \\\n\t\t\t\t\t--output output \\\n\t\t\t\t\t--pretrained \\\n\t\t\t\t\t--num_classes 20\\\n\t\t\t\t\t--epochs 50\n```\n\n\n## Run Quick-Tune on meta-dataset\n\nDownload QuickTune meta-dataset:\n\n```bash\nmkdir data \u0026\u0026 cd data\nwget https://rewind.tf.uni-freiburg.de/index.php/s/oMxC5sfrkA53ESo/download/qt_metadataset.zip\nunzip QT_metadataset.zip\n```\n\nRun examples on the meta-dataset:\n```bash\nmkdir output\n#quicktune on micro\n./bash_scripts/run_micro.sh\n#quicktune on mini\n./bash_scripts/run_mini.sh\n#quicktune on extended\n./bash_scripts/run_extended.sh\n\n#generate the plot for an experiment\n#the plots are saved automatically in a folder called \"plots\"\npython plots_generation/plot_results_benchmark.py --experiment_id qt_micro\n```\n\n\n## Run on a new dataset\n\nFor *quick-tuning* on a new dataset, you can use the following examples as a reference. They run QuickTune on *Imagenette2-320* and *Inaturalist*.\n\n```bash\n#example on imagenette2-320\ncd data\nwget https://s3.amazonaws.com/fast-ai-imageclas/imagenette2-320.tgz\n\ntar -xvzf imagenette2-320.tgz\ncd .. #back to root folder\n\n#before this, we executed quicktune on mini (above) to create the optimizer\n./bash_scripts/run_imagenette.sh\n\n#before this, we executed quicktune on extended (above) to create the optimizer\n./bash_scripts/run_inaturalist.sh\n\n#generate the plots and save them in a folder called \"plots\"\npython plots_generation/plots_results_user_interface.py\n```\n\nIf you use any other dataset, make sure to provide the datasets in a format accepted by Timm library. You have to pass the datasets descriptors for the execution as presented in the example bash scripts. \n\n## Query QuickTune Metadataset\n\nIf you want to query the metadataset, follow [this tutorial](example_query_metadataset.ipynb). Make sure to install `ipykernel`.\n\n## QuickTuneTool (QTT)\n\nIf you are interested in QTT for real image classification datasets, we suggest you try our package [QuickTuneTool](https://github.com/automl/QTT). We are planning to extend it to other modalities.\n\n## Citation\n\nYou can cite our work as follows:\n\n```bib\n@inproceedings{\narango2024quicktune,\ntitle={Quick-Tune: Quickly Learning Which Pretrained Model to Finetune and How},\nauthor={Sebastian Pineda Arango and Fabio Ferreira and Arlind Kadra and Frank Hutter and Josif Grabocka},\nbooktitle={The Twelfth International Conference on Learning Representations},\nyear={2024},\nurl={https://openreview.net/forum?id=tqh1zdXIra}\n}\n```\n","funding_links":[],"categories":["Python"],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmachinelearningnuremberg%2FQuickTune","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fmachinelearningnuremberg%2FQuickTune","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmachinelearningnuremberg%2FQuickTune/lists"}