https://github.com/speediedan/finetuning-scheduler
A PyTorch Lightning extension that accelerates and enhances foundation model experimentation with flexible fine-tuning schedules.
https://github.com/speediedan/finetuning-scheduler
artificial-intelligence fine-tuning finetuning machine-learning neural-networks pytorch pytorch-lightning superglue transfer-learning
Last synced: 18 days ago
JSON representation
A PyTorch Lightning extension that accelerates and enhances foundation model experimentation with flexible fine-tuning schedules.
- Host: GitHub
- URL: https://github.com/speediedan/finetuning-scheduler
- Owner: speediedan
- License: apache-2.0
- Created: 2022-02-04T19:08:54.000Z (about 3 years ago)
- Default Branch: main
- Last Pushed: 2025-04-03T18:03:21.000Z (19 days ago)
- Last Synced: 2025-04-03T19:22:22.532Z (19 days ago)
- Topics: artificial-intelligence, fine-tuning, finetuning, machine-learning, neural-networks, pytorch, pytorch-lightning, superglue, transfer-learning
- Language: Python
- Homepage: https://finetuning-scheduler.readthedocs.io
- Size: 2.59 MB
- Stars: 64
- Watchers: 3
- Forks: 6
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- Contributing: .github/CONTRIBUTING.md
- License: LICENSE
- Code of conduct: .github/CODE_OF_CONDUCT.md
- Citation: CITATION.cff
- Codeowners: .github/CODEOWNERS
- Security: SECURITY.md
Awesome Lists containing this project
- awesome-llmops - finetuning-scheduler - tuning schedules. |  | (Training / Foundation Model Fine Tuning)
README
**A PyTorch Lightning extension that enhances model experimentation with flexible fine-tuning schedules.**
______________________________________________________________________
Docs •
Setup •
Examples •
Community[](https://pypi.org/project/finetuning-scheduler/)
[](https://badge.fury.io/py/finetuning-scheduler)\
[](https://codecov.io/gh/speediedan/finetuning-scheduler)
[](https://finetuning-scheduler.readthedocs.io/en/stable/)
[](https://zenodo.org/badge/latestdoi/455666112)
[](https://github.com/speediedan/finetuning-scheduler/blob/master/LICENSE)______________________________________________________________________
[FinetuningScheduler](https://finetuning-scheduler.readthedocs.io/en/stable/api/finetuning_scheduler.fts.html#finetuning_scheduler.fts.FinetuningScheduler) is simple to use yet powerful, offering a number of features that facilitate model research and exploration:
- easy specification of flexible fine-tuning schedules with explicit or regex-based parameter selection
- implicit schedules for initial/naive model exploration
- explicit schedules for performance tuning, fine-grained behavioral experimentation and computational efficiency
- automatic restoration of best per-phase checkpoints driven by iterative application of early-stopping criteria to each fine-tuning phase
- composition of early-stopping and manually-set epoch-driven fine-tuning phase transitions______________________________________________________________________
## Setup
### Step 0: Install from PyPI
```bash
pip install finetuning-scheduler
```Additional installation options
#### *Install Optional Packages*
#### To install additional packages required for examples:
```bash
pip install finetuning-scheduler['examples']
```#### or to include packages for examples, development and testing:
```bash
pip install finetuning-scheduler['all']
```#### *Source Installation Examples*
#### To install from (editable) source (includes docs as well):
```bash
# FTS pins Lightning to a specific commit for CI and development
# This is similar to PyTorch's approach with Triton.
export USE_CI_COMMIT_PIN="1"git clone https://github.com/speediedan/finetuning-scheduler.git
cd finetuning-scheduler
python -m pip install -e ".[all]" -r requirements/docs.txt
```#### Install a specific FTS version from source using the standalone `pytorch-lighting` package:
```bash
export FTS_VERSION=2.6.0
export PACKAGE_NAME=pytorch
git clone -b v${FTS_VERSION} https://github.com/speediedan/finetuning-scheduler
cd finetuning-scheduler
python -m pip install -e ".[all]" -r requirements/docs.txt
```#### *Latest Docker Image*
Note, publishing of new `finetuning-scheduler` version-specific docker images was paused after the `2.0.2` patch release. If new version-specific images are required, please raise an issue.

### Step 1: Import the FinetuningScheduler callback and start fine-tuning!
```python
import lightning as L
from finetuning_scheduler import FinetuningSchedulertrainer = L.Trainer(callbacks=[FinetuningScheduler()])
```Get started by following [the Fine-Tuning Scheduler introduction](https://finetuning-scheduler.readthedocs.io/en/stable/index.html) which includes a [CLI-based example](https://finetuning-scheduler.readthedocs.io/en/stable/index.html#example-scheduled-fine-tuning-for-superglue) or by following the [notebook-based](https://pytorch-lightning.readthedocs.io/en/stable/notebooks/lightning_examples/finetuning-scheduler.html) Fine-Tuning Scheduler tutorial.
______________________________________________________________________
### Installation Using the Standalone `pytorch-lightning` Package
*applicable to versions >= `2.0.0`*
Now that the core Lightning package is `lightning` rather than `pytorch-lightning`, Fine-Tuning Scheduler (FTS) by default depends upon the `lightning` package rather than the standalone `pytorch-lightning`. If you would like to continue to use FTS with the standalone `pytorch-lightning` package instead, you can still do so as follows:
Install a given FTS release (for example v2.0.0) using standalone `pytorch-lightning`:
```bash
export FTS_VERSION=2.0.0
export PACKAGE_NAME=pytorch
wget https://github.com/speediedan/finetuning-scheduler/releases/download/v${FTS_VERSION}/finetuning-scheduler-${FTS_VERSION}.tar.gz
pip install finetuning-scheduler-${FTS_VERSION}.tar.gz
```### Dynamic Versioning
FTS (as of version `2.6.0`) now enables dynamic versioning both at installation time and via CLI post-installation. Initially, the dynamic versioning system allows toggling between Lightning unified and standalone imports. The two conversion operations are individually idempotent and mutually reversible.
#### Toggling Between Unified and Standalone Lightning Imports
FTS provides a simple CLI tool to easily toggle between unified and standalone import installation versions post-installation:
```bash
# Toggle from unified to standalone Lightning imports
toggle-lightning-mode --mode standalone# Toggle from standalone to unified Lightning imports (default)
toggle-lightning-mode --mode unified
```> **Note:** If you have the standalone package (`pytorch-lightning`) installed but not the unified package (`lightning`), toggling to unified mode will be prevented. You must install the `lightning` package first before toggling.
This can be useful when:
- You need to adapt existing code to work with a different Lightning package
- You're switching between projects using different Lightning import styles
- You want to test compatibility with both import styles______________________________________________________________________
## Examples
### Scheduled Fine-Tuning For SuperGLUE
- [Notebook-based Tutorial](https://pytorch-lightning.readthedocs.io/en/stable/notebooks/lightning_examples/finetuning-scheduler.html)
- [CLI-based Tutorial](https://finetuning-scheduler.readthedocs.io/en/stable/#example-scheduled-fine-tuning-for-superglue)
- [FSDP Scheduled Fine-Tuning](https://finetuning-scheduler.readthedocs.io/en/stable/advanced/fsdp_scheduled_fine_tuning.html)
- [LR Scheduler Reinitialization](https://finetuning-scheduler.readthedocs.io/en/stable/advanced/lr_scheduler_reinitialization.html) (advanced)
- [Optimizer Reinitialization](https://finetuning-scheduler.readthedocs.io/en/stable/advanced/optimizer_reinitialization.html) (advanced)______________________________________________________________________
## Continuous Integration
Fine-Tuning Scheduler is rigorously tested across multiple CPUs, GPUs and against major Python and PyTorch versions. Each Fine-Tuning Scheduler minor release (major.minor.patch) is paired with a Lightning minor release (e.g. Fine-Tuning Scheduler 2.0 depends upon Lightning 2.0).
To ensure maximum stability, the latest Lightning patch release fully tested with Fine-Tuning Scheduler is set as a maximum dependency in Fine-Tuning Scheduler's requirements.txt (e.g. \<= 1.7.1). If you'd like to test a specific Lightning patch version greater than that currently in Fine-Tuning Scheduler's requirements.txt, it will likely work but you should install Fine-Tuning Scheduler from source and update the requirements.txt as desired.
Current build statuses for Fine-Tuning Scheduler
| System / (PyTorch/Python ver) | 2.3.1/3.9 | 2.7.0/3.9, 2.7.0/3.12 |
| :---------------------------: | :----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------: | :----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------: |
| Linux \[GPUs\*\*\] | - | [](https://dev.azure.com/speediedan/finetuning-scheduler/_build/latest?definitionId=1&branchName=main) |
| Linux (Ubuntu 22.04) | [](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml) | [](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml) |
| OSX (14) | [](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml) | [](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml) |
| Windows (2022) | [](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml) | [](https://github.com/speediedan/finetuning-scheduler/actions/workflows/ci_test-full.yml) |- \*\* tests run on one RTX 4090 and one RTX 2070
## Community
Fine-Tuning Scheduler is developed and maintained by the community in close communication with the [Lightning team](https://pytorch-lightning.readthedocs.io/en/stable/governance.html). Thanks to everyone in the community for their tireless effort building and improving the immensely useful core Lightning project.
PR's welcome! Please see the [contributing guidelines](https://finetuning-scheduler.readthedocs.io/en/stable/generated/CONTRIBUTING.html) (which are essentially the same as Lightning's).
______________________________________________________________________
## Citing Fine-Tuning Scheduler
Please cite:
```tex
@misc{Dan_Dale_2022_6463952,
author = {Dan Dale},
title = {{Fine-Tuning Scheduler}},
month = Feb,
year = 2022,
doi = {10.5281/zenodo.6463952},
publisher = {Zenodo},
url = {https://zenodo.org/record/6463952}
}
```Feel free to star the repo as well if you find it useful or interesting. Thanks 😊!