Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/entelecheia/llama-factory-container
Container for LLaMA-Factory
https://github.com/entelecheia/llama-factory-container
llama llms
Last synced: 17 days ago
JSON representation
Container for LLaMA-Factory
- Host: GitHub
- URL: https://github.com/entelecheia/llama-factory-container
- Owner: entelecheia
- License: mit
- Created: 2024-05-13T00:49:18.000Z (6 months ago)
- Default Branch: main
- Last Pushed: 2024-08-13T19:08:17.000Z (3 months ago)
- Last Synced: 2024-08-13T22:24:56.242Z (3 months ago)
- Topics: llama, llms
- Language: Shell
- Homepage:
- Size: 62.5 KB
- Stars: 0
- Watchers: 1
- Forks: 3
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
Awesome Lists containing this project
README
# Container for LLaMA-Factory: Unify Efficient Fine-Tuning of 100+ LLMs
[![version-image]][release-url]
[![release-date-image]][release-url]
[![license-image]][license-url]llama-factory-container is a comprehensive AI model training and deployment framework supporting various models, methods, and optimizations.
- GitHub: [https://github.com/entelecheia/llama-factory-container][repo-url]
- LLaMA-Factory: [https://github.com/hiyouga/LLaMA-Factory](https://github.com/hiyouga/LLaMA-Factory)llama-factory-container is a versatile and feature-rich framework for training and deploying AI models. It supports a wide range of models, including LLaMA, LLaVA, Mistral, Mixtral-MoE, Qwen, Yi, Gemma, Baichuan, ChatGLM, and Phi. The framework integrates various training methods such as pre-training, supervised fine-tuning, reward modeling, and reinforcement learning algorithms like PPO, DPO, and ORPO.
## Prerequisites
- Docker
- Docker Compose
- NVIDIA Docker (for GPU support, Optional)## Build and Run
1. Build the Docker image:
```bash
make docker-build
```The `docker.dev.env` file includes various configuration options and environment variables. The `docker-compose.dev.yaml` file uses these variables to customize the behavior of the services. This is a common practice when you want to set different configurations for development, testing, and production environments. The `Dockerfile.dev` file uses these variables to customize the Docker build. These files are automatically generated by Copier when you run the `copier copy` command.
2. Start the Docker container:
```bash
make docker-up
```This will start a Docker container with the image built in the previous step. The container will run a bash lauch script that starts the application.
## Usage
After starting the container, you can access the application at `localhost:`. By default, the port is set to `19090`.
You can also SSH into the container using the SSH port specified in `APP_HOST_SSH_PORT`. By default, the port is set to `2233`.
## Volumes
The `docker-compose.dev.yaml` file specifies several volumes that bind mount directories on the host to directories in the container. These include the cache, the workspace directory, and the scripts directory. Changes made in these directories will persist across container restarts.
## Troubleshooting
If you encounter any issues with this setup, please check the following:
- Make sure Docker and Docker Compose are installed correctly.
- Make sure NVIDIA Docker is installed if you're planning to use GPU acceleration.
- Ensure the environment variables in the `docker.dev.env` file are correctly set.
- Check the Docker and Docker Compose logs for any error messages.## Environment Variables
In Docker, environment variables can be used in the `docker-compose.dev.yaml` file to customize the behavior of the services.
The `docker-compose` command has an `--env-file` argument, but it's used to set the environment variables for the services defined in the `docker-compose.yaml` file, not for the `docker-compose` command itself. The variables defined in the `--env-file` will not be substituted into the `docker-compose.yaml` file.
However, the environment variables we set in the `.docker/docker.dev.env` file are used in the `docker-compose.app.yaml` file. For example, the `$BUILD_FROM` variable is used to set the base image for the Docker build. Therefore, we need to export these variables to the shell environment before running the `docker-compose` command.
This method also allows us to use shell commands in the variable definitions, like `"$(whoami)"` for the `USERNAME` variable, which wouldn't be possible if we used the `--env-file` argument. Shell commands in the `.env` file are not evaluated.
### Files for Environment Variables
- `.docker/docker.common.env`: Common environment variables for all Docker images.
- `.docker/docker.dev.env`: Environment variables for the Docker image.
- `.env.secret`: Secret environment variables that are not committed to the repository.## Changelog
See the [CHANGELOG] for more information.
## Contributing
Contributions are welcome! Please see the [contributing guidelines] for more information.
## License
This project is released under the [MIT License][license-url].
[license-image]: https://img.shields.io/github/license/entelecheia/llama-factory-container
[license-url]: https://github.com/entelecheia/llama-factory-container/blob/main/LICENSE
[version-image]: https://img.shields.io/github/v/release/entelecheia/llama-factory-container?sort=semver
[release-date-image]: https://img.shields.io/github/release-date/entelecheia/llama-factory-container
[release-url]: https://github.com/entelecheia/llama-factory-container/releases
[repo-url]: https://github.com/entelecheia/llama-factory-container
[changelog]: https://github.com/entelecheia/llama-factory-container/blob/main/CHANGELOG.md
[contributing guidelines]: https://github.com/entelecheia/llama-factory-container/blob/main/CONTRIBUTING.md