https://github.com/huggingface/transformers
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://github.com/huggingface/transformers
bert deep-learning flax hacktoberfest jax language-model language-models machine-learning model-hub natural-language-processing nlp nlp-library pretrained-models python pytorch pytorch-transformers seq2seq speech-recognition tensorflow transformer
Last synced: 5 days ago
JSON representation
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
- Host: GitHub
- URL: https://github.com/huggingface/transformers
- Owner: huggingface
- License: apache-2.0
- Created: 2018-10-29T13:56:00.000Z (over 6 years ago)
- Default Branch: main
- Last Pushed: 2025-04-12T10:34:30.000Z (8 days ago)
- Last Synced: 2025-04-13T10:10:51.894Z (7 days ago)
- Topics: bert, deep-learning, flax, hacktoberfest, jax, language-model, language-models, machine-learning, model-hub, natural-language-processing, nlp, nlp-library, pretrained-models, python, pytorch, pytorch-transformers, seq2seq, speech-recognition, tensorflow, transformer
- Language: Python
- Homepage: https://huggingface.co/transformers
- Size: 279 MB
- Stars: 142,871
- Watchers: 1,147
- Forks: 28,618
- Open Issues: 1,733
-
Metadata Files:
- Readme: README.md
- Contributing: CONTRIBUTING.md
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
- Citation: CITATION.cff
- Security: SECURITY.md
Awesome Lists containing this project
- awesome-local-llms - transformers - of-the-art Machine Learning for Pytorch, TensorFlow, and JAX. | 138,058 | 27,674 | 1,538 | 436 | 175 | Apache License 2.0 | 0 days, 9 hrs, 5 mins | (Open-Source Local LLM Projects)
- awesome - huggingface/transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX. (Python)
- awesome - huggingface/transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX. (Python)
- stars - huggingface/transformers - of-the-art Machine Learning for Pytorch, TensorFlow, and JAX. (HarmonyOS / Windows Manager)
- awesome-data-science-viz - huggingface/transformers - of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0 (NLP / Analysis)
- awesome-ccamel - huggingface/transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX. (Python)
- my-awesome-starred - huggingface/transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX. (Python)
- awesome-python-machine-learning-resources - GitHub - 4% open · ⏱️ 25.08.2022): (文本数据和NLP)
- awesome-transformer-nlp - 🤗 Hugging Face Transformers - transformers](https://github.com/huggingface/pytorch-transformers) and [pytorch-pretrained-bert](https://github.com/huggingface/pytorch-pretrained-BERT)) provides state-of-the-art general-purpose architectures (BERT, GPT-2, RoBERTa, XLM, DistilBert, XLNet, CTRL...) for Natural Language Understanding (NLU) and Natural Language Generation (NLG) with over 32+ pretrained models in 100+ languages and deep interoperability between TensorFlow 2.0 and PyTorch. [[Paper](https://arxiv.org/abs/1910.03771)] (Transformer Implementations By Communities / PyTorch and TensorFlow)
- awesome-rainmana - huggingface/transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX. (Python)
- awesome-ChatGPT-repositories - transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX. (NLP)
- awesome-starred - transformers - 🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0. (Python)
- Awesome-pytorch-list-CNVersion - transformers
- awesome-starts - huggingface/transformers - 🤗Transformers: State-of-the-art Natural Language Processing for Pytorch and TensorFlow 2.0. (Python)
- best-of-awesome - transformers - of-the-art Machine Learning for Pytorch, TensorFlow, and JAX. (data science)
- awesome-open-data-centric-ai - Huggingface transformers - of-the-art Machine Learning for Pytorch, TensorFlow, and JAX. |  | <a href="https://github.com/huggingface/transformers/blob/main/LICENSE"><img src="https://img.shields.io/github/license/huggingface/transformers" height="15"/></a> | (Embeddings and pre-trained models)
- Awesome-Tensorflow2 - huggingface/transformers
- awesome-production-machine-learning - 🤗 Transformers - Huggingface's library of state-of-the-art pretrained models for Natural Language Processing (NLP). (Industrial Strength NLP)
- awesome-huggingface - transformers - State-of-the-art natural language processing for Jax, PyTorch and TensorFlow. (🤗 Official Libraries)
- awesome - transformers - of-the-art Machine Learning for Pytorch, TensorFlow, and JAX. | 7 minutes ago | ([Python](#python))
- awesome-list - HuggingFace Transformers - A high-level machine learning library for text, images and audio data, with support for Pytorch, TensorFlow and JAX. (Natural Language Processing / General Purpose NLP)
- awesome-tokenizers - transformers BertTokenizer
- awesome-tokenizers - transformers BertTokenizer
- Jupyter-Guide - Transformer - of-the-art Natural Language Processing for Pytorch, TensorFlow, and JAX. (PyTorch Tools, Libraries, and Frameworks)
- awesome-ARTificial - transformers - hugginface transformers. (Uncategorized / Uncategorized)
- StarryDivineSky - huggingface/transformers
- awesome-ML-NLP - transformers - Python package providing thousands of pretrained models to perform tasks on different modalities such as text, vision, and audio (Libraries, Softwares)
- Awesome-pytorch-list - transformers - of-the-art Natural Language Processing for TensorFlow 2.0 and PyTorch. huggingface.co/transformers (Pytorch & related libraries / NLP & Speech Processing:)
- awesome-text-ml - Transformers - Transformers: State-of-the-art Natural Language Processing for TensorFlow 2.0 and PyTorch. https://huggingface.co/transformers (Frameworks and libraries / :snake: Python)
- bert-in-production - huggingface/transformers - of-the-art Natural Language Processing for TensorFlow 2.0 and PyTorch. The transformers library is focussed on using publicly-available pretrained models and has wide support for many of the most popular varieties. (Implementations)
- awesome-nlg - Transformers - State-of-the-art Natural Language Processing for TensorFlow 2.0 and PyTorch. (Neural Natural Language Generation)
- project-awesome - huggingface/transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX. (Python)
- awesome-starred - huggingface/transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX. (deep-learning)
- awesome-pytorch-list - transformers
- awesome-pytorch-list - transformers
- awesome-ai - Transformers - HuggingFace's NLP (Natural Language Processing) library. Transformers is backed by the three most popular deep learning libraries — [Jax](https://jax.readthedocs.io/en/latest/), [PyTorch](https://pytorch.org/) and [TensorFlow](https://www.tensorflow.org/) — with a seamless integration between them. (Communities & Organizations)
- awesome-pydantic - Transformers - State-of-the-art Natural Language Processing for PyTorch and TensorFlow 2.0. (Machine Learning)
- awesome-pydantic - Transformers - State-of-the-art Natural Language Processing for PyTorch and TensorFlow 2.0. (Machine Learning)
- AiTreasureBox - huggingface/transformers - 04-15_142947_2](https://img.shields.io/github/stars/huggingface/transformers.svg)|🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.| (Repos)
- awesome-jax - HuggingFace Transformers - Ecosystem of pretrained Transformers for a wide range of natural language tasks (Flax). <img src="https://img.shields.io/github/stars/huggingface/transformers?style=social" align="center"> (Libraries)
- awesome-colab-project - Huggingface Transformers
- my-awesome - huggingface/transformers - learning,flax,hacktoberfest,jax,language-model,language-models,machine-learning,model-hub,natural-language-processing,nlp,nlp-library,pretrained-models,python,pytorch,pytorch-transformers,seq2seq,speech-recognition,tensorflow,transformer pushed_at:2025-04 star:143.0k fork:28.6k 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX. (Python)
- awesome-repositories - huggingface/transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX. (Python)
- awesome-production-machine-learning - Transformers - Huggingface's library of state-of-the-art pretrained models for Natural Language Processing (NLP). (Industry Strength Natural Language Processing)
- awesome-arsenal - Transformers - 预训练模型。 (武器库 / 人工智能)
- awesome-docker - huggingface/transformers
- awesome-docker - huggingface/transformers
- awesome-sciml - huggingface/transformers: 🤗 Transformers: State-of-the-art Natural Language Processing for Pytorch, TensorFlow, and JAX.
- awesome - huggingface/transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX. (Python)
- awesome-tensorflow-2 - Transformers: State-of-the-art Natural Language Processing for TensorFlow 2.0 and PyTorch
README
![]()
English |
简体中文 |
繁體中文 |
한국어 |
Español |
日本語 |
हिन्दी |
Русский |
Рortuguês |
తెలుగు |
Français |
Deutsch |
Tiếng Việt |
العربية |
اردو |
State-of-the-art pretrained models for inference and training
![]()
Transformers is a library of pretrained text, computer vision, audio, video, and multimodal models for inference and training. Use Transformers to fine-tune models on your data, build inference applications, and for generative AI use cases across multiple modalities.
There are over 500K+ Transformers [model checkpoints](https://huggingface.co/models?library=transformers&sort=trending) on the [Hugging Face Hub](https://huggingface.com/models) you can use.
Explore the [Hub](https://huggingface.com/) today to find a model and use Transformers to help you get started right away.
## Installation
Transformers works with Python 3.9+ [PyTorch](https://pytorch.org/get-started/locally/) 2.1+, [TensorFlow](https://www.tensorflow.org/install/pip) 2.6+, and [Flax](https://flax.readthedocs.io/en/latest/) 0.4.1+.
Create and activate a virtual environment with [venv](https://docs.python.org/3/library/venv.html) or [uv](https://docs.astral.sh/uv/), a fast Rust-based Python package and project manager.
```py
# venv
python -m venv .my-env
source .my-env/bin/activate# uv
uv venv .my-env
source .my-env/bin/activate
```Install Transformers in your virtual environment.
```py
# pip
pip install transformers# uv
uv pip install transformers
```Install Transformers from source if you want the latest changes in the library or are interested in contributing. However, the *latest* version may not be stable. Feel free to open an [issue](https://github.com/huggingface/transformers/issues) if you encounter an error.
```shell
git clone https://github.com/huggingface/transformers.git
cd transformers
pip install .
```## Quickstart
Get started with Transformers right away with the [Pipeline](https://huggingface.co/docs/transformers/pipeline_tutorial) API. The `Pipeline` is a high-level inference class that supports text, audio, vision, and multimodal tasks. It handles preprocessing the input and returns the appropriate output.
Instantiate a pipeline and specify model to use for text generation. The model is downloaded and cached so you can easily reuse it again. Finally, pass some text to prompt the model.
```py
from transformers import pipelinepipeline = pipeline(task="text-generation", model="Qwen/Qwen2.5-1.5B")
pipeline("the secret to baking a really good cake is ")
[{'generated_text': 'the secret to baking a really good cake is 1) to use the right ingredients and 2) to follow the recipe exactly. the recipe for the cake is as follows: 1 cup of sugar, 1 cup of flour, 1 cup of milk, 1 cup of butter, 1 cup of eggs, 1 cup of chocolate chips. if you want to make 2 cakes, how much sugar do you need? To make 2 cakes, you will need 2 cups of sugar.'}]
```To chat with a model, the usage pattern is the same. The only difference is you need to construct a chat history (the input to `Pipeline`) between you and the system.
> [!TIP]
> You can also chat with a model directly from the command line.
> ```shell
> transformers-cli chat --model_name_or_path Qwen/Qwen2.5-0.5B-Instruct
> ``````py
import torch
from transformers import pipelinechat = [
{"role": "system", "content": "You are a sassy, wise-cracking robot as imagined by Hollywood circa 1986."},
{"role": "user", "content": "Hey, can you tell me any fun things to do in New York?"}
]pipeline = pipeline(task="text-generation", model="meta-llama/Meta-Llama-3-8B-Instruct", torch_dtype=torch.bfloat16, device_map="auto")
response = pipeline(chat, max_new_tokens=512)
print(response[0]["generated_text"][-1]["content"])
```Expand the examples below to see how `Pipeline` works for different modalities and tasks.
Automatic speech recognition
```py
from transformers import pipelinepipeline = pipeline(task="automatic-speech-recognition", model="openai/whisper-large-v3")
pipeline("https://huggingface.co/datasets/Narsil/asr_dummy/resolve/main/mlk.flac")
{'text': ' I have a dream that one day this nation will rise up and live out the true meaning of its creed.'}
```Image classification
![]()
```py
from transformers import pipelinepipeline = pipeline(task="image-classification", model="facebook/dinov2-small-imagenet1k-1-layer")
pipeline("https://huggingface.co/datasets/Narsil/image_dummy/raw/main/parrots.png")
[{'label': 'macaw', 'score': 0.997848391532898},
{'label': 'sulphur-crested cockatoo, Kakatoe galerita, Cacatua galerita',
'score': 0.0016551691805943847},
{'label': 'lorikeet', 'score': 0.00018523589824326336},
{'label': 'African grey, African gray, Psittacus erithacus',
'score': 7.85409429227002e-05},
{'label': 'quail', 'score': 5.502637941390276e-05}]
```Visual question answering
![]()
```py
from transformers import pipelinepipeline = pipeline(task="visual-question-answering", model="Salesforce/blip-vqa-base")
pipeline(
image="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/tasks/idefics-few-shot.jpg",
question="What is in the image?",
)
[{'answer': 'statue of liberty'}]
```## Why should I use Transformers?
1. Easy-to-use state-of-the-art models:
- High performance on natural language understanding & generation, computer vision, audio, video, and multimodal tasks.
- Low barrier to entry for researchers, engineers, and developers.
- Few user-facing abstractions with just three classes to learn.
- A unified API for using all our pretrained models.1. Lower compute costs, smaller carbon footprint:
- Share trained models instead of training from scratch.
- Reduce compute time and production costs.
- Dozens of model architectures with 1M+ pretrained checkpoints across all modalities.1. Choose the right framework for every part of a models lifetime:
- Train state-of-the-art models in 3 lines of code.
- Move a single model between PyTorch/JAX/TF2.0 frameworks at will.
- Pick the right framework for training, evaluation, and production.1. Easily customize a model or an example to your needs:
- We provide examples for each architecture to reproduce the results published by its original authors.
- Model internals are exposed as consistently as possible.
- Model files can be used independently of the library for quick experiments.## Why shouldn't I use Transformers?
- This library is not a modular toolbox of building blocks for neural nets. The code in the model files is not refactored with additional abstractions on purpose, so that researchers can quickly iterate on each of the models without diving into additional abstractions/files.
- The training API is optimized to work with PyTorch models provided by Transformers. For generic machine learning loops, you should use another library like [Accelerate](https://huggingface.co/docs/accelerate).
- The [example scripts]((https://github.com/huggingface/transformers/tree/main/examples)) are only *examples*. They may not necessarily work out-of-the-box on your specific use case and you'll need to adapt the code for it to work.## 100 projects using Transformers
Transformers is more than a toolkit to use pretrained models, it's a community of projects built around it and the
Hugging Face Hub. We want Transformers to enable developers, researchers, students, professors, engineers, and anyone
else to build their dream projects.In order to celebrate Transformers 100,000 stars, we wanted to put the spotlight on the
community with the [awesome-transformers](./awesome-transformers.md) page which lists 100
incredible projects built with Transformers.If you own or use a project that you believe should be part of the list, please open a PR to add it!
## Example models
You can test most of our models directly on their [Hub model pages](https://huggingface.co/models).
Expand each modality below to see a few example models for various use cases.
Audio
- Audio classification with [Whisper](https://huggingface.co/openai/whisper-large-v3-turbo)
- Automatic speech recognition with [Moonshine](https://huggingface.co/UsefulSensors/moonshine)
- Keyword spotting with [Wav2Vec2](https://huggingface.co/superb/wav2vec2-base-superb-ks)
- Speech to speech generation with [Moshi](https://huggingface.co/kyutai/moshiko-pytorch-bf16)
- Text to audio with [MusicGen](https://huggingface.co/facebook/musicgen-large)
- Text to speech with [Bark](https://huggingface.co/suno/bark)Computer vision
- Automatic mask generation with [SAM](https://huggingface.co/facebook/sam-vit-base)
- Depth estimation with [DepthPro](https://huggingface.co/apple/DepthPro-hf)
- Image classification with [DINO v2](https://huggingface.co/facebook/dinov2-base)
- Keypoint detection with [SuperGlue](https://huggingface.co/magic-leap-community/superglue_outdoor)
- Keypoint matching with [SuperGlue](https://huggingface.co/magic-leap-community/superglue)
- Object detection with [RT-DETRv2](https://huggingface.co/PekingU/rtdetr_v2_r50vd)
- Pose Estimation with [VitPose](https://huggingface.co/usyd-community/vitpose-base-simple)
- Universal segmentation with [OneFormer](https://huggingface.co/shi-labs/oneformer_ade20k_swin_large)
- Video classification with [VideoMAE](https://huggingface.co/MCG-NJU/videomae-large)Multimodal
- Audio or text to text with [Qwen2-Audio](https://huggingface.co/Qwen/Qwen2-Audio-7B)
- Document question answering with [LayoutLMv3](https://huggingface.co/microsoft/layoutlmv3-base)
- Image or text to text with [Qwen-VL](https://huggingface.co/Qwen/Qwen2.5-VL-3B-Instruct)
- Image captioning [BLIP-2](https://huggingface.co/Salesforce/blip2-opt-2.7b)
- OCR-based document understanding with [GOT-OCR2](https://huggingface.co/stepfun-ai/GOT-OCR-2.0-hf)
- Table question answering with [TAPAS](https://huggingface.co/google/tapas-base)
- Unified multimodal understanding and generation with [Emu3](https://huggingface.co/BAAI/Emu3-Gen)
- Vision to text with [Llava-OneVision](https://huggingface.co/llava-hf/llava-onevision-qwen2-0.5b-ov-hf)
- Visual question answering with [Llava](https://huggingface.co/llava-hf/llava-1.5-7b-hf)
- Visual referring expression segmentation with [Kosmos-2](https://huggingface.co/microsoft/kosmos-2-patch14-224)NLP
- Masked word completion with [ModernBERT](https://huggingface.co/answerdotai/ModernBERT-base)
- Named entity recognition with [Gemma](https://huggingface.co/google/gemma-2-2b)
- Question answering with [Mixtral](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1)
- Summarization with [BART](https://huggingface.co/facebook/bart-large-cnn)
- Translation with [T5](https://huggingface.co/google-t5/t5-base)
- Text generation with [Llama](https://huggingface.co/meta-llama/Llama-3.2-1B)
- Text classification with [Qwen](https://huggingface.co/Qwen/Qwen2.5-0.5B)## Citation
We now have a [paper](https://www.aclweb.org/anthology/2020.emnlp-demos.6/) you can cite for the 🤗 Transformers library:
```bibtex
@inproceedings{wolf-etal-2020-transformers,
title = "Transformers: State-of-the-Art Natural Language Processing",
author = "Thomas Wolf and Lysandre Debut and Victor Sanh and Julien Chaumond and Clement Delangue and Anthony Moi and Pierric Cistac and Tim Rault and Rémi Louf and Morgan Funtowicz and Joe Davison and Sam Shleifer and Patrick von Platen and Clara Ma and Yacine Jernite and Julien Plu and Canwen Xu and Teven Le Scao and Sylvain Gugger and Mariama Drame and Quentin Lhoest and Alexander M. Rush",
booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations",
month = oct,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/2020.emnlp-demos.6",
pages = "38--45"
}
```