Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/chengchingwen/Transformers.jl
Julia Implementation of Transformer models
https://github.com/chengchingwen/Transformers.jl
attention deep-learning flux machine-learning natural-language-processing nlp transformer
Last synced: 3 months ago
JSON representation
Julia Implementation of Transformer models
- Host: GitHub
- URL: https://github.com/chengchingwen/Transformers.jl
- Owner: chengchingwen
- License: mit
- Created: 2018-12-27T06:24:53.000Z (almost 6 years ago)
- Default Branch: master
- Last Pushed: 2023-12-18T06:28:38.000Z (11 months ago)
- Last Synced: 2023-12-18T14:37:30.429Z (11 months ago)
- Topics: attention, deep-learning, flux, machine-learning, natural-language-processing, nlp, transformer
- Language: Julia
- Size: 2.57 MB
- Stars: 477
- Watchers: 16
- Forks: 60
- Open Issues: 24
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- License: LICENSE
Awesome Lists containing this project
- awesome-sciml - chengchingwen/Transformers.jl: Julia Implementation of Transformer models
- awesome-generative-ai-meets-julia-language - Transformers.jl - Transformers.jl is a Julia package that provides a high-level API for using pre-trained transformer models. It also allows to download any models from Hugging Face hub with `@hgf_str` macro string. (Models)
README
[![Build status](https://github.com/chengchingwen/Transformers.jl/actions/workflows/CI.yml/badge.svg?branch=master)](https://github.com/chengchingwen/Transformers.jl/actions/workflows/CI.yml?query=branch%3Amaster)
[![codecov](https://codecov.io/gh/chengchingwen/Transformers.jl/branch/master/graph/badge.svg)](https://codecov.io/gh/chengchingwen/Transformers.jl)
[![](https://img.shields.io/badge/docs-dev-blue.svg)](https://chengchingwen.github.io/Transformers.jl/dev/)Julia implementation of [transformer](https://arxiv.org/abs/1706.03762)-based models, with [Flux.jl](https://github.com/FluxML/Flux.jl).
*notice: The current version is almost completely different from the 0.1.x version. If you are using the old version, make sure to update the changes or stick to the old version.*
# Installation
In the Julia REPL:
]add Transformers
# Example
Using pretrained Bert with `Transformers.jl`.
```julia
using Transformers
using Transformers.TextEncoders
using Transformers.HuggingFacetextencoder, bert_model = hgf"bert-base-uncased"
text1 = "Peter Piper picked a peck of pickled peppers"
text2 = "Fuzzy Wuzzy was a bear"text = [[ text1, text2 ]] # 1 batch of contiguous sentences
sample = encode(textencoder, text) # tokenize + pre-process (add special tokens + truncate / padding + one-hot encode)@assert reshape(decode(textencoder, sample.token), :) == [
"[CLS]", "peter", "piper", "picked", "a", "peck", "of", "pick", "##led", "peppers", "[SEP]",
"fuzzy", "wu", "##zzy", "was", "a", "bear", "[SEP]"
]bert_features = bert_model(sample).hidden_state
```See `example` folder for the complete example.
# For more information
If you want to know more about this package, see the [document](https://chengchingwen.github.io/Transformers.jl/dev/)
and read code in the `example` folder. You can also tag me (@chengchingwen) on Julia's slack or discourse if
you have any questions, or just create a new Issue on GitHub.