Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/warner-benjamin/commented-transformers
Highly commented implementations of Transformers in PyTorch
https://github.com/warner-benjamin/commented-transformers
bert gpt pytorch transformers
Last synced: 4 months ago
JSON representation
Highly commented implementations of Transformers in PyTorch
- Host: GitHub
- URL: https://github.com/warner-benjamin/commented-transformers
- Owner: warner-benjamin
- License: mit
- Created: 2023-07-28T03:32:56.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2023-08-02T05:02:57.000Z (over 1 year ago)
- Last Synced: 2024-10-07T08:03:04.091Z (5 months ago)
- Topics: bert, gpt, pytorch, transformers
- Language: Python
- Homepage:
- Size: 9.77 KB
- Stars: 127
- Watchers: 3
- Forks: 9
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Commented Transformers
Highly commented implementations of Transformers in PyTorch for *Creating a Transformer From Scratch* series:
1. [The Attention Mechanism](https://benjaminwarner.dev/2023/07/01/attention-mechanism.html)
2. [The Rest of the Transformer](https://benjaminwarner.dev/2023/07/28/rest-of-the-transformer.html)The layers folder contains implementations for Bidirectional Attention, Causal Attention, and CausalCrossAttention.
The models folder contains single file implementations for GPT-2 and BERT. Both models are compatible with `torch.compile(..., fullgraph=True)`.