https://github.com/hasanisaeed/c-transformer
Implementation of the core Transformer architecture in pure C
https://github.com/hasanisaeed/c-transformer
c c-transformers generative-ai llm transformers
Last synced: 9 months ago
JSON representation
Implementation of the core Transformer architecture in pure C
- Host: GitHub
- URL: https://github.com/hasanisaeed/c-transformer
- Owner: hasanisaeed
- License: mit
- Created: 2024-08-14T18:31:04.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2024-09-08T18:54:08.000Z (over 1 year ago)
- Last Synced: 2025-01-21T13:35:26.151Z (11 months ago)
- Topics: c, c-transformers, generative-ai, llm, transformers
- Language: C
- Homepage:
- Size: 164 KB
- Stars: 2
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
## C-Transformer
I created this repo to test my C programming skills and see how well I could handle it. I decided to implement a Transformer based on the famous [Attention is All You Need](https://arxiv.org/abs/1706.03762) paper.

The Transformer - model architecture
## Output
```
***** Encoding *****
Humanity 1
thrives 2
on 3
compassion 4
a 5
fundamental 6
trait 7
...
***** Positional Embedding *****
0.000000, 0.010000, 0.000200, 0.000003,
1.000000, 0.999950, 1.000000, 1.000000,
...
***** Single Head Attention *****
0.589237, 0.589154, 0.603617, 0.518480,
0.614376, 0.629398, 0.638245, 0.511109,
...
***** Feed Forward Output *****
0.260464, 1.185559, 0.141220, 1.783113,
...
***** Decoder Output *****
0.872295, -0.787095, -1.186268, 1.101068,
...
>> Predicted Word: true
```