Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/follgad/hpong
in progress
https://github.com/follgad/hpong
Last synced: 26 days ago
JSON representation
in progress
- Host: GitHub
- URL: https://github.com/follgad/hpong
- Owner: FOLLGAD
- Created: 2024-12-02T16:03:44.000Z (about 1 month ago)
- Default Branch: main
- Last Pushed: 2024-12-02T16:28:20.000Z (about 1 month ago)
- Last Synced: 2024-12-02T17:34:34.575Z (about 1 month ago)
- Language: Python
- Homepage:
- Size: 29.9 MB
- Stars: 2
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# hPong (WIP)
![hPong Demo 2](./assets/pong_vae.png)
![hPong Demo](./assets/pong_simulation.png)
## Doing:
- [ ] implement **temporal** position embedding using rope
## Todo:
- [ ] Read up on rotary position embeddings in transformers, look at example implementations
- [ ] implement **spacial** position embedding using rope
- [ ] make vae training output images after each epoch
- [ ] Figure out why KL divergence of the DiT is so high (in the 9 orders of magnitude)
- [ ] guidance for user actions on DiT (STGuidance)Done:
- [x] Split each frame into its own VAE encoding/decoding.
- [x] in the dataset, the "player" paddle should sometimes play like a good player for data quality# References
- [RoPE: Rotary Position Embedding](https://arxiv.org/pdf/2104.09864v5)
- [Spatiotemporal Skip Guidance](https://arxiv.org/pdf/2411.18664)
- [Classifier-free Guidance](https://arxiv.org/pdf/2207.12598)
- [Axial Attention](https://arxiv.org/pdf/1912.12180)