https://github.com/ambidextrous9/knowledge-distillation-using-flant5-teacher-student-method
Knowledge Distillation using FlanT5 : Teacher-Student Method
https://github.com/ambidextrous9/knowledge-distillation-using-flant5-teacher-student-method
Last synced: 7 months ago
JSON representation
Knowledge Distillation using FlanT5 : Teacher-Student Method
- Host: GitHub
- URL: https://github.com/ambidextrous9/knowledge-distillation-using-flant5-teacher-student-method
- Owner: ambideXtrous9
- Created: 2023-12-20T18:40:19.000Z (almost 2 years ago)
- Default Branch: main
- Last Pushed: 2023-12-22T21:32:33.000Z (almost 2 years ago)
- Last Synced: 2025-01-11T21:32:55.782Z (9 months ago)
- Language: Jupyter Notebook
- Size: 373 KB
- Stars: 0
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Knowledge-Distillation-using-FlanT5-Teacher-Student-Method
Knowledge Distillation using FlanT5 : Teacher-Student Method[Fast Beam Search Decoding in PyTorch](https://pytorch.org/blog/fast-beam-search-decoding-in-pytorch-with-torchaudio-and-flashlight-text/)
[Beam Search for Machine Translation](https://kikaben.com/beam-search-for-machine-translation/)
[Transformer’s Evaluation Details](https://kikaben.com/transformers-evaluation-details/#beam-search-translator)
[LANGUAGE TRANSLATION WITH NN.TRANSFORMER](https://pytorch.org/tutorials/beginner/translation_transformer.html)
[PyTorch-seq2seq](https://github.com/bentrevett/pytorch-seq2seq/blob/master/6%20-%20Attention%20is%20All%20You%20Need.ipynb)