https://github.com/ukplab/starsem2023-arithmetic-based-pretraining
Code and data for the StarSem 2023 paper "Arithmetic-Based Pretraining -- Improvin Numeracy of Pretrained Language Models"
https://github.com/ukplab/starsem2023-arithmetic-based-pretraining
bart contrastive-learning flan-t5 language-model numerical-reasoning pretraining t5 transformers
Last synced: 4 months ago
JSON representation
Code and data for the StarSem 2023 paper "Arithmetic-Based Pretraining -- Improvin Numeracy of Pretrained Language Models"
- Host: GitHub
- URL: https://github.com/ukplab/starsem2023-arithmetic-based-pretraining
- Owner: UKPLab
- License: apache-2.0
- Created: 2022-01-01T09:12:16.000Z (almost 4 years ago)
- Default Branch: main
- Last Pushed: 2023-07-23T08:50:26.000Z (about 2 years ago)
- Last Synced: 2023-07-23T09:34:51.467Z (about 2 years ago)
- Topics: bart, contrastive-learning, flan-t5, language-model, numerical-reasoning, pretraining, t5, transformers
- Language: Julia
- Homepage: https://arxiv.org/pdf/2205.06733.pdf
- Size: 108 MB
- Stars: 0
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE