https://github.com/adrianbzg/sfavel
Code for "Unsupervised Pretraining for Fact Verification by Language Model Distillation" (ICLR 2024)
https://github.com/adrianbzg/sfavel
deep-learning knowledge-distillation knowledge-graphs language-model multimodal-deep-learning natural-language-processing self-supervised-learning
Last synced: 11 months ago
JSON representation
Code for "Unsupervised Pretraining for Fact Verification by Language Model Distillation" (ICLR 2024)
- Host: GitHub
- URL: https://github.com/adrianbzg/sfavel
- Owner: AdrianBZG
- License: mit
- Created: 2023-11-12T23:41:02.000Z (over 2 years ago)
- Default Branch: main
- Last Pushed: 2024-03-03T15:18:01.000Z (almost 2 years ago)
- Last Synced: 2025-03-21T17:24:14.781Z (11 months ago)
- Topics: deep-learning, knowledge-distillation, knowledge-graphs, language-model, multimodal-deep-learning, natural-language-processing, self-supervised-learning
- Language: Python
- Homepage: https://arxiv.org/abs/2309.16540
- Size: 14.6 KB
- Stars: 4
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# SFAVEL: Unsupervised Pretraining for Fact Verification by Language Model Distillation
### [Paper](https://arxiv.org/abs/2309.16540) | [ICLR 2024](https://openreview.net/forum?id=1mjsP8RYAw)
This is the official implementation of the paper "Unsupervised Pretraining for Fact Verification by Language Model Distillation".
Code coming up soon, stay tuned!
## Citation
```
@inproceedings{
bazaga2024unsupervised,
title={Unsupervised Pretraining for Fact Verification by Language Model Distillation},
author={Adrián Bazaga and Pietro Liò and Gos Micklem},
booktitle={The Twelfth International Conference on Learning Representations},
year={2024},
url={https://openreview.net/forum?id=1mjsP8RYAw}
}
```
## Contact
For feedback, questions, or press inquiries please contact [Adrián Bazaga](mailto:ar989@cam.ac.uk)