Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/adrianbzg/sfavel
Code for "Unsupervised Pretraining for Fact Verification by Language Model Distillation" (ICLR 2024)
https://github.com/adrianbzg/sfavel
deep-learning knowledge-distillation knowledge-graphs language-model multimodal-deep-learning natural-language-processing self-supervised-learning
Last synced: 8 days ago
JSON representation
Code for "Unsupervised Pretraining for Fact Verification by Language Model Distillation" (ICLR 2024)
- Host: GitHub
- URL: https://github.com/adrianbzg/sfavel
- Owner: AdrianBZG
- License: mit
- Created: 2023-11-12T23:41:02.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2024-03-03T15:18:01.000Z (8 months ago)
- Last Synced: 2024-05-21T02:14:11.656Z (6 months ago)
- Topics: deep-learning, knowledge-distillation, knowledge-graphs, language-model, multimodal-deep-learning, natural-language-processing, self-supervised-learning
- Language: Python
- Homepage: https://arxiv.org/abs/2309.16540
- Size: 14.6 KB
- Stars: 4
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# SFAVEL: Unsupervised Pretraining for Fact Verification by Language Model Distillation
### [Paper](https://arxiv.org/abs/2309.16540) | [ICLR 2024](https://openreview.net/forum?id=1mjsP8RYAw)This is the official implementation of the paper "Unsupervised Pretraining for Fact Verification by Language Model Distillation".
Code coming up soon, stay tuned!
## Citation
```
@inproceedings{
bazaga2024unsupervised,
title={Unsupervised Pretraining for Fact Verification by Language Model Distillation},
author={Adrián Bazaga and Pietro Liò and Gos Micklem},
booktitle={The Twelfth International Conference on Learning Representations},
year={2024},
url={https://openreview.net/forum?id=1mjsP8RYAw}
}
```## Contact
For feedback, questions, or press inquiries please contact [Adrián Bazaga](mailto:[email protected])