Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/adrianbzg/sfavel

Code for "Unsupervised Pretraining for Fact Verification by Language Model Distillation" (ICLR 2024)
https://github.com/adrianbzg/sfavel

deep-learning knowledge-distillation knowledge-graphs language-model multimodal-deep-learning natural-language-processing self-supervised-learning

Last synced: 8 days ago
JSON representation

Code for "Unsupervised Pretraining for Fact Verification by Language Model Distillation" (ICLR 2024)

Awesome Lists containing this project

README

        

# SFAVEL: Unsupervised Pretraining for Fact Verification by Language Model Distillation
### [Paper](https://arxiv.org/abs/2309.16540) | [ICLR 2024](https://openreview.net/forum?id=1mjsP8RYAw)

This is the official implementation of the paper "Unsupervised Pretraining for Fact Verification by Language Model Distillation".

Code coming up soon, stay tuned!

## Citation

```
@inproceedings{
bazaga2024unsupervised,
title={Unsupervised Pretraining for Fact Verification by Language Model Distillation},
author={Adrián Bazaga and Pietro Liò and Gos Micklem},
booktitle={The Twelfth International Conference on Learning Representations},
year={2024},
url={https://openreview.net/forum?id=1mjsP8RYAw}
}
```

## Contact

For feedback, questions, or press inquiries please contact [Adrián Bazaga](mailto:[email protected])