https://github.com/ggldnl/bert
Pytorch lightning implementation of an encoder-only architecture
https://github.com/ggldnl/bert
bert next-poi-recommendation transformer
Last synced: 3 months ago
JSON representation
Pytorch lightning implementation of an encoder-only architecture
- Host: GitHub
- URL: https://github.com/ggldnl/bert
- Owner: ggldnl
- Created: 2024-03-31T10:58:57.000Z (about 1 year ago)
- Default Branch: main
- Last Pushed: 2024-03-31T20:46:23.000Z (about 1 year ago)
- Last Synced: 2025-01-14T02:47:37.744Z (5 months ago)
- Topics: bert, next-poi-recommendation, transformer
- Language: Python
- Homepage:
- Size: 55.7 KB
- Stars: 0
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Bidirectional Encoder Representations from Transformers (BERT)
Pytorch lightning implementation of the BERT encoder-only architecture
as described in the [BERT: Pre-training of Deep Bidirectional Transformers for
Language Understanding](https://arxiv.org/pdf/1810.04805.pdf) paper.
Along with the architecture, the repo contains the code to run training and inference on
a next POI recommendation task. A Tokenizer and a Dataloader are provided as well.
The dataloader uses the [Foursquare](https://sites.google.com/site/yangdingqi/home/foursquare-dataset) dataset.