Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/JiachengLi1995/TiSASRec
TensorFlow implementation for paper Time Interval Aware Self-Attention for Sequential Recommendation.
https://github.com/JiachengLi1995/TiSASRec
Last synced: about 2 months ago
JSON representation
TensorFlow implementation for paper Time Interval Aware Self-Attention for Sequential Recommendation.
- Host: GitHub
- URL: https://github.com/JiachengLi1995/TiSASRec
- Owner: JiachengLi1995
- Created: 2020-03-07T20:56:44.000Z (almost 5 years ago)
- Default Branch: master
- Last Pushed: 2020-10-06T19:49:58.000Z (over 4 years ago)
- Last Synced: 2024-08-09T13:18:33.328Z (5 months ago)
- Language: Python
- Homepage:
- Size: 6.68 MB
- Stars: 118
- Watchers: 1
- Forks: 25
- Open Issues: 3
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
- StarryDivineSky - JiachengLi1995/TiSASRec - Attention for Sequential Recommendation 时间间隔自注意力模型用于序列推荐。 基于序列模型框架对行为的时间戳进行建模,在下一个商品预测中探索不同时间间隔的影响。 (其他_推荐系统 / 网络服务_其他)
README
# TiSASRec: Time Interval Aware Self-Attention for Sequential Recommendation
This is our TensorFlow implementation for the paper:
Jiacheng Li, Yujie Wang, [Julian McAuley](http://cseweb.ucsd.edu/~jmcauley/) (2020). *[Time Interval Aware Self-Attention for Sequential Recommendation.](https://cseweb.ucsd.edu/~jmcauley/pdfs/wsdm20b.pdf)* WSDM'20
We refer to the repo [SASRec](https://github.com/kang205/SASRec).
Please cite our paper if you use the code or datasets.
The code is tested under a Linux desktop (w/ GTX 1080 Ti GPU) with TensorFlow.
For Pytorch version of TiSASRec, please refer to [repo](https://github.com/pmixer/TiSASRec.pytorch).
## Datasets
This repo includes ml-1m dataset as an example.
For Amazon dataset, you could download Amazon review data from *[here.](http://jmcauley.ucsd.edu/data/amazon/index.html)*.
## Model Training
To train our model on `ml-1m` (with default hyper-parameters):
```
python main.py --dataset=ml-1m --train_dir=default
```## Misc
The implemention of self attention is modified based on *[this](https://github.com/Kyubyong/transformer)*.
## Contact
If you have any questions, please send me an email ([email protected]).