https://github.com/hkuds/dccf
[SIGIR'2023] "DCCF: Disentangled Contrastive Collaborative Filtering"
https://github.com/hkuds/dccf
collaborative-filtering contrastive-learning disentangled-representations graph-neural-networks recommender-system self-supervised-learning
Last synced: about 2 months ago
JSON representation
[SIGIR'2023] "DCCF: Disentangled Contrastive Collaborative Filtering"
- Host: GitHub
- URL: https://github.com/hkuds/dccf
- Owner: HKUDS
- Created: 2023-04-23T15:44:30.000Z (over 2 years ago)
- Default Branch: main
- Last Pushed: 2024-03-18T04:56:49.000Z (over 1 year ago)
- Last Synced: 2025-07-15T02:08:59.248Z (3 months ago)
- Topics: collaborative-filtering, contrastive-learning, disentangled-representations, graph-neural-networks, recommender-system, self-supervised-learning
- Language: Python
- Homepage: https://arxiv.org/abs/2305.02759
- Size: 51.8 MB
- Stars: 57
- Watchers: 3
- Forks: 6
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# Disentangled Contrastive Collaborative Filtering
This is the PyTorch implementation by @Re-bin for DCCF model proposed in this paper:
>**Disentangled Contrastive Collaborative Filtering**
> Xubin Ren, Lianghao Xia, Jiashu Zhao, Dawei Yin, Chao Huang*\
>*SIGIR 2023*\* denotes corresponding author
![]()
In this paper, we propose a disentangled contrastive learning method for recommendation, which explores latent factors underlying implicit intents for interactions. In particular, a graph structure learning layer is devised to enable the adaptive interaction augmentation, based on the learned disentangle user (item) intent-aware dependencies. Along the augmented intent-aware graph structures, we propose a intent-aware contrastive learning scheme to bring the benefits of disentangled self-supervision signals.
## Environment
The codes are written in Python 3.8.13 with the following dependencies.
- numpy == 1.22.3
- pytorch == 1.11.0 (GPU version)
- torch-scatter == 2.0.9
- torch-sparse == 0.6.14
- scipy == 1.9.3## Dataset
We utilized three public datasets to evaluate DCCF: *Gowalla, Amazon-book,* and *Tmall*.
Note that the validation set is only used for tuning hyperparameters, and for *Gowalla* / *Tmall*, the validation set is merged into the training set for training.
## Examples to run the codes
The command to train DCCF on the Gowalla / Amazon-book / Tmall dataset is as follows.
We train DCCF with a fixed number of epochs and save the parameters obtained after the final epoch for testing.
- Gowalla
```python DCCF_PyTorch.py --dataset gowalla --epoch 150```
- Amazon-book:
```python DCCF_PyTorch.py --dataset amazon --epoch 100```
- Tmall:
```python DCCF_PyTorch.py --dataset tmall --epoch 100```
**For advanced usage of arguments, run the code with --help argument.**
**Thanks for your interest in our work.**
## Citation
If you find this work is helpful to your research, please consider citing our paper:
```bibtex
@inproceedings{ren2023disentangled,
title={Disentangled contrastive collaborative filtering},
author={Ren, Xubin and Xia, Lianghao and Zhao, Jiashu and Yin, Dawei and Huang, Chao},
booktitle={Proceedings of the 46th International ACM SIGIR Conference on Research and Development in Information Retrieval},
pages={1137--1146},
year={2023}
}
```