https://github.com/applied-machine-learning-lab/hamur
Code implementation of "HAMUR: Hyper Adapter for Multi-Domain Recommendation" in CIKM‘2023
https://github.com/applied-machine-learning-lab/hamur
ctr-prediction domain-adaptation multi-domain multi-domain-learning multi-domain-recommendation multi-scenario multi-scenario-recommendation recommender-system
Last synced: about 2 months ago
JSON representation
Code implementation of "HAMUR: Hyper Adapter for Multi-Domain Recommendation" in CIKM‘2023
- Host: GitHub
- URL: https://github.com/applied-machine-learning-lab/hamur
- Owner: Applied-Machine-Learning-Lab
- License: apache-2.0
- Created: 2023-08-09T06:52:53.000Z (about 2 years ago)
- Default Branch: main
- Last Pushed: 2024-01-03T18:21:13.000Z (almost 2 years ago)
- Last Synced: 2025-06-20T04:38:15.659Z (4 months ago)
- Topics: ctr-prediction, domain-adaptation, multi-domain, multi-domain-learning, multi-domain-recommendation, multi-scenario, multi-scenario-recommendation, recommender-system
- Language: Python
- Homepage:
- Size: 907 KB
- Stars: 23
- Watchers: 0
- Forks: 4
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# HAMUR
Official implementation of our paper [HAMUR: Hyper Adapter for Multi-Domain Recommendation](https://arxiv.org/pdf/2309.06217.pdf) in CIKM 2023.
You could cite our paper if you find this repository interesting or helpful:
```
@inproceedings{li2023hamur,
title={HAMUR: Hyper Adapter for Multi-Domain Recommendation},
author={Li, Xiaopeng and Yan, Fan and Zhao, Xiangyu and Wang, Yichao and Chen, Bo and Guo, Huifeng and Tang, Ruiming},
booktitle={Proceedings of the 32nd ACM International Conference on Information and Knowledge Management},
pages={1268--1277},
year={2023}
}
```## Introduction
Source code of HAMUR: Hyper Adapter for Multi-Domain Recommendation, in Proceedings of the 32nd ACM International Conference on Information and Knowledge Management(CIKM 23').
## Environment Setting
* torch >=1.7.0
* numpy >=1.23.5
* pandas >=1.5.3
* scikit-learn >=0.23.2## Dataset Download
In this paper, we use two datasets, **Aliccp** and **movieLens**. Dataset samples are shown in example/data.Full dataset download:
* Aliccp: Download address https://tianchi.aliyun.com/dataset/408.
* Movielens: The raw data file can be found in [Torch-Rechub-ml-1m](https://github.com/morningsky/Torch-RecHub/tree/main/examples/matching/data/ml-1m), and you could directly download the processed file from https://cowtransfer.com/s/5a3ab69ebd314e.## Models
In this repo, we offer the following models. Their structures are shown in the following figure.

* Pure MLP as multi-domain backbone models.
* MLP + HAMUR
* Pure Wide & Deep as multi-domain backbone models.
* Wide & Deep + HAMUR
* Pure DCN as multi-domain backbone models.
* DCN + HAMUR## Usage
### Step 1: Clone the repository
```Shell
git clone https://github.com/Applied-Machine-Learning-Lab/HAMUR.git
```### Step 2: Run the model
```Shell
cd examples
# For Aliccp
python run_ali_ccp_ctr_ranking_multi_domain.py --model_name mlp_adp --epoch 200 --device cpu --seed 2022
# For MovieLens
python run_movielens_rank_multi_domain.py --model_name mlp_adp --epoch 200 --device cpu --seed 2022```
## Credits
Our code is developed based on [Torch-RecHub](https://github.com/datawhalechina/torch-rechub). Thanks to their contribution.