{"id":13771358,"url":"https://github.com/easezyc/Multitask-Recommendation-Library","last_synced_at":"2025-05-11T04:30:39.849Z","repository":{"id":38250663,"uuid":"466753328","full_name":"easezyc/Multitask-Recommendation-Library","owner":"easezyc","description":"MTReclib provides a PyTorch implementation of multi-task recommendation models and common datasets.","archived":false,"fork":false,"pushed_at":"2022-11-30T02:54:55.000Z","size":48,"stargazers_count":263,"open_issues_count":4,"forks_count":42,"subscribers_count":2,"default_branch":"main","last_synced_at":"2024-05-22T11:33:00.493Z","etag":null,"topics":["advertising","ctr-prediction","multitask-learning","multitask-recommendation","recommendation","recommender-system","transfer-learning"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/easezyc.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2022-03-06T13:58:54.000Z","updated_at":"2024-05-22T02:50:52.000Z","dependencies_parsed_at":"2022-08-09T01:31:36.401Z","dependency_job_id":null,"html_url":"https://github.com/easezyc/Multitask-Recommendation-Library","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/easezyc%2FMultitask-Recommendation-Library","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/easezyc%2FMultitask-Recommendation-Library/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/easezyc%2FMultitask-Recommendation-Library/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/easezyc%2FMultitask-Recommendation-Library/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/easezyc","download_url":"https://codeload.github.com/easezyc/Multitask-Recommendation-Library/tar.gz/refs/heads/main","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":253518941,"owners_count":21921074,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["advertising","ctr-prediction","multitask-learning","multitask-recommendation","recommendation","recommender-system","transfer-learning"],"created_at":"2024-08-03T17:00:50.572Z","updated_at":"2025-05-11T04:30:39.572Z","avatar_url":"https://github.com/easezyc.png","language":"Python","readme":"# Multi-task Recommendation in PyTorch\n[![MIT License](https://img.shields.io/badge/license-MIT-green.svg)](https://opensource.org/licenses/MIT)  [![Awesome](https://awesome.re/badge.svg)](https://awesome.re)\n\n![MTRec](./mtreclib.png)\n\n-------------------------------------------------------------------------------\n\n## Introduction\nMTReclib provides a PyTorch implementation of multi-task recommendation models and common datasets. Currently, we implmented 7 multi-task recommendation models to enable fair comparison and boost the development of multi-task recommendation algorithms. The currently supported algorithms include:\n* SingleTask：Train one model for each task, respectively\n* Shared-Bottom: It is a traditional multi-task model with a shared bottom and multiple towers.\n* OMoE: [Adaptive Mixtures of Local Experts](https://ieeexplore.ieee.org/abstract/document/6797059) (Neural Computation 1991)\n* MMoE: [Modeling Task Relationships in Multi-task Learning with Multi-Gate Mixture-of-Experts](https://dl.acm.org/doi/pdf/10.1145/3219819.3220007) (KDD 2018)\n* PLE: [Progressive Layered Extraction (PLE): A Novel Multi-Task Learning (MTL) Model for Personalized Recommendations](https://dl.acm.org/doi/pdf/10.1145/3383313.3412236?casa_token=8fchWD8CHc0AAAAA:2cyP8EwkhIUlSFPRpfCGHahTddki0OEjDxfbUFMkXY5fU0FNtkvRzmYloJtLowFmL1en88FRFY4Q) (RecSys 2020 best paper)\n* AITM: [Modeling the Sequential Dependence among Audience Multi-step Conversions with Multi-task Learning in Targeted Display Advertising](https://dl.acm.org/doi/pdf/10.1145/3447548.3467071?casa_token=5YtVOYjJClUAAAAA:eVczwdynmE9dwoyElCG4da9fC5gsRiyX6zKt0_mIJF1K8NkU-SlNkGmpAu0c0EHbM3hBUe3zZc-o) (KDD 2021)\n* MetaHeac: [Learning to Expand Audience via Meta Hybrid Experts and Critics for Recommendation and Advertising](https://easezyc.github.io/data/kdd21_metaheac.pdf) (KDD 2021)\n\n## Datasets\n* AliExpressDataset: This is a dataset gathered from real-world traffic logs of the search system in AliExpress. This dataset is collected from 5 countries: Russia, Spain, French, Netherlands, and America, which can utilized as 5 multi-task datasets. [Original_dataset](https://tianchi.aliyun.com/dataset/dataDetail?dataId=74690) [Processed_dataset Google Drive](https://drive.google.com/drive/folders/1F0TqvMJvv-2pIeOKUw9deEtUxyYqXK6Y?usp=sharing) [Processed_dataset Baidu Netdisk](https://pan.baidu.com/s/1AfXoJSshjW-PILXZ6O19FA?pwd=4u0r)\n\n\u003e For the processed dataset, you should directly put the dataset in './data/' and unpack it. For the original dataset, you should put it in './data/' and run 'python preprocess.py --dataset_name NL'.\n\n## Requirements\n* Python 3.6\n* PyTorch \u003e 1.10\n* pandas\n* numpy\n* tqdm\n\n\n## Run\n\nParameter Configuration:\n\n- dataset_name: choose a dataset in ['AliExpress_NL', 'AliExpress_FR', 'AliExpress_ES', 'AliExpress_US'], default for `AliExpress_NL`\n- dataset_path: default for `./data`\n- model_name: choose a model in ['singletask', 'sharedbottom', 'omoe', 'mmoe', 'ple', 'aitm', 'metaheac'], default for `metaheac`\n- epoch: the number of epochs for training, default for `50`\n- task_num: the number of tasks, default for `2` (CTR \u0026 CVR)\n- expert_num: the number of experts for ['omoe', 'mmoe', 'ple', 'metaheac'], default for `8`\n- learning_rate: default for `0.001`\n- batch_size: default for `2048`\n- weight_decay: default for `1e-6`\n- device: the device to run the code, default for `cuda:0`\n- save_dir: the folder to save parameters, default for `chkpt`\n\nYou can run a model through:\n\n```powershell\npython main.py --model_name metaheac --num_expert 8 --dataset_name AliExpress_NL\n```\n\n## Results\n\u003e For fair comparisons, the learning rate is 0.001, the dimension of embeddings is 128, and mini-batch size is 2048 equally for all models. We report the mean AUC and Logloss over five random runs. Best results are in boldface.\n\n\u003ctable\u003e\n\t\u003chead \u003e\n\t\t\u003ctr\u003e\n      \u003cth rowspan=\"3\"; center\u003eMethods\u003c/th\u003e\n\t\t\t\u003cth colspan=\"4\"\u003e\u003ccenter\u003eAliExpress (Netherlands, NL)\u003c/center\u003e\u003c/th\u003e\n\t\t\t\u003cth colspan=\"4\"\u003e\u003ccenter\u003eAliExpress (Spain, ES)\u003c/center\u003e\u003c/th\u003e\n\t\t\u003c/tr\u003e\n\t\t\u003ctr \u003e\n\t\t\t\u003cth colspan=\"2\"\u003e\u003ccenter\u003eCTR\u003c/center\u003e\u003c/th\u003e\n      \u003cth colspan=\"2\"\u003e\u003ccenter\u003eCTCVR\u003c/center\u003e\u003c/th\u003e\n\t  \u003cth colspan=\"2\"\u003e\u003ccenter\u003eCTR\u003c/center\u003e\u003c/th\u003e\n      \u003cth colspan=\"2\"\u003e\u003ccenter\u003eCTCVR\u003c/center\u003e\u003c/th\u003e\n\t\t\u003c/tr\u003e\n\t\t\u003ctr\u003e\n\t\t\t\u003cth \u003eAUC\u003c/th\u003e\n\t\t\t\u003cth \u003eLogloss\u003c/th\u003e\n\t\t\t\u003cth \u003eAUC\u003c/th\u003e\n      \u003cth \u003eLogloss\u003c/th\u003e\n\t  \u003cth \u003eAUC\u003c/th\u003e\n\t\t\t\u003cth \u003eLogloss\u003c/th\u003e\n\t\t\t\u003cth \u003eAUC\u003c/th\u003e\n      \u003cth \u003eLogloss\u003c/th\u003e\n\t\t\u003c/tr\u003e\n\t\u003c/head\u003e\n\t\u003cbody\u003e\n\t\t\u003ctr\u003e\n\t\t\t\u003ctd\u003eSingleTask\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.7222 \u003c/td\u003e\n\t\t\t\u003ctd\u003e0.1085\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.8590\u003c/td\u003e\n      \u003ctd\u003e0.00609\u003c/td\u003e\n\t  \u003ctd\u003e0.7266\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.1207\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.8855\u003c/td\u003e\n      \u003ctd\u003e0.00456\u003c/td\u003e\n\t\t\u003c/tr\u003e\n    \u003ctr\u003e\n\t\t\t\u003ctd\u003eShared-Bottom\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.7228\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.1083\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.8511\u003c/td\u003e\n      \u003ctd\u003e0.00620\u003c/td\u003e\n\t  \u003ctd\u003e0.7287\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.1204\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.8866\u003c/td\u003e\n      \u003ctd\u003e0.00452\u003c/td\u003e\n\t\t\u003c/tr\u003e\n    \u003ctr\u003e\n\t\t\t\u003ctd\u003eOMoE\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.7254\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.1081\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.8611\u003c/td\u003e\n      \u003ctd\u003e0.00614\u003c/td\u003e\n\t  \u003ctd\u003e0.7253\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.1209\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.8859\u003c/td\u003e\n      \u003ctd\u003e0.00452\u003c/td\u003e\n\t\t\u003c/tr\u003e\n    \u003ctr\u003e\n\t\t\t\u003ctd\u003eMMoE\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.7234\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.1080\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.8606\u003c/td\u003e\n      \u003ctd\u003e0.00607\u003c/td\u003e\n\t  \u003ctd\u003e0.7285\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.1205\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.8898\u003c/td\u003e\n      \u003ctd\u003e\u003cstrong\u003e0.00450\u003c/strong\u003e\u003c/td\u003e\n\t\t\u003c/tr\u003e\n    \u003ctr\u003e\n\t\t\t\u003ctd\u003ePLE\u003c/td\u003e\n\t\t\t\u003ctd\u003e\u003cstrong\u003e0.7292\u003c/strong\u003e\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.1088\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.8591\u003c/td\u003e\n      \u003ctd\u003e0.00631\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.7273\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.1223\u003c/td\u003e\n\t\t\t\u003ctd\u003e\u003cstrong\u003e0.8913\u003c/strong\u003e\u003c/td\u003e\n      \u003ctd\u003e0.00461\u003c/td\u003e\n\t\t\u003c/tr\u003e\n    \u003ctr\u003e\n\t\t\t\u003ctd\u003eAITM\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.7240\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.1078\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.8577\u003c/td\u003e\n      \u003ctd\u003e0.00611\u003c/td\u003e\n\t  \u003ctd\u003e0.7290\u003c/td\u003e\n\t\t\t\u003ctd\u003e\u003cstrong\u003e0.1203\u003c/strong\u003e\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.8885\u003c/td\u003e\n      \u003ctd\u003e0.00451\u003c/td\u003e\n\t\t\u003c/tr\u003e\n    \u003ctr\u003e\n\t\t\t\u003ctd\u003eMetaHeac\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.7263\u003c/td\u003e\n\t\t\t\u003ctd\u003e\u003cstrong\u003e0.1077\u003c/strong\u003e\u003c/td\u003e\n\t\t\t\u003ctd\u003e\u003cstrong\u003e0.8615\u003c/strong\u003e\u003c/td\u003e\n      \u003ctd\u003e\u003cstrong\u003e0.00606\u003c/strong\u003e\u003c/td\u003e\n\t  \u003ctd\u003e\u003cstrong\u003e0.7299\u003c/strong\u003e\u003c/td\u003e\n\t\t\t\u003ctd\u003e\u003cstrong\u003e0.1203\u003c/strong\u003e\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.8883\u003c/td\u003e\n      \u003ctd\u003e\u003cstrong\u003e0.00450\u003c/strong\u003e\u003c/td\u003e\n\t\t\u003c/tr\u003e\n\t\u003c/body\u003e\n\u003c/table\u003e\n\n\u003ctable\u003e\n\t\u003chead \u003e\n\t\t\u003ctr\u003e\n      \u003cth rowspan=\"3\"; center\u003eMethods\u003c/th\u003e\n\t\t\t\u003cth colspan=\"4\"\u003e\u003ccenter\u003eAliExpress (French, FR)\u003c/center\u003e\u003c/th\u003e\n\t\t\t\u003cth colspan=\"4\"\u003e\u003ccenter\u003eAliExpress (America, US)\u003c/center\u003e\u003c/th\u003e\n\t\t\u003c/tr\u003e\n\t\t\u003ctr \u003e\n\t\t\t\u003cth colspan=\"2\"\u003e\u003ccenter\u003eCTR\u003c/center\u003e\u003c/th\u003e\n      \u003cth colspan=\"2\"\u003e\u003ccenter\u003eCTCVR\u003c/center\u003e\u003c/th\u003e\n\t  \u003cth colspan=\"2\"\u003e\u003ccenter\u003eCTR\u003c/center\u003e\u003c/th\u003e\n      \u003cth colspan=\"2\"\u003e\u003ccenter\u003eCTCVR\u003c/center\u003e\u003c/th\u003e\n\t\t\u003c/tr\u003e\n\t\t\u003ctr\u003e\n\t\t\t\u003cth \u003eAUC\u003c/th\u003e\n\t\t\t\u003cth \u003eLogloss\u003c/th\u003e\n\t\t\t\u003cth \u003eAUC\u003c/th\u003e\n      \u003cth \u003eLogloss\u003c/th\u003e\n\t  \u003cth \u003eAUC\u003c/th\u003e\n\t\t\t\u003cth \u003eLogloss\u003c/th\u003e\n\t\t\t\u003cth \u003eAUC\u003c/th\u003e\n      \u003cth \u003eLogloss\u003c/th\u003e\n\t\t\u003c/tr\u003e\n\t\u003c/head\u003e\n\t\u003cbody\u003e\n\t\t\u003ctr\u003e\n\t\t\t\u003ctd\u003eSingleTask\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.7259\u003c/td\u003e\n\t\t\t\u003ctd\u003e\u003cstrong\u003e0.1002\u003c/strong\u003e\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.8737\u003c/td\u003e\n      \u003ctd\u003e0.00435\u003c/td\u003e\n\t  \u003ctd\u003e0.7061\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.1004\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.8637\u003c/td\u003e\n      \u003ctd\u003e0.00381\u003c/td\u003e\n\t\t\u003c/tr\u003e\n    \u003ctr\u003e\n\t\t\t\u003ctd\u003eShared-Bottom\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.7245\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.1004\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.8700\u003c/td\u003e\n      \u003ctd\u003e0.00439\u003c/td\u003e\n\t  \u003ctd\u003e0.7029\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.1008\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.8698\u003c/td\u003e\n      \u003ctd\u003e0.00381\u003c/td\u003e\n\t\t\u003c/tr\u003e\n    \u003ctr\u003e\n\t\t\t\u003ctd\u003eOMoE\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.7257\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.1006\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.8781\u003c/td\u003e\n      \u003ctd\u003e0.00432\u003c/td\u003e\n\t  \u003ctd\u003e0.7049\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.1007\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.8701\u003c/td\u003e\n      \u003ctd\u003e0.00381\u003c/td\u003e\n\t\t\u003c/tr\u003e\n    \u003ctr\u003e\n\t\t\t\u003ctd\u003eMMoE\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.7216\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.1010\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.8811\u003c/td\u003e\n      \u003ctd\u003e0.00431\u003c/td\u003e\n\t  \u003ctd\u003e0.7043\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.1006\u003c/td\u003e\n\t\t\t\u003ctd\u003e\u003cstrong\u003e0.8758\u003c/strong\u003e\u003c/td\u003e\n      \u003ctd\u003e\u003cstrong\u003e0.00377\u003c/strong\u003e\u003c/td\u003e\n\t\t\u003c/tr\u003e\n    \u003ctr\u003e\n\t\t\t\u003ctd\u003ePLE\u003c/td\u003e\n\t\t\t\u003ctd\u003e\u003cstrong\u003e0.7276\u003c/strong\u003e\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.1014\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.8805\u003c/td\u003e\n      \u003ctd\u003e0.00451\u003c/td\u003e\n\t\t\t\u003ctd\u003e\u003cstrong\u003e0.7138\u003c/strong\u003e\u003c/td\u003e\n\t\t\t\u003ctd\u003e\u003cstrong\u003e0.0992\u003c/strong\u003e\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.8675\u003c/td\u003e\n      \u003ctd\u003e0.00403\u003c/td\u003e\n\t\t\u003c/tr\u003e\n    \u003ctr\u003e\n\t\t\t\u003ctd\u003eAITM\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.7236\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.1005\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.8763\u003c/td\u003e\n      \u003ctd\u003e0.00431\u003c/td\u003e\n\t  \u003ctd\u003e0.7048\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.1004\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.8730\u003c/td\u003e\n      \u003ctd\u003e\u003cstrong\u003e0.00377\u003c/strong\u003e\u003c/td\u003e\n\t\t\u003c/tr\u003e\n    \u003ctr\u003e\n\t\t\t\u003ctd\u003eMetaHeac\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.7249\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.1005\u003c/td\u003e\n\t\t\t\u003ctd\u003e\u003cstrong\u003e0.8813\u003c/strong\u003e\u003c/td\u003e\n      \u003ctd\u003e\u003cstrong\u003e0.00429\u003c/strong\u003e\u003c/td\u003e\n\t  \u003ctd\u003e0.7089\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.1001\u003c/td\u003e\n\t\t\t\u003ctd\u003e0.8743\u003c/td\u003e\n      \u003ctd\u003e0.00378\u003c/td\u003e\n\t\t\u003c/tr\u003e\n\t\u003c/body\u003e\n\u003c/table\u003e\n\n## File Structure\n\n```\n.\n├── main.py\n├── README.md\n├── models\n│   ├── layers.py\n│   ├── aitm.py\n│   ├── omoe.py\n│   ├── mmoe.py\n│   ├── metaheac.py\n│   ├── ple.py\n│   ├── singletask.py\n│   └── sharedbottom.py\n└── data\n    ├── preprocess.py         # Preprocess the original data\n    ├── AliExpress_NL         # AliExpressDataset from Netherlands\n    \t├── train.csv\n\t└── test.py\n    ├── AliExpress_ES         # AliExpressDataset from Spain\n    ├── AliExpress_FR         # AliExpressDataset from French\n    └── AliExpress_US         # AliExpressDataset from America\n```\n\n\n\n## Contact\nIf you have any problem about this library, please create an issue or send us an Email at:\n* zhuyc0204@gmail.com\n\n\n## Reference\nIf you use this repository, please cite the following papers:\n\n```\n@inproceedings{zhu2021learning,\n  title={Learning to Expand Audience via Meta Hybrid Experts and Critics for Recommendation and Advertising},\n  author={Zhu, Yongchun and Liu, Yudan and Xie, Ruobing and Zhuang, Fuzhen and Hao, Xiaobo and Ge, Kaikai and Zhang, Xu and Lin, Leyu and Cao, Juan},\n  booktitle={Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery \\\u0026 Data Mining},\n  pages={4005--4013},\n  year={2021}\n}\n```\n\n```\n@inproceedings{xi2021modeling,\n  title={Modeling the sequential dependence among audience multi-step conversions with multi-task learning in targeted display advertising},\n  author={Xi, Dongbo and Chen, Zhen and Yan, Peng and Zhang, Yinger and Zhu, Yongchun and Zhuang, Fuzhen and Chen, Yu},\n  booktitle={Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery \\\u0026 Data Mining},\n  pages={3745--3755},\n  year={2021}\n}\n```\n\nSome model implementations and util functions refers to these nice repositories.\n\n- [pytorch-fm](https://github.com/rixwew/pytorch-fm): This package provides a PyTorch implementation of factorization machine models and common datasets in CTR prediction. \n- [MetaHeac](https://github.com/easezyc/MetaHeac): This is an official implementation for Learning to Expand Audience via Meta Hybrid Experts and Critics for Recommendation and Advertising.\n","funding_links":[],"categories":["Codebase","其他_推荐系统"],"sub_categories":["Recommendation","网络服务_其他"],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Feasezyc%2FMultitask-Recommendation-Library","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Feasezyc%2FMultitask-Recommendation-Library","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Feasezyc%2FMultitask-Recommendation-Library/lists"}