{"id":26781869,"url":"https://github.com/jiwidi/behavior-sequence-transformer-pytorch","last_synced_at":"2025-07-29T01:38:18.024Z","repository":{"id":41575761,"uuid":"380786162","full_name":"jiwidi/Behavior-Sequence-Transformer-Pytorch","owner":"jiwidi","description":"This is a pytorch implementation for the BST model from Alibaba https://arxiv.org/pdf/1905.06874.pdf","archived":false,"fork":false,"pushed_at":"2022-07-11T14:55:20.000Z","size":229,"stargazers_count":166,"open_issues_count":3,"forks_count":36,"subscribers_count":1,"default_branch":"master","last_synced_at":"2025-07-25T03:45:57.017Z","etag":null,"topics":["alibaba","behavior","behavior-sequence-transformer","pytorch","recommenders","seq2seq","seq2seq-attn","transformers"],"latest_commit_sha":null,"homepage":"","language":"Jupyter Notebook","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/jiwidi.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null}},"created_at":"2021-06-27T16:20:17.000Z","updated_at":"2025-07-17T02:34:27.000Z","dependencies_parsed_at":"2022-07-09T00:16:09.768Z","dependency_job_id":null,"html_url":"https://github.com/jiwidi/Behavior-Sequence-Transformer-Pytorch","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/jiwidi/Behavior-Sequence-Transformer-Pytorch","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/jiwidi%2FBehavior-Sequence-Transformer-Pytorch","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/jiwidi%2FBehavior-Sequence-Transformer-Pytorch/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/jiwidi%2FBehavior-Sequence-Transformer-Pytorch/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/jiwidi%2FBehavior-Sequence-Transformer-Pytorch/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/jiwidi","download_url":"https://codeload.github.com/jiwidi/Behavior-Sequence-Transformer-Pytorch/tar.gz/refs/heads/master","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/jiwidi%2FBehavior-Sequence-Transformer-Pytorch/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":267616877,"owners_count":24116171,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-07-28T02:00:09.689Z","response_time":68,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["alibaba","behavior","behavior-sequence-transformer","pytorch","recommenders","seq2seq","seq2seq-attn","transformers"],"created_at":"2025-03-29T08:18:24.788Z","updated_at":"2025-07-29T01:38:18.003Z","avatar_url":"https://github.com/jiwidi.png","language":"Jupyter Notebook","readme":"# Behavior-Sequence-Transformer-Pytorch\n[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/jiwidi/Behavior-Sequence-Transformer-Pytorch/blob/master/colab-bst.ipynb)\n\nThis is a pytorch implementation for the BST model from Alibaba https://arxiv.org/pdf/1905.06874.pdf\n\n![](img/bst.png \"BST ARCHITECTURE\")\n\n\nThis model is a novel recommender architecture based on seq2seq models. We translate user behaviour into sequences and predict a rating for each target item (movie).\n# Dataset\nFor this implementation we used Movielens [1M Dataset](https://movielens.org/) that contains timestamps per each rating, making it perfect to test in the sequence recommendation model.\n\n\n# Running\n\nYou can run it in colab [here](https://colab.research.google.com/github/jiwidi/Behavior-Sequence-Transformer-Pytorch/blob/master/colab-bst.ipynb). If you prefer to run locally the model architecture is contained on `pytorch-best.ipynb` while data processing is on the `prepare_data.ipynb` notebook and should be run first.\n\n# Results\nTraining on all-1 user ratings and leaving the latest rating for test we obtain the following results\n\n| Dataset |  MAE  | RMSE |\n| :------ | :---: | ---: |\n| Train   | 0.72  | 0.84 |\n| Test    | 0.74  | 0.93 |\n\nHere is a screenshot of training logs we we see overfitting from epoch 12-15.\n\n![](img/logs.png)\n\n# References\n\n* Original paper [1](https://arxiv.org/pdf/1905.06874.pdf)\n* Keras implementation [2](https://keras.io/examples/structured_data/movielens_recommendations_transformers/)\n* Tensorflow implementation [3](https://github.com/shenweichen/DeepCTR/blob/master/deepctr/models/bst.py)\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fjiwidi%2Fbehavior-sequence-transformer-pytorch","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fjiwidi%2Fbehavior-sequence-transformer-pytorch","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fjiwidi%2Fbehavior-sequence-transformer-pytorch/lists"}