{"id":28676568,"url":"https://github.com/zjunlp/relphormer","last_synced_at":"2025-06-13T23:05:13.098Z","repository":{"id":107436193,"uuid":"440418285","full_name":"zjunlp/Relphormer","owner":"zjunlp","description":"[Neurocomputing 2023] Relational Graph Transformer for Knowledge Graph Representation","archived":false,"fork":false,"pushed_at":"2024-07-15T14:50:11.000Z","size":30093,"stargazers_count":113,"open_issues_count":1,"forks_count":11,"subscribers_count":7,"default_branch":"main","last_synced_at":"2024-07-15T18:01:05.921Z","etag":null,"topics":["fb15k-237","kg","knowledge-graph","knowledge-graph-completion","knowledge-graph-representation","link-prediction","question-answering","recommendation-system","relphormer","transformer","umls","wn18rr"],"latest_commit_sha":null,"homepage":"","language":"Python","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/zjunlp.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2021-12-21T06:47:48.000Z","updated_at":"2024-07-15T14:50:15.000Z","dependencies_parsed_at":"2024-01-27T09:37:44.877Z","dependency_job_id":null,"html_url":"https://github.com/zjunlp/Relphormer","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/zjunlp/Relphormer","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/zjunlp%2FRelphormer","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/zjunlp%2FRelphormer/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/zjunlp%2FRelphormer/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/zjunlp%2FRelphormer/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/zjunlp","download_url":"https://codeload.github.com/zjunlp/Relphormer/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/zjunlp%2FRelphormer/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":259732770,"owners_count":22903087,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["fb15k-237","kg","knowledge-graph","knowledge-graph-completion","knowledge-graph-representation","link-prediction","question-answering","recommendation-system","relphormer","transformer","umls","wn18rr"],"created_at":"2025-06-13T23:05:12.351Z","updated_at":"2025-06-13T23:05:13.090Z","avatar_url":"https://github.com/zjunlp.png","language":"Python","readme":"# Relphormer\n\nCode for the Neurocomputing 2023 paper: \"[Relphormer: Relational Graph Transformer for Knowledge Graph Representations](https://arxiv.org/abs/2205.10852)\".\n\n\n\u003e Transformers have achieved remarkable performance in widespread fields, including natural language processing, computer vision and graph mining. However, vanilla Transformer architectures have not yielded promising improvements in the Knowledge Graph (KG) representations, where the translational distance paradigm dominates this area. Note that vanilla Transformer architectures struggle to capture the intrinsically heterogeneous semantic and structural information of knowledge graphs. To this end, we propose a new variant of Transformer for knowledge graph representations dubbed Relphormer. Specifically, we introduce Triple2Seq which can dynamically sample contextualized sub-graph sequences as the input to alleviate the heterogeneity issue. We propose a novel structure-enhanced self-attention mechanism to encode the relational information and keep the globally semantic information among sub-graphs. Moreover, we propose masked knowledge modeling as a new paradigm for knowledge graph representation learning. We apply Relphormer to three tasks, namely, knowledge graph completion, KG-based question answering and KG-based recommendation for evaluation. Experimental results show that Relphormer can obtain better performance on benchmark datasets compared with baselines.\n\n\n# Model Architecture\n\n\u003cdiv align=center\u003e\n\u003cimg src=\"./resource/model.png\" width=\"85%\" height=\"75%\" /\u003e\n\u003c/div\u003e\n \n\nThe model architecture of Relphormer. \nThe contextualized sub-graph is sampled with Triple2Seq, and then it will be converted into sequences while maintaining its sub-graph structure.\nNext, we conduct masked knowledge modeling, which randomly masks the nodes in the center triple in the contextualized sub-graph sequences.\nFor the transformer architecture, we design a novel structure-enhanced mechanism to preserve the structure feature.\nFinally, we utilize our pre-trained KG transformer for KG-based downstream tasks. \n\n# Environments\n\n- python (3.8.18)\n- Ubuntu-18.04.6 (4.15.0-156-generic)\n\n# Requirements\n\nTo run the codes, you need to install the requirements:\n```\npip install -r requirements.txt\n```\n\nThe expected structure of files is:\n\n```\n ── Relphormer\n    ├── data\n    ├── dataset\n    │   ├── fb15k-237\n    │   ├── wn18rr\n    │   ├── umls\n    │   ├── create_neighbor.py\n    ├── lit_models\n    │   ├── _init_.py\n    │   ├── base.py\n    │   ├── transformer.py\n    │   └── utils.py\n    ├── models\n    │   ├── _init_.py\n    │   ├── huggingface_relformer.py\n    │   ├── model.py\n    │   └── utils.py    \n    ├── resource\n    │   └── model.png    \n    ├── scripts\n    │   ├── fb15k-237\n    │   └── wn18rr\n    ├── QA\n    ├── main.py\n    └── requirements.txt\n```\n\n# How to run\n\n## KGC Task\n\n###  Pre-trained Model Preparation \n\n- First download the pre-trained model.\n```shell\n\u003e\u003e cd Relphormer\n\u003e\u003e mkdir -p Pre-trained_models/bert-base-uncased\n\u003e\u003e cd Pre-trained_models/bert-base-uncased\n```\nTo download the pre-training model, run the following commands.\n```\nwget https://hf-mirror.com/bert-base-uncased/resolve/main/config.json\nwget https://hf-mirror.com/bert-base-uncased/resolve/main/pytorch_model.bin\nwget https://hf-mirror.com/bert-base-uncased/resolve/main/tokenizer.json\nwget https://hf-mirror.com/bert-base-uncased/resolve/main/tokenizer_config.json\nwget https://hf-mirror.com/bert-base-uncased/resolve/main/vocab.txt\n```\n\n### Entity Embedding Initialization\n\n- Then use the command below to add entities to BERT and initialize the entity embedding layer to be used in the later training. For other datasets `fb15k-237`  and `wn18rr` ,  just replace the dataset name with `fb15k-237` and  `wn18rr` will be fine.\n\n```shell\n\u003e\u003e cd Relphormer/\n\u003e\u003e bash scripts/fb15k-237/pretrain_fb15k.sh\n```\n\nThe pretrained models are saved in the `Relphormer/pretrain/output` directory.\n\nFor convenience, we provide users with our processed masked files and **pre-trained checkpoints**, you can download from [here](https://drive.google.com/drive/folders/1siVVMNJYdYWcFby3PhEZv-UiLxbOYHnx?usp=share_link).\n\n### Entity Prediction\n\n- Next use the command below to train the model to predict the correct entity in the masked position. Same as above for other datasets.\n\n\n```shell\n\u003e\u003e cd Relphormer\n\u003e\u003e python dataset/create_neighbor.py --dataset \"fb15k-237\"\n\u003e\u003e bash scripts/fb15k-237/fb15k.sh\n```\n\nThe trained models are saved in the `Relphormer/output` directory.\n\n\n## QA Task\nThe experimental settings in QA follow the [Hitter](https://arxiv.org/pdf/2008.12813.pdf) experimental settings, and the environment installation can be done by referring to [GitHub](https://github.com/microsoft/HittER). We only modified **hitter-best.py** to fit our model.\n\n- The relphormer model used by QA can be downloaded [here](https://drive.google.com/file/d/1FK_A_kFq1ECoNm75RfkcvYv8rZiJL1Bw/view?usp=sharing).\n\n```shell\n\u003e\u003e cd QA\n\u003e\u003e sh scripts/relphormer_fbqa.sh\n\u003e\u003e sh scripts/relphormer_fbqa_filtered.sh \n\u003e\u003e sh scripts/relphormer_webqsp.sh\n\u003e\u003e sh scripts/relphormer_webqsp_filtered.sh\n```\n\n# Citation\nIf you use the code, please cite the following paper:\n\n```bibtex\n@article{BI2023127044,\ntitle = {Relphormer: Relational Graph Transformer for knowledge graph representations},\njournal = {Neurocomputing},\npages = {127044},\nyear = {2023},\nissn = {0925-2312},\nauthor = {Zhen Bi and Siyuan Cheng and Jing Chen and Xiaozhuan Liang and Feiyu Xiong and Ningyu Zhang},\n}\n```\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fzjunlp%2Frelphormer","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fzjunlp%2Frelphormer","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fzjunlp%2Frelphormer/lists"}