https://github.com/zjunlp/knn-kg
[NLPCC 2023] Reasoning Through Memorization: Nearest Neighbor Knowledge Graph Embeddings with Language Models
https://github.com/zjunlp/knn-kg
fb15k-237 fb15k237 inductive inductive-reasoning kg kge knn knn-kge knnkg knowledge-graph knowledge-graph-embeddings machine-learning semiparametric-models transductive transductive-reasoning wn18rr
Last synced: 4 months ago
JSON representation
[NLPCC 2023] Reasoning Through Memorization: Nearest Neighbor Knowledge Graph Embeddings with Language Models
- Host: GitHub
- URL: https://github.com/zjunlp/knn-kg
- Owner: zjunlp
- Created: 2021-12-18T05:10:27.000Z (about 4 years ago)
- Default Branch: main
- Last Pushed: 2023-07-31T13:21:27.000Z (over 2 years ago)
- Last Synced: 2025-06-13T23:05:08.894Z (8 months ago)
- Topics: fb15k-237, fb15k237, inductive, inductive-reasoning, kg, kge, knn, knn-kge, knnkg, knowledge-graph, knowledge-graph-embeddings, machine-learning, semiparametric-models, transductive, transductive-reasoning, wn18rr
- Language: Python
- Homepage:
- Size: 32.9 MB
- Stars: 52
- Watchers: 6
- Forks: 9
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
# KNN-KG
Code for the NLPCC2023 paper "[Reasoning Through Memorization: Nearest Neighbor Knowledge Graph Embeddings with Language Models](https://arxiv.org/abs/2201.05575)".
Requirements
==========
To install requirements:
```
pip install -r requirements.txt
```
Run the experiments
==========
## Training
### Entity Embedding Initialization
Use the command below to add entities to BERT and train the entity embedding layer to use in the later training. For another dataset `WN18RR` just replacing the dataset name will be fine.
```shell
./scripts/pretrain_fb15k.sh
```
The parameters of Entity Embedding Layer trained will be used in the next `Entity prediction task`.
### Entity Prediction Task
Use the command below to train the model to predict the correct entity in the masked position.
```shell
./scripts/fb15k-237/fb15k.sh
```
## Consturct Knowledge Store
After training the model in `Entity prediction task`, we use the model to get the knowledge store built from triples and descriptions.
```shell
./scripts/fb15k-237/get_knowledge_store.sh
```
## Inference
Here we have a trained model and our knowledge store (e.g., faiss.dump file), use the command below to inference in the test set.
```shell
./scripts/fb15k-237/inference.sh
```
And for inductive setting, the command is similar to the transductive setting (just replace the `dataset` with inductive dataset), the code will automatically handle the differences.