Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/HKUST-KnowComp/FKGE
Code for CIKM 2021 paper: Differentially Private Federated Knowledge Graphs Embedding (https://arxiv.org/abs/2105.07615)
https://github.com/HKUST-KnowComp/FKGE
differential-privacy federated-learning knowledge-graph
Last synced: about 1 month ago
JSON representation
Code for CIKM 2021 paper: Differentially Private Federated Knowledge Graphs Embedding (https://arxiv.org/abs/2105.07615)
- Host: GitHub
- URL: https://github.com/HKUST-KnowComp/FKGE
- Owner: HKUST-KnowComp
- Created: 2021-05-17T02:10:35.000Z (over 3 years ago)
- Default Branch: main
- Last Pushed: 2022-12-06T05:59:50.000Z (about 2 years ago)
- Last Synced: 2024-08-03T09:07:10.660Z (5 months ago)
- Topics: differential-privacy, federated-learning, knowledge-graph
- Language: Python
- Homepage:
- Size: 11.9 MB
- Stars: 29
- Watchers: 3
- Forks: 4
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
- StarryDivineSky - HKUST-KnowComp/FKGE
README
## Differentially Private Federated Knowledge Graphs Embedding:
### Data Release
The datasets we used for experiments have been partially uploaded.
To obtain all the KGs, you can find it in https://drive.google.com/file/d/1oD1Gv2RbpNzO8GWGq7SusbAmYih5r-6Q/view?usp=sharing.
Make sure to put KGs from the Google Drive into ```OpenKE/benchmarks```.**Update**: The aligned files are updated and already put in the ```trainse_data/aligned``` folder.
### Package Dependencies
* numpy
* tensorflow 1.xx
* tensorflow_probability### Baseline Embeddings
**You need to run the baseline experiments to obtain the KG embeddings through the following code**:
```python Config.py baseline 300 100 1.0 -1```
The parameters denotes mode, epoches, dimension, gan_ratio and pred_id respectively.
Note that if you want to try other embedding algorithms or some files like ```1-1.txt``` is missing, you need to run ```n_n.py``` from ```OpenKE/benchmarks``` for each KG in ```/OpenKE/benchmarks/KG_1```.
You can replace baseline with strategy_1 or strategy_2 to conduct the experiments with respect to FKGE.By running baseline embeddings, you will create a ```experiment/``` folder and the embeddings are inside ```experiment/0/``` if you sepcify ```pred_id=-1```.
### Federated Knowledge Graphs Embedding
**After obtaining KG's initital embeddings from running the baseline model (make sure there are embeddings in the ```experiment/0/``` folder), run**:
```python Config.py strategy_1 300 100 1.0 0```
### DPFKGE
If you want to train FKGE with the *PATE* mechanism, in `Config.py`, replace```from FederalTransferLearning.hetro_AGCN_mul_dataset import GAN```
with
```from FederalTransferLearning.hetro_AGCN_mul_dataset_pate import GAN```
### Citation
* Paper: https://arxiv.org/abs/2105.07615If you use this code in your work, please kindly cite it.
```
@inproceedings{Peng-2021-DPFKGE,
title={Differentially Private Federated Knowledge Graphs Embedding},
author={Hao Peng and
Haoran Li and
Yangqiu Song and
Vincent W. Zheng and
Jianxin Li},
booktitle={CIKM 2021},
year={2021},
url={https://arxiv.org/abs/2105.07615}
}
```
### MiscellaneousPlease send any questions about the code and/or the algorithm to [email protected]