https://github.com/aveek-saha/graph-attention-net
A TensorFlow 2 implementation of Graph Attention Networks (GAT)
https://github.com/aveek-saha/graph-attention-net
gat graph-attention-networks graph-neural-networks graphs tensorflow tensorflow2
Last synced: about 1 month ago
JSON representation
A TensorFlow 2 implementation of Graph Attention Networks (GAT)
- Host: GitHub
- URL: https://github.com/aveek-saha/graph-attention-net
- Owner: Aveek-Saha
- License: mit
- Created: 2020-09-03T18:45:04.000Z (about 5 years ago)
- Default Branch: master
- Last Pushed: 2020-09-06T14:06:10.000Z (about 5 years ago)
- Last Synced: 2025-02-15T20:24:50.625Z (8 months ago)
- Topics: gat, graph-attention-networks, graph-neural-networks, graphs, tensorflow, tensorflow2
- Language: Python
- Homepage:
- Size: 165 KB
- Stars: 4
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Graph Attention Networks
A TensorFlow 2 implementation of Graph Attention Networks for classification of nodes from the paper, [Graph Attention Networks](https://arxiv.org/abs/1710.10903) (Veličković et al., ICLR 2018).This is my attempt at trying to understand and recreate the neural network from from the paper. You can find the official implementation here: https://github.com/PetarV-/GAT
## Requirements
- tensorflow 2
- networkx
- numpy
- scikit-learn## Run
To train and test the network with the CORA dataset.
```bash
python train.py
```## Cite
Please cite the original paper if you use this code in your own work:
```
@article{
velickovic2018graph,
title="{Graph Attention Networks}",
author={Veli{\v{c}}kovi{\'{c}}, Petar and Cucurull, Guillem and Casanova, Arantxa and Romero, Adriana and Li{\`{o}}, Pietro and Bengio, Yoshua},
journal={International Conference on Learning Representations},
year={2018},
url={https://openreview.net/forum?id=rJXMpikCZ},
note={accepted as poster},
}
```