Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/vijaydwivedi75/gnn-lspe
Source code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional Representations), ICLR 2022
https://github.com/vijaydwivedi75/gnn-lspe
attention geometric-deep-learning gnn gnn-lspe graph-deep-learning graph-neural-networks graph-representation-learning graph-transformer graphs lspe message-passing molecules positional-encoding representation-learning transformer-networks transformers
Last synced: 3 days ago
JSON representation
Source code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional Representations), ICLR 2022
- Host: GitHub
- URL: https://github.com/vijaydwivedi75/gnn-lspe
- Owner: vijaydwivedi75
- License: mit
- Created: 2021-09-13T06:00:28.000Z (over 3 years ago)
- Default Branch: main
- Last Pushed: 2022-02-10T08:18:34.000Z (almost 3 years ago)
- Last Synced: 2024-12-13T03:34:26.029Z (10 days ago)
- Topics: attention, geometric-deep-learning, gnn, gnn-lspe, graph-deep-learning, graph-neural-networks, graph-representation-learning, graph-transformer, graphs, lspe, message-passing, molecules, positional-encoding, representation-learning, transformer-networks, transformers
- Language: Python
- Homepage: http://arxiv.org/abs/2110.07875
- Size: 267 KB
- Stars: 247
- Watchers: 4
- Forks: 35
- Open Issues: 1
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Graph Neural Networks with
Learnable Structural and Positional Representations
Source code for the paper "**[Graph Neural Networks with Learnable Structural and Positional Representations](https://openreview.net/pdf?id=wTTjnvGphYj)**" by Vijay Prakash Dwivedi, Anh Tuan Luu, Thomas Laurent, Yoshua Bengio and Xavier Bresson, at the **Tenth International Conference on Learning Representations (ICLR) 2022**.
We propose a novel GNN architecture in which the structural and positional representations are decoupled, and are learnt separately to learn these two essential properties. The architecture, named **MPGNNs-LSPE** (MPGNNs with **L**earnable **S**tructural and **P**ositional **E**ncodings), is generic that it can be applied to any GNN model of interest which fits into the popular 'message-passing framework', including Transformers.
![MPGNNs-LSPE](./docs/gnn-lspe.png)
## 1. Repo installation
[Follow these instructions](./docs/01_repo_installation.md) to install the repo and setup the environment.
## 2. Download datasets
[Proceed as follows](./docs/02_download_datasets.md) to download the benchmark datasets.
## 3. Reproducibility
[Use this page](./docs/03_run_codes.md) to run the codes and reproduce the published results.
## 4. Reference
:page_with_curl: Paper [on arXiv](https://arxiv.org/abs/2110.07875)
:movie_camera: Video by [@vijaydwivedi75](https://github.com/vijaydwivedi75) [on YouTube](https://youtu.be/fft2Q0jEWi0)
:movie_camera: Video by [@xbresson](https://github.com/xbresson) [on YouTube](https://youtu.be/hADjUl4ymoQ)
```
@inproceedings{dwivedi2022graph,
title={Graph Neural Networks with Learnable Structural and Positional Representations},
author={Vijay Prakash Dwivedi and Anh Tuan Luu and Thomas Laurent and Yoshua Bengio and Xavier Bresson},
booktitle={International Conference on Learning Representations},
year={2022},
url={https://openreview.net/forum?id=wTTjnvGphYj}
}
```