Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/diyang/GRAPH-NODE-RECOGINITION-R
MxNet implementation of Node Recognition in Large Scale Network
https://github.com/diyang/GRAPH-NODE-RECOGINITION-R
Last synced: 2 months ago
JSON representation
MxNet implementation of Node Recognition in Large Scale Network
- Host: GitHub
- URL: https://github.com/diyang/GRAPH-NODE-RECOGINITION-R
- Owner: diyang
- License: gpl-3.0
- Created: 2018-11-21T22:09:06.000Z (about 6 years ago)
- Default Branch: master
- Last Pushed: 2018-12-09T13:17:09.000Z (about 6 years ago)
- Last Synced: 2024-05-22T04:19:52.484Z (8 months ago)
- Language: R
- Homepage:
- Size: 9.25 MB
- Stars: 3
- Watchers: 2
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
- Awesome-MXNet - SAGE-GRAPH
README
# Node Recognition in Large Scale Network
This repo contains an MXNet implementation of [this](https://arxiv.org/pdf/1706.02216.pdf) state of the art Graph Convolutional Neural Network with R.
![](./docs/sample_and_agg.png)
## Requirements
All examples use MxNet 0.10.1 library to perform deep learning,
and use igraph 1.2.2 to store graph regarding data. To install the library, please type following commands in R Studio
```
install.package("mxnet")
install.package("igraph")
```## Running the code
Download & extract the training data:Currently the examples in this directory are tested on the CORA dataset. The GraphSAGE model assumes that node
features are available.The dataset can be found in example_data folder, and data are in CSV format
The following is the description of the dataset:
> The Cora dataset consists of 2708 scientific publications classified into one of seven classes.
> The citation network consists of 5429 links. Each publication in the dataset is described by a
> 0/1-valued word vector indicating the absence/presence of the corresponding word from the dictionary.
> The dictionary consists of 1433 unique words. The README file in the dataset provides more details.
## Results & Comparison
- Please run `main.R`
- This MXNet implementation achieves *NLL = 0.780* after 100 epochs on the validation dataset## Hyperparameters
The default arguments in `main.R` achieve equivalent performance to the published results. For other datasets, the following hyper parameters provide a good starting point:
- K = 2
- hidden num = {20,20}
- random neighbour sampling = {20,10}
- learning rate = 0.005
- Dropout after every layer = 0.3
- Epochs = 100+