Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/szymonrucinski/graph-convolutional-networks
Train your own GCN model using the latest pytorch-geometric π to solve image classification problem π§
https://github.com/szymonrucinski/graph-convolutional-networks
graph-convolutional-networks network pytorch-geometric pytrorch
Last synced: about 1 month ago
JSON representation
Train your own GCN model using the latest pytorch-geometric π to solve image classification problem π§
- Host: GitHub
- URL: https://github.com/szymonrucinski/graph-convolutional-networks
- Owner: szymonrucinski
- Created: 2021-06-12T21:47:29.000Z (over 3 years ago)
- Default Branch: master
- Last Pushed: 2022-06-30T18:01:34.000Z (over 2 years ago)
- Last Synced: 2024-11-11T04:41:12.376Z (3 months ago)
- Topics: graph-convolutional-networks, network, pytorch-geometric, pytrorch
- Language: Jupyter Notebook
- Homepage:
- Size: 1.05 MB
- Stars: 1
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
Awesome Lists containing this project
README
ο»ΏSpline graph networks
All experiments performed on a computer with the following configuration.
|CPU|GPU|RAM|
| :-: | :-: | :-: |
|AMD Ryzen 5 2600|GTX 1660 Super OC 6Gb|32 Gb DDR4|
## Category 1 experiment
The task is to prepare a program and conduct an experiment to test the effectiveness of a selected graph neural network model in an inductive approach using the graph equivalent of the MNIST set.
## Experiment 1.1|Network architecture|
| :- |
|Net(
` `(conv1): SplineConv(1, 32, dim=2)
` `(conv2): SplineConv(32, 64, dim=2)
` `(fc1): Linear(in\_features=64, out\_features=128, bias=True)
` `(fc2): Linear(in\_features=128, out\_features=10, bias=True)
)
|
|Optimizer|Learning rate|Number of epochs|Patience|Objective function|Learning Time|
|Adam|0.01|50|10|Cross-entropy|68 minutes|![](images/images/Aspose.Words.8defa0a5-3800-4dad-b7c5-de76fa555a1b.003.png)
Graph 1.1 - Visualization of the analyzed graph and image from the MNIST set.
![](images/Aspose.Words.8defa0a5-3800-4dad-b7c5-de76fa555a1b.004.png)![](images/Aspose.Words.8defa0a5-3800-4dad-b7c5-de76fa555a1b.005.png)
Figure 1.2 - Percentage effectiveness for the training set
Graph 1.3 - Graph showing changes in error (objective function) during learning
The final accuracy achieved for the test set is **94.2%**.
## Experiment 1.2|Network architecture|
| :- |
|GCN2(
` `(conv1): SplineConv(1, 64, dim=2)
` `(conv2): SplineConv(64, 128, dim=2)
` `(fc1): Linear(in\_features=128, out\_features=256, bias=True)
` `(fc2): Linear(in\_features=256, out\_features=10, bias=True)
)
|
|Optimizer|Learning rate|Number of epochs|Patience|Objective function|Learning Time|
|Adam|0.01|50|10|Cross-entropy|70 minutes|![](images/Aspose.Words.8defa0a5-3800-4dad-b7c5-de76fa555a1b.006.png)![](images/Aspose.Words.8defa0a5-3800-4dad-b7c5-de76fa555a1b.007.png)
Figure 1.4 - Percentage effectiveness for the training set
Graph 1.5 - Graph showing changes in error (objective function) during learning
The final accuracy achieved for the test set is **94.6%**.
Reference results of obtained % classification error for a given set published in the literature for other network architectures:
|GCGP|PNCNN|Dynamic Reduction Network|
| :- | :- | :- |
|4.2|1.24|0.95|## Category 2 experiment
The task is to prepare a program and conduct an experiment to test the effectiveness of the selected graph neural network model in the transduction case approach using an example from one of the elements in the Planetoid set.
##
## Experiment 2.1|Network architecture|
| :- |
|GCN(
` `(conv1): GCNConv(1433, 32)
` `(conv2): GCNConv(32, 7)
)
|
|Optimizer|Learning rate|Number of epochs|Objective function|Learning Time|
|Adam|0.001|200|Cross-entropy|9.8s|![](images/Aspose.Words.8defa0a5-3800-4dad-b7c5-de76fa555a1b.008.png)
graphic 2.1 - Visualization of elements in a set
![](images/Aspose.Words.8defa0a5-3800-4dad-b7c5-de76fa555a1b.009.png)![](images/Aspose.Words.8defa0a5-3800-4dad-b7c5-de76fa555a1b.010.png)
Graph 2.2 - Percentage performance for the training set
Graph 2.3 - Graph showing changes in error (objective function) during learning
The final accuracy achieved is 78.9%.
## Experiment 2.2|Network architecture|
| :- |
|GCN(
` `(conv1): GCNConv(1433, 512)
` `(conv2): GCNConv(512, 7)
)
|
|Optimizer|Learning rate|Number of epochs|Objective function|Learning Time|
|Adam|0.001|200|Cross-entropy|7.5s|![](images/Aspose.Words.8defa0a5-3800-4dad-b7c5-de76fa555a1b.009.png)![](Aspose.Words.8defa0a5-3800-4dad-b7c5-de76fa555a1b.010.png)
Figure 2.4 - Percentage effectiveness for the training set
Graph 2.5 - Graph showing changes in error (objective function) during learning
The final accuracy achieved is **79.6%**.
Reference % accuracy results obtained for a given set published in the literature for other network architectures:
|GAT|SPLINECNN|SSP|
| :- | :- | :- |
|83,00 %|89,48 %|90,160% |