Paper: [GTN] Graph Transformer Networks
Extension Paper: [fastGTN] Graph Transformer Networks: Learning Meta-path Graphs to Improve GNNs
Code from author: https://github.com/seongjunyun/Graph_Transformer_Networks
Clone the OpenHGNN
# Run GTN
python main.py -m GTN -t node_classification -d acm4GTN -g 0 --use_best_config
# Run the fastGTN
python main.py -m fastGTN -t node_classification -d acm4GTN -g 0 --use_best_config
If you do not have gpu, set -gpu -1.
acm4GTN/imdb4GTN
Node classification
Node classification | acm4GTN | imdb4GTN |
---|---|---|
paper[GTN] | 92.68 | 60.92 |
OpenHGNN[GTN] | 92.22 | 61.58 |
OpenHGNN[fastGTN] | 91.80 | 58.90 |
The model is trained in semi-supervisied node classification.
- transform_relation_graph_list
- Extract a graph list where every graph just contains a relation.
- GTLayer
- Contain GTConv
- Contain the product of the adjacency matrices of two graphs getting from GTConv.
- GTConv
- Create a weighted graph whose adjacency matrix is the sum of the adjacency matrices of the given graph list.
Supported dataset: acm4GTN, imdb4GTN
Note: Every node in dataset should have the same features dimension.
We process the acm dataset given by HAN. It saved as dgl.heterograph and can be loaded by dgl.load_graphs
You can download the dataset by
wget https://s3.cn-north-1.amazonaws.com.cn/dgl-data/dataset/acm4GTN.zip
wget https://s3.cn-north-1.amazonaws.com.cn/dgl-data/dataset/imdb4GTN.zip
Or run the code mentioned above and it will download automaticlly.
num_channels = 2 # number of channel
num_layers = 3 # number of layer
adaptive_lr_flag = True # use different learning rate for weight in GTLayer.
Best config can be found in best_config
dgl.adj_product_graph which is equivalent SpSpMM.
Tianyu Zhao[GAMMA LAB]
Submit an issue or email to [email protected].