PyTorch implementation of the CIKM 2023 paper: "GUARD: Graph Universal Adversarial Defense" [arXiv].
Fig. 1. An illustrative example of graph universal defense. The universal patch p can be applied to an arbitrary node (here v1) to protect it from adversarial targeted attacks by removing adversarial edges (if exist).
- torch==1.9
- dgl==0.7.0
- Cora (available in
data/
) - Pubmed (available in
data/
) - ogbn-arxiv from OGB
- Reddit from http://snap.stanford.edu/graphsage/
Install graphattack:
cd GraphAttack
pip install -e .
See demo.ipynb
run
python evaluate_guard.py
if you find this repo helpful, please cite our work:
@inproceedings{li2022guard,
author = {Jintang Li and
Jie Liao and
Ruofan Wu and
Liang Chen and
Zibin Zheng and
Jiawang Dan and
Changhua Meng and
Weiqiang Wang},
title = {{GUARD:} Graph Universal Adversarial Defense},
booktitle = {{CIKM}},
pages = {1198--1207},
publisher = {{ACM}},
year = {2023}
}