Skip to content

[CIKM 2023] GUARD: Graph Universal Adversarial Defense

Notifications You must be signed in to change notification settings

EdisonLeeeee/GUARD

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GUARD: Graph Universal Adversarial Defense

PyTorch implementation of the CIKM 2023 paper: "GUARD: Graph Universal Adversarial Defense" [arXiv].

Fig. 1. An illustrative example of graph universal defense. The universal patch p can be applied to an arbitrary node (here v1) to protect it from adversarial targeted attacks by removing adversarial edges (if exist).

Requirements

  • torch==1.9
  • dgl==0.7.0

Datasets

Install graphattack:

cd GraphAttack
pip install -e .

Quick Start

See demo.ipynb

Reproduce results in the paper

run

python evaluate_guard.py

Cite

if you find this repo helpful, please cite our work:

@inproceedings{li2022guard,
  author       = {Jintang Li and
                  Jie Liao and
                  Ruofan Wu and
                  Liang Chen and
                  Zibin Zheng and
                  Jiawang Dan and
                  Changhua Meng and
                  Weiqiang Wang},
  title        = {{GUARD:} Graph Universal Adversarial Defense},
  booktitle    = {{CIKM}},
  pages        = {1198--1207},
  publisher    = {{ACM}},
  year         = {2023}
}

About

[CIKM 2023] GUARD: Graph Universal Adversarial Defense

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published