Skip to content

Implementation of Transformer-based GAN Model in Tensorflow / Keras

License

Notifications You must be signed in to change notification settings

milmor/TransGAN

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

45 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🔥 News

  • [2024-09-13] The new Transformer GAN model, LadaGAN, has been released. It offers improved FID evaluation results, includes model checkpoints, and requires only a single GPU for training. The code has been optimized for better performance and now offers additional functionalities.

TransGAN

Implementation of the Transformer-based GAN model in the paper:

TransGAN: Two Pure Transformers Can Make One Strong GAN, and That Can Scale Up.

Architecture

See here for the official Pytorch implementation.

Dependencies

  • Python 3.8
  • Tensorfow 2.5

Usage

Train

  1. Use --dataset_path=<path> to specify the dataset path (default builds CIFAR-10 dataset), and --model_name=<name> to specify the checkpoint directory name.
python train.py --dataset_path=<path> --model_name=<name> 

Hparams setting

Adjust hyperparameters in the hparams.py file.

Tensorboard

Run tensorboard --logdir ./.

Examples

  • CIFAR-10 training progress

References

Code:

  • This model depends on other files that may be licensed under different open source licenses.
  • TransGAN uses Differentiable Augmentation. Under BSD 2-Clause "Simplified" License.
  • Small-TransGAN models are instances of the original TransGAN architecture with a smaller number of layers and lower-dimensional embeddings.

Implementation notes:

  • Single layer per resolution Generator.
  • Orthogonal initializer and 4 heads in both Generator and Discriminator.
  • WGAN-GP loss.
  • Adam with β1 = 0.0 and β2 = 0.99.
  • Noise dimension = 64.
  • Batch size = 64

Licence

MIT

About

Implementation of Transformer-based GAN Model in Tensorflow / Keras

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages