← Previous · All Episodes · Next →
The GAN is dead; long live the GAN! A Modern GAN Baseline Episode 374

The GAN is dead; long live the GAN! A Modern GAN Baseline

· 20:12

|

🤗 Upvotes: 27 | cs.LG, cs.CV

Authors:
Yiwen Huang, Aaron Gokaslan, Volodymyr Kuleshov, James Tompkin

Title:
The GAN is dead; long live the GAN! A Modern GAN Baseline

Arxiv:
http://arxiv.org/abs/2501.05441v1

Abstract:
There is a widely-spread claim that GANs are difficult to train, and GAN architectures in the literature are littered with empirical tricks. We provide evidence against this claim and build a modern GAN baseline in a more principled manner. First, we derive a well-behaved regularized relativistic GAN loss that addresses issues of mode dropping and non-convergence that were previously tackled via a bag of ad-hoc tricks. We analyze our loss mathematically and prove that it admits local convergence guarantees, unlike most existing relativistic losses. Second, our new loss allows us to discard all ad-hoc tricks and replace outdated backbones used in common GANs with modern architectures. Using StyleGAN2 as an example, we present a roadmap of simplification and modernization that results in a new minimalist baseline -- R3GAN. Despite being simple, our approach surpasses StyleGAN2 on FFHQ, ImageNet, CIFAR, and Stacked MNIST datasets, and compares favorably against state-of-the-art GANs and diffusion models.


Subscribe

Listen to Daily Paper Cast using one of many popular podcasting apps or directories.

Apple Podcasts Spotify Overcast Pocket Casts
← Previous · All Episodes · Next →