By all means, this is not a complete list of GAN-related papers. We are specifically looking for generative modeling techniques to bridge the gap between two domains, something similar to the CycleGAN. This repo also includes historical important GAN papers, their difficulties in training, and various techniques to overcome such difficulties and improve the performance.
- 05/28/21 Pre-Trained Image Processing Transformer
- 08/10/21 GANBERT: Generative Adversarial Networks with Bidirectional Encoder Representations from Transformers for MRI to PET synthesis
- 07/12/21 Early Convolutions Help Transformers See Better
- 07/15/21 CMT: Convolutional Neural Networks Meet Vision Transformers
- 07/10/21 Local-to-Global Self-Attention in Vision Transformers
- 07/13/21 Visual Parser: Representing Part-whole Hierarchies with Transformers
- 07/09/21 ViTGAN: Training GANs with Vision Transformers
In reverse chronological order
- StarGANv2
- Unpaired Image-to-Image Translation using Adversarial Consistency Loss
- CartonGAN
- Resolution Dependant GAN Interpolation for Controllable Image Synthesis Between Domains
- CycleGAN
- SWAE
- StyleGANv2 Distillation for Feed-forward Image Manipulation
- Controlling generative models with continuous factors of variations
- Counterfactuals uncover the modular structure of deep generative models
- Swapping Autoencoder for Deep Image Manipulation
In reverse chronological order
- Pixel2Pixel Image-to-Image Translation with Conditional Adversarial Networks
- Prototypical Pseudo Label Denoising and Target Structure Learning for Domain Adaptive Semantic Segmentation
- SESAME: Semantic Editing of Scenes by Adding, Manipulating or Erasing Objects
- You Only Need Adversarial Supervision for Semantic Image Synthesis
- U-GAT-IT: Unsupervised Generative Attentional Networks with Adaptive Layer-Instance Normalization for Image-to-Image Translation
In chronological order.
It is hard to find a paper dedicated on the failures of the GAN techniques. But some papers may have some content describing what problems the previous GAN has.
There are many techniques to improve GAN's performance. And some of them are tried and true.
- Improved Techniques for Training GANs
- Which Training Methods for GANs do actually Converge?
- Arbitrary Style Transfer in Real-time with Adaptive Instance Normalization
- The Unusual Effectiveness of Averaging in GAN Training
-
Graph Generative Adversarial Networks for Sparse Data Generation in High Energy Physics Using GAN as a "fast simulation" to replace Geant4. While this is not a goal of LS4GAN the use of sparse data on graphs may be a useful technique to handle LS4GAN's anticipated large but sparse events.
- Here are some related papers on graph neurla networks (GNN) that might be helpful.
- Semi-Supervised Classification with Graph Convolutional Networks The start of graph convolutional networks by T. Kipf and M. Welling.
- Neural Message Passing for Quantum Chemistry The most known graph conv network in physics and chemistry.
- Benchmarking Graph Neural Networks A benchmark paper using different GNNs on different tasks.
-
Calo Gan : Simulating 3D High Energy Particle Showers in Multi-Layer Electromagnetic Calorimeters with Generative Adversarial Networks
-
AI-based Monte Carlo event generator for electron-proton scattering