- What is Minibatch discrimination?
- How do you prevent mode GANs collapse?
- How can I improve my GAN training?
- Why is GAN unstable?
What is Minibatch discrimination?
Minibatch Discrimination is a discriminative technique for generative adversarial networks where we discriminate between whole minibatches of samples rather than between individual samples. This is intended to avoid collapse of the generator.
How do you prevent mode GANs collapse?
A carefully tunned learning rate may mitigate some serious GAN's problems like mode collapse. In specific, lower the learning rate and redo the training when mode collapse happens. We can also experiment with different learning rates for the generator and the discriminator.
How can I improve my GAN training?
Additional Tips and Tricks
- Feature matching. Develop a GAN using semi-supervised learning.
- Minibatch discrimination. Develop features across multiple samples in a minibatch.
- Historical averaging. Update the loss function to incorporate history.
- One-sided label smoothing. ...
- Virtual batch normalization.
Why is GAN unstable?
The fact that GANs are composed by two networks, and each one of them has its loss function, results in the fact that GANs are inherently unstable- diving a bit deeper into the problem, the Generator (G) loss can lead to the GAN instability, which can be the cause of the gradient vanishing problem when the ...