Header menu link for other important links
X
Class Balancing GAN with a Classifier in the Loop
H. Rangwani, , R.V. Babu
Published in ML Research Press
2021
Volume: 161
   
Pages: 1618 - 1627
Abstract
Generative Adversarial Networks (GANs) have swiftly evolved to imitate increasingly complex image distributions. However, majority of the developments focus on performance of GANs on balanced datasets. We find that the existing GANs and their training regimes which work well on balanced datasets fail to be effective in case of imbalanced (i.e. long-tailed) datasets. In this work we introduce a novel theoretically motivated Class Balancing regularizer for training GANs. Our regularizer makes use of the knowledge from a pretrained classifier to ensure balanced learning of all the classes in the dataset. This is achieved via modelling the effective class frequency based on the exponential forgetting observed in neural networks and encouraging the GAN to focus on underrepresented classes. We demonstrate the utility of our regularizer in learning representations for long-tailed distributions via achieving better performance than existing approaches over multiple datasets. Specifically, when applied to an unconditional GAN, it improves the FID from 13.03 to 9.01 on the long-tailed iNaturalist-2019 dataset. © 2021 Proceedings of Machine Learning Research. All Rights Reserved.
About the journal
JournalProceedings of Machine Learning Research
PublisherML Research Press
ISSN26403498