Scalable balanced training of conditional generative adversarial neural networks on image data

Keywords

Advanced Numerical Methods for Scientific Computing
Code:
33/2021
Title:
Scalable balanced training of conditional generative adversarial neural networks on image data
Date:
Wednesday 2nd June 2021
Author(s):
Lupo Pasini, M.; Gabbi, V.; Yin, J.; Perotto, S.; Laanait, N.
Download link:
Abstract:
We propose a distributed approach to train deep convolutional generative adversarial neural network (DC-CGANs) models. Our method reduces the imbalance between generator and discriminator by partitioning the training data according to data labels, and enhances scalability by performing a parallel training where multiple generators are concurrently trained, each one of them focusing on a single data label. Performance is assessed in terms of inception score and image quality on MNIST, CIFAR10, CIFAR100, and ImageNet1k datasets, showing a significant improvement in comparison to state-of-the-art techniques to training DC-CGANs. Weak scaling is attained on all the four datasets using up to 1,000 processes and 2,000 NVIDIA V100 GPUs on the OLCF supercomputer Summit.