Aug 31, 2018 · 1 min read
Great work !! It was easier to understand than the paper. I just have one doubt, in the implementation there is just one attention module ?
This source has just one attention module at the middle of the generator and discriminator: https://github.com/taki0112/Self-Attention-GAN-Tensorflow/blob/master/SAGAN.py
