#009 Developing a DCGAN for MNIST Dataset
Highlights: In the previous posts we have already explored the basic GAN idea, and we have studied the guidelines for more stable training. In particular, we have analyzed the “GAN Hacks” in post 007 that was proposed in a DCGAN paper. In addition, we have implemented a simple GAN network to learn the mapping of a 1D function. Hence, to prove that the GANs are a promising family of generator architecture, we need to start…
Read more