User Tools

Site Tools


basic_theory_of_generative_adversarial_networks

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Last revisionBoth sides next revision
basic_theory_of_generative_adversarial_networks [2018/11/15 20:59] – [Final Words] dongbinkimbasic_theory_of_generative_adversarial_networks [2018/11/15 20:59] – [Final Words] dongbinkim
Line 364: Line 364:
 ==== Final Words ==== ==== Final Words ====
 The python GAN code has been tested with several changes. First of all, the generator and discriminator network are tested with different activation function. Second, the input in train-data python code is modified to give different order variable system input. All test is described in following. The python GAN code has been tested with several changes. First of all, the generator and discriminator network are tested with different activation function. Second, the input in train-data python code is modified to give different order variable system input. All test is described in following.
 +\\
 • (1)Activation function : Leaky-Rectified Linear Unit(Leaky-ReLU), different input • (1)Activation function : Leaky-Rectified Linear Unit(Leaky-ReLU), different input
 +\\
 • (2)Activation function : Rectified Linear Unit(ReLU), different input • (2)Activation function : Rectified Linear Unit(ReLU), different input
 +\\
 • (3)Activation function : Sigmoid, different input • (3)Activation function : Sigmoid, different input
 +\\
 The test(1) is described in Fig 8 11. The lower order variable inputs works better in GAN training. The higher one affects the unstable train results. The test(2) is described in Fig 12 14. It shows that ReLU based generator and discriminator are more unstable than Leaky-ReLU. In actual test running with input x3, the system has stopped, therefore only 3 figure can be achieved. Lastly, test(3) describes that sigmoid is not suitable for generator and discriminator. The test(1) is described in Fig 8 11. The lower order variable inputs works better in GAN training. The higher one affects the unstable train results. The test(2) is described in Fig 12 14. It shows that ReLU based generator and discriminator are more unstable than Leaky-ReLU. In actual test running with input x3, the system has stopped, therefore only 3 figure can be achieved. Lastly, test(3) describes that sigmoid is not suitable for generator and discriminator.
  
basic_theory_of_generative_adversarial_networks.txt · Last modified: 2018/11/15 21:01 by dongbinkim