WGANs: A stable alternative to traditional GANs || Wasserstein GAN
Автор: Developers Hutt
Загружено: 2023-05-01
Просмотров: 13673
In this video, we'll explore the Wasserstein GAN with Gradient Penalty, which addresses the instability issues in traditional GANs. Unlike traditional GANs, WGANs use the Wasserstein distance as their loss function to measure the difference between the real and generated data distributions. The Gradient penalty is used to ensure that the gradients from the discriminator don't explode or vanish. We'll implement the WGAN with Gradient Penalty from scratch and use the anime faces dataset for training. Watch the video to learn how to create this type of GAN and improve its performance.
Link to dataset: https://rb.gy/iyolm
Link to code: https://github.com/henry32144/wgan-gp...
Instagram: / developershutt
And as always,
Thanks for watching ❤️
Chapters:
0:00 Intro
0:34 Wasserstein distance
1:15 Wasserstein as loss function
2:43 Gradient Penalty (Lipschitz continuity)
4:38 Code from scratch
11:45 Things to remember
Доступные форматы для скачивания:
Скачать видео mp4
-
Информация по загрузке: