Neural Networks: Stochastic, mini-batch and batch gradient descent
Автор: Bevan Smith 2
Загружено: 25 июн. 2021 г.
Просмотров: 24 918 просмотров
What is the difference between stochastic, mini-batch and batch gradient descent?
Which is the best? Which one is recommended?
0:00 Introduction
0:20 How do we train a neural network?
1:25 3 types of gradient descent
1:55 My silly training dataset
2:55 Stochastic gradient descent
4:05 Mini-batch gradient descent
5:20 Batch gradient descent
5:45 What is an epoch?
7:10 So why do we not use batch gradient descent?
8:10 What does the literature say on gradient descent in neural networks?
8:12 Goodfellow - stochastic gradient descent
9:10 Wikipedia - stochastic gradient descent
9:25 Lecun - BackProp, stochastic gradient descent
9:58 Andrew Ng - mini-batch and stochastic
10:43 Conclusion

Доступные форматы для скачивания:
Скачать видео mp4
-
Информация по загрузке: