Stochastic Gradient Descent vs Batch Gradient Descent vs Mini Batch Gradient Descent |DL Tutorial 14
Автор: codebasics
Загружено: 2020-08-18
Просмотров: 215119
Stochastic gradient descent, batch gradient descent and mini batch gradient descent are three flavors of a gradient descent algorithm. In this video I will go over differences among these 3 and then implement them in python from scratch using housing price dataset. At the end of the video we have an exercise for you to solve.
🔖 Hashtags 🔖
#stochasticgradientdescentpython #stochasticgradientdescent #batchgradientdescent #minibatchgradientdescent #gradientdescent
Do you want to learn technology from me? Check https://codebasics.io/?utm_source=des... for my affordable video courses.
Next Video: • Chain Rule | Deep Learning Tutorial 15 (Te...
Previous video: • Implement Neural Network In Python | Deep ...
Code of this tutorial: https://github.com/codebasics/deep-le...
Exercise: Go at the end of above link to find description for exercise
Deep learning playlist: • Deep Learning With Tensorflow 2.0, Keras a...
Machine learning playlist : https://www.youtube.com/playlist?list...
Prerequisites for this series:
1: Python tutorials (first 16 videos): https://www.youtube.com/playlist?list...
2: Pandas tutorials(first 8 videos): • Pandas Tutorial (Data Analysis In Python)
3: Machine learning playlist (first 16 videos): https://www.youtube.com/playlist?list...
#️⃣ Social Media #️⃣
🔗 Discord: / discord
📸 Dhaval's Personal Instagram: / dhavalsays
📸 Instagram: / codebasicshub
🔊 Facebook: / codebasicshub
📝 Linkedin (Personal): / dhavalsays
📝 Linkedin (Codebasics): / codebasics
📱 Twitter: / codebasicshub
🔗 Patreon: https://www.patreon.com/codebasics?fa...
Доступные форматы для скачивания:
Скачать видео mp4
-
Информация по загрузке: