How to Calculate Loss, Backpropagation, Gradient Descent on the Simplest Neural Network
Автор: Johnny Code
Загружено: 2023-11-26
Просмотров: 6334
End-to-end example of calculating loss, backpropagation, and gradient descent on the most basic neural network. Demonstrated on a whiteboard, with Python code, and simplified math. After this tutorial, you will have a good idea of how neural networks are trained.
Demo code: https://github.com/johnnycode8/basic_...
Buy Me a Coffee: https://www.buymeacoffee.com/johnnycode
00:00 Intro
00:18 Structure of the 2-Node Neural Network
00:40 Define the Problem (Linear Regression)
02:25 Setup Python to Train the Network
03:02 Start by Guessing the Weight
03:40 Loss (How Bad was the Guess?)
04:26 Loss Function (Mean Squared Error)
04:40 Why Drop Zero from Inputs
05:34 How Weight Affects Loss
06:47 Backpropagation (Derivative of Loss wrt Weight)
11:20 Adjust Weight (Gradient Descent)
11:52 Train Network for y=x
13:12 Purpose of Learning Rate
14:36 Train Network Again for y=-2x
15:32 Add Bias (Intercept)
16:16 Find the Derivative of Loss wrt Bias
17:21 Train Network Again for y=-2x+2
19:01 Burnt-out Cat
Доступные форматы для скачивания:
Скачать видео mp4
-
Информация по загрузке: