Implementing GELU and Its Derivative from Scratch
Автор: David Oniani
Загружено: 29 сент. 2022 г.
Просмотров: 1 889 просмотров
In this video, we discuss and implement GELU activation function and its derivative using PyTorch.
Codebase: https://github.com/oniani/ai
GitHub: https://github.com/oniani
Web: https://oniani.org
#ai #softwareengineering #programming #stylepoint #gelu
Chapters
0:00 - Intro
0:39 - Discussing GELU
9:24 - Computing the derivative of GELU
11:19 - Implementing `forward` method
12:33 - Implementing `backward` method
13:42 - Using `gradcheck` for testing
14:12 - The alternative implementation
15:20 - Outro

Доступные форматы для скачивания:
Скачать видео mp4
-
Информация по загрузке: