Lecture 8 - Generating Language with Attention [Chris Dyer]
Автор: Zafar Mahmood
Загружено: 2017-03-15
Просмотров: 12068
This lecture introduces one of the most important and influencial mechanisms employed in Deep Neural Networks: Attention. Attention augments recurrent networks with the ability to condition on specific parts of the input and is key to achieving high performance in tasks such as Machine Translation and Image Captioning.
Доступные форматы для скачивания:
Скачать видео mp4
-
Информация по загрузке: