Stanford CME295 Transformers & LLMs | Autumn 2025 | Lecture 1 - Transformer
Автор: Stanford Online
Загружено: 2025-10-17
Просмотров: 365622
For more information about Stanford’s graduate programs, visit: https://online.stanford.edu/graduate-...
September 26, 2025
This lecture covers:
• Background on NLP and tasks
• Tokenization
• Embeddings
• Word2vec, RNN, LSTM
• Attention mechanism
• Transformer architecture
To follow along with the course schedule and syllabus, visit: https://cme295.stanford.edu/syllabus/
Chapters:
00:00:00 Introduction
00:03:54 Class logistics
00:09:40 NLP overview
00:22:57 Tokenization
00:30:28 Word representation
00:53:23 Recurrent neural networks
01:06:47 Self-attention mechanism
01:13:53 Transformer architecture
01:29:53 Detailed example
Afshine Amidi is an Adjunct Lecturer at Stanford University.
Shervine Amidi is an Adjunct Lecturer at Stanford University.
View the course playlist: • Stanford CME295: Transformers and Large La...
Доступные форматы для скачивания:
Скачать видео mp4
-
Информация по загрузке: