Why LLMs Are Not AGI – The Evolution and Future of AI (Talk)
Автор: Barani A
Загружено: 2025-12-24
Просмотров: 131
Large Language Models like ChatGPT feel intelligent — but are they?
In this talk, I explain why LLMs ≠ AGI, and what’s missing for real intelligence.
I reflect my personal understanding of current AI research and industry trends, based on publicly available work and ongoing discussions in the field.
This talk will explore the evolution of AI — from symbolic systems to deep learning, LLMs, agents, and world models.
I’ll cover:
• Why LLMs excel at language but fail at prediction
• Self-attention and autoregressive generation (intuitively explained)
• Why agents add action but not understanding
• World models, JEPA, and Yann LeCun’s vision for real intelligence
This talk is designed for everyone — whether you’re casually curious about AI, you work with it every day, or you’re deep into research. Using one simple example, I walk through the entire evolution of AI: from the early symbolic days, to deep learning, to world models, and what the road to future AI may actually look like.
So no matter your background, there’s something interesting (and hopefully thought-provoking!) in there.
0:00 Introduction
5:54 Sections/Eras
6:14 Symbolic AI
9:06 Machine Learning
16:31 Deep Learning
28:20 Generative AI
29:09 Why LLMs Exploded?
30:12 Self-Attention (Core of Transformers architecture)
37:40 How LLMs Generate Text?
43:58 Agentic AI
51:04 What Next?
51:42 Predictive World Models
56:45 Energy Based Models
59:26 JEPA
1:01:03 Why JEPA is close to AGI than LLMs?
1:06:13 The Staircase of Intelligence
1:07:21 How the robot can finally become intelligent?
1:08:05 What this means for engineers?
Sources and references links:
https://ourworldindata.org/brief-history-o...
https://proceedings.neurips.cc/paper_files...
• Yann LeCun "Mathematical Obstacles on the ...
https://epichka.com/blog/2023/qkv-transfor...
https://cdn.educba.com/academy/wp-content/...
Доступные форматы для скачивания:
Скачать видео mp4
-
Информация по загрузке: