Популярное

Музыка Кино и Анимация Автомобили Животные Спорт Путешествия Игры Юмор

Интересные видео

2025 Сериалы Трейлеры Новости Как сделать Видеоуроки Diy своими руками

Топ запросов

смотреть а4 schoolboy runaway турецкий сериал смотреть мультфильмы эдисон
dTub
Скачать

An AI Expert Explains How to Prevent The Apocalypse

Автор: Kainos

Загружено: 2025-10-18

Просмотров: 2164

Описание:

Nate Soares is the co-author of the New York Times Bestselling "If Anyone Builds It, Everyone Dies' along with Eliezer Yudkowsky. A former Google and Microsoft engineer, he's also the director of the influential Machine Intelligence Research Institute. In this conversation, we explore how an ASI (Artificial Super Intelligence) could annihilate humanity, what most people get wrong about ASI risk, and what we can do about it today.

Check out 'If Anyone Builds It Everyone Dies' here: https://ifanyonebuildsit.com/
MISI: (https://intelligence.org/)
Join the Kainos Substack: beiner.kainos.com

An AI Expert Explains How to Prevent The Apocalypse

Поделиться в:

Доступные форматы для скачивания:

Скачать видео mp4

  • Информация по загрузке:

Скачать аудио mp3

Похожие видео

How Afraid of the AI Apocalypse Should We Be? | The Ezra Klein Show

How Afraid of the AI Apocalypse Should We Be? | The Ezra Klein Show

EP 327 Nate Soares on Why Superhuman AI Would Kill Us All

EP 327 Nate Soares on Why Superhuman AI Would Kill Us All

Godfather of AGI on Why Big Tech Innovation is Over

Godfather of AGI on Why Big Tech Innovation is Over

Will AI Kill Us All? Nate Soares on His Controversial Bestseller

Will AI Kill Us All? Nate Soares on His Controversial Bestseller

The AI Safety Expert: These Are The Only 5 Jobs That Will Remain In 2030! - Dr. Roman Yampolskiy

The AI Safety Expert: These Are The Only 5 Jobs That Will Remain In 2030! - Dr. Roman Yampolskiy

Учёный пытается спасти людей от ИИ (при участии Нейта Соареса)

Учёный пытается спасти людей от ИИ (при участии Нейта Соареса)

AI FUTURE THAT CAN DESTROY US | Superintelligence Is Getting Closer — Nick Bostrom × Jonas von Essen

AI FUTURE THAT CAN DESTROY US | Superintelligence Is Getting Closer — Nick Bostrom × Jonas von Essen

Why Building Superintelligence Means Human Extinction (with Nate Soares)

Why Building Superintelligence Means Human Extinction (with Nate Soares)

Superhuman AI: If Anyone Builds It ... Everyone Dies? | Semafor Tech

Superhuman AI: If Anyone Builds It ... Everyone Dies? | Semafor Tech

Godfather of AI WARNS:

Godfather of AI WARNS: "You Have No Idea What's Coming"

Full interview:

Full interview: "Godfather of AI" shares prediction for future of AI, issues warnings

Godfather of AI: They Keep Silencing Me But I’m Trying to Warn Them!

Godfather of AI: They Keep Silencing Me But I’m Trying to Warn Them!

OpenAI тонет. Google рвёт индустрию. ИИ улетает в космос / Итоги ноября в AI

OpenAI тонет. Google рвёт индустрию. ИИ улетает в космос / Итоги ноября в AI

Richard Sutton – Father of RL thinks LLMs are a dead end

Richard Sutton – Father of RL thinks LLMs are a dead end

Daniel Kokotajlo on how superintelligent AIs could build a self-replicating robot economy in months

Daniel Kokotajlo on how superintelligent AIs could build a self-replicating robot economy in months

Ex-Google Whistleblower: Why Tech CEOs Fear A.I. Could End Humanity

Ex-Google Whistleblower: Why Tech CEOs Fear A.I. Could End Humanity

Вы (пока) не отстаёте: как освоить ИИ за 17 минут

Вы (пока) не отстаёте: как освоить ИИ за 17 минут

ASI Risks: Similar premises, opposite conclusions | Eliezer Yudkowsky vs Mark Miller

ASI Risks: Similar premises, opposite conclusions | Eliezer Yudkowsky vs Mark Miller

Robot Plumbers, Robot Armies, and Our Imminent A.I. Future | Interesting Times with Ross Douthat

Robot Plumbers, Robot Armies, and Our Imminent A.I. Future | Interesting Times with Ross Douthat

Scientists Reveal THE TRUTH About AI TAKEOVER in 2029 | Ray Kurzweil & The Lost Wanderer

Scientists Reveal THE TRUTH About AI TAKEOVER in 2029 | Ray Kurzweil & The Lost Wanderer

© 2025 dtub. Все права защищены.



  • Контакты
  • О нас
  • Политика конфиденциальности



Контакты для правообладателей: [email protected]