Популярное

Музыка Кино и Анимация Автомобили Животные Спорт Путешествия Игры Юмор

Интересные видео

2025 Сериалы Трейлеры Новости Как сделать Видеоуроки Diy своими руками

Топ запросов

смотреть а4 schoolboy runaway турецкий сериал смотреть мультфильмы эдисон
dTub
Скачать

Modern Hopfield Networks - Dr Sepp Hochreiter

Автор: IARAI Research

Загружено: 2020-12-21

Просмотров: 9375

Описание:

Dr Sepp Hochreiter is a pioneer in the field of Artificial Intelligence (AI). He was the first to identify the key obstacle to Deep Learning and then discovered a general approach to address this challenge. He thus became the founding father of modern Deep Learning and AI.

Sepp Hochreiter is a founding director of IARAI, a professor at Johannes Kepler University Linz and a recipient of the 2020 IEEE Neural Networks Pioneer Award.

In a recent groundbreaking paper “Hopfield Networks is All You Need“, Sepp Hochreiter’s team introduced a new modern Hopfield network with continuous states that can store exponentially many patterns and has a very fast convergence.

Abstract: Associative memories are one of the earliest artificial neural models dating back to the 1960s and 1970s. Best known are Hopfield Networks, presented by John Hopfield in 1982. Recently, Modern Hopfield Networks have been introduced, which tremendously increase the storage capacity and converge extremely fast. This new Hopfield network can store exponentially (with the dimension) many patterns, converges with one update, and has exponentially small retrieval errors. The number of stored patterns is traded off against convergence speed and retrieval error. The new Hopfield network has three types of energy minima (fixed points of the update): (1) global fixed point averaging over all patterns, (2) metastable states averaging over a subset of patterns, and (3) fixed points which store a single pattern. Transformer and BERT models operate in their first layers preferably in the global averaging regime, while they operate in higher layers in metastable states. The gradient in transformers is maximal for metastable states, is uniformly distributed for global averaging, and vanishes for a fixed point near a stored pattern. Using the Hopfield network interpretation, we analyzed learning of transformer and BERT models. Learning starts with attention heads that average and then most of them switch to metastable states. However, the majority of heads in the first layers still averages and can be replaced by averaging, e.g. our proposed Gaussian weighting. In contrast, heads in the last layers steadily learn and seem to use metastable states to collect information created in lower layers. These heads seem to be a promising target for improving transformers. Neural networks with Hopfield networks outperform other methods on immune repertoire classification, where the Hopfield net stores several hundreds of thousands of patterns. We provide a new PyTorch layer called “Hopfield”, which allows to equip deep learning architectures with modern Hopfield networks as a new powerful concept comprising pooling, memory, and attention.

Subscribe to our newsletter and stay in the know:
https://www.iarai.ac.at/event-type/se...
_________
IARAI | Institute of Advanced Research in Artificial Intelligence
www.iarai.org

Modern Hopfield Networks - Dr Sepp Hochreiter

Поделиться в:

Доступные форматы для скачивания:

Скачать видео mp4

  • Информация по загрузке:

Скачать аудио mp3

Похожие видео

Visual Learning with Reduced Supervision - Dr. Christian Rupprecht

Visual Learning with Reduced Supervision - Dr. Christian Rupprecht

How GNNs and Symmetries can help to solve PDEs - Max Welling

How GNNs and Symmetries can help to solve PDEs - Max Welling

ICML 2021 | Modern Hopfield Networks - Dr Sepp Hochreiter

ICML 2021 | Modern Hopfield Networks - Dr Sepp Hochreiter

Алгоритм памяти, вдохновлённый работой мозга

Алгоритм памяти, вдохновлённый работой мозга

Hopfield Networks is All You Need (Paper Explained)

Hopfield Networks is All You Need (Paper Explained)

Michael Bronstein: Breakthroughs with AI

Michael Bronstein: Breakthroughs with AI

Плотная ассоциативная память в машинном обучении

Плотная ассоциативная память в машинном обучении

Neural diffusion PDEs, differential geometry, and graph neural networks - Michael Bronstein

Neural diffusion PDEs, differential geometry, and graph neural networks - Michael Bronstein

Generative Model That Won 2024 Nobel Prize

Generative Model That Won 2024 Nobel Prize

The future of intelligence | Demis Hassabis (Co-founder and CEO of DeepMind)

The future of intelligence | Demis Hassabis (Co-founder and CEO of DeepMind)

Backpropagation and the brain

Backpropagation and the brain

Модель, получившая Нобелевскую премию по физике 2024 года — Hopfield Networks

Модель, получившая Нобелевскую премию по физике 2024 года — Hopfield Networks

Abstraction and Analogy are the Keys to Robust AI - Melanie Mitchell

Abstraction and Analogy are the Keys to Robust AI - Melanie Mitchell

Graph Based Machine Learning Methods for Human Mobility Analysis - Henry Martin

Graph Based Machine Learning Methods for Human Mobility Analysis - Henry Martin

Dmitry Krotov | Modern Hopfield Networks for Novel Transformer Architectures

Dmitry Krotov | Modern Hopfield Networks for Novel Transformer Architectures

ICLR 2021 Keynote -

ICLR 2021 Keynote - "Geometric Deep Learning: The Erlangen Programme of ML" - M Bronstein

Can We Build an Artificial Hippocampus?

Can We Build an Artificial Hippocampus?

Сеть Хопфилда: как хранятся воспоминания в нейронных сетях? [Нобелевская премия по физике 2024 го...

Сеть Хопфилда: как хранятся воспоминания в нейронных сетях? [Нобелевская премия по физике 2024 го...

Theoretical Foundations of Graph Neural Networks

Theoretical Foundations of Graph Neural Networks

John Hopfield: Physics View of the Mind and Neurobiology | Lex Fridman Podcast #76

John Hopfield: Physics View of the Mind and Neurobiology | Lex Fridman Podcast #76

© 2025 dtub. Все права защищены.



  • Контакты
  • О нас
  • Политика конфиденциальности



Контакты для правообладателей: [email protected]