Популярное

Музыка Кино и Анимация Автомобили Животные Спорт Путешествия Игры Юмор

Интересные видео

2025 Сериалы Трейлеры Новости Как сделать Видеоуроки Diy своими руками

Топ запросов

смотреть а4 schoolboy runaway турецкий сериал смотреть мультфильмы эдисон
dTub
Скачать

ESWEEK 2021 Education - Spiking Neural Networks

Автор: Embedded Systems Week (ESWEEK)

Загружено: 2021-11-03

Просмотров: 26050

Описание:

ESWEEK 2021 - Education Class C1, Sunday, October 10, 2021
Instructor: Priyadarshini Panda, Yale

Abstract: Spiking Neural Networks (SNNs) have recently emerged as an alternative to deep learning due to their huge energy efficiency benefits on neuromorphic hardware. In this presentation, we suggest important techniques for training SNNs which bring a huge benefit in terms of latency, accuracy, interpretability, and robustness. We will first delve into how training is performed in SNNs. Training SNNs with surrogate gradients presents computational benefits due to short latency and is also considered as a more bio-plausible approach. However, due to the non-differentiable nature of spiking neurons, the training becomes problematic and surrogate methods have thus been limited to shallow networks compared to the conversion method. To address this training issue with surrogate gradients, we will also go over a recently proposed method Batch Normalization Through Time (BNTT) that allows us to target interesting beyond traditional image classification applications like video segmentation. with SNNs. Another critical limitation of SNNs is the lack of interpretability. While a considerable amount of attention has been given to optimizing SNNs, the development of explainability still is at its infancy. I will talk about our recent work on a bio-plausible visualization tool for SNNs, called Spike Activation Map (SAM) compatible with BNTT training. The proposed SAM highlights spikes having short inter-spike interval, containing discriminative information for classification. Finally, with proposed BNTT and SAM, I will highlight the robustness aspect of SNNs with respect to adversarial attacks. In the end, I will talk about interesting prospects of SNNs for non-conventional learning scenarios such as, federated and distributed learning.

Bio: Priya Panda is an assistant professor in the electrical engineering department at Yale University, USA. She received her B.E. and Master’s degree from BITS, Pilani, India in 2013 and her PhD from Purdue University, USA in 2019. During her PhD, she interned in Intel Labs where she developed large scale spiking neural network algorithms for benchmarking the Loihi chip. She is the recipient of the 2019 Amazon Research Award. Her research interests include- neuromorphic computing, deep learning and algorithm-hardware co-design for robust and energy efficient machine intelligence.

ESWEEK 2021 Education - Spiking Neural Networks

Поделиться в:

Доступные форматы для скачивания:

Скачать видео mp4

  • Информация по загрузке:

Скачать аудио mp3

Похожие видео

ESWEEK 2021 Education - Neural Network Accelerator Design

ESWEEK 2021 Education - Neural Network Accelerator Design

ACACES 2023: Neuromorphic computing: from theory to applications, Lecture 1 – Yulia Sandamirskaya

ACACES 2023: Neuromorphic computing: from theory to applications, Lecture 1 – Yulia Sandamirskaya

Training Spiking Neural Networks Using Lessons From Deep Learning

Training Spiking Neural Networks Using Lessons From Deep Learning

Самый важный алгоритм в машинном обучении

Самый важный алгоритм в машинном обучении

Spiking Neural Networks for More Efficient AI Algorithms

Spiking Neural Networks for More Efficient AI Algorithms

Intro to Binarized Neural Networks

Intro to Binarized Neural Networks

Dendrites: Why Biological Neurons Are Deep Neural Networks

Dendrites: Why Biological Neurons Are Deep Neural Networks

Neuromorphic Computing - Dr. Kwabena Boahen

Neuromorphic Computing - Dr. Kwabena Boahen

Why spiking neural networks are important - Simon Thorpe, CERCO

Why spiking neural networks are important - Simon Thorpe, CERCO

Neuromorphic Computing Explained | Jeffrey Shainline and Lex Fridman

Neuromorphic Computing Explained | Jeffrey Shainline and Lex Fridman

Cosyne 2022 Tutorial on Spiking Neural Networks - Part 1/2

Cosyne 2022 Tutorial on Spiking Neural Networks - Part 1/2

How the Brain Makes You: Collective Intelligence and Computation by Neural Circuits

How the Brain Makes You: Collective Intelligence and Computation by Neural Circuits

Introduction to Next Generation Reservoir Computing

Introduction to Next Generation Reservoir Computing

Neuromorphic computing with emerging memory devices

Neuromorphic computing with emerging memory devices

Но что такое нейронная сеть? | Глава 1. Глубокое обучение

Но что такое нейронная сеть? | Глава 1. Глубокое обучение

Tutorial on snnTorch: Jason Eshraghian ICONS 2021

Tutorial on snnTorch: Jason Eshraghian ICONS 2021

Brain-Like (Neuromorphic) Computing - Computerphile

Brain-Like (Neuromorphic) Computing - Computerphile

MIT Introduction to Deep Learning | 6.S191

MIT Introduction to Deep Learning | 6.S191

Сеть Хопфилда: как хранятся воспоминания в нейронных сетях? [Нобелевская премия по физике 2024 го...

Сеть Хопфилда: как хранятся воспоминания в нейронных сетях? [Нобелевская премия по физике 2024 го...

Почему диффузия работает лучше, чем авторегрессия?

Почему диффузия работает лучше, чем авторегрессия?

© 2025 dtub. Все права защищены.



  • Контакты
  • О нас
  • Политика конфиденциальности



Контакты для правообладателей: [email protected]