Atmanirbhar Bharat Lecture Series 2025, on the sub theme 'Nobel Prize in Physics 2024
Автор: International Hindu School
Загружено: 2025-11-10
Просмотров: 388
Atmanirbhar Bharat 2nd Lecture Series 2025, on the sub-theme 'Nobel Prize in Physics 2024',
Hopfield Network, Boltzmann Machine, and Restricted Boltzmann Machine (RBM)
1. Hopfield Network
• A recurrent neural network used mainly for associative memory and pattern recall.
• Consists of fully connected neurons with symmetric weights (no self-connections).
• Dynamics:
• Neurons update their states iteratively to minimize an energy function.
• The network converges to stable states (local minima), which represent stored memories.
• Learning rule: Hebbian learning
• Limitations:
• Limited storage capacity (~0.15 × number of neurons).
• May converge to wrong memories (local minima).
• Deterministic model (binary neurons).
2. Boltzmann Machine (BM)
• A stochastic recurrent neural network.
• Like Hopfield, but:
• Uses probabilistic neurons (binary stochastic units).
• Includes hidden units (not just visible ones).
• Learns complex probability distributions.
• Uses simulated annealing to minimize energy.
• Learning is slow due to sampling over all neuron interactions.
• Energy function:
• Limitations:
• Training is computationally expensive due to full connectivity.
3. Restricted Boltzmann Machine (RBM)
• A simplified and efficient version of BM.
• Architecture restriction:
• Two layers only: Visible (v) and Hidden (h).
• No connections within a layer (only between layers).
• This makes training faster and practical.
• Learning method: Contrastive Divergence (CD-k) — approximates gradients efficiently.
• Used for:
• Feature learning
• Dimensionality reduction
• Recommendation systems
• Pre-training deep networks (Deep Belief Nets)
• Key advantages:
• Faster training than BM
• Can learn useful hidden representations
Доступные форматы для скачивания:
Скачать видео mp4
-
Информация по загрузке: