Популярное

Музыка Кино и Анимация Автомобили Животные Спорт Путешествия Игры Юмор

Интересные видео

2025 Сериалы Трейлеры Новости Как сделать Видеоуроки Diy своими руками

Топ запросов

смотреть а4 schoolboy runaway турецкий сериал смотреть мультфильмы эдисон
dTub
Скачать

Physics Informed Neural Networks explained for beginners | From scratch implementation and code

Автор: Vizuara

Загружено: Дата премьеры: 23 февр. 2025 г.

Просмотров: 6 650 просмотров

Описание:

Teaching your neural network to "respect" Physics



As universal function approximators, neural networks can learn to fit any dataset produced by complex functions. With deep neural networks, overfitting is not a feature. It is a bug.



Let us consider a hypothetical set of experiments. You throw a ball up (or at an angle), and note down the height of the ball at different points of time.



It is easy to train a neural network on this dataset so that you can predict the height of the ball even at time points where you did not note down the height in your experiments.



First, let us discuss how this training is done.



You can construct a neural network with a few or multiple hidden layers. The input is time (t) and the output predicted by the neural network ithe s height of the ball (h).



The neural network will be initialized with random weights. This means the predictions of h(t) the neural network makes will be very bad initially.



We need to penalize the neural network for making these bad predictions right? How do we do that? In the form of loss functions.



Loss of a neural network is a measure of how bad its predictions are compared to the real data. The closer the predictions and data, the lower the loss.



A singular goal of neural network training is to minimize the loss.



Mean Squared Error is minimum when the predictions are very close to the experimental data as shown in the figure below.



But there is a problem with this approach. What if your experimental data was not good? In the image below you can see that one of the data points is not following the trend shown by the rest of the dataset.



Knowing that real-life data may have noise and outliers, it will not be wise if we train a neural network to exactly mimic this dataset. It results in something called overfitting.



If you are throwing a ball and observing its physics, then you already have some knowledge about the trajectory of the ball, based on Newton’s laws of motion.



The physics you assume may not be in perfect agreement with the experimental data as shown above, but it makes sense to think that the experiments will not deviate too much from physics.



If you want to teach physics to neural network, then you have to somehow incentivize neural network to make predictions closer to what is suggested by physics.



The goal of PINNs is to solve (or learn solutions to) differential equations by embedding the known ODEs directly into the neural network’s training objective (loss function).



The basic idea in PINN is to have a neural network is trained to minimize a loss function that includes:



1) A data mismatch term (if observational data are available).

2) A physics loss term enforcing the differential equation itself (and initial/boundary conditions).


I have made a 60-minute lecture video on PINNs (meant even for absolute beginners) and hosted it on Vizuara’s YouTube channel. Do check this out. I hope you enjoy watching this lecture as much as I enjoyed making it:

Here is the article with code: https://www.vizuaranewsletter.com/p/t...

Physics Informed Neural Networks explained for beginners | From scratch implementation and code

Поделиться в:

Доступные форматы для скачивания:

Скачать видео mp4

  • Информация по загрузке:

Скачать аудио mp3

Похожие видео

Но что такое нейронная сеть? | Глава 1. Глубокое обучение

Но что такое нейронная сеть? | Глава 1. Глубокое обучение

Physics Informed Neural Networks (PINNs) [Physics Informed Machine Learning]

Physics Informed Neural Networks (PINNs) [Physics Informed Machine Learning]

MCP vs API: Simplifying AI Agent Integration with External Data

MCP vs API: Simplifying AI Agent Integration with External Data

Building a neural network FROM SCRATCH (no Tensorflow/Pytorch, just numpy & math)

Building a neural network FROM SCRATCH (no Tensorflow/Pytorch, just numpy & math)

AI/ML+Physics Part 4: Crafting a Loss Function [Physics Informed Machine Learning]

AI/ML+Physics Part 4: Crafting a Loss Function [Physics Informed Machine Learning]

ГЕНЕТИКА работает НЕ ТАК, КАК вы ДУМАТЕ! — ТОПЛЕС

ГЕНЕТИКА работает НЕ ТАК, КАК вы ДУМАТЕ! — ТОПЛЕС

Музыка для работы - Deep Focus Mix для программирования, кодирования

Музыка для работы - Deep Focus Mix для программирования, кодирования

Algebra Basics: What Are Polynomials? - Math Antics

Algebra Basics: What Are Polynomials? - Math Antics

Physics-Informed Neural Networks (PINNs) - An Introduction - Ben Moseley | Jousef Murad

Physics-Informed Neural Networks (PINNs) - An Introduction - Ben Moseley | Jousef Murad

Introduction to Generative AI

Introduction to Generative AI

© 2025 dtub. Все права защищены.



  • Контакты
  • О нас
  • Политика конфиденциальности



Контакты для правообладателей: [email protected]