BERT 05 - Pretraining And Finetuning
Автор: Balaji Srinivasan
Загружено: 2022-09-04
Просмотров: 7138
In this video, we will learn how to pre-train the BERT model. But what does pre-training mean? Say we have a model, first, we train the model with a huge dataset for a particular task and save the trained model. Now, for a new task, instead of initializing a new model with random weights, we will initialize the model with the weights of our already trained model, (pre-trained model). That is since the model is already trained on a huge dataset, instead of training a new model from scratch for a new task, we use the
pre-trained model, and adjust (fine-tune) its weights according to the new task. This is a type of transfer learning.
LinkedIn: / balaji2512
GitHub: https://github.com/balajisrinivas
Доступные форматы для скачивания:
Скачать видео mp4
-
Информация по загрузке: