Faithful-Newton: a variant of Newton's method for large-scale optimisation
Автор: OPTIMA ARC
Загружено: 2025-10-15
Просмотров: 48
Speaker: Alexander Lim (University of Queensland)
Title: Faithful-Newton: a variant of Newton's method for large-scale optimisation
Summary: In machine learning (ML), gradient-based algorithms have long been the default choice for solving empirical risk minimisation problems. In this talk, I will provide a brief overview of an alternative approach, Newton’s method, discuss why it is not commonly used in ML, and introduce one possible way to make it more practical and effective in modern ML applications.
Bio: Alexander Lim completed his PhD in Mathematics at the University of Queensland (UQ), under the supervision of Dr Fred Roosta. His research focuses on developing and analysing second-order type methods for large-scale problems. Prior to his PhD, he obtained a Bachelor of Mathematics from UQ in 2020 and a Bachelor of Arts (Honours), major in philosophy of mathematics and physics, from the University of Tasmania in 2016.
Доступные форматы для скачивания:
Скачать видео mp4
-
Информация по загрузке: