Algorithmic probability and the information distance (Bruno Bauwens)
Автор: ФКН ВШЭ
Загружено: 2025-10-18
Просмотров: 193
The conditional Kolmogorov complexity of a string x given a string y is the minimal length of a program that on input y prints x and halts. Andrei Kolmogorov promoted the study of this notion in 1964 to find a theory that could somehow justify his famous axioms of probability. But to connect to probability, one should use a variant of complexity, which is based on self delimiting programs. This notion can be defined in 4 different ways, one of which is the logarithm of algorithmic probability (in discrete form). This probability was introduced by Solomonoff in 1960 to describe learning in a very general way.
In various applications, there is a need for a symmetric notion of conditional complexity. The first proposal from 1998 is to consider the minimal length of a program that prints x on input y and also prints y on input x. The authors also prove that the other symmetrized definitions of conditional complexity are close to each other, but conjecture that they do not coincide. Recently, I have proven this conjecture and also showed that the 4 definitions only differ in strange corner cases (for example, one string needs to be exponentially more complex than the other).
In this talk, I will briefly discuss applications of algorithmic probability and the algorithmic information distance to machinelearning. Then I will prove the coding theorem and its approximate bidirectional variant. Finally, I discuss recent results.
Bruno Bauwens, associate professor of the Big Data and Information Retrieval School, HSE Faculty of Computer Science.
3 October 2025
• Первое достижение границ случайным процесс...
• Математический семинар ФКН
HSE Faculty of Computer Science: https://cs.hse.ru/en/
📍 https://vk.com/cshse
📍 https://t.me/fcs_hse
📍 https://t.me/sci_fcs
Доступные форматы для скачивания:
Скачать видео mp4
-
Информация по загрузке: