Aarti Jha-When AI Makes Things Up_ Understanding and Tackling Hallucinations-PyData Global 2025
Автор: PyData
Загружено: 2026-01-09
Просмотров: 74
AI systems are increasingly being integrated into real-world products - from chatbots and search engines to summarisation tools and coding assistants. Yet, despite their fluency, these models can produce confident but false or misleading information, a phenomenon known as hallucination. In production settings, such errors can erode user trust, misinform decisions, and introduce serious risks. This talk unpacks the root causes of hallucinations, explores their impact on various applications, and highlights emerging techniques to detect and mitigate them. With a focus on practical strategies, the session offers guidance for building more trustworthy AI systems fit for deployment.
Доступные форматы для скачивания:
Скачать видео mp4
-
Информация по загрузке: