Lecture 21: CS217 | SVM: Primal-Dual Formulation, KKT Conditions & Kernel Trick | AI-ML | IITB 2025
Автор: Prof. Pushpak Bhattacharyya | IIT Bombay
Загружено: 2025-03-06
Просмотров: 594
Welcome to Lecture 21 of the CS217: AI-ML Course by IIT Bombay. In this session—led by Nihar Ranjan Sahoo (a final-year PhD student)—we extend the discussion on Support Vector Machines (SVMs) from the previous lecture (where we introduced soft margins, slack variables, and outlier handling) and delve into the mathematical framework of the primal-dual approach, the Karush–Kuhn–Tucker (KKT) conditions, and the kernel trick for handling non-linear data.
Topics Covered
Primal & Dual Formulation - Setting up the primal objective to minimize ‖w‖² (with penalty terms for misclassifications in soft margin). Converting to the dual problem by introducing Lagrange multipliers (α) and understanding how this leads to an equivalent, often more computationally efficient optimization. Interpreting why only certain data points (support vectors) have α greater than 0, and how they define the decision boundary.
KKT Conditions - The role of partial derivatives (w.r.t. w, b, and slack variables) in deriving the dual. How satisfaction of these conditions guarantees equivalence between primal and dual solutions (strong duality). Connection to support vectors: points that lie on or within the margin boundaries.
Kernel Trick - Motivation: mapping non-linearly separable data into higher-dimensional spaces for linear separability. Examples of commonly used kernels (Polynomial, RBF), with discussion of their computational advantages and how SVM libraries implement them without explicitly constructing high-dimensional feature vectors. Building a kernel matrix (pairwise dot products in transformed space) and how it simplifies training for high-dimensional or infinite-dimensional feature mappings.
Code Examples: Demonstration of SVM implementation using scikit-learn's SVC class with different kernels and parameters.
This lecture is part of the CS217 course taught by Prof. Pushpak Bhattacharya at IIT Bombay. It offers a deeper look into how SVMs handle a wide range of real-world datasets—both linearly separable and non-linear—through robust optimization techniques.
#supportvectormachines #svm #primaldual #kktconditions #kerneltrick #machinelearning #aiml #iitbombay #computerscience #softmargin #svmkernel #rbfkernel #polynomialkernel #cs217 #optimizationtechniques #aimlcourse #chatgpt #anthropic #deepseek #artificialintelligence #machinelearning #iitbombay #supportvectormachines #svm #cs217 #aicourse #maximummargin #hyperplanes #optimizationtechniques #softmargin #hardmargin #lagrangian #dualformulation #slackvariables #machinelearningtheory #aiml #iitb #computerscience #classifiers #computerscience #hyperplane #marginoptimization #outliers #iitlecture #chatgpt #deepseek #anthropic
Доступные форматы для скачивания:
Скачать видео mp4
-
Информация по загрузке: