This SIMPLE XGBoost Trick Boosts Your Accuracy - XGBoost Classification Step-by-Step Guide
Автор: Super Data Science
Загружено: 2025-02-10
Просмотров: 1200
Learn how XGBoost classification works and why it’s different from regression. In this tutorial, we break down key concepts like probability calculations, log-odds, decision tree splitting, and residuals. Whether you're a beginner or looking for advanced insights, this video will give you a strong foundation in XGBoost classification.
Course Link HERE: https://sds.courses/ml-2
You can also find us here:
Website: https://www.superdatascience.com/
Facebook: / superdatascience
Twitter: / superdatasci
LinkedIn: / superdatascience
Contact us at: [email protected]
📌 Chapters:
00:00 - Introduction to XGBoost Classification
00:28 - Understanding Classification and Probabilities
00:50 - How XGBoost Predicts Classes
01:15 - Calculating Residuals in Classification
01:44 - Similarity Score Differences in Classification
02:17 - How XGBoost Finds the Best Split
02:48 - Why Classification Uses Different Similarity Scores
03:13 - The Role of Probability in XGBoost
03:48 - Why We Can't Use Residual Charts in Classification
04:20 - Decision Tree Splitting and Leaf Outputs
04:53 - The Formula Behind Leaf Outputs
05:26 - Understanding Log-Odds in Classification
05:57 - How Decision Trees Use Log-Odds
06:15 - The Math Behind Log-Odds and Probabilities
06:44 - Why We Use Log-Odds Instead of Probabilities
07:16 - Building Trees in the World of Log-Odds
07:48 - Converting Log-Odds Back to Probabilities
08:17 - Residuals and Probabilities in XGBoost
08:46 - Advanced Probability Calculations in Classification
09:22 - How XGBoost Sums Log-Odds for Predictions
09:39 - Final Thoughts on XGBoost Classification
#MachineLearning #XGBoost #DataScience #AI #Classification #LogOdds #Probability #DecisionTrees #MLAlgorithms #AIforBeginners #DataAnalytics #PredictiveModeling #PythonML #DeepLearning #XGBoostTutorial
Доступные форматы для скачивания:
Скачать видео mp4
-
Информация по загрузке: