$\color{black}\rule{365px}{3px}$

Gradient Boosting is a powerful machine learning technique used for regression and classification tasks. It builds an ensemble of weak learners, typically decision trees(CART), in a sequential manner to produce a strong predictive model. The core idea is to focus on the mistakes made by previous models and correct them in subsequent iterations.

<aside> <img src="/icons/bookmark_lightgray.svg" alt="/icons/bookmark_lightgray.svg" width="40px" />

Each new model is trained to predict the residuals (errors) of the combined ensemble of previous models.

</aside>

Algorithm Steps

$\color{black}\rule{365px}{3px}$

image.png

Step 1: Initialize the Model


Start with an initial model $F_0(x)$ that minimizes the loss function. For regression, this could be the mean of the target variable.

$$ F_0(x)=\argmin_\gamma ⁡∑_{i=1}^nL(y_i,γ) $$

Step 2: Iterative Boosting Process


For each boosting iteration $m=1$ to $M$: