Gradient Boosting Machine (GBM)
An ensembling technique where instead of averaging the predictions of all models, each successive model predicts the error of the previous model. The errors are then summed to obtain the final prediction.
- The first model produces a prediction.
- The difference between this prediction and the actual value is obtained.
- The difference is now set as the target.
- The next model now attempts to predict this difference.
- Repeat steps 2-4 for as many models as desired.
- Sum all obtained differences.
Note
While this technique tends to produce better results, is more likely to overfit. This is because the machine is trying to minimize the difference between the predictions and actual values in the training set.