XGBoost

2.4.9

XGBoost

Last updated 2020-05-20 Read time 1 min
Summary
  • model assumptions and when the method is appropriate.
  • objective criteria and how they influence model behavior.
  • implementation and validation choices for stable results.

Intuition #

This method should be interpreted through its assumptions, data conditions, and how parameter choices affect generalization.

Detailed Explanation #

XGBoost (eXtreme Gradient Boosting) is a gradient boosting implementation that focuses on regularisation and speed. It offers rich features such as missing-value handling, tree optimisations, and parallel training, making it a staple in competitions and production.