Ensemble

Basic

Ensemble | Machine Learning Basics

Chapter 4 #

Ensemble #

Blend multiple models to boost accuracy and robustness via variance/bias reduction.

Families #

  • Bagging: bootstrap samples; reduces variance (e.g., Random Forests).
  • Boosting: sequential learners correcting residuals (e.g., XGBoost, LightGBM, CatBoost).
  • Stacking: meta‑model over base learners; requires careful CV to avoid leakage.

Tips #

  • Keep base models diverse (algorithms/features/seeds).
  • Use cross‑validated out‑of‑fold predictions for stacking.
  • Tune depth/learning rate/regularization to prevent overfitting.