When nothing works, Boosting does. Nowadays many people use either XGBoost or LightGBM or CatBoost to win competitions at Kaggle or Hackathons. AdaBoost is the first stepping stone in the world of Boosting.
AdaBoost is one of the first boosting algorithms to be adapted in solving practices. Adaboost helps you combine multiple “weak classifiers” into a single “strong classifier”. Here are some (fun) facts about Adaboost!
→ The weak learners in AdaBoost are decision trees with a single split, called decision stumps.
→ AdaBoost works by putting more weight on difficult to classify instances and less on those already handled well.
→ AdaBoost algorithms can be used for both classification and regression problem.