They get complex by adding new weak models to the previous ones, like 3D printers, layer by layer. On the other hand, ANN models begin their life with complexity!
The goal of ‘Boosting’ is to reduce the ‘bias’, which causes ‘under-fitting’ (no relation btw explanatory and response variables).
The goal of ‘Bagging’ is to reduce the ‘variance’, which causes ‘over-fitting’ (fluctuation in responses).
Random Forest models uses ‘Bagging’ technique with ‘Decision Tree’ type trees, while Gradient Boosting models uses ‘Boosting’ technique with ‘Regression Tree’ type trees.
Decision Trees are useful for categorical responses, while Regression Trees are useful for numeric ones.
Boosting is actually a sequentially iterative procedure that we generate new ‘weak’ models which are dedicated to the defections (wrong predictions) of previous ‘weak’ models.
