Bagging and boosting are popular ensemble methods. While they aren’t without their drawbacks, their strengths are clearly defined: bagging reduces the variance, while boosting focuses on both bias and variance (but does lead to higher variance). So, what if we combined the two?
Alex Rogozhnikov and Tatiana Likhomanenko from the Higher School of Economics in Moscow developed a new algorithm for this purpose, called InfiniteBoosting. It constructs an arbitrary large ensemble tree and trains every weak learner in a gradient boosting way. Due to its hybrid features, it builds a classifier reducing bias at each training step, while controlling the over-fitting at the same time.
Congratulations for this success to Alex Rogozhnikov and Tatiana Likhomanenko from the Higher School of Economics in Moscow.
For the full article, please look here.