Boosting and AdaBoost

Preliminaries Committee Boosting The committee has an equal weight for every prediction from all models, and it gives little improvement than a single model. Then boosting was built for this problem. Boosting is a technique of combining multiple ‘base’ classifiers to produce a form of the committee that: performances better than any of the base classifiers and each base classifier has a different weight factor Adaboost Adaboost is short for adaptive boosting....

March 7, 2020 · (Last Modification: August 4, 2022) · Anthony Tan

An Introduction to Combining Models

Preliminaries ‘Mixtures of Gaussians’ Basic machine learning concepts Combining Models1 The mixture of Gaussians had been discussed in the post ‘Mixtures of Gaussians’. It was used to introduce the ‘EM algorithm’ but it gave us the inspiration of improving model performance. All models we have studied, besides neural networks, are all single-distribution models. That is just like that, to solve a problem we invite an expert who is very good at this kind of problem, then we just do whatever the expert said....

March 7, 2020 · (Last Modification: April 28, 2022) · Anthony Tan