Boosting and AdaBoost

Preliminaries Committee Boosting The committee has an equal weight for every prediction from all models, and it gives little improvement than a single model. Then boosting was built for this problem. Boosting is a technique of combining multiple ‘base’ classifiers to produce a form of the committee that: performances better than any of the base classifiers and each base classifier has a different weight factor Adaboost Adaboost is short for adaptive boosting....

March 7, 2020 · (Last Modification: August 4, 2022) · Anthony Tan

Bayesian Model Averaging(BMA) and Combining Models

Preliminaries Bayesian Theorem Bayesian Model Averaging(BMA)1 Bayesian model averaging(BMA) is another wildly used method that is very like a combining model. However, the difference between BMA and combining models is also significant. A Bayesian model averaging is a Bayesian formula in which the random variable are models(hypothesizes) \(h=1,2,\cdots,H\) with prior probability \(\Pr(h)\), then the marginal distribution over data \(X\) is: \[ \Pr(X)=\sum_{h=1}^{H}\Pr(X|h)\Pr(h) \] And the MBA is used to select a model(hypothesis) that can model the data best through Bayesian theory....

March 7, 2020 · (Last Modification: April 28, 2022) · Anthony Tan

An Introduction to Combining Models

Preliminaries ‘Mixtures of Gaussians’ Basic machine learning concepts Combining Models1 The mixture of Gaussians had been discussed in the post ‘Mixtures of Gaussians’. It was used to introduce the ‘EM algorithm’ but it gave us the inspiration of improving model performance. All models we have studied, besides neural networks, are all single-distribution models. That is just like that, to solve a problem we invite an expert who is very good at this kind of problem, then we just do whatever the expert said....

March 7, 2020 · (Last Modification: April 28, 2022) · Anthony Tan