An Introduction to Probabilistic Generative Models

Preliminaries Probability Bayesian Formular Calculus Probabilistic Generative Models1 The generative model used for making decisions contains an inference step and a decision step: Inference step is to calculate \(\Pr(\mathcal{C}_k|\mathbf{x})\) which means the probability of \(\mathbf{x}\) belonging to the class \(\mathcal{C}_k\) given \(\mathbf{x}\) Decision step is to make a decision based on \(\Pr(\mathcal{C}_k|\mathbf{x})\) which was calculated in step 1 In this post, we just give an introduction and a framework for the probabilistic generative model in classification....

February 20, 2020 · (Last Modification: April 28, 2022) · Anthony Tan

Fisher Linear Discriminant(LDA)

Preliminaries linear algebra inner multiplication projection Idea of Fisher linear discriminant1 ‘Least-square method’ in classification can only deal with a small set of tasks. That is because it was designed for the regression task. Then we come to the famous Fisher linear discriminant. This method is also discriminative for it gives directly the class to which the input \(\mathbf{x}\) belongs. Assuming that the linear function \[ y=\mathbf{w}^T\mathbf{x}+w_0\tag{1} \]...

February 19, 2020 · (Last Modification: April 28, 2022) · Anthony Tan

Least Squares in Classification

Preliminaries A Simple Linear Regression Least Squares Estimation From Linear Regression to Linear Classification pseudo-inverse Least Squares for Classification1 Least-squares for linear regression had been talked about in ‘Simple Linear Regression’. And in this post, we want to find out whether this powerful algorithm can be used in classification. Recalling the distinction between the properties of classification and regression, two points need to be emphasized again(‘From Linear Regression to Linear Classification’):...

February 17, 2020 · (Last Modification: April 28, 2022) · Anthony Tan

From Linear Regression to Linear Classification

Preliminaries An Introduction to Linear Regression A Simple Linear Regression Bayesian theorem Feature extraction Recall Linear Regression The goal of a regression problem is to find out a function or hypothesis that given an input \(\mathbf{x}\), it can make a prediction \(\hat{y}\) to estimate the target. Both the target \(y\) and prediction \(\hat{y}\) here are continuous. They have the properties of numbers1: Consider 3 inputs \(\mathbf{x}_1\), \(\mathbf{x}_2\) and \(\mathbf{x}_3\) and their coresponding targets are \(y_1=0\), \(y_2=1\) and \(y_3=2\)....

February 17, 2020 · (Last Modification: April 28, 2022) · Anthony Tan

Polynomial Regression and Features-Extension of Linear Regression

Priliminaries A Simple Linear Regression Least Squares Estimation Extending Linear Regression with Features1 The original linear regression is in the form: \[ \begin{aligned} y(\mathbf{x})&= b + \mathbf{w}^T \mathbf{x}\\ &=w_01 + w_1x_1+ w_2x_2+\cdots + w_{m+1}x_{m+1} \end{aligned}\tag{1} \] where the input vector \(\mathbf{x}\) and parameter \(\mathbf{w}\) are \(m\)-dimension vectors whose first components are \(1\) and bias \(w_0=b\) respectively. This equation is linear for both the input vector and parameter vector. Then an idea come to us, if we set \(x_i=\phi_i(\mathbf{x})\) then equation (1) convert to:...

February 15, 2020 · (Last Modification: April 30, 2022) · Anthony Tan