Discriminant Functions and Decision Boundary

Preliminaries convex definition linear algebra vector length vector direction Discriminant Function in Classification The discriminant function or discriminant model is on the other side of the generative model. And we, here, have a look at the behavior of the discriminant function in linear classification.1 In the post ‘Least Squares Classification’, we have seen, in a linear classification task, the decision boundary is a line or hyperplane by which we separate two classes....

February 17, 2020 · (Last Modification: April 28, 2022) · Anthony Tan

Least Squares in Classification

Preliminaries A Simple Linear Regression Least Squares Estimation From Linear Regression to Linear Classification pseudo-inverse Least Squares for Classification1 Least-squares for linear regression had been talked about in ‘Simple Linear Regression’. And in this post, we want to find out whether this powerful algorithm can be used in classification. Recalling the distinction between the properties of classification and regression, two points need to be emphasized again(‘From Linear Regression to Linear Classification’):...

February 17, 2020 · (Last Modification: April 28, 2022) · Anthony Tan

From Linear Regression to Linear Classification

Preliminaries An Introduction to Linear Regression A Simple Linear Regression Bayesian theorem Feature extraction Recall Linear Regression The goal of a regression problem is to find out a function or hypothesis that given an input \(\mathbf{x}\), it can make a prediction \(\hat{y}\) to estimate the target. Both the target \(y\) and prediction \(\hat{y}\) here are continuous. They have the properties of numbers1: Consider 3 inputs \(\mathbf{x}_1\), \(\mathbf{x}_2\) and \(\mathbf{x}_3\) and their coresponding targets are \(y_1=0\), \(y_2=1\) and \(y_3=2\)....

February 17, 2020 · (Last Modification: April 28, 2022) · Anthony Tan