Linear Discriminant Analysis in Machine Learning
Overview
L.D.A. (Linear Discriminant Analysis) is a prominent dimensionality reduction technique used in machine learning for solving classification problems greater than two-class. It is also referred to as N.D.A. (Normal Discriminant Analysis), D.F.A. (Discriminant Function Analysis), etc. The purpose of LDA is to project the features of a higher dimensional space one with a lower number of dimensions for bringing down the resources and dimensional costs.
Scope
In this article we shall discuss:
- What is LDA?
- Why is LDA required?
- The extensions of LDA.
- Disadvantages of using LDA.
This article is however not about:
- Dimensionality reduction or its other techniques.
- Any other unrelated concepts.
What is LDA?
LDA is a popular dimensionality reduction method used in ML in supervised classification problems. It is also seen as a pre-computation stage for modeling distinctions in machine learning and pattern classification applications. When it is required to efficiently isolate multiple classes with several features, LDA is the most frequently used technique in solving such classification problems. Suppose we want to efficiently isolate two classes having several features. If we use one feature for classifying them then overlapping can be observed as shown in the diagram below:
The count of features is regularly increased to resolve the issue of overlapping in the process of classification. Suppose we need to classify into the 2-dimensional plane, two classes having two data points sets, as shown in the image below:
But is not possible to draw a straight line in a 2D plane that efficiently separates these data points. However, LDA can be used for reducing this 2D plane into a 1D plane. LDA helps in maximizing the separability among several classes.
Why LDA is required?
- Logistic Regression is a common classification algorithm and performs fine in binary classification, but not when multiple classification problems having well-isolated classes are involved. LDA addresses this issue efficiently.
- LDA is also usable in data pre-computation to lower the feature count, thus in turn significantly decreasing the cost of computing.
- LDA can also be used in algorithms of face detection. LDA is used in extracting relevant data from different faces in Fisherfaces.
Extensions of LDA
LDA is an effective and simple method of solving classification problems. It has several variations and extensions, some of which are as follows:
- Q.D.A. (Quadratic Discriminant Analysis)
- F.D.A. (Flexible Discriminant Analysis)
- R.D.A. (Regularized Discriminant Analysis)
Disadvantages of LDA
LDA is used specifically in solving supervised classification problems for multiple classes; something impossible if using logistic regression. But LDA does not work in cases when the mean of the distributions is shared.
In such a situation, LDA can not produce a new axis that can linearly separate both classes. To solve this problem, non-linear discriminant analysis is used in machine learning.
Conclusion
- LDA is a prominent dimensionality reduction technique used in machine learning for solving classification problems greater than two-class.
- LDA is also seen as a pre-computation stage for modeling distinctions in machine learning and pattern classification applications.
- QDA, FDA, RDA, etc are some famous extensions of LDA.
- LDA does not work in cases when the mean of the distributions is shared.