# ML | Independent Component Analysis

Prerequisite: Principal Component Analysis

**Independent Component Analysis** (ICA) is a machine learning technique to separate independent sources from a mixed signal. Unlike principal component analysis which focuses on maximizing the variance of the data points, the independent component analysis focuses on independence, i.e. independent components.

**Problem:** To extract independent sources’ signals from a mixed signal composed of the signals from those sources.

**Given:** Mixed signal from five different independent sources.

**Aim:** To decompose the mixed signal into independent sources:

- Source 1
- Source 2
- Source 3
- Source 4
- Source 5

**Solution:** **Independent Component Analysis (ICA)**.

Consider *Cocktail Party Problem* or *Blind Source Separation* problem to understand the problem which is solved by independent component analysis.

Here, There is a party going into a room full of people. There is ‘n’ number of speakers in that room and they are speaking simultaneously at the party. In the same room, there are also ‘n’ number of microphones placed at different distances from the speakers which are recording ‘n’ speakers’ voice signals. Hence, the number of speakers is equal to the number must of microphones in the room.

Now, using these microphones’ recordings, we want to separate all the ‘n’ speakers’ voice signals in the room given each microphone recorded the voice signals coming from each speaker of different intensity due to the difference in distances between them. Decomposing the mixed signal of each microphone’s recording into independent source’s speech signal can be done by using the machine learning technique, independent component analysis.

*[ X1, X2, ….., Xn ] => [ Y1, Y2, ….., Yn ]*

where, X1, X2, …, Xn are the original signals present in the mixed signal and Y1, Y2, …, Yn are the new features and are independent components which are independent of each other.

### Restrictions on ICA –

- The independent components generated by the ICA are assumed to be statistically independent of each other.
- The independent components generated by the ICA must have non-gaussian distribution.
- The number of independent components generated by the ICA is equal to the number of observed mixtures.

**Difference between PCA and ICA – **

Principal Component Analysis | Independent Component Analysis |
---|---|

It reduces the dimensions to avoid the problem of overfitting. | It decomposes the mixed signal into its independent sources’ signals. |

It deals with the Principal Components. | It deals with the Independent Components. |

It focuses on maximizing the variance. | It doesn’t focus on the issue of variance among the data points. |

It focuses on the mutual orthogonality property of the principal components. | It doesn’t focus on the mutual orthogonality of the components. |

It doesn’t focus on the mutual independence of the components. | It focuses on the mutual independence of the components. |

## Recommended Posts:

- Analysis of test data using K-Means Clustering in Python
- ML | Principal Component Analysis(PCA)
- Python | NLP analysis of Restaurant reviews
- Principal Component Analysis with Python
- Multidimensional data analysis in Python
- Exploratory Data Analysis in Python | Set 1
- Exploratory Data Analysis in Python | Set 2
- ML | Linear Discriminant Analysis
- ML | R-squared in Regression Analysis
- Heteroscedasticity in Regression Analysis
- Data Analysis with SciPy
- ML | Adjusted R-Square in Regression Analysis
- ML | Spectral Clustering
- ML | Training Image Classifier using Tensorflow Object Detection API

If you like GeeksforGeeks and would like to contribute, you can also write an article using contribute.geeksforgeeks.org or mail your article to contribute@geeksforgeeks.org. See your article appearing on the GeeksforGeeks main page and help other Geeks.

Please Improve this article if you find anything incorrect by clicking on the "Improve Article" button below.