What is curse of Dimensionality in the field of Machine Learning and Pattern Recognition?
No eq just theory
The Attempt at a Solution
Initially the feature space is sparse but as we increase the number of variables, feature space becomes dense. Now we need more computational power for testing those features. Also with more var we have more noise added. This phenomena is called curse of dimensionality. So we have to go for reducing the dimensions which may cause loss of some features.
Is the above correct? What else can i add to it in simple words?