Question:medium

How does the curse of dimensionality affect the performance of the K-Nearest Neighbours (K-NN) algorithm?

Show Hint

High dimensions make distance metrics unreliable — keep K-NN features limited.
Updated On: Jan 14, 2026
  • K-NN performs better with an increasing number of input variables.
  • K-NN struggles to predict the output accurately with an increasing number of input variables.
  • K-NN becomes faster with more input variables due to increased computational power.
  • K-NN becomes less sensitive to outliers with an increasing number of input variables.
Show Solution

The Correct Option is B

Solution and Explanation

Increasing the number of input features leads to the curse of dimensionality. In K-NN, distance calculations lose their effectiveness in high-dimensional spaces. With more dimensions, data points disperse, resulting in similar distances between them. This hinders K-NN's ability to identify genuine nearest neighbors, thus diminishing accuracy. Furthermore, increased dimensions elevate computational and storage demands, potentially slowing down K-NN. Options (A), (C), and (D) are incorrect as they propose improvements or advantages that are not realized. Consequently, K-NN experiences reduced prediction accuracy when faced with numerous input variables due to the curse of dimensionality.
Was this answer helpful?
0