K-Nearest Neighbours (K-NN) is a straightforward and easily understood machine learning algorithm. It is categorized as a lazy learner because it defers model construction until prediction time, merely storing the training data. To classify a new data point, K-NN identifies the closest neighbours within the stored training data. Its non-parametric nature means it makes no assumptions about the underlying data distribution's form, relying instead on the entire training dataset for decision-making. This characteristic enables K-NN to effectively handle complex data structures, though it may incur significant computational costs with extensive datasets. Consequently, the accurate classification is Lazy learning and non-parametric learning.