First page Back Continue Last page Image
k- Nearest Neighbors Regression
- To predict a new data point, the k-NN algorithm looks at the closest data points in the training dataset. When using multiple nearest neighbors, the prediction is the average of the relevant neighbors.
- Using a small number of neighbors can cause the model to overfit to training data whereas too many neighbors can cause the model to underfit.
- While the k-NN algorithm is easy and fast to implement, it is not appropriate to be used with large datasets with many features.
-
- Figure 1: Prediction made by k-NN regression model with one nearest neighbor and three nearest neighbors
- Figure 2: Performance of k-NN regression for different values of n_neighbors
-