Distance-Weighted kNN
- We might want to weight the nearer neighbors more heavily:
\[ \hat{f}(x_{q}) \leftarrow \frac{\sum_{i=1}^{k} w_{i} f(x_{i})}{\sum_{i=1}^{k} w_{i}}
\]
where
\[ w_{i} \equiv \frac{1}{d(x_{q}, x_{i})^{2}} \]
and $d(x_{q}, x_{i})$ is distance between $x_{q}$ and $x_{i}$.
- Using this function it makes sense to use all examples
instead of just $k$.
- If all training examples are used we call the algorithm a
global method. If only the nearest are use then its
a local method.
José M. Vidal
.
6 of 18