Nyström method with Kernel K-means++ samples as landmarksTools Oglic, Dino and Gaertner, Thomas (2017) Nyström method with Kernel K-means++ samples as landmarks. In: Proceedings of the 34th International Conference on Machine Learning, 6-11 August 2017, Sydney, Australia. Full text not available from this repository.
Official URL: http://proceedings.mlr.press/v70/oglic17a
AbstractWe investigate, theoretically and empirically, the effectiveness of kernel K-means++ samples as landmarks in the Nyström method for low-rank approximation of kernel matrices. Previous empirical studies (Zhang et al., 2008; Kumar et al.,2012) observe that the landmarks obtained using (kernel) K-means clustering define a good low-rank approximation of kernel matrices. However, the existing work does not provide a theoretical guarantee on the approximation error for this approach to landmark selection. We close this gap and provide the first bound on the approximation error of the Nystrom method with kernel K-means++ samples as landmarks. Moreover, for the frequently used Gaussian kernel we provide a theoretically sound motivation for performing Lloyd refinements of kernel K-means++ landmarks in the instance space. We substantiate our theoretical results empirically by comparing the approach to several state-of-the-art algorithms.
Actions (Archive Staff Only)
|