Nyström method with Kernel K-means++ samples as landmarks

Oglic, Dino and Gaertner, Thomas (2017) Nyström method with Kernel K-means++ samples as landmarks. In: Proceedings of the 34th International Conference on Machine Learning, 6-11 August 2017, Sydney, Australia.

[img]
Preview
PDF - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
Download (846kB) | Preview

Abstract

We investigate, theoretically and empirically, the effectiveness of kernel K-means++ samples as landmarks in the Nyström method for low-rank approximation of kernel matrices. Previous empirical studies (Zhang et al., 2008; Kumar et al.,2012) observe that the landmarks obtained using (kernel) K-means clustering define a good low-rank approximation of kernel matrices. However, the existing work does not provide a theoretical guarantee on the approximation error for this approach to landmark selection. We close this gap and provide the first bound on the approximation error of the Nystrom method with kernel K-means++ samples as landmarks. Moreover, for the frequently used Gaussian kernel we provide a theoretically sound motivation for performing Lloyd refinements of kernel K-means++ landmarks in the instance space. We substantiate our theoretical results empirically by comparing the approach to several state-of-the-art algorithms.

Item Type: Conference or Workshop Item (Paper)
Schools/Departments: University of Nottingham, UK > Faculty of Science > School of Computer Science
Related URLs:
Depositing User: Oglic, Dino
Date Deposited: 14 Jun 2017 14:55
Last Modified: 27 Jul 2017 14:35
URI: http://eprints.nottingham.ac.uk/id/eprint/43573

Actions (Archive Staff Only)

Edit View Edit View