Mathematical analysis on out-of-sample extensions of dimensionality reduction by diffusion mappingCP7

Many unsupervised learning algorithms provide dimensionality reduction (DR), among which the diffusion mapping is proved attractive and effective. However, these method cannot be straightforwardly applied for DR of new coming set. The out-of-sample extension is to find DR of the new set using an extension technique based on the same DR method for the training set. Many papers have developed out-of-extension algorithms based on Nystrom approximation and shown their validity by numerical experiments. However, the mathematical theory for the extension still need a further consideration. Based on the reproducing kernel Hilbert space (RKHS) theory, we give a mathematical analysis on such out-of-sample extensions. We treat an out-of-sample extension operator as an extension of the identity on the RKHS. Then the Nystrom-type extension becomes an orthogonal embedding on the RKHS. We then present the conditions for the exact extension and estimate the errors of the extensions.

This presentation is part of Contributed Presentation “CP7 - Contributed session 7

Authors:
Jianzhong Wang (Sam Houston State University )
Keywords:
diffusion mapping, dimensionality reduction, kernel pca, machine learning, manifold learning, nonlinear optimization, out-of-sample extension