Abstract
Asymmetry naturally exists in real-life applications, such as directed graphs. Different from classical kernel methods requiring Mercer kernels with symmetry and semi-positive definiteness, Kernel Singular Value Decomposition (KSVD) is proposed recently as a powerful tool that extends SVD to nonlinear feature spaces and is capable of directly tackling asymmetric kernels. As generally in kernel methods, KSVD also suffers from inefficiency with large-size data, but can be significantly sped up by the asymmetric Nyström method, facilitating the scalability. KSVD is handled under the framework of kernel methods with well derived primal-dual representations, benefiting practitioners from exploring asymmetry in generic feature learning. The insight and methodology in KSVD pave the way for further explorations of asymmetric kernel methods in machine learning and beyond.
There are different ways to derive KSVD, where important common properties and fundamental differences w.r.t. classical Mercer kernels are revealed towards in-depth understandings and promising future developments of asymmetric kernel methods. This keynote presents the first systematic introduction to the modeling, optimization, and practicality of the timely proposed KSVD. In this keynote, the formulations of KSVD with both kernels and covariance operators are first reviewed with detailed derivations, covering rigorous discussions on the current applications and promising future outlooks. Then, the connections to existing kernel methods are discussed comprehensively, aiming to present in-depth understandings on KSVD and its potentials.
京公网安备11010802044758号
Comments on this article