TITLE: Low-rank approximation via Partial Matrix Sampling: Assumption-free Local Minimum Analysis and Applications in Memory-efficient Kernel PCA


Author: Ji Chen and Xiaodong Li



In this talk, we study nonconvex matrix completion from a perspective of assumption-free approximation: with no assumptions on the underlying positive-semidefinite matrix in terms of rank, eigenvalues or eigenvectors, we established the low-rank approximation error based on any local minimum of the proposed objective function. The approximation error consists of an over-parametrization term and an under-parametrization term, both of which are also illustrated by numerical experiments. As interesting byproducts, when certain assumptions are imposed on the underlying matrix, our results improve the state-of-the-art sampling rate results in the literature of nonconvex matrix completion with no spurious local minima. We also discussed how the proposed low-rank approximation framework is applied to memory-efficient Kernel PCA, and numerical experiments also show that our approach is competitive in terms of approximation accuracy compared to the well-known Nystrom algorithm. 


BIO:  Dr. Xiaodong Li is an assistant professor in the statistics department at UC Davis. Prior to that, he worked in the statistics department of Wharton School at University of Pennsylvania for two years. He got his Ph.D of mathematics at Stanford University in 2013, and his BS at Peking University in 2008. He has general research interests in machine learning, statistics, optimization and signal processing. Particularly, he is interested in the connection between optimization/spectral methods and the underlying low-rank/spectral structures. His papers have been published in various journals of statistics, mathematics and engineering such as AoS, ACHA, FoCM, JACM, IEEE TIT, etc.