删除或更新信息,请邮件至freekaoyan#163.com(#换成@)

张驰浩 博士后:Matrix Normal PCA for Interpretable Dimension Reduction and Graphical Noise Modeling

本站小编 Free考研考试/2021-12-26



Academy of Mathematics and Systems Science, CAS
Colloquia & Seminars

Speaker: 张驰浩 博士后 ,日本东京大学
Inviter: 张世华
Title:
Matrix Normal PCA for Interpretable Dimension Reduction and Graphical Noise Modeling
Time & Venue:
2021.10.27 08:00-08:40 S525
Abstract:
Principal component analysis (PCA) is one of the most widely used dimension reduction and multivariate statistical techniques. From a probabilistic perspective, PCA seeks a low-dimensional representation of data in the presence of independent identical Gaussian noise. Probabilistic PCA (PPCA) and its variants have been extensively studied for decades. Most of them assume the underlying noise follows a certain independent identical distribution. However, the noise in the real world is usually complicated and structured. To address this challenge, some variants of PCA for non-IID data have been proposed. However, most of the existing methods only assume that the noise is correlated in the feature space while there may exist two-way structured noise. To this end, we propose a powerful and intuitive PCA method (MN-PCA) through modeling the graphical noise by the matrix normal distribution, which enables us to explore the structure of noise in both the feature space and the sample space. MN-PCA obtains a low-rank representation of data and the structure of noise simultaneously. And it can be explained as approximating data over the generalized Mahalanobis distance. We develop two algorithms to solve this model: one maximizes the regularized likelihood, the other exploits the Wasserstein distance, which is more robust. Extensive experiments on various data demonstrate their effectiveness.

相关话题/博士后 日本东京