On Multilinear Principal Component Analysis


Su-Yun Huang

11:00:00 - 11:50:00

308 , Mathematics Research Center Building (ori. New Math. Bldg.)

Principal component analysis is commonly used for dimension reduction in analyzing high dimensional data. Multilinear principal component analysis aims to serve a similar function for analyzing tensor structure data, and has empirically been shown effective in reducing dimensionality. In this paper, we investigate its statistical properties and demonstrate its advantages. Conventional principal component analysis, which vectorizes the tensor data, may lead to inefficient and unstable prediction due to the often extremely large dimensionality involved. Multilinear principal component analysis, in trying to preserve the data structure, searches for low-dimensional projections and, thereby, decreases dimensionality more efficiently. Asymptotic theory of order-two multilinear principal component analysis, including asymptotic efficiency and distributions of principal components, associated projections, and the explained variance, is developed. A test of dimensionality is also proposed. Finally, multilinear principal component analysis is shown to improve conventional principal component analysis in analyzing the Olivetti faces data set, which is achieved by extracting a more modularly-oriented basis in reconstructing test faces. (joint with Hung Hung, Pei-Shien Wu and I-Ping Tu)