Block principle component analysis (BPCA) is a recently developed technique in computer vision
and pattern classiˉcation. In this paper, we propose a robust and sparse BPCA with Lp-norm, referred to as
BPCALp-S, which inherits the robustness of BPCA-L1 due to the employment of adjustable Lp-norm. In order
to perform a sparse modelling, the elastic net is integrated into the objective function. An iterative algorithm
which extracts feature vectors one by one greedily is elaborately designed. The monotonicity of the proposed
iterative procedure is theoretically guaranteed. Experiments of image classiˉcation and reconstruction on several
benchmark sets show the e?ectiveness of the proposed approach.
TANG Ganyi (唐肝翌), LU Guifu (卢桂馥)
. Block Principle Component Analysis with Lp-norm for Robust and Sparse Modelling[J]. Journal of Shanghai Jiaotong University(Science), 2018
, 23(3)
: 398
.
DOI: 10.1007/s12204-018-1955-4
[1] JOLLIFFE I T. Principal component analysis [M].New York, USA: Springer, 2002.
[2] YANG J, ZHANG D, FRANGI A F, et al. Two-dimensional PCA: A new approach to appearance-based face representation and recognition [J]. IEEETransaction on Pattern Analysis and Machine Intel-ligence, 2004, 26(1): 131-137.
[3] KE Q F, KANADE T. Robust L1-norm factorizationin the presence of outliers and missing data by al-ternative convex programming [C]//Proceedings of the2005 Computer Society Conference on Computer Vi-sion and Pattern Recognition. San Diego, USA: IEEE,2005: 739-746.
[4] DING C, ZHOU D, HE X F, et al. R1-PCA: Rota-tional invariant L1-norm principal component analysisfor robust subspace factorization [C]//Proceedings ofthe 23rd International Conference on Machine Learn-ing. Pittsburgh, USA: ACM, 2006: 281-288.
[5] KWAK N. Principal component analysis based on L1-norm maximization [J]. IEEE Transaction on PatternAnalysis and Machine Intelligence, 2008, 30(9): 1672-1680.
[6] NIE F P, HUANG H, DING C, et al. Robust principalcomponent analysis with non-greedy L1-norm maxi-mization [C]//Proceedings of the Twenty-Second In-ternational Joint Conference on Artiˉcial Intelligence.Barcelona, Spain: [s.n.], 2011: 1433-1438.
[7] LI X L, PANG Y W, YUAN Y. L1-norm-based 2DPCA[J]. IEEE Transactions on Systems, Man, and Cyber-netics Part B: Cybernetics, 2009, 40(4): 1170-1175.
[8] WANG R, NIE F P, YANG X J, et al. Robust2DPCA with non-greedy L1-norm maximization forimage analysis [J]. IEEE Transactions on Cybernetics,2015, 45(5): 1108-1112.
[9] KWAK N. Principal component analysis by Lp-normmaximization [J]. IEEE Transactions on Cybernetics,2014, 44(5): 594-609.
[10] GAO Q X. Is two-dimensional PCA equivalent to aspecial case of modular PCA? [J]. Pattern RecognitionLetters, 2007, 28(10): 1250-1251.
[11] KONG H, WANG L, TEOH E K, et al. Generalized2D principal component analysis for face image representation and recognition [J]. Neural Networks, 2005,18(5/6): 585-594.
[12] WANG L W,WANG X, ZHANG X R, et al. The equivalence of two-dimensional PCA to line-based PCA [J].Pattern Recognition Letters, 2005, 26(1): 57-60.
[13] GOTTUMUKKAL R, ASARI V K. An improved facerecognition technique based on modular PCA approach [J]. Pattern Recognition Letters, 2004, 25(4):429-436.
[14] KIM C, CHOI C H. Image covariance-based subspacemethod for face recognition [J]. Pattern Recognition,2007, 40(5): 1592-1604.
[15] WANG H X. Block principal component analysis withL1-norm for image analysis [J]. Pattern RecognitionLetters, 2012, 33(5): 537-542.
[16] WANG H X, WANG J. 2DPCA with L1-norm for simultaneously robust and sparse modelling [J]. NeuralNetworks, 2013, 46: 190-198.
[17] WANG J. Generalized 2-D principal component analysis by Lp-norm for image analysis [J]. IEEE Transactions on Cybernetics, 2015, 46(3): 792-803.
[18] JENATTON R, OBOZINSKI G, BACH F. Structuredsparse principal component analysis [C]//Proceedingsof the 13th International Conference on Artiˉcial Intelligence and Statistics. Chia Laguna Resort, Italy:[s.n.], 2010: 366-373.