TY - GEN A1 - Gao, Yunlong A1 - Zhang, Yisong A1 - Pan, Jinyan A1 - Luo, Sizhe A1 - Yang, Chengyu T1 - Discriminant analysis based on reliability of local neighborhood T2 - Expert Systems with Applications N2 - To obtain a compact and effective low-dimensional representation, recently, most existing discriminant manifold learning methods have integrated manifold learning into discriminant analysis (DA) for extracting the intrinsic structure of data. These methods learn two kinds of adjacency graphs, such as intrinsic graph and penalty graph, to characterize the similarity between samples from intraclass and the pseudo similarity of interclass. However, they treat every sample equally, which results in the following defects: (1) These methods cannot accurately characterize the marginal region among different classes only through penalty graphs. (2) They can not identify the noisy and outlier samples which reduce the robustness of these methods. To address these problems, we introduce an adaptive adjacency factor to perform the discriminative based reliability analysis for each sample. By integrating the adjacency factor into discriminant manifold learning methods, we propose a novel method for DA namely discriminant analysis based on reliability of local neighborhood (DA-RoLN). We mainly have three contributions in this paper: (1) By the introduction of adjacency factor, sample points can be divided into three parts: intraclass samples, marginal samples, and outliers. Therefore, DA-RoLN emphasizes the effect of valid samples and filters the influence of outliers. (2) We adaptively calculate the adjacency factor in low-dimensional space, thus, the margin between different classes in low-dimensional space is emphasized. (3) An iterative algorithm is developed to solve the objective function of DA-RoLN, and it is easy to solve with a low computational cost. Extensive experimental results show the effectiveness of DA-RoLN. KW - Dimensionality reduction KW - Discriminant analysis KW - Manifold learning KW - Graph learning KW - Adjacency factor Y1 - 2021 U6 - https://doi.org/10.1016/j.eswa.2021.114790 SN - 1873-6793 SN - 0957-4174 VL - 175 ER - TY - GEN A1 - Gao, Yunlong A1 - Luo, Si-Zhe A1 - Pan, Jin-Yan A1 - Chen, Bai-Hua A1 - Zhang, Yi-Song T1 - Robust PCA Using Adaptive Probability Weighting T2 - Acta Automatica Sinica N2 - Principal component analysis (PCA) is an important method for processing high-dimensional data. In recent years, PCA models based on various norms have been extensively studied to improve the robustness. However, on the one hand, these algorithms do not consider the relationship between reconstruction error and covariance; on the other hand, they lack the uncertainty of considering the principal component to the data description. Aiming at these problems, this paper proposes a new robust PCA algorithm. Firstly, the L2,p-norm is used to measure the reconstruction error and the description variance of the projection data. Based on the reconstruction error and the description variance, the adaptive probability error minimization model is established to calculate the uncertainty of the principal component's description of the data. Based on the uncertainty, the adaptive probability weighting PCA is established. The corresponding optimization method is designed. The experimental results of artificial data sets, UCI data sets and face databases show that RPCA-PW is superior than other PCA algorithms. KW - Principle component analysis (PCA) KW - weighted principal component analysis (WPCA) KW - dimensionality reduction KW - robustness Y1 - 2021 U6 - https://doi.org/10.16383/j.aas.c180743 SN - 0254-4156 VL - 47 IS - 4 SP - 825 EP - 838 ER -