Abstract: In many statistical settings, it is assumed that high-dimensional data actually lies on a low-dimensional manifold. In this perspective, there is a need to generalize statistical methods to nonlinear spaces. To that end, we propose generalizations of the Linear Discriminant Analysis (LDA) to manifolds. First, we generalize the reduced rank LDA solution by constructing a geodesic subspace which optimizes a criterion equivalent to Fisher's discriminant in the linear case. Second, we generalize the LDA formulated as a restricted Gaussian classifier. The generalizations of those two methods, which are equivalent in the linear case, are in general different in the manifold case. We illustrate the first generalization on the 2-sphere. Then, we propose applications using the Large Deformation Diffeomorphic Metric Mapping (LDDMM) framework, in which we rephrase the second generalization. We perform dimension reduction and classification on the kimia-216 dataset and on a set of 3D brain structures segmented from Alzheimer's disease and control subjects, recovering state-of-the-art performances.
0 Replies
Loading