Keywords: principal component analysis, nonlinear principal component analysis, independent component analysis, single-layer autoencoder
Abstract: Linear principal component analysis (PCA) is marked by orthogonality and ordering of variances -- its conventional nonlinear counterpart is marked by neither.
To yield a useful output, conventional nonlinear PCA requires, as a preprocessing step, whitening. This makes the overall transformation akin to independent component analysis (ICA). But, as a side effect of whitening, the overall transformation becomes non-orthogonal and the variances become non-estimable. Conventional nonlinear PCA thus lacks the two distinctive characteristics of PCA.
To bridge the disparity, we propose $\sigma$-PCA, a unified neural model for linear and nonlinear PCA as single-layer autoencoders. The key is modelling the variances as separate parameters. With our model, we show that whitening is not required, and nonlinear PCA can retain both orthogonality and ordering of variances, becoming a special case of ICA for when the overall transformation is assumed orthogonal. And so where linear PCA fails to separate components into orthogonal directions when their variances are similar, nonlinear PCA can. And when the overall transformation is non-orthogonal, two isolated layers of nonlinear PCA can perform conventional ICA. In the middle between linear PCA and ICA, we carve out a place for nonlinear PCA as a method in its own right.
Supplementary Material: zip
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 6084
Loading