Online Orthogonal Regression Based on a Regularized Squared Loss

Published: 01 Jan 2018, Last Modified: 19 Feb 2025ICMLA 2018EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Orthogonal regression extends the classical regression framework by assuming that the data may contain errors in both the dependent and independent variables. Often, this approach tends to outperform classical regression in real-world scenarios. However, the algorithms used to determine a solution to the orthogonal regression problem require the computation of singular value decompositions (SVD), which may be computationally expensive and impractical for real-world problems. In this work, we propose a new approach to the orthogonal regression problem based on a regularized squared loss. The method follows an online learning strategy which makes it more flexible for different types of applications. The algorithm is derived in primal and dual variables and the later formulation allows the introduction of kernels for nonlinear modeling. We compare our proposed orthogonal regression algorithm to a corresponding classical regression algorithm using both synthetic and real-world datasets from different applications. Our algorithm achieved better results for most of the datasets.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview