Pairwise Learning with Adaptive Online Gradient Descent

Published: 26 Nov 2023, Last Modified: 26 Nov 2023Accepted by TMLREveryoneRevisionsBibTeX
Authors that are also TMLR Expert Reviewers: ~Yunwen_Lei1
Abstract: In this paper, we propose an adaptive online gradient descent method with momentum for pairwise learning, in which the stepsize is determined by historical information. Due to the structure of pairwise learning, the sample pairs are dependent on the parameters, causing difficulties in the convergence analysis. To this end, we develop novel techniques for the convergence analysis of the proposed algorithm. We show that the proposed algorithm can output the desired solution in strongly convex, convex, and nonconvex cases. Furthermore, we present theoretical explanations for why our proposed algorithm can accelerate previous workhorses for online pairwise learning. All assumptions used in the theoretical analysis are mild and common, making our results applicable to various pairwise learning problems. To demonstrate the efficiency of our algorithm, we compare the proposed adaptive method with the non-adaptive counterpart on the benchmark online AUC maximization problem.
Certifications: Expert Certification
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: Dear Editor, Thank you for approving our paper. Based on your suggestion, we have made three changes to the camera-ready version: 1. We have extended Remark 1 to provide a more specific explanation of the condition $\alpha<1/2$. 2. We have revised the formulation of the AUC problem and provided a detailed explanation in the first paragraph of Remark 2. 3. We have also provided further clarification on the pairwise model and AUC maximization in paragraphs 2 to 4 of Remark 2. Best, The authors
Code: https://github.com/normalbasis/adaptive_pairwise_learning.git
Assigned Action Editor: ~Simon_Lacoste-Julien1
Submission Number: 1257
Loading