Regret-Optimal Federated Transfer Learning for Kernel Regression – with Applications in American Option Pricing

TMLR Paper1630 Authors

29 Sept 2023 (modified: 18 Apr 2024)Rejected by TMLREveryoneRevisionsBibTeX
Abstract: We propose an optimal iterative scheme for federated transfer learning, where a central planner has access to datasets ${\cal D}_1,\dots,{\cal D}_N$ for the same learning model $f_{\theta}$. Our objective is to minimize the cumulative deviation of the generated parameters $\{\theta_i(t)\}_{t=0}^T$ across all $T$ iterations from the specialized parameters $\theta^\star_{1},\ldots,\theta^\star_N$ obtained for each dataset while respecting the loss function for the model $f_{\theta(T)}$ produced by the algorithm upon halting. We only allow for continual communication between each of the specialized models (nodes/agents) and the central planner (server) at each iteration (round). For the case where the model $f_{\theta}$ is a finite-rank kernel regression, we derive explicit updates for the regret-optimal algorithm. By leveraging symmetries within the regret-optimal algorithm, we further develop a nearly regret-optimal heuristic that runs with $\mathcal{O}(Np^2)$ fewer elementary operations, where $p$ is the dimension of the parameter space. Additionally, we investigate the adversarial robustness of the regret-optimal algorithm, showing that an adversary which perturbs $q$ training pairs by at most $\varepsilon>0$, across all training sets, cannot reduce the regret-optimal algorithm's regret by more than $\mathcal{O}(\varepsilon q \bar{N}^{1/2})$, where $\bar{N}$ is the aggregate number of training pairs. To validate our theoretical findings, we conduct numerical experiments in the context of American option pricing, utilizing a randomly generated finite-rank kernel.
Submission Length: Long submission (more than 12 pages of main content)
Previous TMLR Submission Url: https://openreview.net/forum?id=ISx91ZcKAQ
Changes Since Last Submission: 1. Section 3 of the last submission contained a link to the code, which compromised anonymity. In the new submission, we have anonymized the link and have submitted the code as an anonymous supplementary file. 2. In the last submission, the Appendix appeared before the References. In the new submission, we have moved the Appendix after the References.
Assigned Action Editor: ~Nishant_A_Mehta1
Submission Number: 1630
Loading