Private Stochastic Optimization with Large Worst-Case Lipschitz Parameter: Optimal Rates for (Non-Smooth) Convex Losses and Extension to Non-Convex LossesDownload PDFOpen Website

2023 (modified: 20 Apr 2023)ALT 2023Readers: Everyone
Abstract: We study differentially private (DP) stochastic optimization (SO) with loss functions whose worst-case Lipschitz parameter over all data points may be extremely large. To date, the vast majority of...
0 Replies

Loading