Compressed-VFL: Communication-Efficient Learning with Vertically Partitioned DataDownload PDF

Published: 28 Jan 2022, Last Modified: 13 Feb 2023ICLR 2022 SubmittedReaders: Everyone
Keywords: Federated Learning, Optimization, Split Learning, Cross-Silo Learning, Compression, Quantization, Sparsification
Abstract: We propose Compressed Vertical Federated Learning (C-VFL) for communication-efficient training on vertically partitioned data. In C-VFL, a server and multiple parties collaboratively train a model on their respective features utilizing several local iterations and sharing compressed intermediate results periodically. Our work provides the first theoretical analysis of the effect message compression has on distributed training over vertically partitioned data. We prove convergence of non-convex objectives to a fixed point at a rate of $O(\frac{1}{\sqrt{T}})$ when the compression error is bounded over the course of training. We provide specific requirements for convergence with common compression techniques, such as quantization and top-$k$ sparsification. Finally, we experimentally show compression can reduce communication by over $90\%$ without a significant decrease in accuracy over VFL without compression.
One-sentence Summary: We propose Compressed Vertical Federated Learning (C-VFL) where a server and multiple parties collaboratively train a model over vertically-partitioned data utilizing several local iterations and sharing compressed intermediate results periodically.
Supplementary Material: zip
12 Replies

Loading