Publications

WORKSHOP (INTERNATIONAL) Accelerating Differentially Private Federated Learning via Adaptive Extrapolation

Shokichi Takakura, Seng Pei Liew, Satoshi Hasegawa

Will Synthetic Data Finally Solve the Data Access Problem? Workshop at ICLR 2025 (ICLR 2025)

April 28, 2025

The federated learning (FL) framework enables multiple clients to collaboratively train machine learning models without sharing their raw data, but it remains vulnerable to privacy attacks. One promising approach is to incorporate differential privacy (DP)—a formal notion of privacy—into the FL framework. DP-FedAvg is one of the most popular algorithms for DP-FL, but it is known to suffer from the slow convergence in the presence of heterogeneity among clients’ data. Most of the existing methods to accelerate DP-FL require 1) additional hyperparameters or 2) additional computational cost for clients, which is not desirable since 1) hyperparameter tuning is computationally expensive and data-dependent choice of hyperparameters raises the risk of privacy leakage, and 2) clients are often resource-constrained. To address this issue, we propose DP-FedEXP, which adaptively selects the global step size based on the diversity of the local updates without requiring any additional hyperparameters or client computational cost. We show that DP-FedEXP provably accelerates the convergence of DP-FedAvg and it empirically outperforms existing methods tailored for DP-FL.

Paper : Accelerating Differentially Private Federated Learning via Adaptive Extrapolationopen into new tab or window (external link)