Publications

CONFERENCE (INTERNATIONAL) FedDuA: Doubly Adaptive Federated Learning

Shokichi Takakura, Seng Pei Liew, Satoshi Hasegawa

The 29th International Conference on Artificial Intelligence and Statistics (AISTATS 2026)

May 02, 2026

Federated learning (FL) is a distributed learning framework where clients collabora-tively train a global model without sharing their raw data.
FedAvg is a popular algo-rithm for FL, but it often suffers from slow convergence due to the heterogeneity of lo-cal datasets and anisotropy in the parameter space.
In this work, we formalize the cen-tral server optimization procedure through the lens of mirror descent and propose a novel framework, called FedDuA, which adaptively selects the global learning rate based on both inter-client and coordinate-wise heterogene-ity in the local updates.
We prove that our proposed doubly adaptive step-size rule is minimax optimal and provide a conver-gence analysis for convex objectives.
Al-though the proposed method does not re-quire additional communication or computa-tional cost on clients, extensive numerical ex-periments show that our proposed framework outperforms baselines in various settings and is robust to the choice of hyperparameters

Paper : FedDuA: Doubly Adaptive Federated Learningopen into new tab or window (external link)