검색 상세

Federated Learning based on Consensus ADMM without drift

초록 (요약문)

As phones and tablets become the primary computing devices, and the regulations on privacy are made strong, federated learning gets increasing recognition from academic researchers and industrial practitioners. Many researchers tackled the federated learning problem after McMahan et al. proposed the FedAvg algorithm. However, it remains a challenging research area due to its nature in dealing with heterogeneous data massively distributed across clients. This work investigates the federated learning problem based on the consensus ADMM, which was previously considered inapplicable for federated learning. First, we present that averaging dual variable in global updates potentially causes a drift of the global model. Then we propose a novel dual-based federated optimization algorithm that removes the dual variable drift while inheriting the superior convergence properties of ADMM. Finally, we demonstrate that the proposed method shows outstanding empirical performance over competing methods in various levels of data heterogeneity and clients participation.

more