Federated Learning based on Consensus ADMM without drift
- 주제어 (키워드) federated learning , distributed optimization , consensus ADMM , heterogeneity
- 발행기관 서강대학교 일반대학원
- 지도교수 김홍석
- 발행년도 2022
- 학위수여년월 2022. 2
- 학위명 석사
- 학과 및 전공 일반대학원 전자공학과
- 실제 URI http://www.dcollection.net/handler/sogang/000000066529
- UCI I804:11029-000000066529
- 본문언어 영어
- 저작권 서강대학교 논문은 저작권 보호를 받습니다.
초록 (요약문)
As phones and tablets become the primary computing devices, and the regulations on privacy are made strong, federated learning gets increasing recognition from academic researchers and industrial practitioners. Many researchers tackled the federated learning problem after McMahan et al. proposed the FedAvg algorithm. However, it remains a challenging research area due to its nature in dealing with heterogeneous data massively distributed across clients. This work investigates the federated learning problem based on the consensus ADMM, which was previously considered inapplicable for federated learning. First, we present that averaging dual variable in global updates potentially causes a drift of the global model. Then we propose a novel dual-based federated optimization algorithm that removes the dual variable drift while inheriting the superior convergence properties of ADMM. Finally, we demonstrate that the proposed method shows outstanding empirical performance over competing methods in various levels of data heterogeneity and clients participation.
more

