Federated learning is a distributed learning framework that enables multi-institutional collaborations on decentralized data with improved protection for each collaborator’s data privacy. This paper proposes a communication-efficient scheme for decentralized federated learning called ProxyFL, which eliminates a significant limitation of canonical federated learning by allowing model heterogeneity and providing stronger privacy guarantees. Experiments on popular image datasets and a cancer diagnostic problem using high-quality gigapixel histology whole slide images show that ProxyFL can outperform existing alternatives with much less communication overhead and stronger privacy.
Previous ArticleArcher Materials Becomes First Australian Company To Join World Economic Forum C4ir Partnership
Next Article Google Cloud Summit 2023- Scale With Ai