Add to Favourites
To login click here

Federated learning is a distributed learning framework that enables multi-institutional collaborations on decentralized data with improved protection for each collaborator’s data privacy. This paper proposes a communication-efficient scheme for decentralized federated learning called ProxyFL, which eliminates a significant limitation of canonical federated learning by allowing model heterogeneity and providing stronger privacy guarantees. Experiments on popular image datasets and a cancer diagnostic problem using high-quality gigapixel histology whole slide images show that ProxyFL can outperform existing alternatives with much less communication overhead and stronger privacy.