TY - JOUR
T1 - CEEP-FL
T2 - A comprehensive approach for communication efficiency and enhanced privacy in federated learning
AU - Asad, Muhammad
AU - Moustafa, Ahmed
AU - Aslam, Muhammad
N1 - Publisher Copyright:
© 2021 Elsevier B.V.
PY - 2021/6
Y1 - 2021/6
N2 - Federated Learning (FL) is an emerging technique for collaboratively training machine learning models on distributed data under privacy constraints. However, recent studies have shown that FL significantly consumes plenty of communication resources during the global model update. In addition, participants’ private data can also be compromised by exploiting the shared parameters when uploading the local gradient updates to the central cloud server, which hinders FL to be implemented widely. To address these challenges, in this paper, we propose a novel comprehensive FL approach, namely, Communication Efficient and Enhanced Privacy (CEEP-FL). In particular, the proposed approach simultaneously aims to; (1) minimize the communication cost, (2) protect data from being compromised, and (3) maximize the global learning accuracy. To minimize the communication cost, we first apply a novel filtering mechanism on each local gradient update and upload only the important gradients. Then, we apply Non-Interactive Zero-Knowledge Proofs based Homomorphic-Cryptosystem (NIZKP-HC) in order to protect those local gradient updates while maintaining robustness in the network. Finally, we use Distributed Selective Stochastic Gradient Descent (DSSGD) optimization to minimize the computational cost and maximize the global learning accuracy. The experimental results on commonly used FL datasets demonstrate that CEEP-FL distinctively outperforms the existing approaches.
AB - Federated Learning (FL) is an emerging technique for collaboratively training machine learning models on distributed data under privacy constraints. However, recent studies have shown that FL significantly consumes plenty of communication resources during the global model update. In addition, participants’ private data can also be compromised by exploiting the shared parameters when uploading the local gradient updates to the central cloud server, which hinders FL to be implemented widely. To address these challenges, in this paper, we propose a novel comprehensive FL approach, namely, Communication Efficient and Enhanced Privacy (CEEP-FL). In particular, the proposed approach simultaneously aims to; (1) minimize the communication cost, (2) protect data from being compromised, and (3) maximize the global learning accuracy. To minimize the communication cost, we first apply a novel filtering mechanism on each local gradient update and upload only the important gradients. Then, we apply Non-Interactive Zero-Knowledge Proofs based Homomorphic-Cryptosystem (NIZKP-HC) in order to protect those local gradient updates while maintaining robustness in the network. Finally, we use Distributed Selective Stochastic Gradient Descent (DSSGD) optimization to minimize the computational cost and maximize the global learning accuracy. The experimental results on commonly used FL datasets demonstrate that CEEP-FL distinctively outperforms the existing approaches.
KW - Communication efficient
KW - Federated learning
KW - Privacy preserving
KW - Zero-knowledge proof
UR - http://www.scopus.com/inward/record.url?scp=85102043955&partnerID=8YFLogxK
U2 - 10.1016/j.asoc.2021.107235
DO - 10.1016/j.asoc.2021.107235
M3 - Article
AN - SCOPUS:85102043955
SN - 1568-4946
VL - 104
JO - Applied Soft Computing
JF - Applied Soft Computing
M1 - 107235
ER -