TY - GEN
T1 - Evaluating the Communication Efficiency in Federated Learning Algorithms
AU - Asad, Muhammad
AU - Moustafa, Ahmed
AU - Ito, Takayuki
AU - Aslam, Muhammad
N1 - Publisher Copyright:
© 2021 IEEE.
PY - 2021/5/5
Y1 - 2021/5/5
N2 - In the era of advanced technologies, mobile devices are equipped with computing and sensing capabilities that gather excessive amounts of data. These amounts of data are suitable for training different learning models. Cooperated with Deep Learning (DL) advancements, these learning models empower numerous useful applications, e.g., image processing, speech recognition, healthcare, vehicular network, and many more. Traditionally, Machine Learning (ML) approaches require data to be centralised in cloud-based data-centres. However, this data is often large in quantity and privacy-sensitive, preventing logging into these data-centres for training the learning models. In turn, this results in critical issues of high latency and communication inefficiency. Recently, in light of new privacy legislation in many countries, the concept of Federated Learning (FL) has been introduced. In FL, mobile users are empowered to learn a global model by aggregating their local models without sharing the privacy-sensitive data. Usually, these mobile users have slow network connections to the data-centre where the global model is maintained. Moreover, in a complicated and extensive scale network, heterogeneous devices with various energy constraints are involved. This raises the challenge of communication cost when implementing FL at a large scale. To this end, in this research, we begin with the fundamentals of FL, and then we highlight the recent FL algorithms and evaluate their communication efficiency with detailed comparisons. Furthermore, we propose a set of solutions to alleviate the existing FL problems from a communication perspective and a privacy perspective.
AB - In the era of advanced technologies, mobile devices are equipped with computing and sensing capabilities that gather excessive amounts of data. These amounts of data are suitable for training different learning models. Cooperated with Deep Learning (DL) advancements, these learning models empower numerous useful applications, e.g., image processing, speech recognition, healthcare, vehicular network, and many more. Traditionally, Machine Learning (ML) approaches require data to be centralised in cloud-based data-centres. However, this data is often large in quantity and privacy-sensitive, preventing logging into these data-centres for training the learning models. In turn, this results in critical issues of high latency and communication inefficiency. Recently, in light of new privacy legislation in many countries, the concept of Federated Learning (FL) has been introduced. In FL, mobile users are empowered to learn a global model by aggregating their local models without sharing the privacy-sensitive data. Usually, these mobile users have slow network connections to the data-centre where the global model is maintained. Moreover, in a complicated and extensive scale network, heterogeneous devices with various energy constraints are involved. This raises the challenge of communication cost when implementing FL at a large scale. To this end, in this research, we begin with the fundamentals of FL, and then we highlight the recent FL algorithms and evaluate their communication efficiency with detailed comparisons. Furthermore, we propose a set of solutions to alleviate the existing FL problems from a communication perspective and a privacy perspective.
KW - Collaborative Learning
KW - Communication Cost
KW - Decentralised Data
KW - Federated Learning
UR - http://www.scopus.com/inward/record.url?scp=85107821087&partnerID=8YFLogxK
U2 - 10.1109/CSCWD49262.2021.9437738
DO - 10.1109/CSCWD49262.2021.9437738
M3 - Conference proceedings published in a book
AN - SCOPUS:85107821087
T3 - Proceedings of the 2021 IEEE 24th International Conference on Computer Supported Cooperative Work in Design, CSCWD 2021
SP - 552
EP - 557
BT - Proceedings of the 2021 IEEE 24th International Conference on Computer Supported Cooperative Work in Design, CSCWD 2021
A2 - Shen, Weiming
A2 - Barthes, Jean-Paul
A2 - Luo, Junzhou
A2 - Shi, Yanjun
A2 - Zhang, Jinghui
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 24th IEEE International Conference on Computer Supported Cooperative Work in Design, CSCWD 2021
Y2 - 5 May 2021 through 7 May 2021
ER -