Abstract
Federated Learning (FL) is a new paradigm of Machine Learning (ML) that enables on-device computation via decentralized data training. However, traditional FL algorithms impose strict requirements on the clients' selection and its ratio. Moreover, the data training becomes inefficient when the client's computational resources are limited. Towards this goal, we aim to extend FL, a decentralized learning framework that efficiently works with heterogeneous clients in practical industrial scenarios. To this end, we propose a Clients' Eligibility Protocol (CEP), a resource-aware FL solution, for a heterogeneous environment. To this end, we use a Trusted Authority (TA) between the clients and the cloud server, which calculates the client's eligibility score based on local computing resources such as bandwidth, memory, and battery life and selects the most resourceful clients for training. If a client gives a slow response or infuses an incorrect model, the TA declares that the client is ineligible for future training. Besides, the proposed CEP leverages the asynchronous FL model, which avoids a long delay in a client's response. The empirical results proves that the proposed CEP gains the benefits of resource-aware clients selection and achieves 88 % and 93 % of accuracy on AlexNet and LeNet, respectively.
Original language | English |
---|---|
Pages (from-to) | 1140-1145 |
Number of pages | 6 |
Journal | Proceedings - IEEE Global Communications Conference, GLOBECOM |
DOIs | |
Publication status | Published - 2022 |
Event | 2022 IEEE Global Communications Conference, GLOBECOM 2022 - Virtual, Online, Brazil Duration: 4 Dec 2022 → 8 Dec 2022 |
ASJC Scopus subject areas
- Artificial Intelligence
- Computer Networks and Communications
- Hardware and Architecture
- Signal Processing
Keywords
- Client Selection
- Clients' Eligibility Protocol
- Federated Learning (FL)
- Resource Awareness