Vtome.ru - электронная библиотека

Communication Efficient Federated Learning for Wireless Networks

  • Добавил: literator
  • Дата: 21-02-2024, 04:43
  • Комментариев: 0
Название: Communication Efficient Federated Learning for Wireless Networks
Автор: Mingzhe Chen, Shuguang Cui
Издательство: Springer
Серия: Wireless Networks
Год: 2024
Страниц: 189
Язык: английский
Формат: pdf (true), epub
Размер: 24.2 MB

This book provides a comprehensive study of Federated Learning (FL) over wireless networks. It consists of three main parts: (a) Fundamentals and preliminaries of FL, (b) analysis and optimization of FL over wireless networks, and © applications of wireless FL for Internet-of-Things systems. In particular, in the first part, the authors provide a detailed overview on widely-studied FL framework. In the second part of this book, the authors comprehensively discuss three key wireless techniques including wireless resource management, quantization, and over-the-air computation to support the deployment of FL over realistic wireless networks. It also presents several solutions based on optimization theory, graph theory and machine learning to optimize the performance of FL over wireless networks. In the third part of this book, the authors introduce the use of wireless FL algorithms for autonomous vehicle control and mobile edge computing optimization.

Machine Learning and data-driven approaches have recently received considerable attention as key enablers for next-generation intelligent networks. Currently, most existing learning solutions for wireless networks rely on centralizing the training and inference processes by uploading data generated at edge devices to data centers. However, such a centralized paradigm may lead to privacy leakage, violate the latency constraints of mobile applications, or may be infeasible due to limited bandwidth or power constraints of edge devices. To address these issues, distributing Machine Learning at the network edge provides a promising solution, where edge devices collaboratively train a shared model using real-time generated mobile data. The avoidance of data uploading to a central server not only helps preserve privacy but also reduces network traffic congestion as well as communication cost. Federated Learning (FL) is one of most important distributed learning algorithms. In particular, FL enables devices to train a shared Machine Learning model while keeping data locally. However, in FL, training machine learning models requires communication between wireless devices and edge servers over wireless links. Therefore, wireless impairments such as noise, interference, and uncertainties among wireless channel states will significantly affect the training process and performance of FL. For example, transmission delay can significantly impact the convergence time of FL algorithms. In consequence, it is necessary to optimize wireless network performance for the implementation of FL algorithms.

On the other hand, FL can also be used for solving wireless communication problems and optimizing network performance. For example, FL endows on edge devices the capabilities of user behavior prediction, user identification, and wireless environment analysis. Moreover, federated reinforcement learning leverages distributed computation power and data to solve complex convex and nonconvex optimization problems that arise in various use cases, such as network control, user clustering, resource management, and interference alignment. Besides, traditionally, FL makes a desirable assumption that edge devices will unconditionally participate in the tasks when invited, which is not practical in reality due to resources cost and wiliness incurred by model training. Therefore, building incentive mechanisms is indispensable for FL network.

Скачать Communication Efficient Federated Learning for Wireless Networks












ОТСУТСТВУЕТ ССЫЛКА/ НЕ РАБОЧАЯ ССЫЛКА ЕСТЬ РЕШЕНИЕ, ПИШИМ СЮДА!


ПРАВООБЛАДАТЕЛЯМ


СООБЩИТЬ ОБ ОШИБКЕ ИЛИ НЕ РАБОЧЕЙ ССЫЛКЕ



Внимание
Уважаемый посетитель, Вы зашли на сайт как незарегистрированный пользователь.
Мы рекомендуем Вам зарегистрироваться либо войти на сайт под своим именем.
Информация
Посетители, находящиеся в группе Гости, не могут оставлять комментарии к данной публикации.