Vtome.ru - электронная библиотека

Energy Efficiency and Robustness of Advanced Machine Learning Architectures: A Cross-Layer Approach

  • Добавил: literator
  • Дата: 13-09-2024, 19:20
  • Комментариев: 0
Название: Energy Efficiency and Robustness of Advanced Machine Learning Architectures: A Cross-Layer Approach
Автор: Alberto Marchisio, Muhammad Shafique
Издательство: CRC Press
Серия: Chapman & Hall/CRC Artificial Intelligence and Robotics Series
Год: 2025
Страниц: 361
Язык: английский
Формат: pdf (true), epub
Размер: 51.9 MB

Machine Learning (ML) algorithms have shown a high level of accuracy, and applications are widely used in many systems and platforms. However, developing efficient ML-based systems requires addressing three problems: energy-efficiency, robustness, and techniques that typically focus on optimizing for a single objective/have a limited set of goals. This book tackles these challenges by exploiting the unique features of advanced ML models and investigates cross-layer concepts and techniques to engage both hardware and software-level methods to build robust and energy-efficient architectures for these advanced ML networks. More specifically, this book improves the energy efficiency of complex models like CapsNets, through a specialized flow of hardware-level designs and software-level optimizations exploiting the application-driven knowledge of these systems and the error tolerance through approximations and quantization. This book also improves the robustness of ML models, in particular for SNNs executed on neuromorphic hardware, due to their inherent cost-effective features. This book integrates multiple optimization objectives into specialized frameworks for jointly optimizing the robustness and energy efficiency of these systems. This is an important resource for students and researchers of computer and electrical engineering who are interested in developing energy efficient and robust ML.

Among Machine Learning (ML) systems, Deep Neural Networks (DNNs) have emerged as an established milestone for several applications, such as computer vision, medicine, finance, and robotics. This led to the need to deploy the DNN inference workload across various devices, including embedded systems with constrained resources. However, the current trends in the ML community are projected in the other direction since the newer networks tend to be deeper and more complex. For instance, Capsule Networks (CapsNets) are peculiar types of DNNs based on capsules, which are arrays of neurons, to learn high-level features with better capabilities than traditional DNNs. As a result, the next generation of computing platforms executing advanced DNNs would exhibit high complexity and consume high energy, thus challenging their feasible implementations in resource-constrained devices.

On the other hand, Spiking Neural Networks (SNNs) emerged as an efficient computation infrastructure for elaborating event-based DNNs, which represent a closer manner to our current understanding of the human brain's functionality. This led to the development of the neuromorphic computing paradigm, whose hardware architectures support the execution of energy-efficient event-based SNNs.

Another fundamental aspect to consider when deploying advanced Deep Learning (DL) architectures is security. The system requires high robustness against various vulnerability threats when dealing with safety-critical applications. An adversary can threaten the integrity of the DL system through attacks at different levels, including the hardware and software stacks, and perturbing the inputs, the memory, or the computational engine. As a result, defensive countermeasures in different abstraction layers of the system must be applied, which typically require some energy and computation overhead. Moreover, while the security of traditional DNNs has been extensively studied, investigating the security of advanced DL systems offers unique opportunities to exploit their peculiar features.

Скачать Energy Efficiency and Robustness of Advanced Machine Learning Architectures: A Cross-Layer Approach












ОТСУТСТВУЕТ ССЫЛКА/ НЕ РАБОЧАЯ ССЫЛКА ЕСТЬ РЕШЕНИЕ, ПИШИМ СЮДА!


ПРАВООБЛАДАТЕЛЯМ


СООБЩИТЬ ОБ ОШИБКЕ ИЛИ НЕ РАБОЧЕЙ ССЫЛКЕ



Внимание
Уважаемый посетитель, Вы зашли на сайт как незарегистрированный пользователь.
Мы рекомендуем Вам зарегистрироваться либо войти на сайт под своим именем.
Информация
Посетители, находящиеся в группе Гости, не могут оставлять комментарии к данной публикации.