- Добавил: literator
- Дата: 3-06-2024, 19:05
- Комментариев: 0
Название: Building Transformer Models with PyTorch 2.0: NLP, computer vision, and speech processing with PyTorch and Hugging Face
Автор: Prem Timsina
Издательство: BPB Publications
Год: 2024
Страниц: 310
Язык: английский
Формат: pdf, epub (true)
Размер: 12.7 MB
Your key to transformer based NLP, vision, speech, and multimodalities. This book covers transformer architecture for various applications including NLP, Computer Vision, speech processing, and predictive modeling with tabular data. It is a valuable resource for anyone looking to harness the power of transformer architecture in their Machine Learning projects. The book provides a step-by-step guide to building transformer models from scratch and fine-tuning pre-trained open-source models. It explores foundational model architecture, including GPT, VIT, Whisper, TabTransformer, Stable Diffusion, and the core principles for solving various problems with transformers. The book also covers transfer learning, model training, and fine-tuning, and discusses how to utilize recent models from Hugging Face. Additionally, the book explores advanced topics such as model benchmarking, multimodal learning, reinforcement learning, and deploying and serving transformer models. In conclusion, this book offers a comprehensive and thorough guide to transformer models and their various applications. This book provides both theoretical and practical understanding of transformer architecture. Specifically, we will cover these ML tasks: Natural Language Processing (NLP), Computer Vision, Speech Processing, Tabular Data Processing, Reinforcement Learning, and Multi-Modalities. The pre-requisite for this book is basic understanding of PyTorch and Deep Learning. This book will benefit data scientists and ML engineers who are seeking to enhance their knowledge of transformer models and learn how to develop ML engines using the transformer architecture and Hugging Face’s transformer library. It will also be valuable for developers and software architects looking to integrate transformer-based models into their existing software products.
Автор: Prem Timsina
Издательство: BPB Publications
Год: 2024
Страниц: 310
Язык: английский
Формат: pdf, epub (true)
Размер: 12.7 MB
Your key to transformer based NLP, vision, speech, and multimodalities. This book covers transformer architecture for various applications including NLP, Computer Vision, speech processing, and predictive modeling with tabular data. It is a valuable resource for anyone looking to harness the power of transformer architecture in their Machine Learning projects. The book provides a step-by-step guide to building transformer models from scratch and fine-tuning pre-trained open-source models. It explores foundational model architecture, including GPT, VIT, Whisper, TabTransformer, Stable Diffusion, and the core principles for solving various problems with transformers. The book also covers transfer learning, model training, and fine-tuning, and discusses how to utilize recent models from Hugging Face. Additionally, the book explores advanced topics such as model benchmarking, multimodal learning, reinforcement learning, and deploying and serving transformer models. In conclusion, this book offers a comprehensive and thorough guide to transformer models and their various applications. This book provides both theoretical and practical understanding of transformer architecture. Specifically, we will cover these ML tasks: Natural Language Processing (NLP), Computer Vision, Speech Processing, Tabular Data Processing, Reinforcement Learning, and Multi-Modalities. The pre-requisite for this book is basic understanding of PyTorch and Deep Learning. This book will benefit data scientists and ML engineers who are seeking to enhance their knowledge of transformer models and learn how to develop ML engines using the transformer architecture and Hugging Face’s transformer library. It will also be valuable for developers and software architects looking to integrate transformer-based models into their existing software products.