Название: Learning PyTorch 2.0: Experiment Deep Learning from basics to complex models using every potential capability of Pythonic PyTorch Автор: Matthew Rosch Издательство: GitforGits Год: 2023 Страниц: 321 Язык: английский Формат: pdf, azw3, epub, mobi Размер: 10.1 MB
This book is a comprehensive guide to understanding and utilizing PyTorch 2.0 for Deep Learning applications. It starts with an introduction to PyTorch, its various advantages over other Deep Learning frameworks, and its blend with CUDA for GPU acceleration. We delve into the heart of PyTorch – tensors, learning their different types, properties, and operations. Through step-by-step examples, the reader learns to perform basic arithmetic operations on tensors, manipulate them, and understand errors related to tensor shapes.
A substantial portion of the book is dedicated to illustrating how to build simple PyTorch models. This includes uploading and preparing datasets, defining the architecture, training, and predicting. It provides hands-on exercises with a real-world dataset. The book then dives into exploring PyTorch's nn module and gives a detailed comparison of different types of networks like Feedforward, RNN, GRU, CNN, and their combination.
Further, the book delves into understanding the training process and PyTorch's optim module. It explores the overview of optimization algorithms like Gradient Descent, SGD, Mini-batch Gradient Descent, Momentum, Adagrad, and Adam. A separate chapter focuses on advanced concepts in PyTorch 2.0, like model serialization, optimization, distributed training, and PyTorch Quantization API.
In the final chapters, the book discusses the differences between TensorFlow 2.0 and PyTorch 2.0 and the step-by-step process of migrating a TensorFlow model to PyTorch 2.0 using ONNX. It provides an overview of common issues encountered during this process and how to resolve them.
In the latter chapters of this book, we delve into the more sophisticated principles of the PyTorch programming language. You will acquire knowledge regarding the serialization and optimization of models, as well as the use of distributed training and the Quantization API provided by PyTorch. In addition, we investigate the connection between TensorFlow 2.0 and PyTorch 2.0, comparing and contrasting the advantages and disadvantages of both programs. In doing so, the book arms you with the knowledge necessary to select the framework that caters to your requirements in the most optimal manner.
Key Learnings:
A comprehensive introduction to PyTorch and CUDA for deep learning. Detailed understanding and operations on PyTorch tensors. Step-by-step guide to building simple PyTorch models. Insight into PyTorch's nn module and comparison of various network types. Overview of the training process and exploration of PyTorch's optim module. Understanding advanced concepts in PyTorch like model serialization and optimization. Knowledge of distributed training in PyTorch. Practical guide to using PyTorch's Quantization API. Differences between TensorFlow 2.0 and PyTorch 2.0. Guidance on migrating TensorFlow models to PyTorch using ONNX.
Audience: A perfect and skillful book for every Machine Learning engineer, data scientist, AI engineer and data researcher who are passionately looking towards drawing actionable intelligence using PyTorch 2.0. Knowing Python and the basics of deep learning is all you need to sail through this book.
Introduction to Pytorch 2.0 and CUDA 11.8 Getting Started with Tensors Advanced Tensors Operations Building Neural Networks with PyTorch 2.0 Training Neural Networks in PyTorch 2.0 PyTorch 2.0 Advanced Migrating from TensorFlow to PyTorch 2.0 End-to-End PyTorch Regression Model
Скачать Learning PyTorch 2.0: Experiment Deep Learning from basics to complex models using every potential capability
Внимание
Уважаемый посетитель, Вы зашли на сайт как незарегистрированный пользователь.
Мы рекомендуем Вам зарегистрироваться либо войти на сайт под своим именем.
Информация
Посетители, находящиеся в группе Гости, не могут оставлять комментарии к данной публикации.