Название: Transformers in Action (MEAP v7) Автор: Nicole Koenigstein Издательство: Manning Publications Год: 2024 Страниц: 272 Язык: английский Формат: pdf (true) Размер: 10.3 MB
Transformers are the superpower behind large language models (LLMs) like ChatGPT, Bard, and LLAMA. Transformers in Action gives you the insights, practical techniques, and extensive code samples you need to adapt pretrained transformer models to new and exciting tasks.
Inside Transformers in Action you’ll learn: How transformers and LLMs work Adapt HuggingFace models to new tasks Automate hyperparameter search with Ray Tune and Optuna Optimize LLM model performance Advanced prompting and zero/few-shot learning Text generation with reinforcement learning Responsible LLMs
Technically speaking, a “Transformer” is a neural network model that finds relationships in sequences of words or other data by using a mathematical technique called attention in its encoder/decoder components. This setup allows a transformer model to learn context and meaning from even long sequences of text, thus creating much more natural responses and predictions. Understanding the transformers architecture is the key to unlocking the power of LLMs for your own AI applications.
This comprehensive guide takes you from the origins of transformers all the way to fine-tuning an LLM for your own projects. Author Nicole Königstein demonstrates the vital mathematical and theoretical background of the transformer architecture practically through executable Jupyter notebooks, illuminating how this amazing technology works in action.
Transformers have established themselves as a indispensable tool in the field of Machine Learning and Artificial Intelligence as the research and deployment of Large Language Models (LLMs) continues to expand.
This book will take you on a fascinating journey through the applications of Transformers, which have, in recent years, evolved from their initial use in natural language processing (NLP) to a wide array of domains. These include, but is not limited to, computer vision, speech recognition, reinforcement learning, mathematical operations, and the study of biological systems such as protein folding. The most notable innovations have been the emergence of decision Transformers and multimodal models. These groundbreaking models have the potential to reshape our understanding of Deep Learning and broaden its horizons.
about the book Transformers in Action adds the revolutionary transformers architecture to your AI toolkit. You’ll dive into the essential details of the model’s architecture, with all complex concepts explained through easy-to-understand examples and clever analogies—from sock sorting to skiing! Even complex foundational concepts start with practical applications, so you never have to struggle with abstract theory. The book includes an extensive code repository that lets you instantly start playing and exploring different LLMs.
In this interesting guide, you’ll start by applying transformers to fundamental NLP tasks like text summarization and text classification. Then, you’ll push transformers farther with tasks like generating text, honing text generation with reinforcement learning, developing multimodal models, and few-shot learning. You’ll discover one-of-a-kind advice on prompt engineering, as well as proven-and-tested methods for optimizing and tuning large language models. Plus, you’ll find unique coverage of AI ethics such as mitigating bias and responsible usage.
about the reader This book is designed for a diverse audience: ML engineers, data scientists, researchers, students, and AI practitioners who are eager to harness the potential of Transformer models in various domains. Readers should be comfortable with the basics of ML, Python, and common data tools.
about the author Nicole Koenigstein is a distinguished Data Scientist and Quantitative Researcher. She is presently the Chief Data Scientist and Head of AI & Quantitative Research at Wyden Capital.
Внимание
Уважаемый посетитель, Вы зашли на сайт как незарегистрированный пользователь.
Мы рекомендуем Вам зарегистрироваться либо войти на сайт под своим именем.
Информация
Посетители, находящиеся в группе Гости, не могут оставлять комментарии к данной публикации.