Vtome.ru - электронная библиотека

LLMs in Production: From language models to successful products (Final Release)

  • Добавил: literator
  • Дата: 17-01-2025, 06:34
  • Комментариев: 0
Название: LLMs in Production: From language models to successful products (Final Release)
Автор: Christopher Brousseau, Matthew Sharp
Издательство: Manning Publications
Год: 2025
Страниц: 456
Язык: английский
Формат: pdf (true)
Размер: 34.0 MB

Learn how to put Large Language Model-based applications into production safely and efficiently.

This practical book offers clear, example-rich explanations of how LLMs work, how you can interact with them, and how to integrate LLMs into your own applications. Find out what makes LLMs so different from traditional software and ML, discover best practices for working with them out of the lab, and dodge common pitfalls with experienced advice.

LLMs in Production is not your typical Data Science book. In fact, you won’t find many books like this at all in the data space mainly because creating a successful data product often requires a large team—data scientists to build models, data engineers to build pipelines, MLOps engineers to build platforms, software engineers to build applications, product managers to go to endless meetings, and, of course, for each of these, managers to take the credit for it all despite their only contribution being to ask questions, oftentimes the same questions repeated, just trying to understand what’s going on.

There are so many books geared toward each of these individuals, but there are so very few that tie the entire process together from end to end. While this book focuses on LLMs—indeed, it can be considered an LLMOps book—what you will take away will be so much more than how to push a large model onto a server. You will gain a roadmap that will show you how to create successful ML products—LLMs or other-wise—that delight end users.

In LLMs in Production you will:

Grasp the fundamentals of LLMs and the technology behind them
Evaluate when to use a premade LLM and when to build your own
Efficiently scale up an ML platform to handle the needs of LLMs
Train LLM foundation models and finetune an existing LLM
Deploy LLMs to the cloud and edge devices using complex architectures like PEFT and LoRA
Build applications leveraging the strengths of LLMs while mitigating their weaknesses

LLMs in Production delivers vital insights into delivering MLOps so you can easily and seamlessly guide one to production usage. Inside, you'll find practical insights into everything from acquiring an LLM-suitable training dataset, building a platform, and compensating for their immense size. Plus, tips and tricks for prompt engineering, retraining and load testing, handling costs, and ensuring security.

Foreword by Joe Reis.

About the technology:
Most business software is developed and improved iteratively, and can change significantly even after deployment. By contrast, because LLMs are expensive to create and difficult to modify, they require meticulous upfront planning, exacting data standards, and carefully-executed technical implementation. Integrating LLMs into production products impacts every aspect of your operations plan, including the application lifecycle, data pipeline, compute cost, security, and more. Get it wrong, and you may have a costly failure on your hands.

About the book:
LLMs in Production teaches you how to develop an LLMOps plan that can take an AI app smoothly from design to delivery. You'll learn techniques for preparing an LLM dataset, cost-efficient training hacks like LORA and RLHF, and industry benchmarks for model evaluation. Along the way, you'll put your new skills to use in three exciting example projects: creating and training a custom LLM, building a VSCode AI coding extension, and deploying a small model to a Raspberry Pi.

What's inside:
Balancing cost and performance
Retraining and load testing
Optimizing models for commodity hardware
Deploying on a Kubernetes cluster

About the reader:
Anyone who finds themselves working on an application that uses LLMs will benefit from this book. This includes all of the previously listed individuals. The individuals who will benefit the most, though, will likely be those who have cross-functional roles with titles like ML engineer. This book is hands-on, and we expect our readers to know Python and, in particular, PyTorch.

Contents:


Скачать LLMs in Production: From language models to successful products (Final Release)



ОТСУТСТВУЕТ ССЫЛКА/ НЕ РАБОЧАЯ ССЫЛКА ЕСТЬ РЕШЕНИЕ, ПИШИМ СЮДА!










ПРАВООБЛАДАТЕЛЯМ


СООБЩИТЬ ОБ ОШИБКЕ ИЛИ НЕ РАБОЧЕЙ ССЫЛКЕ



Внимание
Уважаемый посетитель, Вы зашли на сайт как незарегистрированный пользователь.
Мы рекомендуем Вам зарегистрироваться либо войти на сайт под своим именем.
Информация
Посетители, находящиеся в группе Гости, не могут оставлять комментарии к данной публикации.