Vtome.ru - электронная библиотека

Graph Neural Networks in Action (MEAP v5)

  • Добавил: literator
  • Дата: 16-01-2023, 09:51
  • Комментариев: 0
Graph Neural Networks in Action (MEAP v5)Название: Graph Neural Networks in Action (MEAP v5)
Автор: Keita Broadwater
Издательство: Manning Publications
Год: 2022
Страниц: 257
Язык: английский
Формат: pdf, epub
Размер: 38.6 MB

A hands-on guide to powerful graph-based Deep Learning models! Learn how to build cutting-edge graph neural networks for recommendation engines, molecular modeling, and more.

In Graph Neural Networks in Action, you will learn how to:

Train and deploy a graph neural network
Generate node embeddings
Use GNNs at scale for very large datasets
Build a graph data pipeline
Create a graph data schema
Understand the taxonomy of GNNs
Manipulate graph data with NetworkX

Graph Neural Networks in Action teaches you to create powerful deep learning models for working with graph data. You’ll learn how to both design and train your models, and how to develop them into practical applications you can deploy to production. Go hands-on and explore relevant real-world projects as you dive into graph neural networks perfect for node prediction, link prediction, and graph classification. Inside this practical guide, you’ll explore common graph neural network architectures and cutting-edge libraries, all clearly illustrated with well-annotated Python code.

about the technology
Graph neural networks expand the capabilities of Deep Learning beyond traditional tabular data, text, and images. This exciting new approach brings the amazing capabilities of Deep Learning to graph data structures, opening up new possibilities for everything from recommendation engines to pharmaceutical research.

For data practitioners, the fields of Machine Learning and data science initially excite us because of the potential to draw non-intuitive and useful insights from data. In particular, the insights from machine learning and deep learning promise to enhance our understanding of the world. For the working engineer, these tools promise to deliver business value in unprecedented ways. Experience detracts from this ideal. Real data is messy, dirty, biased. Statistical methods and learning systems have limitations. An essential part of the practitioner’s job involves understanding these limitations, and bridging this gap to obtain a solution.

For a certain class of data, graphs, the gap has proven difficult to bridge. Graphs are a data type that is rich with information. Yet, they can also explode in size when we try to account for all this information. They are also ubiquitous, appearing in nature (molecules), society (social networks), technology (the internet), and everyday settings (roadmaps). In order to use this rich and ubiquitous data type for machine learning, we need a specialized form of neural network dedicated to work on graph dаta: this is the graph neural network or GNN.

about the book
In Graph Neural Networks in Action you’ll create Deep Learning models that are perfect for working with interconnected graph data. Start with a comprehensive introduction to graph data’s unique properties. Then, dive straight into building real-world models, including GNNs that can generate node embeddings from a social network, recommend eCommerce products, and draw insights from social sites. This comprehensive guide contains coverage of the essential GNN libraries, including PyTorch Geometric, DeepGraph Library, and Alibaba’s GraphScope for training at scale.

about the reader
For Python programmers familiar with Machine Learning and the basics of Deep Learning.

Скачать Graph Neural Networks in Action (MEAP V05)












ОТСУТСТВУЕТ ССЫЛКА/ НЕ РАБОЧАЯ ССЫЛКА ЕСТЬ РЕШЕНИЕ, ПИШИМ СЮДА!


ПРАВООБЛАДАТЕЛЯМ


СООБЩИТЬ ОБ ОШИБКЕ ИЛИ НЕ РАБОЧЕЙ ССЫЛКЕ



Внимание
Уважаемый посетитель, Вы зашли на сайт как незарегистрированный пользователь.
Мы рекомендуем Вам зарегистрироваться либо войти на сайт под своим именем.
Информация
Посетители, находящиеся в группе Гости, не могут оставлять комментарии к данной публикации.