- Добавил: literator
- Дата: 29-11-2021, 14:01
- Комментариев: 0
Название: Pretrained Transformers for Text Ranking: Bert and Beyond
Автор: Jimmy Lin, Rodrigo Nogueira
Издательство: Morgan & Claypool
Год: 2022
Страниц: 327
Язык: английский
Формат: pdf (true)
Размер: 10.15 MB
The goal of text ranking is to generate an ordered list of texts retrieved from a corpus in response to a query. This book provides an overview of text ranking with a family of neural network models known as transformers, of which BERT (Bidirectional Encoder Representations from Transformers), an invention of Google, is the best-known example. These models have been responsible for a paradigm shift in the fields of natural language processing (NLP) and information retrieval (IR) and, more broadly, human language technologies (HLT), a catch-all term that includes technologies to process, analyze, and otherwise manipulate (human) language data.
Автор: Jimmy Lin, Rodrigo Nogueira
Издательство: Morgan & Claypool
Год: 2022
Страниц: 327
Язык: английский
Формат: pdf (true)
Размер: 10.15 MB
The goal of text ranking is to generate an ordered list of texts retrieved from a corpus in response to a query. This book provides an overview of text ranking with a family of neural network models known as transformers, of which BERT (Bidirectional Encoder Representations from Transformers), an invention of Google, is the best-known example. These models have been responsible for a paradigm shift in the fields of natural language processing (NLP) and information retrieval (IR) and, more broadly, human language technologies (HLT), a catch-all term that includes technologies to process, analyze, and otherwise manipulate (human) language data.