Vi bøger
Levering: 1 - 2 hverdage

Embeddings in Natural Language Processing - Jose Camacho-Collados - Bog

Bag om Embeddings in Natural Language Processing

Embeddings have undoubtedly been one of the most influential research areas in Natural Language Processing (NLP). Encoding information into a low-dimensional vector representation, which is easily integrable in modern machine learning models, has played a central role in the development of NLP. Embedding techniques initially focused on words, but the attention soon started to shift to other forms: from graph structures, such as knowledge bases, to other types of textual content, such as sentences and documents. This book provides a high-level synthesis of the main embedding techniques in NLP, in the broad sense. The book starts by explaining conventional word vector space models and word embeddings (e.g., Word2Vec and GloVe) and then moves to other types of embeddings, such as word sense, sentence and document, and graph embeddings. The book also provides an overview of recent developments in contextualized representations (e.g., ELMo and BERT) and explains their potential in NLP. Throughout the book, the reader can find both essential information for understanding a certain topic from scratch and a broad overview of the most successful techniques developed in the literature.

Vis mere
  • Sprog:
  • Engelsk
  • ISBN:
  • 9783031010491
  • Indbinding:
  • Paperback
  • Sideantal:
  • 176
  • Udgivet:
  • 13. November 2020
  • Størrelse:
  • 191x10x235 mm.
  • Vægt:
  • 341 g.
  • 2-3 uger.
  • 4. Juni 2024
På lager

Normalpris

Medlemspris

Prøv i 30 dage for 45 kr.
Herefter fra 79 kr./md. Ingen binding.

Beskrivelse af Embeddings in Natural Language Processing

Embeddings have undoubtedly been one of the most influential research areas in Natural Language Processing (NLP). Encoding information into a low-dimensional vector representation, which is easily integrable in modern machine learning models, has played a central role in the development of NLP. Embedding techniques initially focused on words, but the attention soon started to shift to other forms: from graph structures, such as knowledge bases, to other types of textual content, such as sentences and documents. This book provides a high-level synthesis of the main embedding techniques in NLP, in the broad sense. The book starts by explaining conventional word vector space models and word embeddings (e.g., Word2Vec and GloVe) and then moves to other types of embeddings, such as word sense, sentence and document, and graph embeddings. The book also provides an overview of recent developments in contextualized representations (e.g., ELMo and BERT) and explains their potential in NLP. Throughout the book, the reader can find both essential information for understanding a certain topic from scratch and a broad overview of the most successful techniques developed in the literature.

Brugerbedømmelser af Embeddings in Natural Language Processing



Find lignende bøger

Gør som tusindvis af andre bogelskere

Tilmeld dig nyhedsbrevet og få gode tilbud og inspiration til din næste læsning.