Contextual Embedding

A type of word embedding that captures the context-dependent meaning of words.

Description

Contextual embeddings are a type of word embedding that captures the context-dependent meaning of words. Unlike traditional static word embeddings, contextual embeddings generate different vector representations for the same word based on its surrounding context. This allows the model to capture polysemy (multiple meanings of a word) and better understand the nuances of language use in different contexts. Contextual embeddings have led to significant improvements in various natural language processing tasks.

Examples

  • πŸ€– BERT embeddings
  • πŸ“š ELMo (Embeddings from Language Models)
  • ✍️ GPT embeddings

Applications

πŸ˜ƒ Sentiment analysis
🏷️ Named entity recognition
❓ Question answering

Related Terms