Context Window

The range of surrounding tokens that a model considers when processing or generating text.

Description

In natural language processing, the context window refers to the range of surrounding tokens that a model takes into account when processing or generating text. It determines how much context the model can consider at once, which is crucial for understanding the meaning of words and phrases in their proper context. The size of the context window can significantly impact a model's ability to capture long-range dependencies and maintain coherence in generated text.

Examples

  • πŸ”² Fixed-size windows in CNNs for text
  • πŸ” Attention-based variable windows in Transformers

Applications

πŸ“ Language modeling
🌐 Machine translation
πŸ“Š Text summarization

Related Terms