Context Window

AI & LLM Glossary

What is Context Window?

Context Window: The context window is the maximum number of tokens a model can process in a single request, including both input and output.

Context Window Explained

Context window size determines how much information you can provide to an LLM in one request. Larger context windows enable processing longer documents, maintaining longer conversation history, and including more examples. However, using more context typically increases cost and latency. In 2026, context windows range from 16K tokens (GPT-3.5) to 2M tokens (Gemini 3.0 Pro).

Examples

  • *GPT-5.2: 128K tokens
  • *Claude Opus 4.5: 200K tokens
  • *Gemini 3.0 Pro: 2M tokens

Related Terms

Track Your LLM Costs

Burnwise monitors every metric automatically. Start optimizing today.

Start Free Trial