Core Concepts

What is Context Window?

The maximum amount of text an AI model can process in a single conversation.

Definition

The context window is the total number of tokens an AI model can handle in a single request, including both the input prompt and the generated output. A larger context window allows the model to process longer documents, maintain longer conversations, and consider more information at once. Claude offers 200K tokens, while GPT-4o offers 128K.

๐Ÿ’ก Example

With a 200K token context window, Claude can process approximately 150,000 words โ€” roughly the length of two full novels. This allows it to analyze entire codebases or lengthy legal documents in a single conversation.

Related concepts

LLM (Large Language Model)

A type of AI trained on massive text datasets to understand and generate human language.

โ†’
Prompt Engineering

The practice of crafting effective instructions to get better results from AI models.

โ†’
Token

The basic unit of text that AI models process โ€” roughly 4 characters or 0.75 words.

โ†’

Explore AI tools

Find tools that use context window in practice.

Browse all tools โ†’ Back to glossary
What is Context Window?

The context window is the total number of tokens an AI model can handle in a single request, including both the input prompt and the generated output. A larger context window allows the model to process longer documents, maintain longer conversations, and consider more information at once. Claude offers 200K tokens, while GPT-4o offers 128K.

How does Context Window work in practice?

With a 200K token context window, Claude can process approximately 150,000 words โ€” roughly the length of two full novels. This allows it to analyze entire codebases or lengthy legal documents in a single conversation.