What is Token?
The basic unit of text that AI models process โ roughly 4 characters or 0.75 words.
Definition
A token is the smallest unit of text that an AI model processes. In English, one token is roughly 4 characters or about 0.75 words. AI models read, process, and generate text in tokens. API pricing is typically based on the number of input tokens (your prompt) and output tokens (the AI response). Context window sizes are also measured in tokens.
๐ก Example
"Hello, world!" contains 4 tokens: "Hello", ",", " world", "!". A 500-word article is approximately 670 tokens. GPT-4o charges $2.50 per million input tokens.
Related concepts
A type of AI trained on massive text datasets to understand and generate human language.
The maximum amount of text an AI model can process in a single conversation.
A way for developers to programmatically access AI models in their own applications.
What is Token?
A token is the smallest unit of text that an AI model processes. In English, one token is roughly 4 characters or about 0.75 words. AI models read, process, and generate text in tokens. API pricing is typically based on the number of input tokens (your prompt) and output tokens (the AI response). Context window sizes are also measured in tokens.
How does Token work in practice?
"Hello, world!" contains 4 tokens: "Hello", ",", " world", "!". A 500-word article is approximately 670 tokens. GPT-4o charges $2.50 per million input tokens.