Output Tokens

AI & LLM Glossary

What is Output Tokens?

Output Tokens: Output tokens are the tokens generated by the model in its response. They're typically 3-10x more expensive than input tokens.

Output Tokens Explained

Output tokens represent the content the model generates. Each token requires significant computation to produce, which is why output pricing is higher. The max_tokens parameter limits output length. For cost optimization, be specific about desired response length and format to avoid verbose responses.

Formula

Output Cost = (Output Tokens / 1,000,000) × Output Price per 1M

Related Terms

Track Your LLM Costs

Burnwise monitors tokens automatically. Start optimizing today.

Start Free Trial