Input Tokens

AI & LLM Glossary

What is Input Tokens?

Input Tokens: Input tokens are the tokens in your prompt sent to the model. They're typically cheaper than output tokens.

Input Tokens Explained

Input tokens include everything you send to the model: system prompts, user messages, context, examples, and any other content. Input pricing is usually 3-10x cheaper than output pricing because the model processes input faster than generating output. For cost optimization, minimize redundant input and use caching when possible.

Formula

Input Cost = (Input Tokens / 1,000,000) × Input Price per 1M

Related Terms

Track Your LLM Costs

Burnwise monitors tokens automatically. Start optimizing today.

Start Free Trial