What are tokens?
Updated 2/3/2026, 7:26:39 PM
Tokens are the basic building blocks of text used by AI models. Each word in your message is broken into tokens, with approximately 4 English characters equaling 1 token. The number of tokens in a message determines the Energy required to process it.
For example:
- "Hello, Phi! How are you today?" = ~9 tokens
Please reference OpenAI’s tokenizer tool to estimate the token count for your messages. However, different AI models use different tokenizers and the actual number of tokens may be different.
Need more help?
Contact Us