Token Counter
Count tokens for popular AI providers, compare totals, and include system prompts when budgeting for API calls.
Characters
254
Words
41
Lines
6
System / template tokens
Add shared tokens such as system prompts or tool instructions.
Filter providers
Focus the estimates on a single vendor when comparing plans.
Token estimates by provider
OpenAI counts use exact tokenizers. Other providers are estimates based on their published guidance.
Provider & model
Notes
Prompt tokens
Total with overhead
OpenAI
GPT-5
OpenAI guidance indicates ~3.7 characters per token for GPT-5.
69
69
OpenAI
GPT-4.1 Mini
Shares the cl100k_base tokenizer; ~3.8 characters per token.
67
67
OpenAI
text-embedding-3.5 large
Embedding models reuse cl100k_base-style tokenisation (~3.7 chars/token).
69
69
Anthropic
Claude 4.5 Sonnet
Anthropic states ~4 characters per token across Claude 4.x models.
64
64
Anthropic
Claude 4.5 Opus
Higher tiers keep the same approximate 4 chars/token ratio.
64
64
Anthropic
Claude 4 Haiku
Haiku inherits the Claude 4 tokenizer (~4 chars/token).
64
64
Gemini 2.5 Pro
Gemini 2.5 documentation targets ≈4 characters per token.
64
64
Gemini 2.5 Flash-Lite
Flash-Lite trades depth for speed; estimate ≈3.9 chars/token.
65
65
Gemini 2.5 Flash
Flash balances latency and reasoning (~3.95 chars/token).
65
65
Gemini 2.5 Flash (Image)
Image-specialised Flash shares the same tokenizer.
65
65
Mistral
Mistral Large 2
Mistral reports ~3.7–3.8 characters per token for Large 2.
68
68
Meta
Llama 4 120B
Meta advises ~3.9 characters per token for Llama 4 series.
65
65
xAI
Grok 3
Based on xAI public benchmarks (~3.8 chars/token).
67
67
Longest estimate above: 69 tokens (before overhead).
Estimates use heuristic character-per-token ratios. Always compare against live API usage for billing-critical workloads.