Calculate how many tokens your text will use with different AI models. Token counts affect context limits for AI conversations.
Count tokens for Claude models including Claude Opus 4.1, Sonnet 4, and Haiku
Count tokens for GPT-4, GPT-4o, GPT-3.5 Turbo, and other OpenAI models
Tokens are the basic units that language models process. A token can be as short as one character or as long as one word. On average:
Different models use different tokenization methods, so the same text may result in different token counts.