Blog
Custom Print on Demand Apparel — Free Storefront for Your Business
Wild & Free Tools

How Many Tokens Are in a Word? GPT, Claude, Gemini Compared

Last updated: April 20266 min readAI Tools

The short answer: about 1.3 tokens per word in English. That holds across GPT, Claude, and Gemini for normal prose. The longer answer is more interesting — and matters if you are estimating costs, sizing context windows, or trimming prompts.

The Quick Conversion

English textApproximate tokens
1 word1.3 tokens
10 words13 tokens
100 words130 tokens
500 words650 tokens
1,000 words1,300 tokens
1 page (~250 words)325 tokens
5 pages (~1,250 words)1,625 tokens
1 short book (~50,000 words)65,000 tokens

Or going the other way:

TokensApproximate wordsEquivalent
10075One paragraph
500375Half a page
1,000750One full page
4,0003,000One short blog post
8,0006,000One long article
32,00024,000One short report
128,00096,000One short novel
200,000150,000One full novel

Paste your text and see exact token count instantly.

Open Token Counter →

Why It Is Not Exactly 1 Token Per Word

Tokenizers do not split text on spaces. They split on a learned vocabulary of common subword pieces. This means:

Tokens Per Word by Content Type

ContentWords per tokenWhy
Plain English prose0.75Standard tokenizer training data
Technical writing0.65More long words, jargon
Code (Python, JS)0.50Symbols and identifiers split heavily
Spanish/French0.55Different tokenizer coverage
Chinese/Japanese0.35Each character is often 1 token
Math/equations0.30Symbols are individual tokens
URLs and emails0.25Characters split heavily
JSON output0.45Brackets, commas, quotes all count

If you're processing code or non-English text, your token count will be much higher than the English rule of thumb. For Chinese or Japanese, 1 word is often 2-3 tokens.

Tokenizer Differences Across Models

Each major LLM uses a slightly different tokenizer. The same 1,000-word English document tokenizes to different counts:

ModelTokenizerApprox tokens (1,000 English words)
GPT-4o, GPT-4.1o200k_base~1,250
GPT-3.5, older GPT-4cl100k_base~1,300
Claude (all)Claude tokenizer~1,290
GeminiSentencePiece~1,280
Llama 3, 4BPE~1,310
DeepSeekBPE variant~1,295

The variation is small — within 5% — but it can matter at scale. If you are budgeting for 1 million prompts, a 5% difference is 50,000 tokens, which adds up.

How to Count Tokens Without the Math

The fastest way is to use a free online token counter:

  1. Paste your text into the Token Counter
  2. See instant count — tokens, words, characters
  3. See estimated API cost across major models
  4. Adjust your text and watch the count update live

The counter runs entirely in your browser — your text never leaves your device, so it works for confidential prompts and proprietary content.

Count tokens for any text in your browser. Free, no signup.

Open Token Counter →

When the Difference Actually Matters

For most casual use, 1.3 tokens per word is fine. The difference matters when:

For everything else, the rule of thumb is fine. 1.3 tokens per word, 0.75 words per token. Use the counter when you need precision, use the rule when you need a rough estimate.

Launch Your Own Clothing Brand — No Inventory, No Risk