Tokens & Context Windows
Created by admin | Last updated 1 month, 3 weeks agoHow AI processes and "remembers" text.
Tokens:
- AI breaks text into chunks called "tokens"
- Roughly 4 characters per token (in English)
- "Hello world" ≈ 2 tokens
- Pricing is often per-token
Context window:
- How much the AI can "see" at once
- GPT-4: ~128K tokens
- Claude: ~200K tokens
- Gemini: ~1M tokens (announced)
Why it matters:
- Long documents may need to be chunked
- Earlier conversation may be "forgotten"
- More context ≠ always better results
- Large contexts cost more to process
Practical tip: If the AI seems to forget earlier instructions, the conversation may have exceeded its effective context.
Comments
No comments yet.
Key Concepts to Demystify (parent)
Tokens & Context Windows (current)
| Created: | Mar 08, 2026 |
| Updated: | Mar 08, 2026 05:04 |
| Tech Level: | 3 - Intermediate |
| Staleness: | 50/100 |
| Children: | 0 |
| Comments: | 0 |
| Content Votes: | 0 up / 0 down |