Tokens & Context Windows

Created by admin | Last updated 1 month, 3 weeks ago
Aging (54d)
Technical Level: Intermediate
Rating accurate?
Rate this content: Net: 0

How AI processes and "remembers" text.

Tokens:
- AI breaks text into chunks called "tokens"
- Roughly 4 characters per token (in English)
- "Hello world" ≈ 2 tokens
- Pricing is often per-token

Context window:
- How much the AI can "see" at once
- GPT-4: ~128K tokens
- Claude: ~200K tokens
- Gemini: ~1M tokens (announced)

Why it matters:
- Long documents may need to be chunked
- Earlier conversation may be "forgotten"
- More context ≠ always better results
- Large contexts cost more to process

Practical tip: If the AI seems to forget earlier instructions, the conversation may have exceeded its effective context.

No child nodes.

Comments

Submit a Comment (Public)

Public comments require moderation before appearing.

No comments yet.

Navigation

Key Concepts to Demystify (parent)

Tokens & Context Windows (current)

Node Information
Created: Mar 08, 2026
Updated: Mar 08, 2026 05:04
Tech Level: 3 - Intermediate
Staleness: 50/100
Children: 0
Comments: 0
Content Votes: 0 up / 0 down