>

Token

Token

AI models like ChatGPT don’t read full sentences the way humans do. Instead, they break input into tokens. For example, the sentence:

“ChatGPT is smart.”

might be split into the tokens: "Chat", "G", "PT", " is", " smart", "."

The model uses these tokens to understand and generate text. Most models have a token limit, which determines how much text they can handle at once. For reference, 1,000 tokens is roughly 750 words.

Understanding tokens helps explain why long prompts sometimes get cut off or why models occasionally forget earlier parts of a conversation.

© 2025 Kumospace, Inc. d/b/a Fonzi