Glossary

Token

A token is the fundamental unit of text that large language models use to process and generate language. Tokens are typically parts of words, whole words, or punctuation marks. On average, one token equals about 3-4 characters or roughly three-quarters of an English word. Models have a context window — a maximum number of tokens they can process at once. For example, Claude's context window can handle up to 200,000 tokens (roughly 150,000 words). Token count affects both the cost of API calls and the amount of context the model can consider.

Related Terms

Related Articles