Back to Developer Roadmap

Tokens in Large Language Models

src/data/roadmaps/ai-engineer/content/[email protected]

4.0691 B
Original Source

Tokens in Large Language Models

Tokens are fundamental units of text that LLMs process, created by breaking text into smaller components such as words, subwords, or characters. Understanding tokens is crucial because models predict the next token in sequences, API costs are based on token count, and models have maximum token limits for input and output.

Visit the following resources to learn more: