CertNova
Menu
<- Back to glossary

Token Limits

noun

Definition

  1. 1.Token limits define the maximum number of tokens that an AI model can process in a single input, impacting the length and detail of the input data.

Example

Exceeding the token limit in a text generation task may result in truncated or incomplete outputs.

Related Exams