What is Context Window?
The maximum number of tokens an LLM can process at once. Claude’s context window is 200K tokens (~150K words), GPT-4 Turbo is 128K. Larger windows let Agents ‘remember’ more conversation history and reference material, but cost more.