LLM Context Window

From GM-RKB
(Redirected from Model Context Limit)
Jump to navigation Jump to search

A LLM Context Window is an llm operational parameter that defines the maximum token count a large language model can process in a single llm inference request.