Large language models translate natural language into database queries, yet context window limitations prevent direct deployment in reporting systems where complete datasets exhaust available tokens. The Model Context Protocol specification defines ResourceLink for referencing external resources, but practical patterns for implementing scalable reporting architectures remain undocumented. This paper presents patterns for building LLM-powered reporting systems that decouple query generation from data retrieval. We introduce a dual-response pattern extending ResourceLink to support both iterative query refinement and out-of-band data access, accompanied by patterns for multi-tenant security and resource lifecycle management. These patterns address fundamental challenges in LLM-driven reporting applications and provide practical guidance for developers building them.
翻译:暂无翻译