Enhancing Your Tools with Chat Context in Theia AI

March 4, 2025 | 7 min Read

In today’s AI-powered tools, giving users the ability to scope their requests with relevant context is essential. Whether it’s a file, a symbol, a hardware component, or any other domain-specific element of your tool, letting users attach specific context helps the AI generate more accurate and timely responses. With chat context variables in Theia AI, tool builders can offer a seamless way for users to provide essential context—whether by dragging items from custom editors or by receiving smart suggestions as they type. This feature not only boosts the precision of AI responses, while avoid having to rely on advanced, time-consuming context retrieval methods, but also makes it more convenient for users to specify exactly what they are up to.

Attaching Context Elements in the AI Chat

Advanced context retrieval methods like workspace maps or retrieval-augmented generation (RAG) can extract relevant context based solely on the user query and tool state (see also context management in domain-specific tools). However, enabling users to directly specify important context remains crucial for many use cases. User involvement in context selection offers several advantages: faster response times by reducing the need for heavy context retrieval mechanisms, and improved precision since user-specified context is reliable and minimizes ambiguity. For instance, querying the entire workspace becomes unnecessary when users can directly indicate the relevant files they already know. For certain types of applications, allowing users to manually select the context provides transparency about what information the LLM will receive, which in our experience increases user acceptance and helps them optimize their prompts for better results.

To maximize benefits, making context attachment simple and convenient for users is crucial. Theia AI provides tool builders with several integrated mechanisms for users to add context elements:

  • Drag and drop: Users can simply drag elements from any view or custom editor directly into the chat
  • Guided selection: Users can select from a list of context element types and then search for the appropriate element of that type
  • Smart suggestions: Users receive matching context element suggestions via auto-completion while typing in the chat input

Processing the Attached Context

Once the user attached context elements, Theia AI provide a flexible frame for the agent to decide how the context data attached by the user is processed. Common processing approaches include:

  1. Summarization: The agent may summarize the provided context (e.g., listing file names) in their communication with the LLM.
  2. Context Window Management: The agent may decide how much context to include based on dynamic context management—adding the entire context if it fits, applying ranking/summarization if it’s too large, or using multi-turn prompt flows to incrementally identify and refine the relevant parts.
  3. On-Demand Retrieval: Instead of sending or choosing the relevant context upfront, the agent may expose tool functions so that the LLM can fetch specific elements when needed autonomously.

This flexibility allows tool providers to implement agents with simple to very sophisticated context processing mechanisms, tailored to the needs of the specific tool.

What This Means for Tool Builders

At the heart of Theia AI’s flexibility lies the generic context variable mechanism. Theia AI empowers tool builders to define custom context element types—whether those are files, symbols, or specialized domain concepts like configuration blocks, components or model elements of their domain-specific modeling language.

When introducing custom context variables, tool builders have full control over:

  • Which context elements are available: Define items such as files, symbols, or custom domain-specific elements.
  • How context is displayed: Customize label providers to control what the user sees.
  • User support during selection: Implement quick pick dialogs, drag-and-drop support, and auto-completion.
  • The context integration strategy: Control how the data is incorporated into the LLM communication.

Implementing User-friendly Context Selection

Theia AI simplifies the development of user-friendly context selection, even for tool-specific context element types. Implementing the following contribution points is all that’s required:

  • Variable Provider: Register a provider that resolves both a value (inserted into the chat prompt) and a contextValue (attached to the chat request)
  • Drag-and-Drop Handler: Register a handler enabling users to drag files directly from various views into the chat
  • Auto-Completion and Quick Pick Providers: Register providers that add auto-completion suggestions to the chat input as users type and guide users in selecting context elements by context element type (e.g., opening a quick pick dialog when a user enters #file in the chat input)

Please refer to the Theia AI documentation to get more details on those contribution points with concrete code examples.

Accessing the Attached Context in Your Agents

When users submit requests with selected context, Theia AI attaches it to the chat request model at request.context.variables. Your chat agents can then process this context data in several ways:

  1. Programmatic processing: Chat agents can access context directly from the request parameter provided during each invocation. This is useful when agents need to evaluate context characteristics (such as size) to determine whether to include it entirely or provide only a summary to the LLM.

  2. Using variables: Chat agents can incorporate context elements using Theia AI framework variables like #contextSummary or #contextDetails in their system message. Theia AI resolves these to either a context summary or their verbatim content.

  3. Using tool calls: Since context elements are available in request models (accessible to tool call handlers), agents can add tools to their LLM requests for on-demand context retrieval. Agents can leverage Theia AI’s ~context_ListChatContext and ~context_ResolveChatContext tools for this approach.

These approaches can be combined flexibly. A typical usage involves checking context size to determine whether to include the context directly using #contextDetails (for smaller context) or provide #contextSummary with the ~context_ResolveChatContext tool, allowing the LLM to decide which context elements to resolve on demand.

An important capability is supporting user references to attached context. For example, when a user attaches multiple files and requests to refactor one of them in their user message, Theia AI’s variable context system proves valuable. A context variable provides both a value (replacing the variable in the user input) and a contextValue (available only in the request context). We recommend providing a context element identifier in the value. For files (where the path serves as identifier), a user request like Refactor #file:src/implementation.ts would add the file content to the context while resolving the user message to Refactor src/implementation.ts. The LLM can then put special emphasize on the context element referenced in the user message directly, and access its full context using the path identifier via the ~context_ResolveChatContext tool.

This approach enables users to efficiently add context and reference it in their messages while allowing agents to leverage built-in mechanisms like variables and tool functions to effectively transfer and process context for the LLM.

For more information on writing custom agents or using variables and tools functions in LLM requests, please refer to the Theia AI documentation.

Final Thoughts

Chat context variables in Theia AI simplify integrating user-provided context directly into AI chat requests. By allowing users to drag files, symbols, or other elements into the chat or receive smart suggestions while typing, you can significantly improve both the precision of AI responses and the overall user experience. Theia AI gives you full control over which context element types are available, how they’re displayed, and how users interact with them—making it a powerful mechanism for building user-friendly, AI-enhanced tools.

Learn More

To learn out more about Theia AI, check out:

Interested in building AI-powered tools? EclipseSource provides consulting and implementation services backed by our extensive experience in tool development. We also specialize in tailored AI assistance for web- and cloud-based tools and professional support for Eclipse Theia and VS Code.

Get in touch with us to discuss your specific use case!

Jonas, Maximilian & Philip

Jonas Helming, Maximilian Koegel and Philip Langer co-lead EclipseSource, specializing in consulting and engineering innovative, customized tools and IDEs, with a strong …