Beyond AI Chat Agents with Theia AI

March 12, 2025 | 8 min Read

When discussing AI integration in development environments, the conversation often centers around chat interfaces. While chat-based interactions with AI are powerful, they represent just one dimension of how AI can enhance your custom tools and specialized IDEs. Therefore, we want to emphasize today how Theia AI – part of the open Eclipse Theia tool platform – enables tool builder to move beyond chat agents to create truly integrated, context-aware AI assistants that work seamlessly within custom tools and specialized IDEs.

If you don’t know Theia AI and the Eclipse Theia platform yet, please refer to the Theia AI introduction and learn about the Eclipse Theia IDE platform that allows you to build custom web-based tools and IDEs. To try it out firsthand, you can download the AI-powered Theia IDE built with Theia AI today.

Why We Need to Think Beyond the Chat Box

Many AI-powered tools are centered around a chat interface that allows users to interact with AI agents and LLMs. Theia AI likewise provides a sophisticated AI chat interface with impressive capabilities:

However, for most of the AI-powered custom tools and domain-specific tool environments we’re developing with our customers, the chat is just one way of interacting with an AI, and for many use cases, other interaction patterns are even more efficient and precise.

The Untapped Potential: AI That Lives and Breathes Within Your Tool

Integrating AI directly into the tool, beyond a chat interface, is an often heavily underestimated feature in terms of value and efficiency. Triggering AI requests as users interact with the tool—for instance, while they type or click a button—creates opportunities to support users without requiring verbose input, making AI feel less like a separate entity and more like a natural extension of the developer’s own capabilities.

In the example below, we show a GLSP diagram editor as an example of a custom editor, in which AI is seamlessly integrated to enhance the user experience. Instead of manually fixing validation errors, users can simply click “Fix with AI”, allowing the system to understand the context and apply corrections instantly. This demonstrates how AI can act as an invisible assistant, streamlining workflows without disrupting the developer’s focus.

As can be seen in the video above, directly integrating the AI in a custom editor, we unlock several benefits:

  1. No verbose interaction: Instead of typing lengthy requests, the user just clicks “Fix with AI”.
  2. Excellent context: Triggering the AI directly from your custom editors or views allows you to carry valuable context. A chat agent, would instead need to apply advanced techniques to derive relevant context from the tool state, such as the workspace, open editors, and previous user actions. (For more on this, see our article on AI context management in domain-specific tools).
  3. No extra friction in applying the action: Being directly in the editor, your AI agent can directly perform the specific isolated change, without requiring an extra step for the user. In a separate chat, making chat responses actionable requires extra steps. While Theia AI provides several means to simplify this, such as Change Sets or interactive chat response user interfaces, users still need to transfer suggestions into editors or project artifacts.

Compelling Use Cases for Direct AI Integration

Here are some typical scenarios where integrating AI directly into your tool can significantly enhance the user experience:

  • In-Editor Auto-completion: Provide intelligent suggestions while users make changes in textual or non-textual editors, such as diagrams or form-based interfaces.

  • Context-Aware Actions: Implement dedicated buttons to request specific AI assistance, such as fixing validation errors or reviewing artifacts for improvements.

  • Specialized View Assistants: Create targeted assistants that respond to short, precise commands in specific views, such as a Terminal assistant of the AI-powered Theia IDE that can help users formulate the exact command they need (see video below).

Creating Custom AI Agents with Theia AI

In Theia AI, providing custom agents that are tailored for integration at specific steps of a user flow, custom editor, triggered by a menu, or custom view action is remarkably straightforward. Essentially, an agent is just a class that can provide a dedicated API to its clients (the view, editor, or command implementation calling the agent) and can therefore streamline the API to their most convenient use, taking in the context that those clients can provide easily from their implementation.

Custom AI agents in Theia AI have full access to the entire Theia tool API, allowing them to retrieve relevant tool state information—such as access to open editors (including custom ones), the current workspace, or ongoing user interactions. Unlike the more restricted VS Code Extension API (see our comparison of VS Code vs Eclipse Theia), Theia AI agents consequently can go beyond giving simple text responses. They can be designed to actively trigger tool actions, modify project files, or even execute complex workflows. This deep integration makes them powerful embedded assistants, capable of providing context-aware suggestions, automating repetitive tasks, and streamlining tool workflows—seamlessly enhancing the user experience without interrupting the flow of work.

In the following, we summarize the most important steps when defining non-chat agents. For more information, please visit the Theia AI documentation.

Step 1: Access Language Models

The first step in implementing your agent is to request access to a language model:

const llm = await this.languageModelRegistry.selectLanguageModel({
    agent: this.id,
    purpose: 'suggest-terminal-commands',
    identifier: 'openai/gpt-4o',
});

Step 2: Structure and Process LLM Responses

As these agents often need to post-process the LLM’s response to translate it into tool actions or structured information, they frequently use structured outputs. This capability enforces a specific response structure from the LLM, such as conforming to a JSON schema:

// Define the structured output type expected from the LLM
const Commands = z.object({
    commands: z.array(z.string()),
});

// Prepare the prompt for the language model
const llmRequest: LanguageModelRequest = {
    messages: [
        {
            actor: 'system',
            type: 'text',
            query: systemMessage
        },
        {
            actor: 'user',
            type: 'text',
            query: request
        }
    ],
    // Specify the expected response structure based on the aforementioned `Commands` schema
    response_format: {
        type: 'json_schema',
        json_schema: {
            name: 'terminal-commands',
            description: 'Suggested terminal commands based on the user request',
            schema: zodToJsonSchema(Commands)
        }
    }
};

// Send the request to the LLM
const result = await lm.request(llmRequest);

if (isLanguageModelParsedResponse(result)) {
    // model returned structured output
    const parsedResult = Commands.safeParse(result.parsed);
    if (parsedResult.success) {
        const response = JSON.stringify(parsedResult.data.commands);
        this.recordingService.recordResponse({ agentId: this.id, sessionId, requestId, response, ...result });
        return parsedResult.data.commands;
    }
}

This approach ensures that the LLM’s output is immediately usable as structured, typed data that your application can work with directly.

Step 3 (Optional): Enabling Continuation in Chat

A powerful pattern is allowing users to continue an original request from within a specific context, such as a custom editor or view, in the chat interface. This transfers both the context and the original request to the chat session, enabling users to explore and elaborate on their request more freely:

commands.registerCommand(TransferToChatCommand, {
    execute: async () => {
        // creates a clean session
        const session = this.chatService.createSession(ChatAgentLocation.Panel, { focus: true });
        // submit a request
        await this.chatService.sendRequest(session.id, {
            text: '@YourAgent <your message that carries over the context>'
        });
    }
});

The video above demonstrates how a simple Continue in chat command in the terminal enables users to seamlessly transfer the entire terminal context, including its current state, into a new chat session with a single click. This allows them to request deeper explanations of an executed command, analyze its results, or ask more complex follow-up questions—all while preserving the original context for a smooth and intuitive experience.

Benefits of Direct AI Integration

By integrating AI capabilities directly into your tools beyond just chat interfaces, you unlock a whole new level of user experience and productivity. Here’s what you gain:

  1. Reduced Cognitive Load: Users don’t need to formulate verbose requests; instead, they can trigger AI assistance with familiar UI interactions.

  2. Contextual Precision: The AI has direct access to the exact context of the user’s work, eliminating the need to infer or reconstruct context.

  3. Seamless Workflow: AI suggestions and actions become a natural extension of the tool’s functionality rather than a separate interaction mode.

  4. Domain-Specific Assistance: Agents can be highly specialized for particular views, editors, or workflows, providing more targeted and efficient help.

Conclusion

While chat-based AI interfaces have become the standard, there are several use cases in almost all custom tools and IDEs that are better served when we move beyond the chat paradigm to integrate intelligence directly into every aspect of the tool experience. Theia AI provides the framework and flexibility to create these deep integrations, enabling tool builders to craft truly intelligent, context-aware assistants that enhance productivity without disrupting workflow.

Think of it as the difference between having a helpful colleague you have to call over every time you need assistance versus having that same expertise magically embedded within your tools themselves. By implementing AI capabilities that respond to natural user interactions within the tool itself, we can create more intuitive, efficient, and powerful experiences that feel less like interacting with an AI assistant and more like working with a tool that simply understands what you need.

Learn More

To explore more about Theia AI and its capabilities:

Interested in implementing Theia AI in your tools? EclipseSource provides consulting and implementation services backed by our extensive experience in tool development. We also specialize in tailored AI assistance for web- and cloud-based tools and professional support for Eclipse Theia and VS Code.

Get in touch with us to discuss your specific use case!

Jonas, Maximilian & Philip

Jonas Helming, Maximilian Koegel and Philip Langer co-lead EclipseSource, specializing in consulting and engineering innovative, customized tools and IDEs, with a strong …