Why Theia supports any LLM!

February 27, 2025 | 8 min Read

The landscape of large language models (LLMs) is rapidly evolving, with numerous models available and new ones released every day. These models can be accessed through cloud-based services, on-premise deployments, or may even run locally. With Theia AI and the AI-powered Theia IDE, we embrace the importance of giving users and adopters complete freedom to choose their preferred hosting solution and LLM. Whether they require a cloud-based API, a self-hosted enterprise-grade model, or a local instance, Theia AI and the AI-powered Theia IDE are designed to accommodate any scenario.

In case you don’t know Theia AI or the AI-powered Theia IDE visit the Theia AI introduction and the AI Theia IDE overview, and download the AI-powered Theia IDE here.

Why Universal LLM Compatibility Matters for Users of the AI-powered Theia IDE

The AI-powered Theia IDE ships with a comprehensive set of LLM providers and preconfigured LLMs, including industry leaders like OpenAI, Anthropic, and Azure; OpenAI-compatible APIs; and local options such as Ollama. See the LLM Provider Documentation for more information. Theia’s AI support goes beyond a fixed list of supported models—users can configure additional LLMs directly in the IDE at runtime, ensuring access to even the newest models released after your Theia IDE version. This flexibility extends to granular control, allowing different LLMs to be assigned to specific IDE functions like code completion, terminal assistance, and chat interfaces. By matching the right model to each task, users can optimize their development experience with their preferred hosting options and models. See how our AI-powered IDE seamlessly integrates with various options, including those that didn’t exist when the latest Theia IDE was released:

👉 Using StarCoder for AI-Powered Auto Completion in the Theia IDE

👉 Integrating DeepSeek into the Theia IDE and Theia AI

👉 Theia AI and Theia IDE support o3-mini and o1

For users, the ability to choose any LLM for each individual AI capability in the IDE provides significant advantages:

  1. Avoiding Vendor Lock-in – Users should not be tied to specific subscriptions or pricing models. They must retain the flexibility to switch providers based on their needs without having to switch to an entirely different IDE product.

  2. Privacy, Security and Enterprise Compliance – Some users and organizations work in environments where sensitive data cannot be sent to third-party cloud services. For these cases, local or self-hosted models provide a crucial alternative.

  3. Optimal Model Selection – Different LLMs have different strengths. Users need the ability to select the best LLM for their specific use cases. Additionally, as new models emerge, users should be able to adopt them quickly without being restricted by rigid tooling.

  4. Future-proofing AI Integration – New LLMs are released frequently. The Theia IDE was among the first to support models like OpenAI’s o3 mini and DeepSeek simply because users can configure their LLMs dynamically.

While Theia IDE enables connections to various models (e.g., HuggingFace, custom OpenAPI models, LlamaFile), of course not all models may work out of the box, as they may require specific customizations or optimizations. If you encounter issues, please provide feedback, keeping in mind this is an alpha-phase feature. As the AI-powered Theia IDE is fully open source, we rely on the power of our community to support as many options in terms of LLM as users and contributors desire!

Why This Matters for Tool Builders Adopting the Theia AI Framework

Theia AI is a comprehensive framework, as part of the Theia tool platform, designed for building custom AI-powered development tools and specialized IDEs. While the AI-powered Theia IDE represents one public adoption of this framework, Theia AI’s core strength lies in enabling organizations to create domain-specific tools and specialized IDEs with embedded AI capabilities. For these tool builders, flexible LLM support delivers strategic advantages—even in scenarios where their end-users have no control over the underlying AI models.

As a tool builder leveraging Theia AI, your freedom to select and control LLMs provides critical benefits:

  1. Strategic Provider Independence – While you may optimize your tool for specific models, Theia AI ensures you’re never locked into a single vendor. This independence protects your investment against provider policy changes, price increases, or service disruptions.

  2. Custom Model Integration – Deploy proprietary fine-tuned models optimized for your industry’s terminology, workflows, and requirements. Theia AI’s flexibility allows seamless integration of these specialized models alongside general-purpose LLMs.

  3. Complete Business Model Control – Manage your own subscription and pricing structure without dependency on third-party LLM provider billing. This enables various monetization approaches, from bundled AI features to premium AI-powered capabilities.

  4. Data Governance and Sovereignty – Address enterprise security requirements by controlling where AI data processing occurs, enabling on-premises solutions or region-specific deployments to meet strict compliance standards.

  5. Version Control and Consistency – Pin your application to specific model versions to ensure predictable performance and behavior, particularly important for tools requiring regulatory compliance or consistent outputs.

  6. Performance Optimization – Select different specialized models for different tool features, using lightweight models for real-time assistance and more powerful models for complex tasks.

  7. Future-Proof Architecture – Rapidly integrate emerging models and capabilities without architectural changes, ensuring your tool remains competitive as AI technology evolves.

This flexibility empowers you to build sophisticated AI-enhanced tools while maintaining complete control over the intelligence that powers them.

How Theia AI Supports Various LLMs and Providers

A common question is whether Theia AI truly supports any LLM and any provider. The conceptual answer is yes, but practical implementation depends on two structured layers:

Supporting LLM Providers

Theia AI operates with an LLM provider architecture, where each provider supports a specific API. For instance, Theia AI currently includes built-in support for the following providers/APIs, which have been contributed by the community (see the LLM provider documentation for more details).

  • OpenAI official API

  • OpenAI compatible API (e.g. Azure or DeepSeek)

  • Anthropic API

  • Hugging Face API

  • Ollama

  • Llama-File

Support for Gemini is on the roadmap. Through these provider integrations, users of the Theia IDE and tool builders using Theia AI can select any compatible LLM with simple configuration changes. However, while Theia AI does not natively support every provider out of the box, the system is designed to be extensible. If a specific provider you need is not yet supported out-of-the-box, you can easily add an LLM provider for it. This can be done as a public contribution to Theia AI, but also in your proprietary tool repo (e.g. if you add support for a proprietary API).

Supporting Specific LLMs

Most LLMs within a given API work without modification. However, some models require specific settings, such as:

  • Custom stop words or token length adjustments.

  • Variations in system message formatting (e.g., OpenAI’s new “developer” role vs. Azure’s legacy “system” role).

Thanks to Theia AI’s extensibility, most of these parameters can be configured without any code changes at runtime. If a setting is identified to be missing it can be easily added to the open source framework.

The more configuration options we support today, the higher the likelihood that future LLMs will work seamlessly out of the box. While we cannot guarantee immediate support for every new LLM, Theia AI’s open-source nature ensures that the community can rapidly contribute integrations for the most requested models and providers. So if you find something missing, please provide feedback and consider a contribution! EclipseSource also provides:

👉 Sponsored Open Source Development for Theia and Theia AI (e.g. to add missing LLM providers)

Another dimension of integrating new LLMs is of course that they might require tweaking your existing prompts. Luckily, Theia AI provides a flexible prompt management system. In the Theia IDE, we even allow users to access and modify all prompts. See for example how to adapt the code completion in the AI-powered Theia IDE to StarCoder.

Conclusion

Theia AI and the AI-powered Theia IDE prioritize flexibility and user control when integrating LLMs. Whether for individual users seeking privacy and vendor independence or tool builders designing their own AI-driven solutions, Theia AI ensures complete freedom in selecting and configuring AI models. With a community-driven approach and an extensible architecture, Theia AI is prepared to support both today’s and tomorrow’s AI landscape.

Of course, every new model brings unique characteristics that may require prompt adjustments, workflow refinements, or adjusted response handling to achieve optimal results. Theia AI simplifies this process, empowering users to experiment, adapt, and fine-tune their setups effortlessly.

Now is the perfect time to discover Theia AI’s capabilities or customize your environment for specific LLMs. Whether you’re developing AI-powered applications, enhancing Theia IDE, or contributing to open-source initiatives, Theia AI delivers the flexibility and control needed to harness AI effectively.

Learn more in our blog posts: “Introducing AI Support in Theia IDE” and “Introducing Theia AI” (the framework powering tool builders)

If you are interested in building custom AI-powered Tools, EclipseSource provides consulting and implementation services backed by our extensive experience with successful AI tool projects. We also specialize in web- and cloud-based tools and support for popular platforms like Eclipse Theia and VS Code.

👉 Get in touch with us to learn more.

Jonas, Maximilian & Philip

Jonas Helming, Maximilian Koegel and Philip Langer co-lead EclipseSource, specializing in consulting and engineering innovative, customized tools and IDEs, with a strong …