Blog

Observe is an AI-powered observability platform engineered for scale. It delivers performance, faster troubleshooting, and greater cost efficiency by correlating logs, metrics, and traces in an open data lake.
At Observe, we want as many users as possible to have access to our powerful query and scaling capabilities in their own AI experiences, whether it’s coding tools like Augment or Cursor, or more general clients like Claude or Gemini. Model Context Protocol (MCP) servers are the emerging standard to achieve this via a pluggable mechanism, and we’re very excited to make the Observe MCP server available to broaden access to Observe.
An MCP server acts as a bridge between AI agents and external resources such as databases, APIs, or other tools. In the case of Observe’s MCP server, you allow your agents to access your observability data in order to better accomplish their tasks. Typically this means simply gathering more context to quickly investigate errors, generate business insights, or even produce code changes.
We’ve looked at many MCP server implementations in the wild and found that most companies are simply wrapping their existing APIs behind tools in their MCP and are done. While that’s an OK starting point that approach can prove to be both brittle and inflexible. We’ve decided to implement something much more expansive.
What makes Observe’s MCP server different?
AI First
Rather than simply exposing our API calls and results behind a series of MCP tools, we started from the ground up by designing AI-first tools that would supercharge our MCP. In our initial iterations of work, our team wasn’t satisfied with the results we were getting–we were missing a piece to this puzzle.
While it's obvious for an experienced SRE to know what and where to search for the results they wanted, AI agents with less intuition were not as quick to deliver. We found that in order for agents to successfully leverage Observe, they needed more context. This is why our MCP server uses the Knowledge Graph: Observe’s AI-first vector store to help agents quickly gather more context from the massive volume of observability data in the data lake.
The Observe Knowledge Graph
Behind the scenes, Observe maintains a Knowledge Graph of “things” and relationships between data for each customer. People frequently ask questions about the things in their infrastructure or business that they want to observe and monitor, whether it’s the state of a deployment or pod, or how the daily active users are changing, or the correlations between them.
For agents to effectively answer these questions, they need access to those definitions as well as their relationships. The Knowledge Graph indexes both the definitions and their relationships, enriched with: extra metadata, rich text descriptions, data shapes, field definitions, metric tags, and use cases, allowing the data to be surfaced when needed by the agent.
In the context of a tool inside of a toolchain across possibly multiple MCP servers, or simply plain ambiguity from a user’s prompt, we leverage this graph to get the right information for the query generation, so that, for instance, when someone says “prod” in the context of a Kubernetes cluster, we can identify the right cluster uid. As more tools and data are made available to an agent, knowing how to turn phrases and text outside of our control into useful information is foundational.

Comprehensive and correct query generation
OPAL, Observe’s query language, enables advanced users to write queries capable of answering complex temporal questions. Creating advanced OPAL queries can certainly be challenging, so to address this we’ve enabled agents to generate them…in a very unique way. We don’t have agents generate OPAL directly; we’ve introduced a new stripped-down JSON schema for representing how queries are structured and how the result can be visualized.
AI agents can easily generate this well-defined JSON schema object. Our MCP server can then validate that the provided JSON matches the defined schema. In order to generate the correct semantics, the template generation step leverages the data provided by the Knowledge Graph. Once validated, we can easily convert this object into OPAL on behalf of the agent. The agent can also provide a similar JSON object to describe the visualization, or tabular data format.
Beyond text: Images and tables
Early in our testing, we worked out that returning the raw data from queries was often not enough–the answers to many observability questions are better represented in charts and tables, rather than a long list of text. In the spirit of MCP, ideally the agent would pick up the raw data result from a query and be able to plot the desired visualization automatically, but this isn’t there today. As we don’t control the “other” capabilities of the hosting agent or sibling MCP tools, we emit markdown-based tables and PNGs for charts. Images and tables may not be optimally represented today in many agents but are still much better than a wall of text for displaying results.

Hosted by Observe
Unlike many MCP servers, we decided to host our MCP server in our own SaaS. This approach provides four capabilities:
- Easier to get started - No need to pull and compile a local MCP server before you add it to your agent of choice.
- Controlled execution - We can more finely control the evaluation and refinement of many of the tools we provide by choosing which LLM we think performs best, rather than being at the mercy of the hosting agent. We do offer a variation which splits up the tools to allow the agent to manage the LLM work itself (for customers who do not want their data to hit an external LLM), but these results can be more unpredictable.
- Remote tool configuration - When businesses don’t want certain functionality exposed by our MCP tools, such as 3rd-party LLMs, these tools can be dynamically removed from the server, or added back as desired by the customer.
- Fix and iterate fast - By hosting the MCP server, Observe can automatically get the latest version to our users.
Enterprise ready: Access to Observe and private data
Many of our customers are very large enterprises with business data that must remain within their own data lakes. On its own, Observe cannot leverage this data to give these businesses the answers that require access to both the SaaS data in our platform and the private data in theirs. Providing an MCP server with access to the Knowledge Graph based on observations in their infrastructure and applications, combined with their own private information, lets them create agents that can leverage both types of data together.
Using the Observe MCP server
Although this article has focused on the external usage of our MCP server, we dogfood the MCP server in the Observe platform as well. The ability to generate queries and charts based on the unique configuration of a customer’s Knowledge Graph is foundational to all of our AI features in Observe. Instead of creating a dashboard, we are able to create cards and visualizations for users using the same MCP server within our own interface. Do you have uses for the Observe MCP server that may be valuable to you? If you’re interested in bringing Observe into your AI workflows via our MCP server, contact your account team to get access.