Skip to content

MCP Server Overview

The ContextKit MCP server exposes your Open Semantic Interchange (OSI)-based semantic layer — models, glossary terms, business rules, golden queries, guardrails, and data product blueprints — to any AI agent that speaks the Model Context Protocol.

Giving an AI agent a database connection string is not enough. Without semantic context, agents will:

  • Guess at column meanings and get them wrong
  • Join tables incorrectly or choose the wrong grain
  • Produce metrics that contradict your business definitions
  • Generate SQL that violates data governance guardrails

ContextKit solves this by publishing a machine-readable semantic layer that agents can query before they write a single line of SQL. The MCP server is the bridge between your curated context and any MCP-compatible agent.

The MCP server supports two transport modes:

The server runs as a child process of your AI tool. Communication happens over standard input/output. This is the simplest mode and works out of the box with Claude Code, Cursor, and other MCP clients.

Terminal window
npx @runcontext/cli serve

The server runs as a standalone HTTP service, exposing a Streamable HTTP endpoint at /mcp. This is useful for shared environments, remote access, or when multiple clients need to connect simultaneously.

Terminal window
npx @runcontext/cli serve --http --port 3000

The MCP server provides two complementary interfaces:

  • Resources — read-only URIs that return structured context (the manifest, individual models, glossary terms, tier scorecards, and data product YAML exports). See Resources.
  • Tools — callable functions that let agents search, explain, validate, and query your semantic layer. See Tools.

Agents can also request a blank data product template (context://data-product/template) or export any model as a portable OSI YAML blueprint (context://data-product/{name}).

Together, these give an agent everything it needs to understand your data before generating SQL or answering business questions.

  1. Make sure you have a context directory with your metadata files.
  2. Start the server:
Terminal window
npx @runcontext/cli serve
  1. Configure your AI tool to connect. See Configuration for tool-specific setup.