PostgreSQL
Also works with Neon, Supabase, Redshift, CockroachDB, AlloyDB, and any Postgres wire-compatible database.
ContextKit connects to your database to introspect schemas, validate metadata against live data, and verify golden queries. It supports a wide range of databases through optional peer dependencies — install only the driver you need.
PostgreSQL
Also works with Neon, Supabase, Redshift, CockroachDB, AlloyDB, and any Postgres wire-compatible database.
DuckDB
Local .duckdb files, in-memory databases, Parquet/CSV/JSON via DuckDB views. Also works with MotherDuck.
MySQL / MariaDB
MySQL 5.7+, 8.x, MariaDB 10.x+. Works with PlanetScale, TiDB, and other MySQL-compatible services.
SQL Server
Microsoft SQL Server 2016+. Also works with Azure SQL Database.
Snowflake
Full support for Snowflake accounts with warehouse/database/schema selection.
BigQuery
Google BigQuery with service account authentication and dataset selection.
ClickHouse
ClickHouse HTTP interface. Works with ClickHouse Cloud.
Databricks
Databricks SQL warehouses via server hostname and HTTP path.
SQLite
Local .sqlite / .db files for lightweight or embedded databases.
The fastest way to connect is the interactive wizard:
context setupIt will auto-detect databases from your environment, MCP configs (Claude Code, Cursor, VS Code, Windsurf), and contextkit.config.yaml.
PostgreSQL, MySQL, and SQL Server use standard connection strings:
data_sources: warehouse: adapter: postgres connection: postgresql://user:pass@host:5432/dbnameAlso works for Neon, Supabase, Redshift, CockroachDB — they all speak the PostgreSQL wire protocol.
data_sources: warehouse: adapter: mysql connection: mysql://user:pass@host:3306/dbnameInstall the driver: npm install mysql2
data_sources: warehouse: adapter: mssql connection: mssql://user:pass@host:1433/dbnameInstall the driver: npm install mssql
data_sources: analytics: adapter: duckdb path: ./data/warehouse.duckdbDuckDB is included by default — no extra install needed.
data_sources: local: adapter: sqlite path: ./data/app.sqliteInstall the driver: npm install better-sqlite3
data_sources: snowflake: adapter: snowflake account: xy12345.us-east-1 username: analyst password: ${SNOWFLAKE_PASSWORD} warehouse: COMPUTE_WH database: ANALYTICS schema: PUBLICInstall the driver: npm install snowflake-sdk
data_sources: bigquery: adapter: bigquery project: my-gcp-project dataset: analytics keyFilename: ./service-account.jsonInstall the driver: npm install @google-cloud/bigquery
data_sources: clickhouse: adapter: clickhouse host: localhost port: 8123 database: analytics username: default password: ${CLICKHOUSE_PASSWORD}Install the driver: npm install @clickhouse/client
data_sources: databricks: adapter: databricks serverHostname: dbc-abc123.cloud.databricks.com httpPath: /sql/1.0/warehouses/abcdef token: ${DATABRICKS_TOKEN}Install the driver: npm install @databricks/sql
ContextKit uses optional peer dependencies — install only the driver for your database:
| Database | Driver Package | Install Command |
|---|---|---|
| DuckDB | duckdb | Included by default |
| PostgreSQL | pg | npm install pg |
| MySQL / MariaDB | mysql2 | npm install mysql2 |
| SQL Server | mssql | npm install mssql |
| Snowflake | snowflake-sdk | npm install snowflake-sdk |
| BigQuery | @google-cloud/bigquery | npm install @google-cloud/bigquery |
| ClickHouse | @clickhouse/client | npm install @clickhouse/client |
| Databricks | @databricks/sql | npm install @databricks/sql |
| SQLite | better-sqlite3 | npm install better-sqlite3 |
If a driver is missing, ContextKit will tell you exactly which package to install.
When you run context setup, ContextKit scans your IDE MCP configs to find databases you’ve already connected:
| IDE | Config Location |
|---|---|
| Claude Code | ~/.claude.json, <project>/.mcp.json |
| Cursor | ~/.cursor/mcp.json, <project>/.cursor/mcp.json |
| VS Code / Copilot | <project>/.vscode/mcp.json |
| Windsurf | ~/.codeium/windsurf/mcp_config.json |
| Claude Desktop | ~/Library/Application Support/Claude/claude_desktop_config.json |
If a database MCP server is found, context setup offers it as the first option — no connection string needed.
For non-interactive use, pass a connection string directly:
# PostgreSQL (also Neon, Supabase, Redshift)context introspect --db postgres://user:pass@host:5432/db
# DuckDBcontext introspect --db ./warehouse.duckdb
# MySQLcontext introspect --db mysql://user:pass@host:3306/db
# SQL Servercontext introspect --db mssql://user:pass@host:1433/db
# Snowflake (via config)context introspect --source snowflake
# BigQuery (via config)context introspect --source bigqueryCloud warehouses (Snowflake, BigQuery, ClickHouse, Databricks) require multiple credentials and should be configured in contextkit.config.yaml rather than passed as a single URL.
Configure multiple databases if your models span different systems:
data_sources: events: adapter: clickhouse host: click.example.com database: events users: adapter: postgres connection: ${USERS_DB_URL} analytics: adapter: duckdb path: ./data/analytics.duckdbEach model’s source field determines which data source is used for validation and introspection.