Skip to content

Model Context Protocol

Hyperterse implements the Model Context Protocol (MCP) to expose your database queries as tools for AI assistants like Claude, ChatGPT, and custom agents.

MCP is a standard protocol for connecting AI assistants to external data sources and tools. With Hyperterse, your database queries become discoverable, callable tools that AI systems can use autonomously.

Benefits:

  • No exposed SQL — AI assistants call tools, not raw queries
  • Type-safe — Input validation before execution
  • Discoverable — Assistants can list and understand available tools
  • Secure — Connection strings and schemas stay hidden

Hyperterse exposes a single MCP endpoint at POST /mcp that implements JSON-RPC 2.0:

┌─────────────┐ ┌─────────────────┐ ┌──────────┐
│ AI Assistant│────────▶│ Hyperterse │────────▶│ Database │
│ (Claude) │ JSON-RPC│ POST /mcp │ SQL │ │
└─────────────┘ └─────────────────┘ └──────────┘

Each query you define becomes an MCP tool that AI assistants can discover and call.

List all available tools (queries):

Terminal window
curl -X POST http://localhost:8080/mcp \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"method": "tools/list",
"id": 1
}'

Example response:

{
"jsonrpc": "2.0",
"result": {
"tools": [
{
"name": "get-user-by-id",
"description": "Retrieve user by ID",
"inputSchema": {
"type": "object",
"properties": {
"userId": {
"type": "integer",
"description": "User ID"
}
},
"required": ["userId"]
}
},
{
"name": "list-users",
"description": "List all users with pagination",
"inputSchema": {
"type": "object",
"properties": {
"limit": {
"type": "integer",
"description": "Maximum results"
},
"offset": {
"type": "integer",
"description": "Results to skip"
}
},
"required": []
}
}
]
},
"id": 1
}

Execute a tool (query):

Terminal window
curl -X POST http://localhost:8080/mcp \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"method": "tools/call",
"params": {
"name": "get-user-by-id",
"arguments": {
"userId": "123"
}
},
"id": 1
}'

Example response:

{
"jsonrpc": "2.0",
"result": {
"content": [
{
"type": "text",
"text": "[{\"id\":123,\"name\":\"Alice\",\"email\":\"alice@example.com\"}]"
}
],
"isError": false
},
"id": 1
}
Terminal window
hyperterse run -f config.terse

Add to Claude’s MCP configuration:

{
"mcpServers": {
"hyperterse": {
"url": "http://localhost:8080/mcp"
}
}
}

Claude can now discover and use your queries:

“Find the user with email alice@example.com

Claude will automatically call the appropriate query tool.

Use any MCP-compatible client:

import requests
def call_mcp_tool(tool_name, arguments):
response = requests.post(
"http://localhost:8080/mcp",
json={
"jsonrpc": "2.0",
"method": "tools/call",
"params": {
"name": tool_name,
"arguments": arguments
},
"id": 1
}
)
return response.json()
# Example usage
result = call_mcp_tool("get-user-by-id", {"userId": "123"})
print(result["result"]["content"])

Use clear, action-oriented names:

# Good
queries:
get-user-by-id:
list-active-users:
search-products-by-category:
# Less clear
queries:
user:
users:
products:

Write descriptions that help AI understand when to use each tool:

queries:
get-user-by-email:
description: "Find a user by their email address. Use this when you know the user's email but not their ID."
# ...
search-users:
description: "Search users by name. Supports partial matches. Use this for finding users when you don't know their exact email."
# ...

Add sensible limits to prevent overwhelming responses:

queries:
list-orders:
description: 'List recent orders. Returns up to 50 results.'
statement: |
SELECT * FROM orders
ORDER BY created_at DESC
LIMIT 50

MCP errors are returned in the standard JSON-RPC 2.0 format:

{
"jsonrpc": "2.0",
"result": {
"content": [
{
"type": "text",
"text": "validation error: required input 'userId' is missing"
}
],
"isError": true
},
"id": 1
}

AI assistants can interpret these errors and retry with correct inputs.

For AI systems that don’t use MCP, Hyperterse also generates LLM-friendly documentation:

Terminal window
# Access at runtime
curl http://localhost:8080/llms.txt
# Or generate a file
hyperterse generate llms -f config.terse -o llms.txt

The generated documentation includes:

  • Query descriptions
  • Input schemas
  • Example usage
  • Base URL for API calls