Skip to content

Instantly share code, notes, and snippets.

@rgarcia
Created March 2, 2026 22:51
Show Gist options
  • Select an option

  • Save rgarcia/7921109f04dc41b9e816e7e2cd8c8081 to your computer and use it in GitHub Desktop.

Select an option

Save rgarcia/7921109f04dc41b9e816e7e2cd8c8081 to your computer and use it in GitHub Desktop.
Cursor + GPT models: Provider Error caused by overly-complex MCP tool JSON schemas (OpenAI nesting depth limit)

Cursor + GPT models: Provider Error caused by overly-complex MCP tool JSON schemas (OpenAI nesting depth limit)

Cursor "Provider Error" with GPT models: MCP tool schema nesting depth

Summary

When using Cursor with MCP (Model Context Protocol) servers, GPT models (gpt-5.3-codex-, gpt-5.2-codex-, etc.) return a generic "Provider Error" that gives no actionable information:

Provider Error: We're having trouble connecting to the model provider. This might be temporary - please try again in a moment.

Root cause: OpenAI's API rejects requests when any tool's JSON schema exceeds its nesting depth limit (approximately 5 levels). MCP servers that auto-generate schemas from deeply recursive data structures can produce schemas with 20+ levels of nesting, which silently causes this error for all requests — even simple ones like "say hello" — as long as those tools are loaded.


Minimum Viable Reproduction

What you need

  • Cursor CLI (cursor-agent)
  • A GPT model (e.g. gpt-5.3-codex)
  • Any MCP server that serves a tool with a deeply nested JSON schema

Reproducing in Docker

# 1. Pull a Debian image and install Node/Python
docker run -it --name cursor-repro debian:bookworm bash
apt-get update && apt-get install -y python3 nodejs curl

# 2. Copy your Cursor auth
docker cp ~/.config/cursor/auth.json cursor-repro:/root/.config/cursor/auth.json
docker cp ~/.cursor/mcp.json cursor-repro:/root/.cursor/mcp.json
# Copy the project's .cursor dir (contains mcp-approvals.json and mcp-cache.json)
docker cp ~/.cursor/projects/<your-project>/ cursor-repro:/root/.cursor/projects/<your-project>/

# 3. Install cursor-agent binary (get path from: which cursor-agent or find ~/.local -name cursor-agent)
# Copy the binary and its node runtime to the same paths inside Docker

# 4. Create a mock MCP server with a deeply nested schema
cat > /tmp/deep-schema-mcp.py << 'PYEOF'
#!/usr/bin/env python3
import sys, json

# Build a schema with 20 levels of nesting
def make_nested(depth):
    if depth == 0:
        return {"type": "string"}
    return {"type": "object", "properties": {"child": make_nested(depth - 1)}}

TOOLS = [{
    "name": "complex_tool",
    "description": "A tool with a deeply nested schema",
    "inputSchema": {
        "type": "object",
        "properties": {"data": make_nested(20)},
    }
}]

for line in sys.stdin:
    req = json.loads(line.strip())
    method, req_id = req.get("method"), req.get("id")
    if method == "initialize":
        resp = {"jsonrpc":"2.0","id":req_id,"result":{"protocolVersion":"2024-11-05","capabilities":{"tools":{}},"serverInfo":{"name":"deep-schema","version":"1.0.0"}}}
    elif method == "notifications/initialized":
        continue
    elif method == "tools/list":
        resp = {"jsonrpc":"2.0","id":req_id,"result":{"tools":TOOLS}}
    else:
        resp = {"jsonrpc":"2.0","id":req_id,"result":{}}
    print(json.dumps(resp), flush=True)
PYEOF
chmod +x /tmp/deep-schema-mcp.py

# 5. Configure the MCP server in ~/.cursor/mcp.json
cat > /root/.cursor/mcp.json << 'MCPEOF'
{
  "mcpServers": {
    "deep-schema": {
      "type": "stdio",
      "command": "python3",
      "args": ["/tmp/deep-schema-mcp.py"]
    }
  }
}
MCPEOF

# 6. Compute the approval fingerprint so it's pre-approved (no interactive prompt)
node -e "
const crypto = require('crypto');
const cwd = '/your/project/path';
// NOTE: Zod schema strips the 'type' field before fingerprinting!
const config = {command: 'python3', args: ['/tmp/deep-schema-mcp.py']};
const s = {path: cwd, server: config};
const h = crypto.createHash('sha256').update(JSON.stringify(s)).digest('hex').substring(0,16);
console.log(JSON.stringify(['deep-schema-' + h]));
"
# Put the output into the project's mcp-approvals.json

# 7. Run with GPT — observe the Provider Error
cursor-agent --api-key <your-key> --model gpt-5.3-codex -p "say hello"
# Output: "Provider Error: We're having trouble connecting..."

# 8. Fix: reduce schema nesting depth to <= 5 levels
# Update the mock to use a flat schema and re-run — it works.

Key Technical Details

How MCP approval fingerprints work

Cursor computes an approval fingerprint for each MCP server:

// From cursor-agent index.js (minified)
function l(serverName, serverConfig, cwd) {
  const s = { path: cwd, server: serverConfig };
  return `${serverName}-${crypto.createHash("sha256")
    .update(JSON.stringify(s))
    .digest("hex")
    .substring(0, 16)}`;
}

Critical: The serverConfig object is parsed through a Zod schema that strips the type field ("type": "stdio" / "type": "http"). So when computing fingerprints manually, omit type:

// WRONG - includes type field, fingerprint won't match
const config = { type: "stdio", command: "python3", args: [...] };

// CORRECT - type is stripped by Zod before fingerprinting
const config = { command: "python3", args: [...] };

Why the error gives no information

Cursor catches the OpenAI API error and maps it to the generic "Provider Error" message regardless of the underlying cause. The actual error from OpenAI is something like "Invalid schema: maximum nesting depth exceeded."

Affected models

All OpenAI-backed models fail (GPT-5.3-codex-, GPT-5.2-codex-, GPT-4o, etc.). Claude and Gemini models appear more tolerant of deep schemas and may not fail.

How to diagnose

  1. Check if the error happens even for trivial prompts ("say hello")
  2. Try with zero approved MCPs ([] in mcp-approvals.json) — if it works, MCPs are the cause
  3. Binary search: approve MCPs one at a time to find the culprit
  4. For each suspicious MCP, check max schema nesting depth:
import json

def max_depth(obj, d=0):
    if isinstance(obj, dict):
        return max([max_depth(v, d+1) for v in obj.values()], default=d)
    if isinstance(obj, list):
        return max([max_depth(v, d+1) for v in obj], default=d)
    return d

cache = json.load(open('~/.cursor/projects/<project>/mcp-cache.json'))
for server, data in cache.items():
    for tool in data.get('tools', []):
        depth = max_depth(tool.get('inputSchema', {}))
        if depth > 5:
            print(f"{server}/{tool['name']}: depth={depth}, size={len(json.dumps(tool['inputSchema']))} bytes")

Fix

Flatten the problematic tool schemas to <= 5 nesting levels. For schemas that represent complex nested objects, use type: object without deep property nesting — the LLM can be guided by documentation/resources rather than schema structure.

Real-world example: SigNoz signoz-mcp-server PR #60signoz_create_dashboard had 20 levels of nesting (20,461 bytes), fixed to 3 levels (780 bytes).


Feature request for Cursor

It would help if Cursor:

  1. Detected schema nesting depth issues before sending to the provider and warned the user
  2. Surfaced the underlying API error message instead of the generic "Provider Error"
  3. Indicated which MCP server/tool is causing the rejection
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment