Skill v3.0 — Decision tree, elicitation, push notifications, MCPB, Docker, GitHub Packages/GHCR, OSS docs, DO/Cloudflare deployment
name: mcp-server-scaffold
description: |
Scaffold and build a production-ready MCP (Model Context Protocol) server for any REST API.
Use when: (1) building a new MCP server from scratch, (2) wrapping an existing REST API as
MCP tools, (3) creating a Claude Desktop or hosted MCP integration, (4) user says "create an
MCP server", "build MCP tools for [API]", "wrap [service] as MCP". Covers: project structure,
decision-tree navigation (mandatory), dual transport (stdio + HTTP streaming), elicitation,
push notifications, domain handler pattern, client/credential management, structured logging,
error handling, testing with vitest, Docker setup, MCPB bundling, CI/CD with semantic-release,
GitHub Packages (npm) publishing for client libraries, GitHub Container Registry (GHCR)
publishing for Docker images, open source documentation (README, CONTRIBUTING, LICENSE,
CODE_OF_CONDUCT), and deployment on DigitalOcean App Platform or Cloudflare Workers.
author: Claude Code
version: 3.0.0
date: 2026-02-26Build a production-ready MCP server that wraps any REST API as Claude-compatible tools.
Creating an MCP server from scratch requires understanding multiple concerns: the MCP protocol, transport layers (stdio + HTTP streaming), tool registration, decision-tree navigation, elicitation (requesting user input mid-tool-call), push notifications, credential management, error handling, logging (must use stderr, not stdout), testing, Docker deployment, MCPB bundling, package publishing (GitHub Packages for npm, GHCR for Docker images), open source project hygiene (README, CONTRIBUTING, LICENSE, CODE_OF_CONDUCT), semantic versioning, and deployment to cloud platforms. This skill provides a battle-tested architecture and step-by-step guide.
The architecture has two layers:
- API Client Library (
node-{service}) — Standalone npm package that wraps the REST API - MCP Server (
{service}-mcp) — Thin layer that exposes the client library as MCP tools
This separation means the client library can be used independently (scripts, other apps), and the MCP server stays focused on tool definitions and protocol handling.
┌───────────────────────────────────────────────────────┐
│ MCP Server ({service}-mcp) │
│ ├── Transport (stdio or HTTP streaming) │
│ ├── Decision Tree (navigate → domain → tools) │
│ ├── Tool Registry (navigation + domain tools) │
│ ├── Domain Handlers (devices, tickets, etc.) │
│ ├── Elicitation (form + URL modes) │
│ ├── Push Notifications (resource/tool changes, logs) │
│ ├── Client Manager (lazy-loaded, cached) │
│ └── Logger (structured, stderr-only) │
├───────────────────────────────────────────────────────┤
│ API Client Library (node-{service}) │
│ ├── HttpClient (auth, retry, rate limit) │
│ ├── Resources (CRUD per entity) │
│ ├── Auth Manager (OAuth/API key) │
│ └── Types (full TypeScript definitions) │
└───────────────────────────────────────────────────────┘
node-{service}/
├── src/
│ ├── index.ts # Public exports
│ ├── client.ts # Main client class
│ ├── config.ts # Configuration & defaults
│ ├── auth.ts # Token management
│ ├── http.ts # HTTP layer with retry
│ ├── rate-limiter.ts # Rate limiting
│ ├── errors.ts # Error class hierarchy
│ ├── resources/ # One file per API resource
│ │ ├── organizations.ts
│ │ ├── devices.ts
│ │ └── tickets.ts
│ └── types/ # TypeScript definitions
│ ├── index.ts
│ ├── common.ts
│ ├── organizations.ts
│ ├── devices.ts
│ └── tickets.ts
├── tests/
│ ├── setup.ts # MSW setup
│ ├── mocks/
│ │ ├── server.ts # MSW server
│ │ └── handlers.ts # Request handlers
│ ├── fixtures/ # Test data
│ ├── unit/ # Unit tests
│ └── integration/ # Integration tests
├── .github/workflows/
│ └── release.yml # Test → Release → Publish to GitHub Packages
├── .npmrc # GitHub Packages registry config
├── tsconfig.json
├── tsup.config.ts # Dual ESM/CJS build
├── vitest.config.ts
├── .releaserc.json # semantic-release → GitHub Packages npm
├── README.md # Project overview, install, usage, API
├── CONTRIBUTING.md # How to contribute
├── LICENSE # MIT license
├── CODE_OF_CONDUCT.md # Contributor Covenant
├── CHANGELOG.md # Auto-generated by semantic-release
└── package.json
// src/errors.ts
export class ServiceError extends Error {
constructor(message: string, public statusCode: number, public response: unknown) {
super(message);
Object.setPrototypeOf(this, new.target.prototype); // Fix instanceof
}
}
export class AuthenticationError extends ServiceError {}
export class ForbiddenError extends ServiceError {}
export class NotFoundError extends ServiceError {}
export class ValidationError extends ServiceError {
constructor(message: string, public errors: Array<{field: string; message: string}>, response: unknown) {
super(message, 400, response);
}
}
export class RateLimitError extends ServiceError {
constructor(message: string, public retryAfter: number, response: unknown) {
super(message, 429, response);
}
}
export class ServerError extends ServiceError {}// src/http.ts — CRITICAL: Read body as text first, then parse
// response.json() + response.text() in catch = "Body already read" error
private async handleResponse<T>(response: Response, ...): Promise<T> {
if (response.ok) {
if (response.status === 204) return {} as T;
const contentType = response.headers.get('content-type');
if (contentType?.includes('application/json')) return response.json() as Promise<T>;
return {} as T;
}
// SAFE: Read text once, then try JSON.parse
let responseBody: unknown;
const rawText = await response.text();
try { responseBody = JSON.parse(rawText); }
catch { responseBody = rawText; }
switch (response.status) {
case 401: /* refresh token, retry once */
case 429: /* exponential backoff retry */
case 404: throw new NotFoundError('Resource not found', responseBody);
// ... etc
}
}// src/resources/organizations.ts
export class OrganizationsResource {
constructor(private readonly httpClient: HttpClient) {}
async list(params?: OrgListParams): Promise<Organization[]> {
return this.httpClient.request<Organization[]>('/api/v2/organizations', {
params: { pageSize: params?.pageSize, cursor: params?.cursor },
});
}
async get(id: number): Promise<Organization> {
return this.httpClient.request<Organization>(`/api/v2/organization/${id}`);
}
async create(data: OrgCreateData): Promise<Organization> {
return this.httpClient.request<Organization>('/api/v2/organizations', {
method: 'POST', body: data,
});
}
}// tests/setup.ts
import { beforeAll, afterAll, afterEach } from 'vitest';
import { server } from './mocks/server.js';
beforeAll(() => server.listen({ onUnhandledRequest: 'error' }));
afterEach(() => server.resetHandlers());
afterAll(() => server.close());
// tests/mocks/handlers.ts
import { http, HttpResponse } from 'msw';
import * as fixtures from '../fixtures/index.js';
export const handlers = [
http.post(`${BASE_URL}/oauth/token`, async ({ request }) => {
const body = await request.text();
if (body.includes('bad-client-id')) {
return HttpResponse.json({ error: 'invalid_client' }, { status: 400 });
}
return HttpResponse.json(fixtures.auth.tokenSuccess);
}),
http.get(`${BASE_URL}/api/v2/organizations`, () =>
HttpResponse.json(fixtures.organizations.list)),
// ... one handler per endpoint
];{
"name": "@org/node-{service}",
"type": "module",
"main": "./dist/index.cjs",
"module": "./dist/index.js",
"types": "./dist/index.d.ts",
"exports": {
".": {
"import": { "types": "./dist/index.d.ts", "default": "./dist/index.js" },
"require": { "types": "./dist/index.d.cts", "default": "./dist/index.cjs" }
}
},
"dependencies": {},
"engines": { "node": ">=18.0.0" }
}Zero production dependencies — uses native fetch (Node 18+).
The client library publishes to GitHub Packages npm registry so the MCP server (and any
other consumers) can install it as @org/node-{service}.
@org:registry=https://npm.pkg.github.com
//npm.pkg.github.com/:_authToken=${NODE_AUTH_TOKEN}
{
"branches": ["main"],
"plugins": [
"@semantic-release/commit-analyzer",
"@semantic-release/release-notes-generator",
"@semantic-release/changelog",
["@semantic-release/npm", {
"npmPublish": true,
"pkgRoot": "."
}],
["@semantic-release/git", {
"assets": ["CHANGELOG.md", "package.json", "package-lock.json"],
"message": "chore(release): ${nextRelease.version} [skip ci]"
}],
"@semantic-release/github"
]
}Key difference from the MCP server config: "npmPublish": true — the library gets
published to GitHub Packages. The MCP server uses "npmPublish": false because it is
distributed via Docker/MCPB, not npm.
name: Release
on:
push: { branches: [main] }
pull_request: { branches: [main] }
jobs:
test:
runs-on: ubuntu-latest
strategy:
matrix: { node-version: [20, 22] }
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: '${{ matrix.node-version }}'
cache: npm
- run: npm ci && npm run build && npm test
release:
needs: test
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
runs-on: ubuntu-latest
permissions:
contents: write
issues: write
packages: write
steps:
- uses: actions/checkout@v4
with: { fetch-depth: 0 }
- uses: actions/setup-node@v4
with:
node-version: 22
registry-url: 'https://npm.pkg.github.com'
scope: '@org'
- run: npm ci && npm run build
- run: npx semantic-release
env:
GITHUB_TOKEN: '${{ secrets.GITHUB_TOKEN }}'
NODE_AUTH_TOKEN: '${{ secrets.GITHUB_TOKEN }}'
NPM_TOKEN: '${{ secrets.GITHUB_TOKEN }}'CRITICAL: The registry-url and NODE_AUTH_TOKEN env vars tell npm publish to push
to GitHub Packages. GITHUB_TOKEN has packages: write permission, which is sufficient —
no separate npm token needed.
{
"name": "@org/node-{service}",
"publishConfig": {
"registry": "https://npm.pkg.github.com"
},
"repository": {
"type": "git",
"url": "https://github.com/org/node-{service}.git"
}
}The publishConfig.registry field is REQUIRED — without it, npm publish defaults to
the public npm registry. The repository field MUST match the GitHub org/user that owns
the package scope.
{service}-mcp/
├── src/
│ ├── index.ts # Server entry (stdio transport)
│ ├── server.ts # Server setup, tool routing, capabilities
│ ├── worker.ts # Cloudflare Worker entry point
│ ├── http.ts # HTTP streaming transport (Node.js)
│ ├── domains/
│ │ ├── index.ts # Domain registry with lazy loading
│ │ ├── navigation.ts # Decision-tree navigation tools
│ │ ├── devices.ts # Device domain tools
│ │ ├── organizations.ts # Organization domain tools
│ │ ├── alerts.ts # Alert domain tools
│ │ └── tickets.ts # Ticket domain tools
│ ├── elicitation/
│ │ ├── forms.ts # Form elicitation helpers
│ │ └── url.ts # URL elicitation + completion tracking
│ ├── utils/
│ │ ├── client.ts # Lazy client singleton
│ │ ├── logger.ts # Structured stderr logger
│ │ └── types.ts # DomainHandler interface
│ └── __tests__/
│ ├── client.test.ts
│ ├── navigation.test.ts
│ ├── http-transport.test.ts
│ └── domains/
│ ├── devices.test.ts
│ └── organizations.test.ts
├── Dockerfile # Multi-stage build → GHCR
├── docker-compose.yml
├── manifest.json # MCPB manifest for Claude Desktop
├── wrangler.json # Cloudflare Worker config
├── .do/
│ └── app.yaml # DigitalOcean App Platform spec
├── .github/workflows/
│ └── release.yml # Test → Release → GHCR Docker → MCPB
├── .releaserc.json
├── tsup.config.ts
├── vitest.config.ts
├── README.md # Project overview, install, usage
├── CONTRIBUTING.md # How to contribute
├── LICENSE # MIT license
├── CODE_OF_CONDUCT.md # Contributor Covenant
├── CHANGELOG.md # Auto-generated by semantic-release
└── package.json
// src/utils/types.ts
import type { Tool } from '@modelcontextprotocol/sdk/types.js';
export type DomainName = 'devices' | 'organizations' | 'alerts' | 'tickets';
export type CallToolResult = {
content: Array<{ type: 'text'; text: string }>;
isError?: boolean;
};
export interface DomainHandler {
getTools(): Tool[];
handleCall(
toolName: string,
args: Record<string, unknown>,
extra?: RequestHandlerExtra
): Promise<CallToolResult>;
}
// Navigation state tracks which domain the user is exploring
export type NavigationState = {
currentDomain: DomainName | null;
};Every MCP server MUST use decision-tree navigation. This is not optional. The pattern reduces cognitive load when servers expose many tools by presenting a navigate → domain → tools flow. The LLM first sees navigation tools, picks a domain, then sees only that domain's tools.
- Initial state: Server exposes only navigation tools (
{service}_navigate,{service}_status) - Navigate: LLM calls
{service}_navigatewith a domain name - Domain tools: Server now exposes that domain's tools + a
{service}_backtool - Back:
{service}_backreturns to the navigation state
// src/domains/navigation.ts
import type { Tool } from '@modelcontextprotocol/sdk/types.js';
import type { DomainHandler, DomainName, NavigationState } from '../utils/types.js';
// Module-level navigation state (per-process for stdio, per-session for HTTP)
const sessionStates = new Map<string, NavigationState>();
export function getState(sessionId: string = 'default'): NavigationState {
if (!sessionStates.has(sessionId)) {
sessionStates.set(sessionId, { currentDomain: null });
}
return sessionStates.get(sessionId)!;
}
export function getNavigationTools(): Tool[] {
return [
{
name: '{service}_navigate',
description: 'Navigate to a domain to see its available tools. Domains: devices, organizations, alerts, tickets.',
inputSchema: {
type: 'object',
properties: {
domain: {
type: 'string',
enum: ['devices', 'organizations', 'alerts', 'tickets'],
description: 'The domain to navigate to',
},
},
required: ['domain'],
},
},
{
name: '{service}_status',
description: 'Check API connection status and available domains.',
inputSchema: { type: 'object', properties: {} },
},
];
}
export function getBackTool(): Tool {
return {
name: '{service}_back',
description: 'Return to the domain navigation menu.',
inputSchema: { type: 'object', properties: {} },
};
}// src/server.ts
import { Server } from '@modelcontextprotocol/sdk/server/index.js';
import { ListToolsRequestSchema, CallToolRequestSchema } from '@modelcontextprotocol/sdk/types.js';
import { getState, getNavigationTools, getBackTool } from './domains/navigation.js';
import { getDomainHandler } from './domains/index.js';
export function createServer(): Server {
const server = new Server(
{ name: '{service}-mcp', version: '1.0.0' },
{
capabilities: {
tools: {},
logging: {},
},
}
);
// Dynamic tool list based on navigation state
server.setRequestHandler(ListToolsRequestSchema, async (request, extra) => {
const sessionId = extra.sessionId || 'default';
const state = getState(sessionId);
if (!state.currentDomain) {
// Show navigation tools only
return { tools: getNavigationTools() };
}
// Show domain tools + back button
const handler = await getDomainHandler(state.currentDomain);
return { tools: [...handler.getTools(), getBackTool()] };
});
// Route tool calls through the decision tree
server.setRequestHandler(CallToolRequestSchema, async (request, extra) => {
const { name, arguments: args } = request.params;
const sessionId = extra.sessionId || 'default';
const state = getState(sessionId);
// Navigation tools
if (name === '{service}_navigate') {
const domain = args?.domain as DomainName;
state.currentDomain = domain;
const handler = await getDomainHandler(domain);
const tools = handler.getTools().map(t => t.name);
// Notify client that the tool list changed
await server.sendToolListChanged();
return {
content: [{
type: 'text',
text: `Navigated to ${domain}. Available tools: ${tools.join(', ')}`,
}],
};
}
if (name === '{service}_back') {
state.currentDomain = null;
await server.sendToolListChanged();
return {
content: [{ type: 'text', text: 'Returned to domain navigation.' }],
};
}
if (name === '{service}_status') {
const { getCredentials } = await import('./utils/client.js');
const creds = getCredentials();
return {
content: [{
type: 'text',
text: JSON.stringify({
connected: !!creds,
domains: ['devices', 'organizations', 'alerts', 'tickets'],
currentDomain: state.currentDomain,
}, null, 2),
}],
};
}
// Domain tool calls
if (!state.currentDomain) {
return {
content: [{ type: 'text', text: `Unknown tool: ${name}. Use {service}_navigate first.` }],
isError: true,
};
}
const handler = await getDomainHandler(state.currentDomain);
try {
return await handler.handleCall(name, args || {}, extra);
} catch (error) {
logger.error('Tool call failed', { tool: name, error: (error as Error).message });
return {
content: [{ type: 'text', text: `Error: ${(error as Error).message}` }],
isError: true,
};
}
});
return server;
}Each domain file follows this exact structure:
// src/domains/organizations.ts
import type { Tool } from '@modelcontextprotocol/sdk/types.js';
import type { DomainHandler, CallToolResult } from '../utils/types.js';
import type { RequestHandlerExtra } from '@modelcontextprotocol/sdk/shared/protocol.js';
import { getClient } from '../utils/client.js';
import { logger } from '../utils/logger.js';
function getTools(): Tool[] {
return [
{
name: '{service}_organizations_list',
description: 'List organizations. Returns paginated results.',
inputSchema: {
type: 'object',
properties: {
limit: { type: 'number', description: 'Max results (default: 50)' },
cursor: { type: 'string', description: 'Pagination cursor' },
},
},
},
{
name: '{service}_organizations_get',
description: 'Get organization details by ID.',
inputSchema: {
type: 'object',
properties: {
organization_id: { type: 'number', description: 'Organization ID' },
},
required: ['organization_id'],
},
},
];
}
async function handleCall(
toolName: string,
args: Record<string, unknown>,
extra?: RequestHandlerExtra
): Promise<CallToolResult> {
const client = await getClient();
switch (toolName) {
case '{service}_organizations_list': {
const limit = (args.limit as number) || 50;
const cursor = args.cursor as string | undefined;
logger.info('API call: organizations.list', { limit, cursor });
const response = await client.organizations.list({ pageSize: limit, cursor });
// IMPORTANT: Handle both raw array and wrapped object responses
const organizations = Array.isArray(response)
? response
: (response?.organizations ?? []);
const nextCursor = Array.isArray(response) ? undefined : response?.cursor;
return {
content: [{
type: 'text',
text: JSON.stringify({ organizations, cursor: nextCursor }, null, 2),
}],
};
}
case '{service}_organizations_get': {
const orgId = args.organization_id as number;
logger.info('API call: organizations.get', { orgId });
const organization = await client.organizations.get(orgId);
return {
content: [{ type: 'text', text: JSON.stringify(organization, null, 2) }],
};
}
default:
return {
content: [{ type: 'text', text: `Unknown tool: ${toolName}` }],
isError: true,
};
}
}
export const organizationsHandler: DomainHandler = { getTools, handleCall };// src/domains/index.ts
const domainCache = new Map<DomainName, DomainHandler>();
export async function getDomainHandler(domain: DomainName): Promise<DomainHandler> {
const cached = domainCache.get(domain);
if (cached) return cached;
let handler: DomainHandler;
switch (domain) {
case 'devices': {
const { devicesHandler } = await import('./devices.js');
handler = devicesHandler;
break;
}
case 'organizations': {
const { organizationsHandler } = await import('./organizations.js');
handler = organizationsHandler;
break;
}
// ... more domains
default:
throw new Error(`Unknown domain: ${domain}`);
}
domainCache.set(domain, handler);
return handler;
}Every MCP server MUST support elicitation — requesting user input during tool execution. Two modes are available: form (structured input) and URL (sensitive input via browser).
Used to collect structured, non-sensitive data through a schema-driven form rendered by the client. The server pauses tool execution until the user responds.
// src/elicitation/forms.ts
import type { Server } from '@modelcontextprotocol/sdk/server/index.js';
import type { ElicitResult } from '@modelcontextprotocol/sdk/types.js';
/**
* Request structured input from the user via a form.
* Returns the user's response or null if they declined/cancelled.
*/
export async function elicitForm(
server: Server,
message: string,
schema: {
type: 'object';
properties: Record<string, unknown>;
required?: string[];
}
): Promise<Record<string, unknown> | null> {
const result: ElicitResult = await server.elicitInput({
mode: 'form',
message,
requestedSchema: schema,
});
if (result.action === 'accept' && result.content) {
return result.content;
}
return null; // user declined or cancelled
}// Inside a domain handler — collect missing credentials at runtime
async function handleCall(
toolName: string,
args: Record<string, unknown>,
extra?: RequestHandlerExtra
): Promise<CallToolResult> {
const creds = getCredentials();
if (!creds) {
// Elicit credentials from the user
const result = await elicitForm(server, 'API credentials are required. Please provide them:', {
type: 'object',
properties: {
client_id: {
type: 'string',
title: 'Client ID',
description: 'Your API client ID',
},
client_secret: {
type: 'string',
title: 'Client Secret',
description: 'Your API client secret',
},
region: {
type: 'string',
title: 'Region',
oneOf: [
{ const: 'us', title: 'United States' },
{ const: 'eu', title: 'Europe' },
{ const: 'oc', title: 'Oceania' },
],
default: 'us',
},
},
required: ['client_id', 'client_secret'],
});
if (!result) {
return {
content: [{ type: 'text', text: 'Credentials are required to use this tool.' }],
isError: true,
};
}
// Store credentials for this session
process.env.SERVICE_CLIENT_ID = result.client_id as string;
process.env.SERVICE_CLIENT_SECRET = result.client_secret as string;
if (result.region) process.env.SERVICE_REGION = result.region as string;
}
// Proceed with the actual tool call...
}Used when the server needs to redirect the user to a URL (OAuth flow, payment confirmation, etc.) to securely collect sensitive data outside the LLM context.
// src/elicitation/url.ts
import { randomUUID } from 'node:crypto';
import { UrlElicitationRequiredError } from '@modelcontextprotocol/sdk/types.js';
import type { Server } from '@modelcontextprotocol/sdk/server/index.js';
type TrackedElicitation = {
status: 'pending' | 'complete';
completionNotifier: (() => Promise<void>) | undefined;
completeResolver: () => void;
completedPromise: Promise<void>;
createdAt: Date;
sessionId: string;
};
const elicitationsMap = new Map<string, TrackedElicitation>();
/**
* Create a tracked URL elicitation that can be completed out-of-band.
*/
export function createUrlElicitation(
server: Server,
sessionId: string,
baseUrl: string,
path: string,
message: string
): string {
const elicitationId = randomUUID();
let completeResolver!: () => void;
const completedPromise = new Promise<void>(resolve => {
completeResolver = resolve;
});
const completionNotifier = server.createElicitationCompletionNotifier(elicitationId);
elicitationsMap.set(elicitationId, {
status: 'pending',
completedPromise,
completeResolver,
createdAt: new Date(),
sessionId,
completionNotifier,
});
// Throw to signal the client to open the URL
throw new UrlElicitationRequiredError([{
mode: 'url',
message,
url: `${baseUrl}${path}?elicitation=${elicitationId}`,
elicitationId,
}]);
}
/**
* Complete a URL elicitation (called from your HTTP route handler
* after the user submits the form at the URL).
*/
export async function completeUrlElicitation(elicitationId: string): Promise<void> {
const elicitation = elicitationsMap.get(elicitationId);
if (!elicitation) throw new Error(`Unknown elicitation: ${elicitationId}`);
elicitation.status = 'complete';
await elicitation.completionNotifier?.();
elicitation.completeResolver();
}
/**
* Wait for a URL elicitation to complete (blocks tool execution).
*/
export async function waitForElicitation(elicitationId: string): Promise<void> {
const elicitation = elicitationsMap.get(elicitationId);
if (!elicitation) throw new Error(`Unknown elicitation: ${elicitationId}`);
await elicitation.completedPromise;
}| Type | Extra Fields |
|---|---|
string |
minLength, maxLength, format ('date', 'uri', 'email', 'date-time'), enum[], enumNames[], oneOf[]{const, title} |
boolean |
default |
number / integer |
minimum, maximum, default |
array |
items.enum[] or items.anyOf[]{const, title}, minItems, maxItems |
Every MCP server MUST support push notifications — server-initiated messages sent to clients over the SSE stream. These are critical for keeping clients in sync when tools, resources, or server state changes.
// On the Server instance — available everywhere
server.sendLoggingMessage({ level: 'info', data: 'Processing...' }, sessionId);
server.sendToolListChanged(); // After navigation state changes
server.sendResourceListChanged(); // After resource additions/removals
server.sendResourceUpdated({ uri: 'resource://...' }); // Specific resource changed
server.sendPromptListChanged(); // After prompt additions/removals
// Inside a request handler via extra object
extra.sendNotification({
method: 'notifications/progress',
params: {
progressToken: request.params._meta?.progressToken,
progress: 50,
total: 100,
},
});The decision-tree navigation MUST call sendToolListChanged() after every navigation state
change. This ensures the client refreshes its tool list and sees the correct tools for the
current domain. See the {service}_navigate and {service}_back handlers above.
async function handleCall(
toolName: string,
args: Record<string, unknown>,
extra?: RequestHandlerExtra
): Promise<CallToolResult> {
// Send progress updates for long-running operations
const progressToken = extra?.request?.params?._meta?.progressToken;
const items = await fetchAllPages(client, args);
for (let i = 0; i < items.length; i++) {
await processItem(items[i]);
// Push progress notification
if (progressToken && extra?.sendNotification) {
await extra.sendNotification({
method: 'notifications/progress',
params: { progressToken, progress: i + 1, total: items.length },
});
}
}
return { content: [{ type: 'text', text: JSON.stringify(items, null, 2) }] };
}// src/index.ts
import { StdioServerTransport } from '@modelcontextprotocol/sdk/server/stdio.js';
import { createServer } from './server.js';
const server = createServer();
const transport = new StdioServerTransport();
await server.connect(transport);// src/http.ts
import { createServer as createHttpServer } from 'node:http';
import { randomUUID } from 'node:crypto';
import { StreamableHTTPServerTransport } from '@modelcontextprotocol/sdk/server/streamableHttp.js';
import { isInitializeRequest } from '@modelcontextprotocol/sdk/types.js';
import { createServer } from './server.js';
import { getCredentials } from './utils/client.js';
import { logger } from './utils/logger.js';
const transports: Record<string, StreamableHTTPServerTransport> = {};
function startHttpServer(): void {
const port = parseInt(process.env.MCP_HTTP_PORT || '8080', 10);
const host = process.env.MCP_HTTP_HOST || '0.0.0.0';
const isGatewayMode = process.env.AUTH_MODE === 'gateway';
const httpServer = createHttpServer(async (req, res) => {
const url = new URL(req.url || '/', `http://${req.headers.host || 'localhost'}`);
// Health check — unauthenticated
if (url.pathname === '/health') {
const creds = getCredentials();
const statusCode = creds ? 200 : 503;
res.writeHead(statusCode, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({
status: creds ? 'ok' : 'degraded',
transport: 'http',
credentials: { configured: !!creds },
timestamp: new Date().toISOString(),
}));
return;
}
if (url.pathname !== '/mcp') {
res.writeHead(404, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ error: 'Not found', endpoints: ['/mcp', '/health'] }));
return;
}
// Gateway mode: extract credentials from headers
if (isGatewayMode) {
const clientId = req.headers['x-service-client-id'] as string;
const clientSecret = req.headers['x-service-client-secret'] as string;
if (!clientId || !clientSecret) {
res.writeHead(401, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ error: 'Missing credentials' }));
return;
}
process.env.SERVICE_CLIENT_ID = clientId;
process.env.SERVICE_CLIENT_SECRET = clientSecret;
}
const sessionId = req.headers['mcp-session-id'] as string | undefined;
// POST — handle JSON-RPC messages
if (req.method === 'POST') {
const body = await readBody(req);
const parsed = JSON.parse(body);
if (sessionId && transports[sessionId]) {
await transports[sessionId].handleRequest(req, res, parsed);
return;
}
if (!sessionId && isInitializeRequest(parsed)) {
const transport = new StreamableHTTPServerTransport({
sessionIdGenerator: () => randomUUID(),
enableJsonResponse: true,
onsessioninitialized: (sid) => { transports[sid] = transport; },
});
transport.onclose = () => {
const sid = transport.sessionId;
if (sid) delete transports[sid];
};
const server = createServer();
await server.connect(transport);
await transport.handleRequest(req, res, parsed);
return;
}
res.writeHead(400, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({
jsonrpc: '2.0',
error: { code: -32000, message: 'Bad Request: missing or invalid session' },
id: null,
}));
return;
}
// GET — SSE stream for server-initiated notifications
if (req.method === 'GET') {
if (!sessionId || !transports[sessionId]) {
res.writeHead(400, { 'Content-Type': 'text/plain' });
res.end('Invalid or missing session ID');
return;
}
await transports[sessionId].handleRequest(req, res);
return;
}
// DELETE — terminate session
if (req.method === 'DELETE') {
if (!sessionId || !transports[sessionId]) {
res.writeHead(400, { 'Content-Type': 'text/plain' });
res.end('Invalid or missing session ID');
return;
}
await transports[sessionId].handleRequest(req, res);
return;
}
res.writeHead(405).end();
});
httpServer.listen(port, host, () => {
logger.info(`HTTP streaming server listening on ${host}:${port}`);
});
}
function readBody(req: import('node:http').IncomingMessage): Promise<string> {
return new Promise((resolve, reject) => {
const chunks: Buffer[] = [];
req.on('data', (chunk) => chunks.push(chunk));
req.on('end', () => resolve(Buffer.concat(chunks).toString()));
req.on('error', reject);
});
}
// Determine transport from environment
const transport = process.env.MCP_TRANSPORT;
if (transport === 'http') {
startHttpServer();
} else {
// Default: stdio
import('./index.js');
}// src/worker.ts
import { McpServer } from '@modelcontextprotocol/sdk/server/mcp.js';
import { WebStandardStreamableHTTPServerTransport } from '@modelcontextprotocol/sdk/server/webStandardStreamableHttp.js';
import { createServer } from './server.js';
export default {
async fetch(request: Request, env: Env): Promise<Response> {
const url = new URL(request.url);
// Health check
if (url.pathname === '/health') {
return new Response(JSON.stringify({
status: 'ok',
transport: 'cloudflare-worker',
timestamp: new Date().toISOString(),
}), { headers: { 'Content-Type': 'application/json' } });
}
if (url.pathname !== '/mcp') {
return new Response(JSON.stringify({ error: 'Not found' }), {
status: 404,
headers: { 'Content-Type': 'application/json' },
});
}
// Stateless: one transport per request
const transport = new WebStandardStreamableHTTPServerTransport({
sessionIdGenerator: undefined, // stateless
});
// Inject credentials from environment bindings
process.env.SERVICE_CLIENT_ID = env.SERVICE_CLIENT_ID;
process.env.SERVICE_CLIENT_SECRET = env.SERVICE_CLIENT_SECRET;
if (env.SERVICE_REGION) process.env.SERVICE_REGION = env.SERVICE_REGION;
const server = createServer();
await server.connect(transport);
return transport.handleRequest(request);
},
} satisfies ExportedHandler<Env>;
interface Env {
SERVICE_CLIENT_ID: string;
SERVICE_CLIENT_SECRET: string;
SERVICE_REGION?: string;
}// src/utils/logger.ts — ALL output to stderr (stdout = MCP protocol)
const LEVELS = { debug: 0, info: 1, warn: 2, error: 3 } as const;
type LogLevel = keyof typeof LEVELS;
function getConfiguredLevel(): LogLevel {
const env = (process.env.LOG_LEVEL || 'info').toLowerCase();
return (env in LEVELS) ? env as LogLevel : 'info';
}
function log(level: LogLevel, message: string, context?: unknown): void {
if (LEVELS[level] < LEVELS[getConfiguredLevel()]) return;
const timestamp = new Date().toISOString();
const prefix = `${timestamp} [${level.toUpperCase()}]`;
if (context !== undefined) {
let contextStr: string;
try { contextStr = JSON.stringify(context); }
catch { contextStr = String(context); }
console.error(`${prefix} ${message} ${contextStr}`);
} else {
console.error(`${prefix} ${message}`);
}
}
export const logger = {
debug: (msg: string, ctx?: unknown) => log('debug', msg, ctx),
info: (msg: string, ctx?: unknown) => log('info', msg, ctx),
warn: (msg: string, ctx?: unknown) => log('warn', msg, ctx),
error: (msg: string, ctx?: unknown) => log('error', msg, ctx),
};// src/utils/client.ts
import type { ServiceClient } from '@org/node-{service}';
let _client: ServiceClient | null = null;
let _credentials: Credentials | null = null;
export function getCredentials(): Credentials | null {
const clientId = process.env.SERVICE_CLIENT_ID;
const clientSecret = process.env.SERVICE_CLIENT_SECRET;
if (!clientId || !clientSecret) {
logger.warn('Missing credentials', { hasClientId: !!clientId, hasClientSecret: !!clientSecret });
return null;
}
return { clientId, clientSecret, region: process.env.SERVICE_REGION || 'us' };
}
export async function getClient(): Promise<ServiceClient> {
const creds = getCredentials();
if (!creds) throw new Error('No API credentials configured');
// Invalidate cache if credentials changed
if (_client && _credentials &&
(creds.clientId !== _credentials.clientId || creds.clientSecret !== _credentials.clientSecret)) {
_client = null;
}
if (!_client) {
const { ServiceClient } = await import('@org/node-{service}');
_client = new ServiceClient(creds);
_credentials = creds;
logger.info('Created API client', { region: creds.region });
}
return _client;
}# Dockerfile — Multi-stage build
FROM node:22-alpine AS builder
ARG GITHUB_TOKEN
WORKDIR /app
COPY package*.json ./
RUN echo "@org:registry=https://npm.pkg.github.com" > .npmrc && \
echo "//npm.pkg.github.com/:_authToken=${GITHUB_TOKEN}" >> .npmrc && \
npm ci --ignore-scripts && rm -f .npmrc
COPY . .
RUN npm run build
FROM node:22-alpine AS production
RUN addgroup -g 1001 -S appuser && adduser -S appuser -u 1001 -G appuser
WORKDIR /app
COPY package*.json ./
COPY --from=builder /app/dist ./dist
COPY --from=builder /app/node_modules ./node_modules
RUN npm prune --omit=dev && npm cache clean --force
USER appuser
EXPOSE 8080
HEALTHCHECK --interval=30s --timeout=10s --start-period=5s --retries=3 \
CMD wget --no-verbose --tries=1 --spider http://localhost:8080/health || exit 1
ENV NODE_ENV=production MCP_TRANSPORT=http MCP_HTTP_PORT=8080 LOG_LEVEL=info
CMD ["node", "dist/http.js"]# docker-compose.yml
services:
mcp:
build:
context: .
args:
GITHUB_TOKEN: ${GITHUB_TOKEN}
ports:
- "8080:8080"
environment:
- MCP_TRANSPORT=http
- MCP_HTTP_PORT=8080
- AUTH_MODE=${AUTH_MODE:-env}
- SERVICE_CLIENT_ID=${SERVICE_CLIENT_ID}
- SERVICE_CLIENT_SECRET=${SERVICE_CLIENT_SECRET}
- SERVICE_REGION=${SERVICE_REGION:-us}
- LOG_LEVEL=${LOG_LEVEL:-info}
healthcheck:
test: ["CMD", "wget", "--no-verbose", "--tries=1", "--spider", "http://localhost:8080/health"]
interval: 30s
timeout: 10s
retries: 3Every MCP server MUST be packaged as an MCPB bundle for single-click installation in Claude Desktop.
{
"manifest_version": "0.3",
"name": "{service}-mcp",
"display_name": "{Service Name} MCP Server",
"version": "1.0.0",
"description": "MCP server for {Service Name} — manage devices, organizations, tickets, and more.",
"long_description": "Provides Claude with tools to interact with the {Service Name} API...",
"author": {
"name": "Your Name",
"url": "https://github.com/yourorg"
},
"repository": {
"type": "git",
"url": "https://github.com/yourorg/{service}-mcp"
},
"license": "MIT",
"keywords": ["{service}", "mcp"],
"icon": "icon.png",
"server": {
"type": "node",
"entry_point": "dist/index.js",
"mcp_config": {
"command": "node",
"args": ["${__dirname}/dist/index.js"],
"env": {
"SERVICE_CLIENT_ID": "${user_config.client_id}",
"SERVICE_CLIENT_SECRET": "${user_config.client_secret}",
"SERVICE_REGION": "${user_config.region}"
}
}
},
"tools_generated": true,
"user_config": {
"client_id": {
"type": "string",
"title": "Client ID",
"description": "Your API client ID",
"required": true
},
"client_secret": {
"type": "string",
"title": "Client Secret",
"description": "Your API client secret",
"required": true,
"sensitive": true
},
"region": {
"type": "string",
"title": "Region",
"description": "API region (us, eu, oc)",
"required": false
}
},
"compatibility": {
"platforms": ["darwin", "win32", "linux"],
"runtimes": { "node": ">=18.0.0" },
"claude_desktop": ">=0.10.0"
}
}{
"scripts": {
"build": "tsup",
"mcpb:pack": "npm run build && npm install --production && mcpb pack"
},
"devDependencies": {
"@anthropic-ai/mcpb": "^2.1.0"
}
}The mcpb pack command creates a {service}-mcp.mcpb ZIP file containing manifest.json,
dist/, node_modules/ (production only), and package.json.
name: {service}-mcp
region: nyc
services:
- name: {service}-mcp
github:
repo: yourorg/{service}-mcp
branch: main
deploy_on_push: true
dockerfile_path: Dockerfile
http_port: 8080
instance_count: 1
instance_size_slug: basic-xxs
routes:
- path: /
health_check:
http_path: /health
initial_delay_seconds: 10
period_seconds: 30
timeout_seconds: 5
envs:
- key: MCP_TRANSPORT
value: http
- key: AUTH_MODE
value: gateway
- key: SERVICE_CLIENT_ID
type: SECRET
scope: RUN_TIME
- key: SERVICE_CLIENT_SECRET
type: SECRET
scope: RUN_TIME
- key: SERVICE_REGION
value: us
- key: LOG_LEVEL
value: infodoctl apps create --spec .do/app.yaml
# Set secrets
doctl apps update <app-id> --spec .do/app.yaml{
"name": "{service}-mcp",
"main": "dist/worker.js",
"compatibility_date": "2026-02-01",
"node_compat": true
}// tsup.config.ts
import { defineConfig } from 'tsup';
export default defineConfig([
// Node.js builds (stdio + HTTP)
{
entry: { index: 'src/index.ts', http: 'src/http.ts' },
format: ['esm'],
target: 'node22',
outDir: 'dist',
clean: true,
dts: true,
sourcemap: true,
},
// Cloudflare Worker build
{
entry: { worker: 'src/worker.ts' },
format: ['esm'],
target: 'esnext',
platform: 'browser', // Cloudflare Worker runtime
outDir: 'dist',
clean: false, // Don't clean — append to existing dist
noExternal: [/.*/], // Bundle all dependencies
sourcemap: true,
},
]);# Set secrets
wrangler secret put SERVICE_CLIENT_ID
wrangler secret put SERVICE_CLIENT_SECRET
# Deploy
wrangler deploy# .github/workflows/release.yml
name: Release
on:
push: { branches: [main] }
pull_request: { branches: [main] }
jobs:
test:
runs-on: ubuntu-latest
strategy:
matrix: { node-version: [20, 22] }
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with: { node-version: '${{ matrix.node-version }}', cache: npm }
- run: npm ci && npm run build && npm test
release:
needs: test
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
runs-on: ubuntu-latest
permissions: { contents: write, issues: write, packages: write }
outputs:
version: ${{ steps.version.outputs.version }}
steps:
- uses: actions/checkout@v4
with: { fetch-depth: 0 }
- run: npx semantic-release
env: { GITHUB_TOKEN: '${{ secrets.GITHUB_TOKEN }}' }
- id: version
run: echo "version=$(node -p "require('./package.json').version")" >> $GITHUB_OUTPUT
docker:
needs: [test, release] # MUST depend on release to avoid version race
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
runs-on: ubuntu-latest
permissions: { contents: read, packages: write }
steps:
- uses: actions/checkout@v4
with: { ref: main } # Get version-bumped commit
- uses: docker/login-action@v3
with: { registry: ghcr.io, username: '${{ github.actor }}', password: '${{ secrets.GITHUB_TOKEN }}' }
- uses: docker/build-push-action@v6
with:
push: true
build-args: GITHUB_TOKEN=${{ secrets.GITHUB_TOKEN }}
tags: |
ghcr.io/${{ github.repository }}:latest
ghcr.io/${{ github.repository }}:v${{ needs.release.outputs.version }}
mcpb:
needs: [test, release]
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
runs-on: ubuntu-latest
permissions: { contents: write }
steps:
- uses: actions/checkout@v4
with: { ref: main }
- uses: actions/setup-node@v4
with: { node-version: 22, cache: npm }
- run: npm ci && npm run build && npm install --production
- run: npx mcpb pack
- uses: softprops/action-gh-release@v2
with:
tag_name: v${{ needs.release.outputs.version }}
files: '*.mcpb'
cloudflare:
needs: [test, release]
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
with: { ref: main }
- uses: actions/setup-node@v4
with: { node-version: 22, cache: npm }
- run: npm ci && npm run build
- run: npx wrangler deploy
env: { CLOUDFLARE_API_TOKEN: '${{ secrets.CLOUDFLARE_API_TOKEN }}' }{
"branches": ["main"],
"plugins": [
"@semantic-release/commit-analyzer",
"@semantic-release/release-notes-generator",
"@semantic-release/changelog",
["@semantic-release/npm", { "npmPublish": false }],
["@semantic-release/git", {
"assets": ["CHANGELOG.md", "package.json"],
"message": "chore(release): ${nextRelease.version} [skip ci]"
}],
"@semantic-release/github"
]
}{
"name": "{service}-mcp",
"version": "1.0.0",
"type": "module",
"main": "./dist/index.js",
"scripts": {
"build": "tsup",
"dev": "tsup --watch",
"start": "node dist/index.js",
"start:http": "MCP_TRANSPORT=http node dist/http.js",
"test": "vitest run",
"test:watch": "vitest",
"lint": "eslint src/",
"typecheck": "tsc --noEmit",
"mcpb:pack": "npm run build && npm install --production && mcpb pack",
"cf:deploy": "npm run build && wrangler deploy",
"cf:dev": "npm run build && wrangler dev"
},
"dependencies": {
"@modelcontextprotocol/sdk": "^1.27.0",
"@org/node-{service}": "^1.0.0"
},
"devDependencies": {
"@anthropic-ai/mcpb": "^2.1.0",
"tsup": "^8.0.0",
"typescript": "^5.5.0",
"vitest": "^2.0.0",
"msw": "^2.4.0",
"wrangler": "^4.0.0"
},
"engines": { "node": ">=18.0.0" }
}Every project (both the client library and the MCP server) MUST include standard open source documentation files. These are scaffolded at project creation time.
# {Project Name}
{One-line description of what this project does.}
## Installation
\`\`\`bash
npm install @org/{package-name}
\`\`\`
For GitHub Packages authentication, add to your `.npmrc`:
\`\`\`
@org:registry=https://npm.pkg.github.com
//npm.pkg.github.com/:_authToken=YOUR_GITHUB_TOKEN
\`\`\`
## Usage
\`\`\`typescript
import { ServiceClient } from '@org/{package-name}';
const client = new ServiceClient({
clientId: process.env.SERVICE_CLIENT_ID,
clientSecret: process.env.SERVICE_CLIENT_SECRET,
});
const devices = await client.devices.list();
\`\`\`
## API Reference
See the [API documentation](docs/api.md) for full details.
## Development
\`\`\`bash
git clone https://github.com/org/{repo-name}.git
cd {repo-name}
npm install
npm run build
npm test
\`\`\`
## Contributing
See [CONTRIBUTING.md](CONTRIBUTING.md) for guidelines.
## License
[MIT](LICENSE)Adapt per project: The MCP server README should include Docker usage, MCPB installation, and Claude Desktop configuration examples instead of the npm install section.
# Contributing to {Project Name}
Thank you for your interest in contributing!
## Getting Started
1. Fork the repository
2. Clone your fork: `git clone https://github.com/YOUR_USERNAME/{repo-name}.git`
3. Create a branch: `git checkout -b feature/your-feature`
4. Install dependencies: `npm install`
5. Make your changes
6. Run tests: `npm test`
7. Run linting: `npm run lint`
8. Commit using [Conventional Commits](https://www.conventionalcommits.org/):
- `feat: add new feature`
- `fix: resolve bug`
- `docs: update documentation`
- `chore: maintenance tasks`
9. Push and open a Pull Request
## Development Setup
\`\`\`bash
npm install # Install dependencies
npm run build # Build the project
npm test # Run tests
npm run lint # Run linter
npm run typecheck # Type check
\`\`\`
## Commit Convention
This project uses [Conventional Commits](https://www.conventionalcommits.org/) and
[semantic-release](https://semantic-release.gitbook.io/) for automated versioning.
Your commit messages determine the next version number:
- `fix:` → patch release (1.0.x)
- `feat:` → minor release (1.x.0)
- `feat!:` or `BREAKING CHANGE:` → major release (x.0.0)
## Pull Request Guidelines
- Keep PRs focused on a single change
- Include tests for new functionality
- Ensure all CI checks pass
- Update documentation if needed
## Reporting Issues
Use [GitHub Issues](https://github.com/org/{repo-name}/issues) to report bugs or
request features. Include reproduction steps for bugs.MIT License
Copyright (c) {year} {org or author name}
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
Use the Contributor Covenant v2.1.
Do not write it inline — instead, scaffold it at project creation time:
npx covgen YOUR_EMAIL@example.comOr download directly:
curl -sL https://www.contributor-covenant.org/version/2/1/code_of_conduct/code_of_conduct.md \
-o CODE_OF_CONDUCT.mdDo not write manually. semantic-release generates and maintains CHANGELOG.md
automatically via the @semantic-release/changelog plugin. Initialize with:
# Changelog
All notable changes to this project will be documented in this file.
See [Conventional Commits](https://conventionalcommits.org) for commit guidelines.All logging MUST go to stderr (console.error). Any console.log will corrupt the
JSON-RPC stream on stdio transport. Use the structured logger pattern above.
Many APIs return raw arrays for list endpoints, not wrapped objects. Always handle both:
const items = Array.isArray(response) ? response : (response?.items ?? []);If you skip this, JSON.stringify({items: undefined}) silently produces {}.
Never response.json() in try + response.text() in catch. Read .text() first,
then JSON.parse(). See the fetch-response-body-double-read skill.
Docker job MUST depend on Release job, not just Test. Otherwise Docker reads
package.json before semantic-release bumps the version. See the
ci-docker-semantic-release-version-race skill.
Use {service}_{domain}_{action} (e.g., ninjaone_devices_list). This prevents
collisions when multiple MCP servers are loaded simultaneously in Claude.
Every server MUST use the navigate → domain → tools flow. This is NOT optional.
Call server.sendToolListChanged() after every navigation state change.
Check credentials early (before exposing domain tools) to give clear error messages rather than failing deep inside an API call. Use elicitation to collect missing credentials.
The Worker entry point uses sessionIdGenerator: undefined (stateless mode). Each
request creates a fresh transport + server. State like navigation must be passed in
request context or stored in KV/Durable Objects for stateful Workers.
Form and URL elicitation only work if the client supports them. Always have a fallback (environment variables, error messages) for clients that don't support elicitation.
Cloudflare Workers MUST use WebStandardStreamableHTTPServerTransport (not the Node.js
StreamableHTTPServerTransport). Import from @modelcontextprotocol/sdk/server/webStandardStreamableHttp.js.
-
npm run build— TypeScript compiles cleanly (all 3 targets: index, http, worker) -
npm test— All tests pass -
npm run lint— No lint errors - stdio transport:
echo '{"jsonrpc":"2.0","method":"tools/list","id":1}' | node dist/index.js - HTTP transport:
MCP_TRANSPORT=http node dist/http.js→curl localhost:8080/health - Decision tree:
tools/listreturns only navigation tools initially - Navigate: calling
{service}_navigatechangestools/listresponse to domain tools - Back: calling
{service}_backrestores navigation tools - Push notifications:
sendToolListChanged()fires on navigate/back - Elicitation: credential form appears when credentials are missing
- Docker:
docker compose up→ health check passes - MCPB:
npm run mcpb:packproduces valid.mcpbfile - Cloudflare:
npm run cf:dev→curl localhost:8787/health -
LOG_LEVEL=debugshows API calls and responses in stderr - Missing credentials returns clear error or triggers elicitation
- GitHub Packages: Client library
.releaserc.jsonhas"npmPublish": true - GitHub Packages: Client library
package.jsonhaspublishConfig.registryset tohttps://npm.pkg.github.com - GitHub Packages: Client library CI workflow uses
registry-url: 'https://npm.pkg.github.com' - GHCR: MCP server CI pushes Docker images to
ghcr.io/${{ github.repository }} - OSS Docs: README.md exists with install, usage, contributing, and license sections
- OSS Docs: CONTRIBUTING.md exists with dev setup, commit convention, and PR guidelines
- OSS Docs: LICENSE file exists (MIT)
- OSS Docs: CODE_OF_CONDUCT.md exists (Contributor Covenant v2.1)
- OSS Docs: CHANGELOG.md is auto-generated by semantic-release
- Keep the MCP server thin — business logic belongs in the client library
- Use
vi.mock()to mock the client library in MCP server tests - For APIs with non-standard auth (API keys vs OAuth), adapt the auth.ts pattern
- Consider adding a
{service}_statustool that shows connection health and credential state - MCPB uses
@anthropic-ai/mcpb(v2.1+) — install globally or use npx - DO App Platform uses
basic-xxsinstance size by default — scale as needed - Cloudflare Worker builds must bundle ALL dependencies (
noExternal: [/.*/]) - GitHub Packages requires the
repositoryfield inpackage.jsonto match the GitHub org that owns the@orgscope GITHUB_TOKENwithpackages: writepermission is sufficient for both npm publish and Docker push — no separate tokens needed- Use
npx covgenor download Contributor Covenant directly — do not copy-paste the full text into the skill