| name | description | user-invocable | disable-model-invocation | allowed-tools | argument-hint |
|---|---|---|---|---|---|
consult |
Send project context and a question to GPT-5.2 Pro for an external consultation. Gathers docs, code, and context automatically, submits via OpenAI Responses API in background mode, and notifies when the response arrives. Use /consult-results to read the response. |
true |
true |
Read, Glob, Grep, Bash(uv run *) |
<your question or request for the consultant> |
You are helping the user send a consultation request to GPT-5.2 Pro. Follow these steps exactly:
The user's consultation prompt is: $ARGUMENTS
Automatically detect and read relevant project files. Use your judgement based on the user's prompt:
- Always include: README, CLAUDE.md, docs/ directory contents, package.json / pyproject.toml / Cargo.toml (whatever exists)
- If about security/architecture/design: Include all docs, config files, infrastructure files (terraform, CDK, docker), API route definitions
- If about code quality/patterns: Include key source files, test files, linting configs
- If about specific files: The user may mention files — include those plus their imports/dependencies
If the user's prompt references specific files or directories, prioritize those.
Read the files using the Read tool. Concatenate all gathered context into a single block.
Once you have gathered all relevant context, write it to a temporary file and invoke the consult script.
Create a file at /tmp/consult-input-{timestamp}.md with this format:
# Consultation Request
## Project Summary
{Write a 3-8 sentence summary of the project based on the context files you read. Cover: what the project is, what it does, who it's for, key tech choices, and current state. This gives the consultant essential context before reading the detailed files.}
## User's Question
{the user's prompt from $ARGUMENTS}
## Project Context
### {filename1}
{file contents}
### {filename2}
{file contents}
... (all gathered files)
Then run:
uv run ~/.claude/skills/consult/consult.py /tmp/consult-input-{timestamp}.mdThis submits the consultation in background mode and returns immediately.
Tell the user:
- What context you gathered (list the files)
- That the consultation has been submitted to GPT-5.2 Pro
- That they can continue working and use
/consult-resultswhen they see the notification that the response has arrived