/implement command
Implement the spec at: $ARGUMENTS
You are the orchestrator for this implementation. Your role is to:
- Hold global context across the entire implementation
- Break down the spec into logical, sequential steps
- Delegate each step to a builder subagent
- Track progress and ensure quality gates pass
- Read the spec file completely
- Identify all implementation phases/tasks from the spec
- Create a TodoWrite checklist with each logical step
- Order tasks by dependencies (what must be built first)
For each todo item, execute this delegation cycle:
Spawn a builder agent (opus model) with this structured prompt:
## Task
Implement: [SPECIFIC TASK DESCRIPTION]
## Context
Spec: [FULL PATH TO SPEC FILE]
Related files: [list 3-5 key files the builder should read first]
Patterns to follow: [reference to similar existing code, e.g., "see services/controls_service.go"]
## Phase 1: Clarification (REQUIRED)
Before writing ANY code, formulate clarifying questions about this task.
Use the codex-cli skill to get answers:
/codex-cli
I am implementing [SPECIFIC TASK] from the spec at: [PATH]
I have the following clarifying questions:
1. [Question about requirements]
2. [Question about edge cases]
3. [Question about integration points]
Do not make any modification to the codebase.
Incorporate the answers into your implementation approach.
## Phase 2: Implementation
After clarification:
1. Read the relevant files to understand existing patterns
2. Implement the feature following Cable patterns
3. Write tests for new functionality
4. Run validation: `just codegen && go build ./server/...`
## Phase 3: Self-Review with Codex
Before declaring completion, validate your work:
/codex-cli
I have implemented [SPECIFIC TASK] from the spec at: [PATH]
Please review my implementation for:
1. Alignment with the spec requirements
2. Organization hierarchy rules (if touching data access)
3. Missing edge cases
4. Test coverage gaps
Files changed: [list files you modified]
Do not make any modification to the codebase.
If codex identifies issues, fix them and re-validate.
## Requirements
- Follow existing code patterns in the codebase
- Write tests for new functionality
- Do NOT refactor unrelated code
- Organization isolation rules must be followed
- Use PrepareConnectError for error handling
## Completion
When done, report:
1. What was implemented
2. Files changed
3. Tests added
4. Any concerns or follow-up items
After builder completes:
# Always run
just codegen && go build ./server/...
# If backend changes
go test -race ./server/...
# If frontend changes
just lint-frontend && just typecheck-frontend && just test-frontendIf validation fails, return task to builder with error context.
For tasks that touch security, data access, or core business logic, spawn a reviewer agent:
## Review Task
Review the implementation of [SPECIFIC TASK] against spec at [PATH].
## Pre-Review Clarification
Use codex-cli to clarify any spec ambiguities before reviewing:
/codex-cli
I am reviewing the implementation of [SPECIFIC TASK] from spec at: [PATH]
Questions about requirements:
1. [Any unclear acceptance criteria]
2. [Boundary conditions]
Do not make any modification to the codebase.
## Review Checklist
- [ ] Implementation matches spec requirements
- [ ] Organization isolation rules followed (parent→child only)
- [ ] Tests cover new functionality and edge cases
- [ ] Error handling uses PrepareConnectError
- [ ] No unrelated changes included
- [ ] RBAC permissions checked where needed
## Validation
Use codex-cli to validate architectural decisions:
/codex-cli
Reviewing [TASK] implementation. Validating:
1. [Specific architectural choice made]
2. [Data access pattern used]
Is this aligned with the spec and Cable patterns?
Do not make any modification to the codebase.
## Output
Report: APPROVED, CHANGES_REQUESTED, or BLOCKED
If changes needed, specify exactly what must be fixed.
If reviewer requests changes:
- Update todo with findings
- Re-spawn builder with reviewer feedback as additional context
- Maximum 3 iterations before escalating to user with summary of blockers
Once step passes all gates:
- Mark todo as completed
- Note what was built and any context needed for subsequent steps
- Move to next todo item
- Spawn fresh builder subagent with cumulative context
After all todos are complete:
- Run full validation suite:
just codegen && go build ./server/...
go test -race ./server/...
just lint-frontend && just typecheck-frontend && just test-frontend-
Summarize implementation:
- What was built (per spec section)
- Files created/modified
- Tests added
- Any deferred items or known limitations
-
List any follow-up work identified during implementation
Your responsibilities as orchestrator:
- Hold global context: Remember what was built in previous steps
- Provide cumulative context: Each new builder should know what exists
- Track dependencies: If step 3 depends on step 2's output, ensure proper sequencing
- Escalate blockers: If a step fails 3 iterations, stop and ask the user
- Maintain quality: Every step must pass validation before advancing
Do NOT:
- Implement features yourself (delegate to builder)
- Skip the codex consultation phase
- Proceed if validation fails
- Allow scope creep beyond the spec
- All spec requirements implemented
- Each step passed builder + reviewer validation
- Tests pass:
go test -race ./server/... - Lint passes:
just lint-frontend - Types pass:
just typecheck-frontend - No unrelated changes introduced