Skip to content

Instantly share code, notes, and snippets.

@decagondev
Created February 18, 2026 18:42
Show Gist options
  • Select an option

  • Save decagondev/8cc9f57da90b7f99d7807e9b81d1a1bb to your computer and use it in GitHub Desktop.

Select an option

Save decagondev/8cc9f57da90b7f99d7807e9b81d1a1bb to your computer and use it in GitHub Desktop.

Backend Audit Report -- CollabBoard MVP

Date: 2026-02-18 Scope: backend/ directory, deployment configs, database schema Purpose: Evaluate backend readiness against the CollabBoard MVP requirements (G4 Week 1)


Architecture Summary

The backend is a Node.js 20 (ESM) Express + Socket.IO server backed by Supabase (PostgreSQL + Auth). AI commands use Anthropic Claude with tool calling. Caching is in-memory (no external Redis dependency).

backend/
├── src/
│   ├── config/
│   │   ├── supabase.js        ← Supabase client + all DB operations
│   │   └── redis.js           ← In-memory cache (Redis removed)
│   ├── middleware/
│   │   ├── auth.js            ← JWT auth, board access, RBAC
│   │   └── errorHandler.js    ← AppError class, asyncHandler
│   ├── routes/
│   │   ├── auth.js            ← Register, verify, get current user
│   │   └── boards.js          ← CRUD boards, collaborators, objects
│   ├── services/
│   │   └── aiAgent.js         ← Anthropic Claude tool-calling agent
│   ├── utils/
│   │   └── logger.js          ← Winston logger
│   ├── websocket/
│   │   └── socketHandler.js   ← Socket.IO event handlers
│   └── server.js              ← Entry point, Express + Socket.IO setup
├── Dockerfile                 ← Multi-stage production build
├── .dockerignore
├── .env.example
└── package.json

MVP Requirements Checklist

MVP Requirement Backend Support Status
Infinite board with pan/zoom Frontend concern; backend stores/syncs objects N/A
Sticky notes with editable text Object CRUD via WebSocket + REST PARTIAL (see Issue 1)
At least one shape type Rectangle, circle, arrow, text supported PASS
Create, move, edit objects create_object, update_object, delete_object events PASS
Real-time sync between 2+ users Socket.IO rooms per board, broadcasts on mutation PASS
Multiplayer cursors with name labels cursor_move / cursor_moved events with userId + email PASS
Presence awareness user_joined, user_left, active_users events PASS
User authentication Supabase Auth with JWT verification PASS
Deployed and publicly accessible Dockerfile + render.yaml present READY (see Issues)

What Is Working Well

1. Authentication and Authorization

Files: src/middleware/auth.js, src/config/supabase.js (lines 273-285)

The auth middleware chain is well-structured with four layers:

  • authenticate -- requires valid JWT
  • optionalAuth -- attaches user if token present, continues either way
  • checkBoardAccess -- verifies ownership, collaboration, or public board
  • requireRole -- enforces a minimum role level (viewer < editor < admin < owner)

Token verification uses supabase.auth.getUser(token) which is the correct server-side approach. The frontend auth context matches (Supabase Auth with email/password and Google OAuth).

2. WebSocket Real-Time Infrastructure

File: src/websocket/socketHandler.js

All core real-time events are implemented:

  • Board join/leave with room management
  • Object CRUD with broadcast to all room members
  • Cursor position broadcasting
  • Active user tracking with counts
  • AI command handling with response broadcast
  • Disconnect cleanup (removes user from tracking, notifies room)

Socket authentication extracts the JWT from handshake auth or headers and verifies it. Unauthenticated sockets can still connect but are blocked from AI features.

3. AI Agent

File: src/services/aiAgent.js

  • Uses Anthropic tool calling with 7 defined tools
  • AI-created objects are persisted to Supabase and broadcast via Socket.IO
  • Locking prevents concurrent AI operations on the same board
  • AI commands are logged to the ai_commands table with execution time
  • Meets the 6+ command types requirement from the spec

4. Database Schema

File: supabase-schema.sql

Comprehensive and well-designed:

  • 5 tables: boards, board_objects, board_collaborators, ai_commands, users
  • Proper foreign keys with ON DELETE CASCADE
  • Performance indexes on all lookup columns
  • RLS policies for every table with appropriate access rules
  • updated_at triggers
  • Auto user-profile creation trigger on signup

5. Dockerfile

File: backend/Dockerfile

Production-quality:

  • Multi-stage build (builder + production)
  • node:20-alpine base
  • Production-only dependencies (npm ci --only=production)
  • Non-root user (nodejs:1001)
  • Health check configured
  • Exposes port 8080

6. Error Handling

File: src/middleware/errorHandler.js

  • Custom AppError class with status codes and operational flag
  • asyncHandler wrapper catches rejected promises and forwards to error middleware
  • Structured error logging with different severity for 4xx vs 5xx
  • Stack trace only exposed in development mode

Critical Bugs

BUG 1: GET /api/boards uses wrong database API

File: src/routes/boards.js, lines 36-73 Severity: CRASH -- this route will throw a runtime error

The "list all boards for current user" route uses a database API that does not exist on the Supabase client:

const ownedBoards = await supabaseConfig.db
  .collection('boards')
  .where('ownerId', '==', userId)
  .orderBy('updatedAt', 'desc')
  .get();

supabaseConfig has no .db property, no .collection() method, and no .where() chaining. This syntax belongs to a different database SDK.

What to do: Rewrite this route to use the Supabase query builder pattern that is already used everywhere else in the codebase. Look at how getBoard(), getObjects(), and other methods in src/config/supabase.js work -- they all use this.adminClient.from('table').select('*').eq('column', value). You likely need to:

  1. Add a getBoardsByOwner(userId) method to SupabaseConfig in src/config/supabase.js
  2. Add a getBoardsByCollaborator(userId) method to the same file
  3. Replace the Firestore-style code in the route with calls to these new methods

BUG 2: docker-compose.yml references wrong environment variables

File: docker-compose.yml (root) Severity: BROKEN -- local Docker development will not work

The compose file sets environment variables that the backend code does not read. The backend expects SUPABASE_URL, SUPABASE_ANON_KEY, and SUPABASE_SERVICE_ROLE_KEY (see src/config/supabase.js lines 13-15 and .env.example).

What to do: Update the environment block in docker-compose.yml for both services to match what .env.example defines. The backend needs Supabase vars; the frontend needs VITE_SUPABASE_URL and VITE_SUPABASE_ANON_KEY.

BUG 3: Stale deployment config in infrastructure/render/render.yaml

File: infrastructure/render/render.yaml Severity: DEPLOYMENT TRAP -- deploying with the wrong config file

There are two render.yaml files:

  • render.yaml (root) -- correct, uses Supabase env vars
  • infrastructure/render/render.yaml -- stale, references wrong env vars and provisions a Redis service the code no longer uses

What to do: Delete infrastructure/render/render.yaml or update it to match the root render.yaml. Render auto-detects render.yaml at the repo root, so the root file is the one that matters.


Issues That Need Attention

ISSUE 1: No sticky note type in AI agent tools

File: src/services/aiAgent.js Impact: MVP spec requires "Sticky notes with editable text" and the tool schema lists createStickyNote(text, x, y, color) as a minimum tool.

The AI agent defines create_rectangle, create_circle, create_text, and create_arrow -- but no create_sticky_note. Sticky notes are a core MVP deliverable.

What to do: Add a create_sticky_note tool to BOARD_TOOLS with parameters text, x, y, color, and width/height. In executeToolCall, map it to a createShape('sticky_note', ...) call. The frontend will need a corresponding renderer for the sticky_note object type.

ISSUE 2: AI agent does not handle multi-turn tool use

File: src/services/aiAgent.js, processToolCalls function (line 176) Impact: Complex commands like "Create a SWOT analysis" will only partially execute.

When the Anthropic API returns a response with stop_reason: "tool_use", the caller is expected to send tool results back in a follow-up message so the model can continue. The current code processes the first batch of tool calls but never sends results back. This means:

  • Single-step commands (create one rectangle) work fine
  • Multi-step commands (create a 4-quadrant layout) will only create the first batch of objects

What to do: After processing tool calls, check response.stop_reason. If it is "tool_use", build a new messages array that includes the assistant response and tool results, then call anthropic.messages.create(...) again. Repeat until stop_reason is "end_turn". Be careful to set a maximum iteration count (e.g., 10) to prevent infinite loops.

ISSUE 3: Logger writes to filesystem on platforms with ephemeral storage

File: src/utils/logger.js Impact: On Railway and Render, log files are lost on every deploy and may cause write errors.

The logger only switches to stdout-only when K_SERVICE is set (Google Cloud Run). On Railway or Render, it will try to write to logs/error.log, logs/combined.log, etc.

What to do: Expand the production check at line 57 to include Railway and Render:

if (process.env.NODE_ENV === 'production') {
  // Use stdout-only on any cloud platform
  logger.clear();
  logger.add(new winston.transports.Console({ ... }));
}

ISSUE 4: joi validation is declared but never used

File: package.json (dependency), all route files Impact: No input validation on request bodies beyond basic truthy checks.

joi is listed as a production dependency but is never imported in any file. Route handlers like POST /api/auth/register only check if (!email || !password) without validating email format, password strength, or field types.

What to do: Either add Joi validation schemas to the routes (recommended) or remove joi from package.json to reduce bundle size. For MVP, at minimum validate email format and password length on registration.

ISSUE 5: In-memory cache limitations

File: src/config/redis.js Impact: Acceptable for single-instance MVP deployment; will not work for horizontal scaling.

The InMemoryCache class replaces Redis. This means:

  • Cache is lost on restart/redeploy
  • AI command locking only works within a single process
  • If you ever scale to multiple instances, concurrent AI commands on the same board could conflict

What to do: For MVP on a single Render/Railway instance, this is fine. Document the limitation. If you scale later, add Redis (Render offers a free Redis plan, Railway has Redis as a plugin).

ISSUE 6: No tests

Impact: jest and supertest are in devDependencies and a test script exists, but there are zero test files.

What to do: At minimum, add integration tests for the health endpoint, auth routes, and board CRUD. The supertest dependency is already available. Even a small test suite demonstrates code quality.

ISSUE 7: Missing createFrame and createConnector AI tools

File: src/services/aiAgent.js Impact: The spec's "Tool Schema (Minimum)" section lists createFrame(title, x, y, width, height) and createConnector(fromId, toId, style). These are missing.

What to do: Add both tools following the same pattern as existing tools. Frames and connectors are explicitly in the minimum tool schema in the spec.


Deployment Readiness

Render (Recommended -- simplest path)

The root render.yaml is correctly configured for Docker deployment with Supabase. To deploy:

  1. Fix BUG 1 (boards route) -- the app will crash on this route otherwise
  2. Fix ISSUE 3 (logger) -- or the app may error on missing log directory
  3. Set these env vars in the Render dashboard:
    • SUPABASE_URL
    • SUPABASE_ANON_KEY
    • SUPABASE_SERVICE_ROLE_KEY
    • ANTHROPIC_API_KEY
    • ALLOWED_ORIGINS (set to your deployed frontend URL)
  4. Connect your GitHub repo and deploy

WebSocket note: Render free tier supports WebSockets. The Socket.IO config in server.js (25s ping interval) will keep connections alive. Render has a 15-minute idle timeout, but active WebSocket pings prevent that.

Railway (Alternative)

The infrastructure/railway/railway.toml needs the dockerfilePath adjusted to backend/Dockerfile (relative to repo root), or configure Railway to use the backend/ directory as the root. Env vars must be set in the Railway dashboard.

Docker Compose (Local Development)

Currently broken due to BUG 2. After fixing the environment variables, Docker Compose provides a good local dev experience with hot reload via volume mounts.


Priority Summary

Must Fix Before Deployment (Blockers)

  1. BUG 1: Rewrite GET /api/boards to use Supabase queries
  2. ISSUE 3: Fix logger for non-GCP cloud platforms
  3. BUG 3: Remove stale render config to avoid deployment confusion

Should Fix for MVP Compliance

  1. ISSUE 1: Add createStickyNote AI tool (core MVP requirement)
  2. ISSUE 2: Add multi-turn AI tool use for complex commands
  3. ISSUE 7: Add createFrame and createConnector AI tools

Nice to Have

  1. BUG 2: Fix docker-compose.yml for local dev
  2. ISSUE 4: Add Joi validation or remove unused dependency
  3. ISSUE 6: Add basic test coverage
  4. ISSUE 5: Document in-memory cache limitation

Code Quality Notes

Strengths:

  • Clean modular structure (config, middleware, routes, services, utils, websocket)
  • Consistent error handling with asyncHandler and AppError
  • Role-based access control is well-implemented
  • Supabase RLS policies add a second layer of security
  • Dockerfile follows production best practices
  • Winston logger is well-configured (aside from the platform detection issue)

Areas for improvement:

  • Missing JSDoc/documentation on exported functions
  • No input validation middleware despite having joi installed
  • Single-line comments used in several files instead of structured documentation
  • No test coverage
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment