Skip to content

Instantly share code, notes, and snippets.

@antenore
Last active January 14, 2026 14:02
Show Gist options
  • Select an option

  • Save antenore/c529e055e45559579b08b4961b517f8c to your computer and use it in GitHub Desktop.

Select an option

Save antenore/c529e055e45559579b08b4961b517f8c to your computer and use it in GitHub Desktop.
Configure OpenAI Codex CLI to use DeepSeek models

Configure OpenAI Codex CLI with DeepSeek Support

⚠️ No Longer Working (January 2026): This configuration was working when originally created, but is now broken due to Codex deprecating the Chat Completions API (wire_api = "chat") in favor of OpenAI's Responses API. Since DeepSeek only supports Chat Completions, this integration no longer works reliably. Tool calls fail with message format errors.

What Happened

  • Before: DeepSeek worked with Codex using wire_api = "chat"
  • Now: Codex is deprecating wire_api = "chat", and the code path has bugs that won't be fixed
  • Result: Tool calls fail with errors like "insufficient tool messages following tool_calls message"

Original Configuration (was working)

~/.codex/config.toml

# Default model - OpenAI
model = "gpt-5.1-codex"

# DeepSeek provider definitions
[model_providers.deepseek]
name = "DeepSeek"
base_url = "https://api.deepseek.com/v1"
env_key = "DEEPSEEK_API_KEY"
wire_api = "chat"

[model_providers.deepseek-reasoner]
name = "DeepSeek Reasoner"
base_url = "https://api.deepseek.com/v1"
env_key = "DEEPSEEK_API_KEY"
wire_api = "chat"

# Profile for DeepSeek Chat (V3.2 non-thinking)
[profiles.deepseek]
model = "deepseek-chat"
model_provider = "deepseek"

# Profile for DeepSeek Reasoner (V3.2 thinking mode)
[profiles.deepseek-r1]
model = "deepseek-reasoner"
model_provider = "deepseek-reasoner"

Usage (when it was working)

# Default - uses OpenAI gpt-5.1-codex
codex

# Switch to DeepSeek V3.2 Chat
codex --profile deepseek
codex -p deepseek

# Switch to DeepSeek V3.2 Reasoner
codex --profile deepseek-r1
codex -p deepseek-r1

Setup

  1. Set your DeepSeek API key:
export DEEPSEEK_API_KEY="your-deepseek-api-key"
  1. Add to your shell profile (~/.bashrc, ~/.zshrc, etc.):
export DEEPSEEK_API_KEY="your-deepseek-api-key"
  1. Place the config in ~/.codex/config.toml

Current Errors (January 2026)

# Deprecation warning
Support for the "chat" wire API is deprecated and will soon be removed.

# Tool call error
An assistant message with 'tool_calls' must be followed by tool messages 
responding to each 'tool_call_id'. (insufficient tool messages following 
tool_calls message)

# Reasoner error
Missing `reasoning_content` field in the assistant message

Alternatives

Since Codex + DeepSeek no longer works, consider:

  • OpenAI models with Codex - fully supported
  • Aider - native DeepSeek support
  • Continue - native DeepSeek support
  • DeepSeek API directly - full control

References

@andrewbell73
Copy link

Looks like many of us are seeing the DeepSeek API down — I’ve experienced the same. Maybe we can follow a systematic troubleshooting guide whenever the API goes down:

  1. First check the official status page to confirm whether the downtime is server‑side or just on our end.
  2. status.
  3. Verify your API key — ensure it’s active, correctly typed, and hasn’t expired.
  4. Check your network: unstable internet, VPN, firewall, or DNS issues can also cause failures.
  5. Review request payload: invalid JSON, wrong parameters or too large requests may cause errors.

If everything above looks correct but it’s still down, wait a while — sometimes the service may be overloaded or under maintenance (others have reported similar outages).

Optionally, implement fallback logic or retry mechanism in your code so your application can handle temporary downtime gracefully or follow step by steps instructions as given on deepseek api down related blog.

If others have additional tips — logging, rate‑limit checks or alternate endpoints — sharing them here might help the community avoid hours of frustration 🙂

@antenore
Copy link
Author

Looks like many of us are seeing the DeepSeek API down — I’ve experienced the same. Maybe we can follow a systematic troubleshooting guide whenever the API goes down:

  1. First check the official status page to confirm whether the downtime is server‑side or just on our end.
  2. status.
  3. Verify your API key — ensure it’s active, correctly typed, and hasn’t expired.
  4. Check your network: unstable internet, VPN, firewall, or DNS issues can also cause failures.
  5. Review request payload: invalid JSON, wrong parameters or too large requests may cause errors.

If everything above looks correct but it’s still down, wait a while — sometimes the service may be overloaded or under maintenance (others have reported similar outages).

Optionally, implement fallback logic or retry mechanism in your code so your application can handle temporary downtime gracefully or follow step by steps instructions as given on deepseek api down related blog.

If others have additional tips — logging, rate‑limit checks or alternate endpoints — sharing them here might help the community avoid hours of frustration 🙂

Nice, thanks for pointing this out...
Didn't check if this is possible, can you propose a patch to the gist?

Otherwise I'll check later

@megtog
Copy link

megtog commented Dec 18, 2025

seeing this error

{"error":{"message":"Missing `reasoning_content` field in the assistant message at message index 7. For more information, please refer to https://
api-docs.deepseek.com/guides/thinking_mode#tool-calls","type":"invalid_request_error","param":null,"code":"invalid_request_error"}}

@CharlieBytesX
Copy link

Same as @megtog

@yyy1mu
Copy link

yyy1mu commented Jan 4, 2026

Same as @megtog . so which one could run?

@antenore
Copy link
Author

I've updated the gist @megtog @CharlieBytesX @yyy1mu
There was a deprecation from OpenAI side, let me know if it works for you.

Also, now the config is cleaner

@antenore
Copy link
Author

My bad @megtog @CharlieBytesX @yyy1mu
It seems Codex is not compatible anymore with DeepSeek... We might submit a bug to OpenAI maybe...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment