OpenClaw + GitHub Copilot GPT-5.4 Technical Fix Guide Date: 2026-03-07 Overview This guide documents how to make github-copilot/gpt-5.4 work inside OpenClaw when the model already works in OpenCode bu…
This guide documents how to make github-copilot/gpt-5.4 work inside OpenClaw when the model already works in OpenCode but fails in OpenClaw.
The final solution requires both:
a config fix in ~/.openclaw/openclaw.json
a runtime patch in the installed OpenClaw bundle
This is necessary because the problem is not only model registration. It is also an OpenClaw transport-routing issue for GitHub Copilot Responses API traffic.
HTTP400: model "gpt-5.4"is not accessible via the /chat/completions endpoint
Symptom 5: gateway instability
gateway disconnected:closed|idle
Root Cause Analysis
There are four distinct problems.
1. Model config and allowlist mismatch
OpenClaw could see the provider, but github-copilot/gpt-5.4 was not fully wired into the active model config path used by the agent defaults.
2. Missing GitHub Copilot IDE headers
GitHub Copilot requires IDE-style headers for auth. OpenClaw was sending requests through a generic OpenAI-compatible path, so required headers were not included.
gpt-5.4 must use the Responses API, not /chat/completions.
So this is wrong for gpt-5.4:
"api":"openai-completions"
This is required instead:
"api":"openai-responses"
4. OpenClaw transport routing only handled openai, not github-copilot
Even after changing gpt-5.4 to openai-responses, OpenClaw still fell back to the generic stream path because its embedded runner only activated the Responses transport for provider openai.
That caused OpenClaw to keep hitting /chat/completions for GitHub Copilot GPT-5.4.
missing Editor-Version header forIDE auth model "gpt-5.4" is not accessible via the /chat/completions endpoint No API provider registered forapi: github-copilot
Reapply After OpenClaw Updates
Because the runtime fix patches the installed OpenClaw bundle, upgrades or reinstalls may overwrite it.
Why not switch the whole provider to api: "github-copilot"?
That looked tempting, but OpenClaw's runtime path did not have a compatible registered streaming provider for that mode in this setup, which caused runtime/provider registration failures.
Why not keep GPT-5.4 on openai-completions?
Because GitHub Copilot GPT-5.4 is not accessible on /chat/completions. It must go through the Responses API.
Why did OpenCode work earlier?
OpenCode already handled the GitHub Copilot transport path correctly, including the required Copilot headers and the proper API mode, while OpenClaw needed both config and runtime fixes.
Recommended Maintenance Notes
Keep this guide with the reapply script path documented nearby
After any OpenClaw upgrade, rerun the patch script
If OpenClaw changes its bundle file name, update the script path target accordingly
If GitHub Copilot changes required IDE header versions, update both the runtime patch and reapply script
Quick Recovery Commands
node ~/.openclaw/workspace/ken-patchs/reapply-openclaw-copilot-gpt54-patches.mjs openclaw gateway restart openclaw status
Final State
With the config fix and runtime patches in place, github-copilot/gpt-5.4 works in OpenClaw and the gateway remains stable.