โ† Back to DevOps & Cloud
DevOps & Cloud by @benediktschackenberg

azure-proxy

Enable Azure OpenAI integration with OpenClaw

0
Source Code

Azure OpenAI Proxy for OpenClaw

A lightweight Node.js proxy that bridges Azure OpenAI with OpenClaw.

The Problem

OpenClaw constructs API URLs like this:

const endpoint = `${baseUrl}/chat/completions`;

Azure OpenAI requires:

https://{resource}.openai.azure.com/openai/deployments/{model}/chat/completions?api-version=2025-01-01-preview

When api-version is in the baseUrl, OpenClaw's path append breaks it.

Quick Setup

1. Configure and Run the Proxy

# Set your Azure details
export AZURE_OPENAI_ENDPOINT="your-resource.openai.azure.com"
export AZURE_OPENAI_DEPLOYMENT="gpt-4o"
export AZURE_OPENAI_API_VERSION="2025-01-01-preview"

# Run the proxy
node scripts/server.js

2. Configure OpenClaw Provider

Add to ~/.openclaw/openclaw.json:

{
  "models": {
    "providers": {
      "azure-gpt4o": {
        "baseUrl": "http://127.0.0.1:18790",
        "apiKey": "YOUR_AZURE_API_KEY",
        "api": "openai-completions",
        "authHeader": false,
        "headers": {
          "api-key": "YOUR_AZURE_API_KEY"
        },
        "models": [
          { "id": "gpt-4o", "name": "GPT-4o (Azure)" }
        ]
      }
    }
  },
  "agents": {
    "defaults": {
      "models": {
        "azure-gpt4o/gpt-4o": {}
      }
    }
  }
}

Important: Set authHeader: false โ€” Azure uses api-key header, not Bearer tokens.

3. (Optional) Use for Subagents

Save Azure credits by routing automated tasks through Azure:

{
  "agents": {
    "defaults": {
      "subagents": {
        "model": "azure-gpt4o/gpt-4o"
      }
    }
  }
}

Run as systemd Service

Copy the template and configure:

mkdir -p ~/.config/systemd/user
cp scripts/azure-proxy.service ~/.config/systemd/user/

# Edit the service file with your Azure details
nano ~/.config/systemd/user/azure-proxy.service

# Enable and start
systemctl --user daemon-reload
systemctl --user enable azure-proxy
systemctl --user start azure-proxy

Environment Variables

Variable Default Description
AZURE_PROXY_PORT 18790 Local proxy port
AZURE_PROXY_BIND 127.0.0.1 Bind address
AZURE_OPENAI_ENDPOINT โ€” Azure resource hostname
AZURE_OPENAI_DEPLOYMENT gpt-4o Deployment name
AZURE_OPENAI_API_VERSION 2025-01-01-preview API version

Health Check

curl http://localhost:18790/health
# {"status":"ok","deployment":"gpt-4o"}

Troubleshooting

404 Resource not found: Check endpoint hostname and deployment name match Azure Portal.

401 Unauthorized: API key is wrong or expired.

Content Filter Errors: Azure has aggressive content filtering โ€” some prompts that work on OpenAI may get blocked.