azure-proxy
Enable Azure OpenAI integration with OpenClaw via a lightweight local proxy. Use when configuring Azure OpenAI as a model provider, when encountering 404 errors with Azure OpenAI in OpenClaw, or when needing to use Azure credits (e.g. Visual Studio subscription) with OpenClaw subagents. Solves the api-version query parameter issue that prevents direct Azure OpenAI integration.
Packaged view
This page reorganizes the original catalog entry around fit, installability, and workflow context first. The original raw source lives below.
Install command
npx @skill-hub/cli install openclaw-skills-azure-proxy
Repository
Skill path: skills/benediktschackenberg/azure-proxy
Enable Azure OpenAI integration with OpenClaw via a lightweight local proxy. Use when configuring Azure OpenAI as a model provider, when encountering 404 errors with Azure OpenAI in OpenClaw, or when needing to use Azure credits (e.g. Visual Studio subscription) with OpenClaw subagents. Solves the api-version query parameter issue that prevents direct Azure OpenAI integration.
Open repositoryBest for
Primary workflow: Ship Full Stack.
Technical facets: Full Stack, Backend, Designer, Integration.
Target audience: everyone.
License: Unknown.
Original source
Catalog source: SkillHub Club.
Repository owner: openclaw.
This is still a mirrored public skill entry. Review the repository before installing into production workflows.
What it helps with
- Install azure-proxy into Claude Code, Codex CLI, Gemini CLI, or OpenCode workflows
- Review https://github.com/openclaw/skills before adding azure-proxy to shared team environments
- Use azure-proxy for development workflows
Works across
Favorites: 0.
Sub-skills: 0.
Aggregator: No.
Original source / Raw SKILL.md
---
name: azure-proxy
description: Enable Azure OpenAI integration with OpenClaw via a lightweight local proxy. Use when configuring Azure OpenAI as a model provider, when encountering 404 errors with Azure OpenAI in OpenClaw, or when needing to use Azure credits (e.g. Visual Studio subscription) with OpenClaw subagents. Solves the api-version query parameter issue that prevents direct Azure OpenAI integration.
---
# Azure OpenAI Proxy for OpenClaw
A lightweight Node.js proxy that bridges Azure OpenAI with OpenClaw.
## The Problem
OpenClaw constructs API URLs like this:
```javascript
const endpoint = `${baseUrl}/chat/completions`;
```
Azure OpenAI requires:
```
https://{resource}.openai.azure.com/openai/deployments/{model}/chat/completions?api-version=2025-01-01-preview
```
When `api-version` is in the baseUrl, OpenClaw's path append breaks it.
## Quick Setup
### 1. Configure and Run the Proxy
```bash
# Set your Azure details
export AZURE_OPENAI_ENDPOINT="your-resource.openai.azure.com"
export AZURE_OPENAI_DEPLOYMENT="gpt-4o"
export AZURE_OPENAI_API_VERSION="2025-01-01-preview"
# Run the proxy
node scripts/server.js
```
### 2. Configure OpenClaw Provider
Add to `~/.openclaw/openclaw.json`:
```json
{
"models": {
"providers": {
"azure-gpt4o": {
"baseUrl": "http://127.0.0.1:18790",
"apiKey": "YOUR_AZURE_API_KEY",
"api": "openai-completions",
"authHeader": false,
"headers": {
"api-key": "YOUR_AZURE_API_KEY"
},
"models": [
{ "id": "gpt-4o", "name": "GPT-4o (Azure)" }
]
}
}
},
"agents": {
"defaults": {
"models": {
"azure-gpt4o/gpt-4o": {}
}
}
}
}
```
**Important:** Set `authHeader: false` — Azure uses `api-key` header, not Bearer tokens.
### 3. (Optional) Use for Subagents
Save Azure credits by routing automated tasks through Azure:
```json
{
"agents": {
"defaults": {
"subagents": {
"model": "azure-gpt4o/gpt-4o"
}
}
}
}
```
## Run as systemd Service
Copy the template and configure:
```bash
mkdir -p ~/.config/systemd/user
cp scripts/azure-proxy.service ~/.config/systemd/user/
# Edit the service file with your Azure details
nano ~/.config/systemd/user/azure-proxy.service
# Enable and start
systemctl --user daemon-reload
systemctl --user enable azure-proxy
systemctl --user start azure-proxy
```
## Environment Variables
| Variable | Default | Description |
|----------|---------|-------------|
| `AZURE_PROXY_PORT` | `18790` | Local proxy port |
| `AZURE_PROXY_BIND` | `127.0.0.1` | Bind address |
| `AZURE_OPENAI_ENDPOINT` | — | Azure resource hostname |
| `AZURE_OPENAI_DEPLOYMENT` | `gpt-4o` | Deployment name |
| `AZURE_OPENAI_API_VERSION` | `2025-01-01-preview` | API version |
## Health Check
```bash
curl http://localhost:18790/health
# {"status":"ok","deployment":"gpt-4o"}
```
## Troubleshooting
**404 Resource not found:** Check endpoint hostname and deployment name match Azure Portal.
**401 Unauthorized:** API key is wrong or expired.
**Content Filter Errors:** Azure has aggressive content filtering — some prompts that work on OpenAI may get blocked.
---
## Referenced Files
> The following files are referenced in this skill and included for context.
### scripts/server.js
```javascript
#!/usr/bin/env node
/**
* Azure OpenAI Proxy for OpenClaw
*
* Bridges Azure OpenAI with OpenClaw by handling the api-version
* query parameter that OpenClaw's URL construction doesn't support.
*
* @author Clawfinger (github.com/BenediktSchackenberg)
* @license MIT
*/
const http = require('http');
const https = require('https');
// ═══════════════════════════════════════════════════════════════
// CONFIGURATION
// ═══════════════════════════════════════════════════════════════
const config = {
port: parseInt(process.env.AZURE_PROXY_PORT || '18790'),
bind: process.env.AZURE_PROXY_BIND || '127.0.0.1',
azure: {
endpoint: process.env.AZURE_OPENAI_ENDPOINT || 'YOUR_RESOURCE.openai.azure.com',
deployment: process.env.AZURE_OPENAI_DEPLOYMENT || 'gpt-4o',
apiVersion: process.env.AZURE_OPENAI_API_VERSION || '2025-01-01-preview'
}
};
// ═══════════════════════════════════════════════════════════════
// PROXY SERVER
// ═══════════════════════════════════════════════════════════════
const server = http.createServer((req, res) => {
// Health check
if (req.method === 'GET' && req.url === '/health') {
res.writeHead(200, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ status: 'ok', deployment: config.azure.deployment }));
return;
}
// Only POST /chat/completions
if (req.method !== 'POST' || !req.url.includes('/chat/completions')) {
res.writeHead(404, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ error: 'Not found', hint: 'POST /chat/completions only' }));
return;
}
let body = '';
req.on('data', chunk => { body += chunk; });
req.on('end', () => {
const azurePath = `/openai/deployments/${config.azure.deployment}/chat/completions?api-version=${config.azure.apiVersion}`;
const apiKey = req.headers['api-key'] || req.headers['authorization']?.replace('Bearer ', '') || '';
const options = {
hostname: config.azure.endpoint,
port: 443,
path: azurePath,
method: 'POST',
headers: {
'Content-Type': 'application/json',
'api-key': apiKey,
'Content-Length': Buffer.byteLength(body)
}
};
const ts = new Date().toISOString();
console.log(`[${ts}] → ${config.azure.deployment}`);
const proxyReq = https.request(options, (proxyRes) => {
console.log(`[${ts}] ${proxyRes.statusCode >= 400 ? '✗' : '✓'} ${proxyRes.statusCode}`);
res.writeHead(proxyRes.statusCode, proxyRes.headers);
proxyRes.pipe(res);
});
proxyReq.on('error', (err) => {
console.error(`[${ts}] ✗ ${err.message}`);
res.writeHead(502, { 'Content-Type': 'application/json' });
res.end(JSON.stringify({ error: 'Bad Gateway', message: err.message }));
});
proxyReq.write(body);
proxyReq.end();
});
});
// ═══════════════════════════════════════════════════════════════
// STARTUP
// ═══════════════════════════════════════════════════════════════
server.listen(config.port, config.bind, () => {
console.log(`
╔══════════════════════════════════════════════════════════╗
║ 🚀 Azure OpenAI Proxy for OpenClaw ║
╠══════════════════════════════════════════════════════════╣
║ Proxy: http://${config.bind}:${config.port}`.padEnd(59) + `║
║ Deployment: ${config.azure.deployment}`.padEnd(59) + `║
║ API Ver: ${config.azure.apiVersion}`.padEnd(59) + `║
║ Target: ${config.azure.endpoint}`.padEnd(59) + `║
╚══════════════════════════════════════════════════════════╝
`);
});
process.on('SIGTERM', () => { server.close(() => process.exit(0)); });
process.on('SIGINT', () => { server.close(() => process.exit(0)); });
```
---
## Skill Companion Files
> Additional files collected from the skill directory layout.
### README.md
```markdown
# Azure OpenAI Proxy for OpenClaw
A lightweight Node.js proxy that enables Azure OpenAI integration with OpenClaw.
## Why This Exists
OpenClaw constructs API URLs by appending `/chat/completions` to the provider's `baseUrl`. Azure OpenAI requires a `?api-version=YYYY-MM-DD` query parameter at the end of the URL. When you put query params in the baseUrl, OpenClaw's path append breaks it.
This proxy sits between OpenClaw and Azure, forwarding requests with the correct URL structure.
## Installation
### Via ClawHub (Recommended)
```bash
clawhub install BenediktSchackenberg/openclaw-tools/azure-proxy
```
### Manual
Clone this repo and copy `azure-proxy/` to your skills directory.
## Quick Start
See [SKILL.md](./SKILL.md) for complete setup instructions.
```bash
export AZURE_OPENAI_ENDPOINT="your-resource.openai.azure.com"
export AZURE_OPENAI_DEPLOYMENT="gpt-4o"
node scripts/server.js
```
Then configure OpenClaw to point to `http://127.0.0.1:18790`.
## License
MIT
```
### _meta.json
```json
{
"owner": "benediktschackenberg",
"slug": "azure-proxy",
"displayName": "Azure OpenAI Proxy",
"latest": {
"version": "1.0.0",
"publishedAt": 1770388661226,
"commit": "https://github.com/openclaw/skills/commit/84d08636e7e4dbb3ee35e670347b7a1335084744"
},
"history": []
}
```