Use one OpenAI-compatible endpoint for Python, Node.js, curl, n8n, Dify, and Open WebUI.
Use Python, Node.js, or curl and only change base URL plus API key.
Use AIPower in n8n, Dify, and Open WebUI with OpenAI-compatible settings.
Route common questions to DeepSeek/Qwen and escalate complex cases to Claude/GPT/Gemini.
from openai import OpenAI
client = OpenAI(
base_url="https://api.aipower.me/v1",
api_key="YOUR_AIPOWER_KEY",
)
response = client.chat.completions.create(
model="deepseek/deepseek-chat",
messages=[
{"role": "user", "content": "Summarize this customer support ticket in 3 bullets."}
],
)
print(response.choices[0].message.content)import OpenAI from "openai";
const client = new OpenAI({
baseURL: "https://api.aipower.me/v1",
apiKey: process.env.AIPOWER_API_KEY,
});
const response = await client.chat.completions.create({
model: "qwen/qwen-plus",
messages: [
{ role: "user", content: "Write a Chinese customer-service reply for a shipping delay." },
],
});
console.log(response.choices[0].message.content);curl https://api.aipower.me/v1/chat/completions \
-H "Authorization: Bearer YOUR_AIPOWER_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "deepseek/deepseek-chat",
"messages": [
{"role": "user", "content": "Classify this lead as hot, warm, or cold."}
],
"temperature": 0.2
}'{
"model": "deepseek/deepseek-chat",
"messages": [
{
"role": "system",
"content": "You classify support tickets for an operations team."
},
{
"role": "user",
"content": "={{ $json.message }}"
}
],
"temperature": 0.2
}API Base URL: https://api.aipower.me/v1 API Key: your-aipower-key Starter model: deepseek/deepseek-chat Chinese model: qwen/qwen-plus Low-cost routing model: zhipu/glm-4-flash Premium escalation model: anthropic/claude-sonnet
10 trial calls. One OpenAI-compatible endpoint. China + global models.
Get API key