Use AIPower as an OpenAI-compatible provider in Dify, then run GPT, Claude, Gemini, DeepSeek, Qwen, GLM, Kimi, and Doubao through one account.
Route common questions to DeepSeek/Qwen, escalate complex cases to Claude/GPT/Gemini.
Use one Dify workspace with shared model access for China and global teams.
Answer product, shipping, refund, and policy questions with low-cost starter models.
Give product and ops teams one billing layer instead of separate provider accounts.
API Base URL: https://api.aipower.me/v1
API Key: your-aipower-key
deepseek/deepseek-chat, qwen/qwen-plus, or zhipu/glm-4-flash for trial-friendly workflows.Dify's model provider docs describe provider setup and OpenAI custom base URL configuration. AIPower supplies the OpenAI-compatible endpoint and billing layer.
| Dify job | AIPower model | Why |
|---|---|---|
| Support / FAQ bot | deepseek/deepseek-chat | Good default for customer support, routing, and bilingual answers. |
| Chinese customer service | qwen/qwen-plus | Strong Chinese answers and practical cost control. |
| Workflow classification | zhipu/glm-4-flash | Very low-cost intent routing, tags, and short structured outputs. |
| Complex troubleshooting | anthropic/claude-sonnet | Use after top-up for long replies, coding, and deeper analysis. |
| Long documents | google/gemini-2.5-flash | Use for large docs, policies, and long-context workflows. |
10 trial calls. One OpenAI-compatible endpoint. China + global models.
Get API key