Integration Guide

Dify + AIPower

Use AIPower as an OpenAI-compatible provider in Dify, then run GPT, Claude, Gemini, DeepSeek, Qwen, GLM, Kimi, and Doubao through one account.

Customer support app

Route common questions to DeepSeek/Qwen, escalate complex cases to Claude/GPT/Gemini.

Internal knowledge base

Use one Dify workspace with shared model access for China and global teams.

E-commerce pre-sales bot

Answer product, shipping, refund, and policy questions with low-cost starter models.

Team prototype

Give product and ops teams one billing layer instead of separate provider accounts.

Setup in Dify

  1. Open your Dify workspace and go to SettingsModel Providers.
  2. Select OpenAI or an OpenAI-compatible provider configuration.
  3. Paste your AIPower key and set the API base URL:

    API Base URL: https://api.aipower.me/v1

    API Key: your-aipower-key

  4. Start with deepseek/deepseek-chat, qwen/qwen-plus, or zhipu/glm-4-flash for trial-friendly workflows.
  5. After your first top-up, add GPT, Claude, Gemini, Kimi, MiniMax, and premium routing models to the same Dify workspace.

Dify's model provider docs describe provider setup and OpenAI custom base URL configuration. AIPower supplies the OpenAI-compatible endpoint and billing layer.

Recommended model routing for Dify apps

Dify jobAIPower modelWhy
Support / FAQ botdeepseek/deepseek-chatGood default for customer support, routing, and bilingual answers.
Chinese customer serviceqwen/qwen-plusStrong Chinese answers and practical cost control.
Workflow classificationzhipu/glm-4-flashVery low-cost intent routing, tags, and short structured outputs.
Complex troubleshootinganthropic/claude-sonnetUse after top-up for long replies, coding, and deeper analysis.
Long documentsgoogle/gemini-2.5-flashUse for large docs, policies, and long-context workflows.

Why this works well for Dify teams

Power your next Dify app with AIPower

10 trial calls. One OpenAI-compatible endpoint. China + global models.

Get API key