Integration Guide

LangChain + AIPower

Access 16 AI models through LangChain with one API key. GPT-5.4, Claude, DeepSeek, Qwen, GLM, Kimi, Doubao — all work with langchain-openai.

Get free API key

Setup (2 minutes)

1. Install packages

pip install langchain langchain-openai

2. Use ChatOpenAI with AIPower base URL

from langchain_openai import ChatOpenAI

# Point LangChain to AIPower
llm = ChatOpenAI(
    base_url="https://api.aipower.me/v1",
    api_key="YOUR_AIPOWER_KEY",
    model="deepseek/deepseek-chat",  # or auto, gpt-5.4, claude-sonnet, etc.
)

# Use like any LangChain LLM
response = llm.invoke("Explain quantum computing")
print(response.content)

3. Use any of 16 models

# Try different models — just change the model ID
models = [
    "auto",                          # Smart routing (recommended)
    "deepseek/deepseek-chat",        # Best value
    "anthropic/claude-sonnet",       # Best code
    "openai/gpt-5.4",                # Latest GPT
    "google/gemini-2.5-pro",         # 1M context
    "qwen/qwen-plus",                # Multilingual
    "zhipu/glm-4-flash",             # Nearly free ($0.01/M)
]

for m in models:
    llm = ChatOpenAI(
        base_url="https://api.aipower.me/v1",
        api_key="YOUR_AIPOWER_KEY",
        model=m,
    )
    print(f"{m}: {llm.invoke('What is 2+2?').content}")

LangChain chains, agents, and RAG

AIPower works with all LangChain features — chains, agents, retrievers, memory, tool calling.

Example: RAG with Chinese AI

from langchain_openai import ChatOpenAI
from langchain.chains import RetrievalQA
from langchain_community.vectorstores import Chroma

# Use DeepSeek V3 for RAG — rivals GPT-4o quality at 1/10th the cost
llm = ChatOpenAI(
    base_url="https://api.aipower.me/v1",
    api_key="YOUR_AIPOWER_KEY",
    model="deepseek/deepseek-chat",
)

# Your vector store (assume documents loaded)
retriever = Chroma(...).as_retriever()

qa = RetrievalQA.from_chain_type(
    llm=llm,
    chain_type="stuff",
    retriever=retriever,
)

answer = qa.invoke({"query": "What's in the docs about embeddings?"})
print(answer["result"])

Example: Agent with tools

from langchain_openai import ChatOpenAI
from langchain.agents import create_openai_tools_agent, AgentExecutor
from langchain_core.prompts import ChatPromptTemplate

# Use Claude Sonnet for agentic tasks — best tool-use capabilities
llm = ChatOpenAI(
    base_url="https://api.aipower.me/v1",
    api_key="YOUR_AIPOWER_KEY",
    model="anthropic/claude-sonnet",
)

prompt = ChatPromptTemplate.from_messages([
    ("system", "You are a helpful assistant with access to tools."),
    ("human", "{input}"),
    ("placeholder", "{agent_scratchpad}"),
])

agent = create_openai_tools_agent(llm, tools, prompt)
executor = AgentExecutor(agent=agent, tools=tools)
executor.invoke({"input": "What's the weather in Beijing?"})

Why use AIPower with LangChain?

Start building with LangChain + AIPower

50 free API calls. No credit card. Works with all LangChain features.

Get free API key