Skip to main content
Prerequisite: You need an AI Gateway endpoint before using these SDK guides. Create one using the dashboard quickstart or follow the manual setup guide.
The AI Gateway works with official and third-party SDKs. Simply change the base URL configuration option for your SDK and you’re connected.

Supported SDKs

Quick start

The pattern is the same for any SDK—just change the base URL:
from openai import OpenAI

client = OpenAI(
    base_url="https://your-ai-gateway.ngrok.app/v1",
    api_key="your-api-key"
)

What works through the gateway

Everything your SDK supports works through the gateway:
FeatureSupported
Chat Completions API
Messages API✅ (for Anthropic SDKs)
Responses API
Streaming
Function/tool calling
Embeddings
Async clients
Retries✅ (enhanced by gateway)

Gateway benefits

When you route SDK requests through the AI Gateway:
  • Automatic failover - If one provider fails, the gateway tries another
  • Key rotation - Use multiple provider API keys to avoid rate limits
  • Provider switching - Change providers without changing code
  • Observability - Track usage, latency, and errors across all requests

Using different providers

Use model prefixes to route to specific providers:
# OpenAI
client.chat.completions.create(model="openai:gpt-4o", ...)

# Anthropic
client.chat.completions.create(model="anthropic:claude-3-5-sonnet-latest", ...)

# Your self-hosted Ollama
client.chat.completions.create(model="ollama:llama3.2", ...)
Or let the gateway choose with ngrok/auto:
client.chat.completions.create(model="ngrok/auto", ...)

Next steps

Choose your SDK guide to get started: