Most provider flags work with both Claude Code and Codex CLI. Each flag entry notes which runtimes support it. See Runtimes for the full compatibility matrix.
Available Providers
Local Providers (Free)
Ollama - Run models locally or on Ollama’s cloudShort form: Requirements:
--olRuntimes: Claude Code, Codex CLISetup:- Ollama installed and running at
http://localhost:11434 - 24GB+ VRAM for local coding models (cloud models work on any hardware)
LM Studio - Local models with MLX support (fast on Apple Silicon)Short form:
--lmRuntimes: Claude Code, Codex CLISetup:- Download LM Studio from lmstudio.ai
- Load a model in the UI
- Start server:
lms server start --port 1234
- LM Studio running at
http://localhost:1234 - 24GB+ unified memory for coding models
Cloud Providers
AWS Bedrock - Claude models on Amazon Web ServicesRuntimes: Claude Code onlySetup in Authentication:
~/.ai-runner/secrets.sh:- AWS credentials file (
~/.aws/credentials) - Or IAM role (for EC2/ECS)
Google Vertex AI - Claude models on Google Cloud PlatformRuntimes: Claude Code onlySetup in Authentication:
~/.ai-runner/secrets.sh:- Application default credentials (
gcloud auth application-default login) - Or service account key file
Anthropic API - Direct access to Anthropic’s APIRuntimes: Claude Code (Anthropic API), Codex CLI (forces OpenAI API key auth)Setup in Authentication:
~/.ai-runner/secrets.sh:- API key from console.anthropic.com
Microsoft Azure — Claude Code: Azure Foundry (Anthropic models) / Codex CLI: Azure OpenAI (GPT models)Runtimes: Claude Code, Codex CLI (different services)Setup in Authentication:
~/.ai-runner/secrets.sh:- Azure Foundry API key
Vercel AI Gateway - Access 100+ models (OpenAI, xAI, Google, Meta, and more)Runtimes: Claude Code onlySetup in Authentication:
~/.ai-runner/secrets.sh:- Vercel AI Gateway token from vercel.com/ai-gateway
- OpenAI (GPT-4, GPT-5, etc.)
- xAI (Grok models)
- Google (Gemini models)
- Meta (Llama models)
- Anthropic (Claude models)
- And many more
Claude Pro/Max - Your regular Claude subscriptionRuntimes: Claude Code onlyAuthentication:
- Claude subscription (log in with
claude)
Provider Precedence
When no provider is specified, AI Runner auto-detects in this order:- Saved default from
--set-default - Claude subscription (if logged in with
claude) - Ollama (if running locally)
- Anthropic API (if
ANTHROPIC_API_KEYis configured) - Cloud providers (AWS, Vertex, Vercel, Azure — if configured)
Examples
Basic Usage
Combining Providers and Models
Interactive Sessions
Resume Conversations
Shebang Scripts
Set Default Provider
Provider Status
Check which providers are configured:Multiple Providers Workflow
Switch between providers based on your needs:Provider-Gated Flags
Some passthrough flags only work with specific providers:Related Pages
Model Flags
Select model tiers and specific model IDs
Provider Guides
Detailed setup for each provider