Cloud providers offer powerful AI models through APIs, with pay-as-you-go pricing and no local hardware requirements.
AWS Bedrock
AWS Bedrock provides Claude models through Amazon’s infrastructure.
Configuration
Add to ~/.ai-runner/secrets.sh:
export AWS_PROFILE = "your-profile-name"
export AWS_REGION = "us-west-2"
Authentication Methods
AWS Bedrock supports three authentication methods:
1. AWS Bedrock API Key (Recommended)
export AWS_BEARER_TOKEN_BEDROCK = "your-bedrock-api-key"
export AWS_REGION = "us-west-2"
2. AWS Access Keys
export AWS_ACCESS_KEY_ID = "your_access_key"
export AWS_SECRET_ACCESS_KEY = "your_secret_key"
export AWS_SESSION_TOKEN = "your_session_token" # Optional
export AWS_REGION = "us-west-2"
3. AWS Profile
export AWS_PROFILE = "your-profile-name"
export AWS_REGION = "us-west-2"
Usage
# Use default tier (Sonnet)
ai --aws
# Use specific tier
ai --aws --opus task.md
ai --aws --sonnet task.md
ai --aws --haiku quick-fix.md
Custom Models
Override default models in secrets.sh:
export CLAUDE_MODEL_OPUS_AWS = "global.anthropic.claude-opus-4-6-v1"
export CLAUDE_MODEL_SONNET_AWS = "global.anthropic.claude-sonnet-4-6"
export CLAUDE_MODEL_HAIKU_AWS = "us.anthropic.claude-haiku-4-5-20251001-v1:0"
export CLAUDE_SMALL_FAST_MODEL_AWS = "us.anthropic.claude-haiku-4-5-20251001-v1:0"
AWS Bedrock Setup Read the complete AWS Bedrock setup guide
Google Vertex AI
Google Vertex AI provides Claude models through Google Cloud Platform.
Configuration
Add to ~/.ai-runner/secrets.sh:
export ANTHROPIC_VERTEX_PROJECT_ID = "your-gcp-project-id"
export CLOUD_ML_REGION = "global"
Authentication
Vertex AI supports three authentication methods (in precedence order):
1. Service Account Key File
export GOOGLE_APPLICATION_CREDENTIALS = "/path/to/service-account-key.json"
2. Application Default Credentials
gcloud auth application-default login
3. User Credentials
Prerequisites
Install Google Cloud SDK: https://cloud.google.com/sdk/docs/install
Authenticate with one of the methods above
Enable Claude models in Vertex AI Model Garden
Usage
# Use default tier (Sonnet)
ai --vertex
# Use specific tier
ai --vertex --opus task.md
ai --vertex --sonnet task.md
ai --vertex --haiku quick-fix.md
Custom Models
Override default models in secrets.sh:
export CLAUDE_MODEL_OPUS_VERTEX = "claude-opus-4-6"
export CLAUDE_MODEL_SONNET_VERTEX = "claude-sonnet-4-6"
export CLAUDE_MODEL_HAIKU_VERTEX = "claude-haiku-4-5@20251001"
export CLAUDE_SMALL_FAST_MODEL_VERTEX = "claude-haiku-4-5@20251001"
Regional Overrides
For specific model regional availability:
export VERTEX_REGION_CLAUDE_4_6_OPUS = "us-east5"
export VERTEX_REGION_CLAUDE_4_6_SONNET = "us-east5"
Vertex AI Setup Read the complete Vertex AI setup guide
Anthropic API
Direct API access to Anthropic’s Claude models.
Configuration
Add to ~/.ai-runner/secrets.sh:
export ANTHROPIC_API_KEY = "sk-ant-..."
Getting an API Key
Sign up at console.anthropic.com
Navigate to API Keys
Create a new API key
Add it to your secrets.sh
Usage
# Use default tier (Sonnet)
ai --apikey
# Use specific tier
ai --apikey --opus task.md
ai --apikey --sonnet task.md
ai --apikey --haiku quick-fix.md
Custom Models
Override default models in secrets.sh:
export CLAUDE_MODEL_OPUS_ANTHROPIC = "claude-opus-4-6"
export CLAUDE_MODEL_SONNET_ANTHROPIC = "claude-sonnet-4-6"
export CLAUDE_MODEL_HAIKU_ANTHROPIC = "claude-haiku-4-5"
export CLAUDE_SMALL_FAST_MODEL_ANTHROPIC = "claude-haiku-4-5"
If you’re also logged into Claude Pro, you’ll see an “Auth conflict” warning from Claude Code. This is normal - Claude Code will use the API key for billing. The warning is just informational.
Anthropic API Docs Read the Anthropic API documentation
Microsoft Azure
Microsoft Foundry on Azure provides Claude models through Azure.
Configuration
Add to ~/.ai-runner/secrets.sh:
export ANTHROPIC_FOUNDRY_API_KEY = "your-azure-api-key"
export ANTHROPIC_FOUNDRY_RESOURCE = "your-resource-name"
Or provide the full URL:
export ANTHROPIC_FOUNDRY_BASE_URL = "https://your-resource-name.services.ai.azure.com"
Authentication Methods
1. API Key (Recommended)
export ANTHROPIC_FOUNDRY_API_KEY = "your-azure-api-key"
export ANTHROPIC_FOUNDRY_RESOURCE = "your-resource-name"
2. Azure Default Credentials
az login
export ANTHROPIC_FOUNDRY_RESOURCE = "your-resource-name"
Usage
# Use default tier (Sonnet)
ai --azure
# Use specific tier
ai --azure --opus task.md
ai --azure --sonnet task.md
ai --azure --haiku quick-fix.md
Custom Models
Azure model names are deployment names (user-defined). Override in secrets.sh:
export CLAUDE_MODEL_OPUS_AZURE = "claude-opus-4-6"
export CLAUDE_MODEL_SONNET_AZURE = "claude-sonnet-4-6"
export CLAUDE_MODEL_HAIKU_AZURE = "claude-haiku-4-5"
export CLAUDE_SMALL_FAST_MODEL_AZURE = "claude-haiku-4-5"
Microsoft Foundry Setup Read the complete Microsoft Foundry setup guide
Vercel AI Gateway
Vercel AI Gateway provides unified access to 100+ models from multiple providers.
Configuration
Add to ~/.ai-runner/secrets.sh:
export VERCEL_AI_GATEWAY_TOKEN = "vck_..."
export VERCEL_AI_GATEWAY_URL = "https://ai-gateway.vercel.sh" # Optional
Getting a Token
Visit vercel.com/dashboard/~/ai
Create a new AI Gateway token
Add it to your secrets.sh
Usage with Claude Models
# Use default tier (Sonnet)
ai --vercel
# Use specific tier
ai --vercel --opus task.md
ai --vercel --sonnet task.md
ai --vercel --haiku quick-fix.md
Use Any Model
Vercel AI Gateway supports 100+ models from OpenAI, xAI, Google, Meta, Anthropic, Mistral, DeepSeek, and more — all through one API.
Use --model provider/model to run Claude Code with any supported model:
ai --vercel --model xai/grok-code-fast-1 # xAI coding model
ai --vercel --model openai/gpt-5.2-codex # OpenAI coding model
ai --vercel --model google/gemini-3-pro-preview # Google reasoning model
ai --vercel --model alibaba/qwen3-coder # Alibaba coding model
ai --vercel --model zai/glm-5 # Zhipu AI GLM-5 198K context
Example Coding Models
Model ID Provider Description xai/grok-code-fast-1xAI Fast coding model openai/gpt-5.2-codexOpenAI Coding-optimized GPT (also openai/gpt-5.3-codex) google/gemini-3-pro-previewGoogle Latest reasoning model alibaba/qwen3-coderAlibaba Open-source coding model zai/glm-5Zhipu AI GLM-5, 198K context, MIT license
Browse All Models View all available models on Vercel AI Gateway
Configuration for Non-Anthropic Models
When using non-Anthropic models, configure defaults in secrets.sh:
# Use a non-Anthropic model as default for Vercel
export CLAUDE_MODEL_SONNET_VERCEL = "xai/grok-code-fast-1"
# Set a specific background/small-fast model
export CLAUDE_SMALL_FAST_MODEL_VERCEL = "xai/grok-code-fast-1"
Automatic small/fast model: When you use --model with a non-Anthropic model (e.g., xai/grok-code-fast-1), the background model is automatically set to the same model. This avoids mixing providers (e.g., xAI for main work + Anthropic for background). For Anthropic models on Vercel, the background model defaults to Haiku as usual.
Set as Default Provider
ai --vercel --model xai/grok-code-fast-1 --set-default
ai --clear-default
Claude Pro
Uses your Claude Pro/Max subscription. No API keys needed.
Prerequisites
Claude Code installed
Logged in with Claude subscription:
Usage
# Use default tier (Sonnet)
ai --pro
# Use specific tier
ai --pro --opus task.md
ai --pro --sonnet task.md
ai --pro --haiku quick-fix.md
This is the default provider if you’re logged into Claude Code with a subscription.
Claude Pro has rate limits. When you hit a limit, switch to an API provider with --resume to continue your work.
Provider Comparison
Provider Setup Complexity Cost Model Model Selection Best For Ollama Easy Free Open-source models Local, privacy, cloud fallback LM Studio Easy Free Custom models Apple Silicon, custom models AWS Bedrock Medium Pay-per-use Claude models AWS integration Vertex AI Medium Pay-per-use Claude models GCP integration Anthropic API Easy Pay-per-use Claude models Direct access Azure Medium Pay-per-use Claude models Azure integration Vercel Easy Pay-per-use 100+ models Multi-provider access Claude Pro Easiest Subscription Claude models Rate limits exist
Next Steps