Skip to main content

Andi AIRun

Run AI prompts like programs. Executable markdown with shebang, Unix pipes, and output redirection. Extends Claude Code with cross-cloud provider switching and any-model support - free local or 100+ cloud models.
# Run Claude Code interactively: any model or provider
ai                                        # Regular Claude subscription (Pro, Max)
ai --aws --opus --team --resume           # Resume chats on AWS w/ Opus 4.6 + Agent Teams
ai --ollama --bypass --model qwen3-coder  # Ollama local model with bypassPermissions set
ai --vercel --model openai/gpt-5.2-codex  # Vercel AI Gateway with 100+ models

# Run prompts like programs
ai --azure --haiku script.md

# Script automation
cat data.json | ./analyze.md > results.txt

Overview

Switch between your Claude Code subscription and different clouds + models: AWS Bedrock, Google Vertex, Azure, Vercel + Anthropic API. Supports free local models (Ollama, LM Studio) and 100+ alternate cloud models via Vercel AI Gateway or Ollama Cloud. Swap and resume conversations mid-task to avoid rate limits and keep working.

Key Features

Create markdown files with #!/usr/bin/env ai shebang for script automation. Run them directly like any executable program.
#!/usr/bin/env ai
Analyze my codebase and summarize the architecture.
Make it executable and run:
chmod +x task.md
./task.md
Pipe data into scripts, redirect output, chain in pipelines — standard Unix semantics for AI automation.
cat data.json | ./analyze.md > results.txt    # Pipe in, redirect out
git log -10 | ./summarize.md                  # Feed git history to AI
./generate.md | ./review.md > final.txt       # Chain scripts together
Use Claude on AWS, Vertex, Azure, Anthropic API + switch mid-conversation to bypass rate limits. Also supports local models and Vercel AI Gateway.
ai --aws                          # AWS Bedrock
ai --vertex                       # Google Vertex AI
ai --ollama                       # Ollama (local, free)
ai --vercel                       # Vercel AI Gateway
Simple tier selection across all providers: --opus/--high, --sonnet/--mid, --haiku/--low
ai --opus task.md                 # Opus 4.6 (most capable)
ai --sonnet task.md               # Sonnet 4.6
ai --haiku task.md                # Haiku 4.5 (fastest)
--resume picks up your previous chats with any model/provider. Switch providers mid-conversation when you hit rate limits.
# Hit rate limit on Claude Pro
claude
# "Rate limit exceeded. Try again in 4 hours."

# Continue immediately with AWS
ai --aws --resume
Declare variables with defaults in YAML front-matter. Users override them from the CLI without editing the script.
#!/usr/bin/env -S ai --haiku
---
vars:
  topic: "machine learning"
  style: casual
  length: short
---
Write a {{length}} summary of {{topic}} in a {{style}} tone.
./summarize.md                          # uses defaults
./summarize.md --topic "AI safety"      # overrides one variable

What it Does

  • Executable markdown with #!/usr/bin/env ai shebang for script automation
  • Unix pipe support: pipe data into scripts, redirect output, chain in pipelines
  • Cross-cloud provider switching: use Claude on AWS, Vertex, Azure, Anthropic API + switch mid-conversation to bypass rate limits. Also supports local models and Vercel AI Gateway
  • Model tiers: --opus/--high, --sonnet/--mid, --haiku/--low
  • Session continuity: --resume picks up your previous chats with any model/provider
  • Non-destructive: plain claude always works untouched as before
From Andi AI Search. Star the GitHub repo if it helps!

Supported Platforms

  • macOS 13.0+
  • Linux (Ubuntu 20.04+, Debian 10+)
  • Windows 10+ via WSL

Providers

ProviderFlagTypeNotes
Ollama--ollama / --olLocalFree, no API costs, cloud option
LM Studio--lmstudio / --lmLocalMLX models (fast on Apple Silicon)
AWS Bedrock--awsCloudRequires AWS credentials
Google Vertex AI--vertexCloudRequires GCP project
Anthropic API--apikeyCloudDirect API access
Microsoft Azure--azureCloudAzure Foundry
Vercel AI Gateway--vercelCloudAny model: OpenAI, xAI, Google, Meta, more
Claude Pro--proSubscriptionDefault if logged in

Next Steps