Andi AIRun
Run AI prompts like programs. Executable markdown with shebang, Unix pipes, and output redirection. Extends Claude Code with cross-cloud provider switching and any-model support - free local or 100+ cloud models.Quickstart
Get started in 5 minutes with your first executable AI script
Installation
Detailed installation guide for all platforms
Executable Markdown
Create markdown files that run as AI programs
Provider Switching
Switch between AWS, Azure, Vertex, Ollama, and more
Overview
Switch between your Claude Code subscription and different clouds + models: AWS Bedrock, Google Vertex, Azure, Vercel + Anthropic API. Supports free local models (Ollama, LM Studio) and 100+ alternate cloud models via Vercel AI Gateway or Ollama Cloud. Swap and resume conversations mid-task to avoid rate limits and keep working.Key Features
Executable Markdown
Executable Markdown
Create markdown files with Make it executable and run:
#!/usr/bin/env ai shebang for script automation. Run them directly like any executable program.Unix Pipe Support
Unix Pipe Support
Pipe data into scripts, redirect output, chain in pipelines — standard Unix semantics for AI automation.
Cross-Cloud Provider Switching
Cross-Cloud Provider Switching
Use Claude on AWS, Vertex, Azure, Anthropic API + switch mid-conversation to bypass rate limits. Also supports local models and Vercel AI Gateway.
Model Tiers
Model Tiers
Simple tier selection across all providers:
--opus/--high, --sonnet/--mid, --haiku/--lowSession Continuity
Session Continuity
--resume picks up your previous chats with any model/provider. Switch providers mid-conversation when you hit rate limits.Script Variables
Script Variables
Declare variables with defaults in YAML front-matter. Users override them from the CLI without editing the script.
What it Does
- Executable markdown with
#!/usr/bin/env aishebang for script automation - Unix pipe support: pipe data into scripts, redirect output, chain in pipelines
- Cross-cloud provider switching: use Claude on AWS, Vertex, Azure, Anthropic API + switch mid-conversation to bypass rate limits. Also supports local models and Vercel AI Gateway
- Model tiers:
--opus/--high,--sonnet/--mid,--haiku/--low - Session continuity:
--resumepicks up your previous chats with any model/provider - Non-destructive: plain
claudealways works untouched as before
From Andi AI Search. Star the GitHub repo if it helps!
Supported Platforms
- macOS 13.0+
- Linux (Ubuntu 20.04+, Debian 10+)
- Windows 10+ via WSL
Providers
| Provider | Flag | Type | Notes |
|---|---|---|---|
| Ollama | --ollama / --ol | Local | Free, no API costs, cloud option |
| LM Studio | --lmstudio / --lm | Local | MLX models (fast on Apple Silicon) |
| AWS Bedrock | --aws | Cloud | Requires AWS credentials |
| Google Vertex AI | --vertex | Cloud | Requires GCP project |
| Anthropic API | --apikey | Cloud | Direct API access |
| Microsoft Azure | --azure | Cloud | Azure Foundry |
| Vercel AI Gateway | --vercel | Cloud | Any model: OpenAI, xAI, Google, Meta, more |
| Claude Pro | --pro | Subscription | Default if logged in |