Local-First
Run agents on your own hardware with Ollama, LiteLLM, or any OpenAI-compatible API. No cloud lock-in.
Local-First
Run agents on your own hardware with Ollama, LiteLLM, or any OpenAI-compatible API. No cloud lock-in.
Tool-Native
Agents call tools, read files, query APIs, and execute code. Define tools in Python or TypeScript.
Production-Ready
Built-in routing, fallbacks, cost tracking, and monitoring. From prototype to production without rewrites.
Open Source
MIT licensed. Self-host, modify, contribute. Community-driven development.