Build AI Agents by Just Talking to AI
One file. 25 tools. Skills, MCP, and a container sandbox built in. Zero dependencies. Describe what you need, your AI assistant builds it, and the harness runs it — locally, for free.
Setup takes 5 minutes • Runs fully offline with Ollama
No coding required • No npm install • You own your code
From Idea to Running Agent
No frameworks to learn. No dependencies to manage. Just say what you want.
-
1
No more framework churn and dependency hell
No LangChain. No pip install. No node_modules. A single .js file with everything an agent needs to do real work.
-
2
Describe your agent in plain language
"Build a research agent that searches the web, cross-references sources, and writes reports." Your AI assistant handles the rest.
-
3
Preview every action before it runs
See exactly which tools your agent will call. Add
--yolowhen you trust it to run for real. -
4
Runs locally. Free forever.
Powered by Ollama — your data never leaves your machine. Or connect OpenAI, Anthropic, or Google when you need cloud models.
Not a Template. A Harness.
A complete runtime that handles tool dispatch, turn management, context compression, and safety — so your agent focuses on the task.
25 Built-in Tools
Files, web search, scraping, downloads, knowledge base, orchestration. Plus agents create new tools at runtime.
Subagent Spawning
Big tasks decompose into focused subtasks. Each subagent gets a fresh context window — critical for local models with limited memory.
Knowledge System
Ingest articles, compile a wiki, search with rg/grep. Every interaction makes the knowledge base richer. Pure markdown, no database.
Skills
Teach agents a new domain with a SKILL.md bundle. Drop one into ./skills/ and the agent opens it only when needed. Works with the Anthropic & Vercel skill ecosystems.
Connect Any Service (MCP)
Drop a mcp.json and your agent can use GitHub, Postgres, Slack, filesystem — the whole Model Context Protocol ecosystem. Servers appear as tools automatically.
Safe by Default
Network allowlist on HTTP tools, container sandbox for untrusted shells, workspace-confined file ops, preview-before-execute. Safety isn't bolted on — it's the default.
Choose Your AI Assistant
Any of these can build your agent. Pick one and start talking.
Claude Code
Anthropic's agentic coding tool. Strong natural-language understanding and multi-file edits. Also runs fully offline through Ollama.
OpenAI Codex CLI
Fast, capable completions. Great all-rounder. Requires OpenAI API key or Plus subscription.
Google Gemini CLI
Generous free tier — 60 requests/min, 1,000/day. Just sign in with your Google account.
Fully Offline. Everything Local.
Run Claude Code itself through Ollama — both the AI assistant that builds your agents and the agents themselves stay on your machine. No API keys. No cloud. No data leaving your network.
# Quick setup — launches Claude Code with a local model
ollama launch claude
# Or pick a specific model
ollama launch claude --model qwen3.5
16GB RAM
qwen3.5, glm-4.7-flash
32GB RAM
kimi-k2.5:cloud, glm-5:cloud
64GB+ RAM
minimax-m2.7, qwen3-coder-next, gemma4
More models at ollama.com/search. Claude Code needs 64k+ context — see Ollama docs to adjust.
Stop Building Frameworks. Start Building Agents.
One file. Zero dependencies. Real work.
Open source. MIT license. You own everything.
Why Agent Builder?
Compare frameworks, building from scratch, and Agent Builder.
| Frameworks | From Scratch | Agent Builder | |
|---|---|---|---|
| Dependencies | 50–200+ packages | You manage everything | Zero npm dependencies |
| Setup time | pip/npm install + config | Days of scaffolding | 5 minutes, one script |
| Learning curve | Framework-specific APIs | Everything from zero | Describe in plain language |
| Local models | Often unsupported | You implement it | Ollama by default, free |
| Codebase size | Hundreds of files | Grows uncontrolled | One .js file |
| Skills & MCP | Framework-specific plugins | You build the loader | Drop-in SKILL.md & mcp.json |
| Sandboxing | External wrappers needed | Roll your own | Apple container or Docker, built in |