orb.toml Config Reference
The orb.toml file describes your agent — what it is, how to build it, what ports to expose, and which LLM providers it uses.
Full Example
toml
[agent]
name = "my-agent"
lang = "node"
entry = "index.js"
args = ["--port", "3000"]
[agent.env]
HOME = "/root"
NODE_ENV = "development"
[source]
git = "https://github.com/you/your-agent"
branch = "main"
[build]
steps = [
"npm install",
"npm run build",
]
working_dir = "/agent/code"
[backend]
provider = "anthropic"
[ports]
expose = [3000]
[resources]
runtime = "2GB"
disk = "8GB"
Sections
[agent] (required)
| Field | Type | Required | Description |
|---|---|---|---|
| name | string | yes | Agent display name |
| lang | string | yes | node, python, binary, go, rust |
| entry | string | yes | Entry point (script path or binary path) |
| args | [string] | no | CLI arguments passed to the agent |
Language behavior:
node— runsnode {entry} {args...}python— runspython3 {entry} {args...}binary— runs{entry} {args...}directlygo,rust— same asbinary
[agent.env]
Key-value environment variables set for the agent process.
toml
[agent.env]
HOME = "/root"
API_KEY = "${MY_SECRET}"
${VAR} syntax resolves from org secrets (passed at deploy time).
[source]
| Field | Type | Required | Description |
|---|---|---|---|
| git | string | yes | HTTPS URL to git repo |
| branch | string | no | Branch to clone (default: main) |
Only HTTPS URLs are allowed. The repo is cloned into /agent/code inside the computer.
[build]
| Field | Type | Required | Description |
|---|---|---|---|
| steps | [string] | yes | Shell commands to run in order |
| working_dir | string | no | Directory to run build steps in (default: /agent/code) |
Each step runs via bash -c inside the sandbox with network access. Steps have a 10-minute timeout each.
Available during build:
- Full internet access (npm, pip, apt-get via build steps)
/usr/bin,/usr/local/bin— host binaries availablenpm,node,python3,pip,git,make,gcc— all available/root/.npm-global/bin— for globally installed npm packages
[backend]
| Field | Type | Required | Description |
|---|---|---|---|
| provider | string | yes | anthropic, openai, google, multi, custom |
Provider presets:
anthropic— routes/v1/messagestoapi.anthropic.comopenai— routes/v1/chat/completionstoapi.openai.comgoogle— routes togenerativelanguage.googleapis.commulti— Anthropic + OpenAI + Google (for multi-provider agents)custom— define your own routes (see below)
For custom routing, add api_base and routes:
toml
[backend]
provider = "custom"
api_base = "https://openrouter.ai/api"
routes = ["POST /v1/chat/completions"]
forward_headers = ["authorization", "content-type"]
[ports]
toml
[ports]
expose = [3000, 8080]
Ports listed here are mapped to public URLs at {computer-short-id}.orbcloud.dev. The first port in the list is the default when accessing the subdomain.
[resources]
toml
[resources]
runtime = "2GB"
disk = "8GB"
runtime— RAM budget for the computer (agents are checkpointed to disk when memory pressure is high)disk— Storage budget for code, builds, data