What is BotINFRA?

BotINFRA is compute for agents. You bring a Docker image — we run it in production with hard spend caps, zero shared tenancy, and a remote kill switch.

It is runtime-agnostic. OpenClaw, Manus, LangChain, AutoGPT, CrewAI, or your own Dockerfile — if it runs in Docker, it runs on BotINFRA.

Deploy in 60 seconds

Install the CLI and point it at your image.

# Install
npm install -g @botinfra/cli

# Authenticate
botinfra login

# Deploy your agent
botinfra deploy --image my-agent:latest --budget 500

Live at https://agent-7x2k.botinfra.ai

--budget 500 sets a hard monthly cap of $500. Your agent stops if it hits the limit — not your credit card.

Supported runtimes

Managed profiles are pre-configured Docker images with sensible defaults. You can also bring any image via the Custom tier.

OpenClawManaged

Zero config. Best for shipping today.

ManusAutonomous

Persistent multi-step execution for revenue workflows.

NanoClawContainer

Tight ceilings and strict isolation.

IronClawEnterprise

Versioned, auditable. Rollback on demand.

ZeroClawPrivacy

Zero cross-instance exposure. Regulated workloads.

LangChainFramework

Bring your existing LangChain stack.

AutoGPTAutonomous

Goal-driven. Set the objective, set the guardrails.

CrewAIMulti-agent

Coordinate crews of agents toward a shared goal.

HermesMessaging

Event-driven agent communication and routing.

CustomBYO

Your Dockerfile. Your IP. Dedicated infrastructure.

Budget caps

Every deployment requires a --budget flag. This sets the maximum monthly spend in USD. When your agent hits the cap, it receives a SIGTERM and stops.

Budget enforcement runs at the proxy layer — token counts from every LLM call are written to Redis and compared against the cap in real time.

# $100/mo budget
botinfra deploy --image agent:latest --budget 100

# Update budget on a running agent
botinfra budget --agent agent-7x2k --set 250

Agent Spec v1

Any HTTP server that implements two endpoints can run on BotINFRA.

GET/health

Returns 200 when the agent is ready to accept messages.

// response
{ "status": "ok", "version": "1.0.0" }
POST/message

Accepts a task and returns a response. Supports streaming via Server-Sent Events.

// request
{ "task": "Summarize this document", "context": {...} }

// response
{ "output": "...", "done": true }