Fastest, hardware-isolated cloud sandbox for AI agents.
Fork in 100 ms. Persistent REPL.
Secure, isolated environments your agents can branch, run code in, and tear down. State carries across calls; forks diverge in milliseconds.
6 msserver-side create42 mshot run_code43 msend-to-end p5078 msend-to-end p95100 msfork(n)measured in-cloud · ↗ vs E2B / Daytona / Blaxelfrom podflare import Sandbox
# Create a sandbox
sbx = Sandbox(template="python-datasci")
# Execute code (state persists across calls)
sbx.run_code("print('hello')")
# Fork — spawn N copies of the running sandbox
children = sbx.fork(n=5)
# Merge a branch back into the parent
sbx.merge_into(children[0])
# Destroy the sandbox
sbx.close()Install the SDK
Or use the MCP server →Source + adapter packages: GitHub · npm: podflare (with /anthropic, /openai-agents, /ai-sdk subpaths) · PyPI: podflare
fork(n)
Snapshot a running sandbox, spawn N copies, each diverges for free. The primitive tree-search agents have always wanted.
Persistent REPL
Variables, imports, open files carry across run_code calls. Data-science templates preload pandas, numpy, scipy, and matplotlib.
Every framework
OpenAI Agents SDK, Vercel AI SDK, Anthropic code_execution, MCP. Drop-in for any tool-use pattern.
One sandbox = one full Linux box
Not a container. Not a function. Every Podflare sandbox is a hardware-isolated microVM with a dedicated kernel. Your agent gets root, full internet, a writable filesystem, and a persistent Python REPL that survives across tool calls.
Install anything
pip install, npm install, apt install, git clone. Outbound internet on by default, no allowlist, no proxy. Opt out with Sandbox(egress=False) for untrusted code.
Persistent Python REPL
Variables, imports, loaded DataFrames carry across run_code calls. No re-parsing CSVs. No re-importing pandas. Hot exec ~60 ms.
Writable filesystem
4 GB rootfs default, up to 64 GB. Write files, clone repos, build images. upload() + download() for moving bytes in and out.
fork(n) in ~80 ms
Snapshot a running sandbox, spawn N copies-on-write. Each child inherits memory, files, env. Tree-search agents actually work.
Persistent Spaces
Freeze to disk on idle, resume into a fresh sandbox later. The running Python process survives — models, caches, REPL state, all preserved.
Tier-scalable
1–16 GB RAM, 2–16 vCPUs, 4–64 GB rootfs, 30 min–24 h lifetimes. Silently clamped to your tier — your code never errors on capacity.
Hardware isolation
KVM-backed virtual machine with its own kernel and page tables. Container escapes don't apply. No cross-tenant L2, memory, or filesystem visibility.
Framework-native
Drop-in for Claude code_execution, OpenAI Agents SDK, Vercel AI SDK, and MCP. One helper per framework.
Multi-region, ~100 ms
us-west + eu today. Full round-trip around 100 ms end-to-end from a fiber uplink. Cloudflare-routed by geo with automatic failover, or pin a region for minimum latency.
Five minutes from sign-up to first sandbox.
Mint a key, install the SDK, you're running.