DockerWorkspace.
This provides complete isolation from the host system, making it ideal for production deployments, testing, and executing untrusted code safely.
Use DockerWorkspace with a pre-built agent server image for the fastest startup. When you need to build your own image from a base image, switch to DockerDevWorkspace.
the Docker sandbox image ships with features configured in the Dockerfile (e.g., secure defaults and services like VSCode and VNC exposed behind well-defined ports), which are not available in the local (non-Docker) agent server.
1) Basic Docker Sandbox
A ready-to-run example is available here!
Key Concepts
DockerWorkspace Context Manager
TheDockerWorkspace uses a context manager to automatically handle container lifecycle:
- Pulls or builds the Docker image
- Starts the container with an agent server
- Waits for the server to be ready
- Cleans up the container when done
Platform Detection
The example includes platform detection to ensure the correct Docker image is built and used:Testing the Workspace
Before creating a conversation, the example tests the workspace connection:Automatic RemoteConversation
When you use a DockerWorkspace, the Conversation automatically becomes a RemoteConversation:DockerWorkspace vs DockerDevWorkspace
UseDockerWorkspace when you can rely on the official pre-built images for the agent server. Switch to DockerDevWorkspace when you need to build or customize the image on-demand (slower startup, requires the SDK source tree and Docker build support).
Ready-tu-run Example Docker Sandbox
This example is available on GitHub: examples/02_remote_agent_server/02_convo_with_docker_sandboxed_server.py
examples/02_remote_agent_server/02_convo_with_docker_sandboxed_server.py
The model name should follow the LiteLLM convention:
provider/model_name (e.g., anthropic/claude-sonnet-4-5-20250929, openai/gpt-4o).
The LLM_API_KEY should be the API key for your chosen provider.2) VS Code in Docker Sandbox
A ready-to-run example is available here!VS Code with Docker demonstrates how to enable VS Code Web integration in a Docker-sandboxed environment. This allows you to access a full VS Code editor running in the container, making it easy to inspect, edit, and manage files that the agent is working with.
Key Concepts
VS Code-Enabled DockerWorkspace
The workspace is configured with extra ports for VS Code access:extra_ports=True setting exposes:
- Port
host_port+1: VS Code Web interface (host_port + 1) - Port
host_port+2: VNC viewer for visual access
DockerDevWorkspace with the same parameters and provide base_image/target to build on demand.
VS Code URL Generation
The example retrieves the VS Code URL with authentication token:VS Code URL Format
vscode_port: Usually host_port + 1 (e.g., 8011)token: Authentication token for securityworkspace_dir: Workspace directory to open
Ready-to-run Example VS Code
This example is available on GitHub: examples/02_remote_agent_server/05_vscode_with_docker_sandboxed_server.py
examples/02_remote_agent_server/05_vscode_with_docker_sandboxed_server.py
The model name should follow the LiteLLM convention:
provider/model_name (e.g., anthropic/claude-sonnet-4-5-20250929, openai/gpt-4o).
The LLM_API_KEY should be the API key for your chosen provider.3) Browser in Docker Sandbox
A ready-to-run example is available here!Browser with Docker demonstrates how to enable browser automation capabilities in a Docker-sandboxed environment. This allows agents to browse websites, interact with web content, and perform web automation tasks while maintaining complete isolation from your host system.
Key Concepts
Browser-Enabled DockerWorkspace
The workspace is configured with extra ports for browser access:extra_ports=True setting exposes additional ports for:
- Port
host_port+1: VS Code Web interface - Port
host_port+2: VNC viewer for browser visualization
DockerWorkspace with DockerDevWorkspace and provide base_image/target to build before launch.
Enabling Browser Tools
Browser tools are enabled by settingcli_mode=False:
cli_mode=False, the agent gains access to browser automation tools for web interaction.
When VNC is available and extra_ports=True, the browser will be opened in the VNC desktop to visualize agent’s work. You can watch the browser in real-time via VNC. Demo video:
VNC Access
The VNC interface provides real-time visual access to the browser:autoconnect=1: Automatically connect to VNC serverresize=remote: Automatically adjust resolution
Ready-to-run Example Browser
This example is available on GitHub: examples/02_remote_agent_server/03_browser_use_with_docker_sandboxed_server.py
DockerWorkspace with browser capabilities and VNC access:
examples/02_remote_agent_server/03_browser_use_with_docker_sandboxed_server.py
The model name should follow the LiteLLM convention:
provider/model_name (e.g., anthropic/claude-sonnet-4-5-20250929, openai/gpt-4o).
The LLM_API_KEY should be the API key for your chosen provider.Next Steps
- Local Agent Server
- Agent Server Overview - Architecture and implementation details
- API Sandboxed Server - Connect to hosted API service
- Agent Server Package Architecture - Remote execution architecture

