Distributed Workers
Scale Your AI Agent Fleet

Connect remote machines to distribute AI agent workloads. One unified dashboard to manage agents running across your entire infrastructure—from your laptop to cloud servers.

Enterprise-Grade Distribution

Manage agents across unlimited machines from one dashboard

Automatic Discovery

Workers connect via WebSocket and automatically register themselves. No complex configuration—just add the SSH connection and you're done.

Health Monitoring

Real-time health checks show worker status, CPU usage, memory consumption, and network connectivity. Know immediately when something goes wrong.

Unified Dashboard

All worker sessions appear in one hierarchical view. No switching between dashboards—manage everything from a single interface.

Remote Session Management

Create, attach, and manage tmux sessions on remote workers just like local sessions. Full terminal streaming over SSH with low latency.

Resource Tracking

Monitor CPU, memory, disk usage, and active sessions per worker. Identify bottlenecks and optimize workload distribution.

Cross-Platform

Works on macOS, Linux, and Windows WSL2. Mix and match different platforms in your worker fleet—the dashboard handles them all.

How It Works

Simple architecture, powerful capabilities

Worker Connection Flow

1

Configure Worker

Add SSH connection details in Settings (hostname, port, username). AI Maestro must be installed on the remote machine.

2

WebSocket Handshake

Dashboard establishes WebSocket connection via SSH tunnel. Worker registers itself with hostname, platform, and capabilities.

3

Session Sync

Worker sends list of all tmux sessions. Sessions appear in your dashboard with worker hostname prefix.

4

Real-Time Streaming

Click any remote session to attach. Terminal I/O streams over WebSocket with full xterm.js rendering.

Example Architecture

┌─────────────────────┐
│  Your Laptop        │
│  AI Maestro         │
│  Dashboard          │
└──────┬──────────────┘
       │
       ├──[WebSocket]───► Worker 1 (macOS)
       │                  10 agents running
       │
       ├──[WebSocket]───► Worker 2 (Linux)
       │                  8 agents running
       │
       └──[WebSocket]───► Worker 3 (Cloud)
                          15 agents running

Total: 33 agents across 3 workers
All manageable from one dashboard

When to Use Distributed Workers

Scale beyond a single machine

Resource-Intensive Projects

Large codebases or compute-heavy tasks can overwhelm a single machine. Distribute agents across multiple workers to parallelize work and speed up development.

Platform-Specific Testing

Test cross-platform code by running agents on macOS, Linux, and Windows workers simultaneously. Catch platform-specific bugs before they reach production.

Secure Environments

Keep sensitive code on secure machines while controlling agents from your laptop. Workers stay behind firewalls—only WebSocket connection needed.

Team Collaboration

Share a pool of powerful worker machines with your team. Everyone connects to the same workers but manages their own agents independently.

Quick Start

Set up your first worker in 5 minutes

1 Install AI Maestro on Worker

# SSH into your remote machine
ssh user@remote-worker

# Clone and install AI Maestro
git clone https://github.com/23blocks-OS/ai-maestro.git
cd ai-maestro
./install.sh

2 Configure Worker in Dashboard

# In AI Maestro dashboard Settings page:
# Add new worker:
#   - Hostname: remote-worker.example.com
#   - Port: 22
#   - Username: your-username
#   - SSH Key: ~/.ssh/id_rsa (optional)

3 Start Using Remote Sessions

Worker sessions automatically appear in your dashboard. Click any session to attach and start coding!

Scale Your AI Fleet
Across Unlimited Machines

Start with one worker, scale to hundreds. The dashboard stays the same.