Intermediate • 15 Minute Setup

Multiple Computers
Distributed AI Agent Architecture

Scale beyond one machine. Run AI agents across your laptop, desktop, cloud servers, and team members' computers with SSH-powered auto-discovery and seamless orchestration.

Why Distributed Architecture?

Unlock the power of multiple machines working together

Workload Distribution

Split heavy tasks across machines. Run frontend agents on your laptop while backend services run on a powerful desktop or cloud instance.

Cloud Resource Access

Connect to AWS, GCP, or Azure instances for GPU-accelerated tasks, large-scale testing, or data-intensive operations without leaving your dashboard.

Team Collaboration

Share worker machines with team members. Everyone sees the same dashboard, same agents, same real-time terminals. Perfect for remote teams.

Resource Monitoring

See CPU, memory, and network usage across all worker machines from one dashboard. Optimize agent placement based on real-time metrics.

Secure SSH Connections

All communication uses SSH with key-based authentication. No exposed ports, no security risks. Works with existing SSH configs and jump hosts.

Auto-Discovery

Configure SSH hosts once. AI Maestro automatically discovers tmux sessions on all worker machines and displays them in your dashboard.

How It Works

Manager/Worker architecture powered by SSH

┌─────────────────────────────────────────────────────────────────┐
│                     MANAGER MACHINE (Your Laptop)               │
│                                                                 │
│  ┌─────────────────────────────────────────────────────────┐  │
│  │              AI Maestro Dashboard                       │  │
│  │          http://localhost:23000                         │  │
│  └─────────────────────────────────────────────────────────┘  │
│                           │                                    │
│                           │ SSH Auto-Discovery                 │
│                           ▼                                    │
│  ┌──────────────┐  ┌──────────────┐  ┌──────────────┐        │
│  │ Local Agents │  │ Worker 1     │  │ Worker 2     │        │
│  │ (localhost)  │  │ (desktop)    │  │ (cloud)      │        │
│  └──────────────┘  └──────────────┘  └──────────────┘        │
└─────────────────────────────────────────────────────────────────┘
                           │                 │
                           │                 │
          ┌────────────────┴─────┬───────────┴─────────────┐
          │                      │                         │
          ▼                      ▼                         ▼
┌──────────────────┐   ┌──────────────────┐   ┌──────────────────┐
│ WORKER 1         │   │ WORKER 2         │   │ WORKER 3         │
│ Desktop (Linux)  │   │ Cloud (AWS EC2)  │   │ Team Machine     │
│                  │   │                  │   │                  │
│ tmux sessions:   │   │ tmux sessions:   │   │ tmux sessions:   │
│ • backend-api    │   │ • ml-training    │   │ • integration    │
│ • database       │   │ • data-proc      │   │ • e2e-tests      │
│ • redis          │   │ • gpu-tasks      │   │ • perf-tests     │
└──────────────────┘   └──────────────────┘   └──────────────────┘
1

Install on Manager

git clone https://github.com/\
23blocks-OS/ai-maestro.git
cd ai-maestro
./install.sh

Install AI Maestro on your main machine (laptop/desktop). This becomes your manager.

2

Install on Workers

# On each worker machine
git clone https://github.com/\
23blocks-OS/ai-maestro.git
cd ai-maestro
./install-worker.sh

Minimal installation on worker machines - just tmux and SSH.

3

Configure SSH

# ~/.aimaestro/workers.yml
workers:
  - name: desktop
    host: desktop.local
  - name: cloud
    host: ec2-user@aws-instance
  - name: team-server
    host: team@company.com

Define workers once. Auto-discovery handles the rest.

That's It!

All tmux sessions from all workers appear in your dashboard automatically.

Pro Tip: Workers can be anywhere - local network, cloud, behind VPN, or even team members' machines. As long as SSH works, AI Maestro works.

Perfect For

Common scenarios where distributed architecture shines

Power Users

You have a laptop for mobility and a desktop for heavy workloads. Run lightweight agents on your laptop, but offload database migrations, large builds, or ML training to your desktop.

Laptop: UI agents, API testing
Desktop: Database, Redis, ML training
Control: All from laptop dashboard

Remote Work

Working from coffee shop with limited bandwidth? Keep heavy tasks running on your home office machine. Access them securely via SSH from anywhere.

Coffee Shop: Active development
Home Office: Background jobs, builds
Same workflow: Seamless switching

Team Collaboration

Share access to worker machines across your team. Everyone can monitor integration tests, access shared development databases, or collaborate on pair programming sessions.

Shared Workers: CI/CD, staging env
Personal Workers: Individual dev
One Dashboard: Unified team view

Cloud + Local Hybrid

Keep fast feedback loops local, but leverage cloud GPUs or large instances for specialized tasks. Spin up/down cloud workers as needed while maintaining local control.

Local: Fast iteration cycles
AWS/GCP: GPU training, big data
Cost: Only pay when cloud is used

Advanced Features

Power user capabilities

Jump Hosts & Bastion Servers

Access workers behind corporate firewalls or VPNs using SSH ProxyJump configuration.

# ~/.ssh/config
Host production-worker
  HostName 10.0.1.50
  ProxyJump bastion.company.com
  User deployer

Custom SSH Ports & Keys

Full SSH config support - custom ports, specific identity files, connection options.

# ~/.ssh/config
Host cloud-gpu
  HostName gpu.example.com
  Port 2222
  IdentityFile ~/.ssh/gpu_key
  StrictHostKeyChecking no

Resource Limits

Set CPU/memory limits per worker to prevent any single agent from consuming all resources.

# workers.yml
workers:
  - name: desktop
    host: desktop.local
    limits:
      cpu: 50%
      memory: 8GB

Auto-Reconnection

Workers go offline? No problem. AI Maestro automatically reconnects when they're back online.

# Automatic behavior:
# - Retry every 30 seconds
# - Exponential backoff
# - Mark offline in dashboard
# - Reconnect when available
NEW in v0.15.0

Portable Agents

Export, transfer, clone, and backup your AI agents across any machine

See full Agent Management guide →

Export Agents

Export any agent to a portable .zip file. Includes configuration, memory database, message history, and git repository references.

Agent Settings → Export → Downloads .zip file

Import Anywhere

Drop .zip files into AI Maestro on any machine. Automatically restores configuration, memory, and messages. Offers to clone git repositories.

New Host → Import Agent → Select file → Restored!

Clone Agents

Create copies of agents on the same machine or across hosts. Perfect for spinning up multiple similar agents or creating backups before experiments.

Export → Import on same machine → New agent clone

Backup & Restore

Keep backups of your critical agents. Restore from any point in time. Perfect for team onboarding or disaster recovery.

Weekly exports → Cloud storage → Instant recovery

What's in an Export?

Configuration

Registry settings

Memory DB

CozoDB database

Messages

Inbox & sent

Git Repos

Remote URLs

Machine Migration

Moving to a new laptop? Export all agents, import on new machine, and you're back to work in minutes.

Team Onboarding

Share pre-configured agent setups with new team members. Everyone starts with the same optimized configurations.

Cross-Host Transfer

Move agents between local machine and worker hosts. Optimize placement based on workload requirements.

When to Consider Docker?

Multi-computer mode is excellent for distributed workloads. Consider Docker when you need:

→ Environment Isolation

Complete dependency isolation without affecting host system

→ Reproducible Builds

Identical environments across development, staging, and production

Scale Your AI Agents
Across All Your Machines

Install in 15 minutes. Works with existing SSH setup. Unlimited workers.