docs/deployment-guide.md

Deployment guide

Deployment options and operational guidance.

🚀 Deployment Guide

Randal runs anywhere Bun runs. This guide covers three deployment patterns: a local Mac Mini, Railway (cloud), and importing Randal as a library into an existing project.

The marketing/docs site is deployed separately as static Astro output on Vercel. See Public Site Deployment for Vercel previews, PostHog, domains, and public-site setup. That path is not required for self-hosting randal serve or deploying the Railway runtime.


📋 Prerequisites

All deployments require:

  • Bun >= 1.1
  • **OpenCode agent CLI installed and on PATH (opencode)
  • Either provider API keys or a local OpenCode/OpenAI login you can bootstrap once with opencode auth login
  • Meilisearch (optional but recommended for production memory/search)

🍎 Mac Mini (Local)

A Mac Mini is the simplest deployment: Randal runs as a background process with launchd or a process manager.

1. Install

curl -fsSL https://raw.githubusercontent.com/drewbietron/randal/main/install.sh | bash

This single command:

  • Installs Bun (if not present)
  • Clones the Randal repo to ~/randal
  • Installs dependencies and links the randal CLI
  • Runs the interactive setup wizard
  • Starts Meilisearch via Docker (if selected and Docker is available)

Or manually:

git clone <repo-url> ~/randal
cd ~/randal
bun install && bun link
randal init
randal setup

2. Configure

cd ~/randal  # use examples/local-mac/ as a starting point
cp .env.example .env
# Edit .env with your API keys

Example randal.config.yaml:

name: home-agent
runner:
  defaultAgent: opencode
  defaultModel: anthropic/claude-sonnet-4
  workdir: ~/dev
credentials:
  envFile: ./.env
  allow: [ANTHROPIC_API_KEY]
gateway:
  channels:
    - type: http
      port: 7600
      auth: "${RANDAL_API_TOKEN}"
memory:
  store: meilisearch
  url: http://localhost:7701
  apiKey: "${MEILI_MASTER_KEY}"

3. Meilisearch

Note: If you used install.sh or randal init, Meilisearch is already running via Docker with persistent storage at ~/.randal/meili-data/.

To manage manually:

# Start via Docker (recommended — data persists at ~/.randal/meili-data/)
docker run -d --name randal-meilisearch --restart unless-stopped \
  -p 7701:7700 \
  -v ~/.randal/meili-data:/meili_data \
  -e MEILI_MASTER_KEY="${MEILI_MASTER_KEY}" \
  getmeili/meilisearch:v1.12

# Or via Homebrew (use --db-path for persistence)
meilisearch --master-key="${MEILI_MASTER_KEY}" --db-path ~/.randal/meili-data

4. Start Randal

cd ~/randal
randal serve

Optional published-artifact bootstrap on any deployment target:

export RANDAL_MANAGED_CONTROL_PLANE_URL=https://control-plane.example
export RANDAL_MANAGED_WORKSPACE_ID=workspace-123
export RANDAL_MANAGED_VERSION_ID=version-7
randal serve

Failure modes:

  • bootstrap endpoint unreachable: startup fails
  • version export missing: startup fails with published-artifact lookup error
  • malformed YAML export: startup fails before the runtime begins serving

If you do not set managed bootstrap env vars, startup remains local-first.

To run as a persistent background service, create a launchd plist:

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN"
  "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
  <key>Label</key>
  <string>com.randal.agent</string>
  <key>ProgramArguments</key>
  <array>
    <string>/Users/you/.bun/bin/bun</string>
    <string>/Users/you/randal/packages/cli/src/index.ts</string>
    <string>serve</string>
  </array>
  <key>WorkingDirectory</key>
  <string>/Users/you/randal</string>
  <key>RunAtLoad</key>
  <true/>
  <key>KeepAlive</key>
  <true/>
  <key>StandardOutPath</key>
  <string>/tmp/randal.out.log</string>
  <key>StandardErrorPath</key>
  <string>/tmp/randal.err.log</string>
</dict>
</plist>
cp com.randal.agent.plist ~/Library/LaunchAgents/
launchctl load ~/Library/LaunchAgents/com.randal.agent.plist

5. Verify

curl http://localhost:7600/health
# {"status":"ok","uptime":...,"version":"0.1.0"}

# Open dashboard
open http://localhost:7600/

🚂 Railway (Cloud)

Railway provides a simple container hosting platform. The official Randal Docker image bundles everything — Bun, Meilisearch, OpenCode, and Randal — so you only need a single service.

For voice on Railway, the intended first working path is PSTN/Twilio. Browser voice remains supported, but is secondary and stays behind authenticated HTTP admin routes.

For voice deployments, treat Railway as a public gateway unless you explicitly put the service behind private networking or another authenticated edge. Only /, /health, and /assets/* are intentionally public. Browser token issuance (POST /api/voice/token) and voice status (GET /voice/status) stay behind normal HTTP auth.

1. Project Structure

Create a deployment directory with:

my-deployment/
  randal.config.yaml
  Dockerfile
  railway.toml

2. Dockerfile

FROM ghcr.io/drewbietron/randal:latest

# Copy your config
COPY randal.config.yaml /app/randal.config.yaml

# Copy knowledge files (if any)
# COPY knowledge/ /app/knowledge/

The official image includes an embedded Meilisearch instance for agent memory. No separate Meilisearch service is needed.

3. Railway Configuration

In the Railway dashboard or the GitHub Actions deploy workflow:

  1. Create a new project.
  2. Add a custom service pointing to your repo.
  3. Set environment variables.

If you use .github/workflows/railway-deploy.yml, set these as GitHub Actions repository secrets because the workflow copies secrets into Railway. Your local .env is not copied into Railway by that workflow.

VariableValue
OPENCODE_AUTH_JSONPreferred for GPT Pro / OpenCode OAuth on Railway. Paste the contents of ~/.local/share/opencode/auth.json after a local opencode auth login.
OPENROUTER_API_KEYOptional alternative for OpenRouter-based model access.
OPENAI_API_KEYOptional alternative for direct OpenAI API billing.
ANTHROPIC_API_KEYOptional alternative for direct Anthropic API billing.
RANDAL_API_TOKENA generated secret for API auth
MEILI_MASTER_KEYA generated secret for the embedded Meilisearch
RANDAL_VOICE_PUBLIC_URLPublic HTTPS/WSS base URL for Randal's /voice/* routes
LIVEKIT_URLLiveKit server URL
LIVEKIT_API_KEYLiveKit API key
LIVEKIT_API_SECRETLiveKit API secret
DEEPGRAM_API_KEYDeepgram STT key
ELEVENLABS_API_KEYElevenLabs TTS key
ELEVENLABS_VOICE_IDElevenLabs voice ID
TWILIO_ACCOUNT_SIDTwilio account SID
TWILIO_AUTH_TOKENTwilio auth token
TWILIO_PHONE_NUMBERTwilio phone number in E.164
  1. Set the deploy config to use your Dockerfile.

Use a dedicated Twilio subaccount for this deployment. The current PSTN runtime expects TWILIO_ACCOUNT_SID + TWILIO_AUTH_TOKEN + TWILIO_PHONE_NUMBER, not Twilio API keys.

4. Preferred GPT Pro / OpenCode OAuth Flow

This is the shortest path if you want Railway to run through an existing OpenCode login instead of API billing.

  1. Bootstrap locally once:
opencode auth login
  1. Confirm the local auth file exists:
ls ~/.local/share/opencode/auth.json
  1. Copy the file contents into the Railway secret named OPENCODE_AUTH_JSON.
  2. Deploy with runner.opencodeAuth.mode: openai-oauth, runner.opencodeAuth.authFileEnv: OPENCODE_AUTH_JSON, runner.defaultModel: openai/gpt-5.4, and runner.opencode.variant: xhigh.

Concrete Railway config:

name: my-cloud-agent
runner:
  defaultAgent: opencode
  defaultModel: openai/gpt-5.4
  workdir: /app/workspace
  opencodeAuth:
    mode: openai-oauth
    authFileEnv: OPENCODE_AUTH_JSON
  opencode:
    variant: xhigh
credentials:
  envFile: ./.env
  allow: [OPENCODE_AUTH_JSON, OPENROUTER_API_KEY, OPENAI_API_KEY, ANTHROPIC_API_KEY, RANDAL_API_TOKEN, MEILI_MASTER_KEY]
  inherit: [PATH, HOME, USER, SHELL, TERM, OPENCODE_AUTH_JSON, OPENROUTER_API_KEY, OPENAI_API_KEY, ANTHROPIC_API_KEY, MEILI_MASTER_KEY]
gateway:
  channels:
    - type: http
      port: 7600
      auth: "${RANDAL_API_TOKEN}"
memory:
  store: meilisearch
  url: http://127.0.0.1:7700
  apiKey: "${MEILI_MASTER_KEY}"
  index: memory-cloud-agent

Operational caveat: the initial OpenAI/OpenCode login is local and interactive, but steady-state Railway runtime is headless after the secret is in place.

Security note: OPENCODE_AUTH_JSON is reusable session state. Store it only as a Railway secret, do not commit it, do not print it in logs, and rotate it by re-running opencode auth login if you suspect exposure.

API-based paths remain supported. If you prefer explicit provider billing instead of subscription auth, keep using OPENROUTER_API_KEY, OPENAI_API_KEY, or ANTHROPIC_API_KEY with the same Railway deployment flow.

5. Alternative API-Key Config for Railway

If you do not want to use the OAuth/session-based flow above, this API-key example remains supported:

name: my-cloud-agent
runner:
  defaultAgent: opencode
  defaultModel: anthropic/claude-sonnet-4
  workdir: /app/workspace
credentials:
  envFile: ./.env
  allow: [ANTHROPIC_API_KEY]
gateway:
  channels:
    - type: http
      port: 7600
      auth: "${RANDAL_API_TOKEN}"
memory:
  store: meilisearch
  url: http://127.0.0.1:7700
  apiKey: "${MEILI_MASTER_KEY}"
  index: memory-cloud-agent

Note: The embedded Meilisearch binds to 127.0.0.1:7700 inside the container. For an external Meilisearch instance, set RANDAL_SKIP_MEILISEARCH=true and point the URL to your Meilisearch service.

Managed mode on Railway is optional. If you want the deployed runtime to start from the active published workspace artifact instead of a repo-local config file, provide:

VariableValue
RANDAL_MANAGED_CONTROL_PLANE_URLControl-plane base URL
RANDAL_MANAGED_WORKSPACE_IDManaged workspace ID
RANDAL_MANAGED_VERSION_IDOptional fixed version pin
RANDAL_MANAGED_ARTIFACT_URLOptional explicit export URL override
RANDAL_CONTROL_PLANE_STATE_DIRPersistent directory for managed workspace state, ideally on a Railway volume

6. Deploy

# Via Railway CLI
railway up

# Or push to your connected Git repo for auto-deploy
git push origin main

Voice Deployment Guidance

Shared-instance voice is supported, but the safest production posture is split deployment:

  1. Admin/browser voice instance: private or tightly restricted, authenticated browser token issuance, full admin voice access.
  2. External/PSTN voice instance: public-facing, explicit external grants, no ambient admin trust unless intentionally configured.

Use one shared instance only if you accept the larger blast radius of putting a single gateway in front of both admin and external voice paths.


📦 Importing Randal into an Existing Project

You can add a Randal agent to an existing project by extending the official Docker image. This is ideal for adding an AI agent alongside your own codebase, knowledge base, or application.

How It Works

  1. Your Dockerfile extends ghcr.io/drewbietron/randal:latest (includes Bun, Meilisearch, OpenCode, Randal)
  2. You copy your randal.config.yaml into the image
  3. You ship whatever files your agent needs (codebase, knowledge, data)
  4. The official entrypoint handles Meilisearch startup and randal serve
  5. For custom pre-start logic, add a pre-start.sh hook

1. Project Structure

your-project/
  randal.config.yaml    # agent configuration
  Dockerfile            # extends the official Randal image
  knowledge/            # optional: files your agent needs
  pre-start.sh          # optional: custom startup logic

2. Dockerfile

FROM ghcr.io/drewbietron/randal:latest

# Copy your config
COPY randal.config.yaml /app/randal.config.yaml

# Ship whatever your agent needs
COPY knowledge/ /app/knowledge/

# Optional: custom pre-start logic (e.g., DB sync)
# COPY pre-start.sh /app/pre-start.sh

The official image handles everything else — Meilisearch starts automatically, Randal serves on port 7600.

3. Pre-Start Hook

If you need custom logic before Randal starts (database sync, file setup, etc.), create a pre-start.sh. The Randal entrypoint sources this automatically:

#!/bin/bash
# pre-start.sh — runs before Randal starts

echo "Pulling data from my database..."
bun /app/scripts/sync-data.mjs || echo "Sync failed, continuing"

4. Security

The Docker container is the isolation boundary. Recommended config for imported usage:

sandbox:
  enforcement: env-scrub

runner:
  workdir: /app/workspace
  allowedWorkdirs:
    - /app/workspace

credentials:
  allow: [ANTHROPIC_API_KEY]  # only what the agent needs

See SECURITY.md for the full security model.

5. External Meilisearch (Optional)

By default, the official image runs an embedded Meilisearch instance at 127.0.0.1:7700. If you want to use an external Meilisearch instance instead:

  1. Set RANDAL_SKIP_MEILISEARCH=true to skip the embedded instance
  2. Point memory.url in your config to the external instance

6. Programmatic Usage (Advanced)

For full programmatic control, override the CMD to run your own entry point:

FROM ghcr.io/drewbietron/randal:latest
COPY randal.config.yaml /app/randal.config.yaml
COPY index.ts /app/index.ts
CMD ["bun", "run", "/app/index.ts"]
import { createRandal } from "@randal/harness";

const randal = await createRandal({
  configPath: "./randal.config.yaml",
});

See examples/imported-service/ for a complete working example.


🔍 Meilisearch Setup

Meilisearch is the sole memory backend — required for all memory operations, cross-agent sharing, and auto-injection. It is auto-installed on first randal serve.

📦 Local Install

# macOS
brew install meilisearch

# Linux (binary)
curl -L https://install.meilisearch.com | sh

# Docker
docker run -d -p 7700:7700 \
  -e MEILI_MASTER_KEY='your-master-key' \
  -v $(pwd)/meili-data:/meili_data \
  getmeili/meilisearch:latest

⚙️ Configuration

Randal auto-configures Meilisearch indexes on first connect. It sets:

  • Searchable attributes: content, category, type, source
  • Filterable attributes: type, category, source, file, timestamp
  • Sortable attributes: timestamp

No manual index setup is needed. Just provide the URL and API key in your Randal config:

memory:
  store: meilisearch
  url: http://localhost:7701
  apiKey: "${MEILI_MASTER_KEY}"

🤝 Cross-Agent Shared Index

For multiple agents in a posse to share learnings:

# Agent A config
memory:
  store: meilisearch
  url: http://localhost:7701
  apiKey: "${MEILI_MASTER_KEY}"
  index: memory-agent-a
  sharing:
    publishTo: shared
    readFrom: [shared]

# Agent B config
memory:
  store: meilisearch
  url: http://localhost:7701
  apiKey: "${MEILI_MASTER_KEY}"
  index: memory-agent-b
  sharing:
    publishTo: shared
    readFrom: [shared]

Both agents publish to and read from the shared index while maintaining their own private indexes. The shared index is auto-created on first write.

🔒 Production Considerations

  • Set a strong MEILI_MASTER_KEY (used as the admin API key).
  • Use persistent storage (--db-path or a Docker volume).
  • Meilisearch is single-node; for HA, use Meilisearch Cloud.
  • Memory is append-only by design. Indexes grow over time. Monitor disk usage.

💬 Discord Setup

  1. Go to discord.com/developers/applications and create a new application.
  2. Navigate to Bot settings. Click Reset Token to generate a bot token. Copy it.
  3. Under Privileged Gateway Intents, enable Message Content Intent.
  4. Navigate to OAuth2 > URL Generator. Select scopes: bot. Select permissions: Send Messages, Read Message History, View Channels.
  5. Copy the generated URL and open it in a browser to invite the bot to your server.
  6. Add the token to your .env and config:
# .env
DISCORD_BOT_TOKEN=your-bot-token-here
gateway:
  channels:
    - type: discord
      token: "${DISCORD_BOT_TOKEN}"
      allowFrom: ["your-discord-user-id"]

Discord works on all platforms — local Mac, Railway, Docker, anywhere Randal runs.

For the full Discord feature reference (slash commands, buttons, threads, per-server config), see the Discord Integration Guide.

Channel Commands Reference

CommandExampleDescription
run: <prompt>run: refactor authStart a new job
statusstatusShow all active jobs
status: <id>status: abc1Show specific job
stopstopStop most recent job
stop: <id>stop: abc1Stop specific job
context: <text>context: focus on testsInject context
jobsjobsList all jobs
memory: <query>memory: auth patternsSearch memory
resume: <id>resume: abc1Resume failed job
helphelpShow commands

Unrecognized messages are treated as implicit run: commands.


🔑 Environment Variables Reference

VariableUsed ByDescription
RANDAL_API_TOKEN📡 Gateway HTTP authBearer token for API authentication.
MEILI_MASTER_KEY🧠 Memory (Meilisearch)Meilisearch admin API key.
ANTHROPIC_API_KEY🤖 Agent (Claude/OpenCode)Anthropic API key for model access.
OPENROUTER_API_KEY🤖 Agent / EmbedderOpenRouter API key (if using OpenRouter models).
OPENAI_API_KEY🔌 EmbedderOpenAI API key (if using OpenAI embeddings).
DISCORD_BOT_TOKEN💬 Discord channelDiscord bot token for the Discord adapter.