Skip to content
GitHub stars

Remote Deployment

msgvault supports a remote-first workflow where you configure a remote instance using a local browser session, then deploy and sync on headless hardware. This works with any always-on host: a NAS (a good choice for RAID fault tolerance), a cloud VM, a Raspberry Pi, or any Linux server with Docker.

The flow is built on three capabilities:

  • msgvault setup interactive wizard that can generate a deployment bundle
  • msgvault export-token to upload OAuth tokens over API
  • [remote] config for running local CLI commands against a remote server

Setup Flow Overview

  1. Configure OAuth credentials and choose the remote target in msgvault setup
  2. Copy the generated nas-bundle directory to your remote host
  3. Run docker-compose up -d on the remote host
  4. Add a Gmail account locally and export the token to the remote host
  5. Run the initial full sync on the remote host
  6. Use the msgvault API or CLI in remote mode

Docker Image

The image is published to GitHub Container Registry:

Terminal window
docker pull ghcr.io/wesm/msgvault:latest
TagDescription
latestLatest stable release from main branch
v1.2.3Specific version
1.2Latest patch of minor version
1Latest minor/patch of major version
sha-abc1234Specific commit (for debugging)

Architectures: linux/amd64 (Intel/AMD NAS, standard servers) and linux/arm64 (Raspberry Pi 4/5, newer NAS). Docker selects the correct one automatically.

1) Interactive Setup and Bundle Generation

Run the wizard once after installing msgvault:

Terminal window
msgvault setup

If you already have OAuth configured, you can skip that step during the flow. If you choose to configure a remote server, the wizard:

  • prompts for remote hostname/IP and port
  • generates a random API key
  • creates <MSGVAULT_HOME>/nas-bundle
  • writes a server-ready config.toml
  • copies client_secret.json into the bundle
  • writes a docker-compose.yml for deployment
  • prints the command for uploading the OAuth token once the account is added

Bundle Contents

From the local machine, the wizard creates:

Terminal window
ls -la ~/.msgvault/nas-bundle
  • config.toml — preconfigured server config for the remote container
  • client_secret.json — copied OAuth credentials
  • docker-compose.yml — ready-to-run Compose service

Example config.toml Generated by Setup

[server]
bind_addr = "0.0.0.0"
api_port = 8080
api_key = "<32-byte-hex-key>"
[oauth]
client_secrets = "/data/client_secret.json"
[sync]
rate_limit_qps = 5
# Accounts will be added automatically when you export tokens.
# [[accounts]] can be added manually if needed.

Example docker-compose.yml Generated by Setup

services:
msgvault:
image: ghcr.io/wesm/msgvault:latest
container_name: msgvault
user: root
restart: unless-stopped
ports:
- "8080:8080"
volumes:
- ./:/data
environment:
- TZ=America/Los_Angeles
- MSGVAULT_HOME=/data
command: ["serve"]
healthcheck:
test: ["CMD", "wget", "-qO/dev/null", "http://localhost:8080/health"]
interval: 30s
timeout: 5s
retries: 3
start_period: 10s

2) Deploy to Remote Host

Copy the bundle and start services via SSH:

Terminal window
# Copy the generated bundle
scp -r ~/.msgvault/nas-bundle user@remote-host:/opt/msgvault
# Start services on the remote host
ssh user@remote-host "cd /opt/msgvault && docker-compose up -d"

Verify the service:

Terminal window
curl http://remote-host:8080/health

See Platform Notes for Synology, QNAP, and Raspberry Pi-specific paths.

3) Provision Gmail Tokens

For each mailbox:

  1. Add account locally (requires browser):
Terminal window
msgvault add-account you@gmail.com
  1. Export the token to the remote endpoint:
Terminal window
msgvault export-token you@gmail.com \
--to http://nas-ip:8080 --api-key YOUR_API_KEY --allow-insecure

The command uploads to POST /api/v1/auth/token/{email} and also posts to POST /api/v1/accounts to register:

  • default sync schedule 0 2 * * *
  • account enabled

If you did not configure remote details during setup, you can also set:

Terminal window
export MSGVAULT_REMOTE_URL=http://nas-ip:8080
export MSGVAULT_REMOTE_API_KEY=YOUR_API_KEY
msgvault export-token you@gmail.com --allow-insecure

4) Run Initial Full Sync

The scheduler and sync API run incremental syncs only, which require a completed full sync to work. Run the initial full sync inside the container:

Terminal window
# Required — scheduled sync will not work without this
docker exec msgvault msgvault sync-full you@gmail.com
# Optional: test with a small batch first
docker exec msgvault msgvault sync-full you@gmail.com --limit 100

After the full sync completes, scheduled syncs run automatically on the cron schedule registered during token export (0 2 * * * by default). You can also trigger a manual sync via the API:

Terminal window
# Trigger incremental sync via API (only works after full sync)
curl -X POST -H "X-API-Key: YOUR_API_KEY" http://remote-host:8080/api/v1/sync/you@gmail.com
# Check schedule status
curl -H "X-API-Key: YOUR_API_KEY" http://remote-host:8080/api/v1/scheduler/status

5) Verify Setup

Terminal window
# Check token was saved
docker exec msgvault ls -la /data/tokens/
# Check daemon logs
docker logs msgvault
# Verify scheduled sync is registered
curl -H "X-API-Key: YOUR_API_KEY" http://remote-host:8080/api/v1/scheduler/status

After setup, your data directory contains:

/opt/msgvault/ # (or wherever you deployed the bundle)
├── config.toml # Server configuration
├── client_secret.json # Google OAuth credentials
├── docker-compose.yml # Compose service definition
├── msgvault.db # SQLite database (created on first run)
├── tokens/ # OAuth tokens (one per account)
│ └── you@gmail.com.json
├── attachments/ # Content-addressed attachment storage
└── analytics/ # Parquet cache for fast queries

Using the Local CLI Against Remote

When your local machine config has:

[remote]
url = "http://remote-host:8080"
api_key = "YOUR_API_KEY"
allow_insecure = true

search, stats, list-accounts, and show-message automatically query the remote API. Use --local if you explicitly want to query local SQLite instead.

Platform Notes

Synology DSM

  1. Install Container Manager (Docker) from Package Center
  2. Create a shared folder for data (e.g., /volume1/docker/msgvault)
  3. Use Container Manager UI or SSH to run docker-compose

Synology uses ACLs that can override standard Unix permissions. The generated bundle already includes user: root in docker-compose.yml to handle this. If you’re writing your own Compose file, add user: root to the service.

Via SSH:

Terminal window
cd /volume1/docker/msgvault
docker-compose up -d

QNAP

  1. Install Container Station from App Center
  2. Create a folder for data (e.g., /share/Container/msgvault)
  3. Use Container Station or SSH to run docker-compose

Raspberry Pi

Works on Pi 4 and Pi 5 with a 64-bit OS:

Terminal window
# Verify 64-bit OS
uname -m # Should show aarch64
# Standard docker-compose setup
docker-compose up -d

Initial sync of large mailboxes will be slower on Pi hardware. Use --limit to test with a small batch first.

Security Notes

  • Use Tailscale. The recommended way to access your NAS remotely is via Tailscale. It encrypts all traffic and avoids the need for TLS certificates, port forwarding, or reverse proxies. Use your Tailscale hostname (e.g., http://nas.tail12345.ts.net:8080) with --allow-insecure.
  • API key protects all API access. The server requires api_key for non-loopback addresses. Anyone with the key can read your entire archive, so treat it like a password.
  • Don’t expose port 8080 to the internet. msgvault is designed for trusted networks. If you need internet access, use Tailscale rather than opening ports on your router.
  • The generated bundle sets user: root in Docker Compose, which works around common NAS ACL quirks (for example Synology). On a standard Linux server you can change this to a non-root user.

Container Management

Terminal window
# View logs
docker logs msgvault
docker logs -f msgvault # Follow
# Run msgvault commands inside the container
docker exec msgvault msgvault stats
docker exec -it msgvault msgvault tui # Interactive TUI
# Restart
docker-compose restart
# Update to latest image
docker-compose pull
docker-compose up -d
# Stop
docker-compose down

Health Checks

The container includes a health check that polls /health every 30 seconds.

Terminal window
docker inspect --format='{{.State.Health.Status}}' msgvault
# Returns: healthy, unhealthy, or starting

Backups

Back up the data directory regularly:

Terminal window
# Stop container for consistent backup
docker-compose stop
tar -czf msgvault-backup-$(date +%Y%m%d).tar.gz ./data
docker-compose start

Critical files:

  • msgvault.db — email metadata and bodies
  • tokens/ — OAuth tokens (re-auth required if lost)
  • config.toml — configuration
  • attachments/ — email attachments (large, optional if you can re-sync)

Cron Schedule Reference

The schedule field in [[accounts]] uses standard cron format (5 fields):

┌───────────── minute (0-59)
│ ┌───────────── hour (0-23)
│ │ ┌───────────── day of month (1-31)
│ │ │ ┌───────────── month (1-12)
│ │ │ │ ┌───────────── day of week (0-6, 0=Sunday)
│ │ │ │ │
* * * * *
ScheduleDescription
0 2 * * *Daily at 2:00 AM
0 */6 * * *Every 6 hours
*/30 * * * *Every 30 minutes
0 8,18 * * *Twice daily at 8 AM and 6 PM
0 2 * * 0Weekly on Sunday at 2 AM
0 2 1 * *Monthly on the 1st at 2 AM

Troubleshooting

Export fails with HTTPS required

msgvault export-token requires HTTPS by default. If your endpoint is http://, add --allow-insecure.

401/authorization errors from export

Check that X-API-Key matches the server’s [server] api_key and that /api/v1/auth/token/{email} is reachable.

Sync fails with “no history ID” or “run full sync first”

The scheduler and sync API only run incremental syncs. You must run a full sync first:

Terminal window
docker exec msgvault msgvault sync-full you@gmail.com

Account not syncing after import

Verify the account exists in the server config:

Terminal window
curl -H "X-API-Key: YOUR_API_KEY" http://remote-host:8080/api/v1/accounts

If missing, re-run export-token, which also posts account metadata.