Backups & Restore
Regular database backups are essential for any production deployment. Drizzle migrations are forward-only — there is no automatic rollback. Backups are your safety net.
Quick Backup
Section titled “Quick Backup”docker compose -f compose.prod.yaml exec db \ pg_dump -U ${POSTGRES_USER:-archvault} ${POSTGRES_DB:-archvault} \ > backup-$(date +%Y%m%d-%H%M%S).sqlIf PostgreSQL is running on a separate server:
pg_dump -h db.example.com -U archvault archvault > backup-$(date +%Y%m%d-%H%M%S).sqlQuick Restore
Section titled “Quick Restore”docker compose -f compose.prod.yaml exec -T db \ psql -U ${POSTGRES_USER:-archvault} ${POSTGRES_DB:-archvault} \ < backup-20260317-143000.sqlpsql -h db.example.com -U archvault archvault < backup-20260317-143000.sqlBackup Formats
Section titled “Backup Formats”PostgreSQL supports two backup formats. Choose based on your needs:
Human-readable SQL dump. Easy to inspect and edit.
# Backuppg_dump -U archvault archvault > backup.sql
# Restorepsql -U archvault archvault < backup.sqlPros: readable, portable, easy to grep/edit Cons: larger file size, slower restore on large databases
Compressed binary format with parallel restore support.
# Backup (compressed)pg_dump -U archvault -Fc archvault > backup.dump
# Restorepg_restore -U archvault -d archvault --clean --if-exists backup.dumpPros: smaller files (~3-5x compression), faster parallel restore, selective table restore Cons: not human-readable
Automated Backups
Section titled “Automated Backups”Set up a cron job for daily backups with 7-day retention:
-
Create a backup script:
backup-archvault.sh #!/bin/shset -eBACKUP_DIR="/opt/archvault/backups"COMPOSE_FILE="/opt/archvault/compose.prod.yaml"RETENTION_DAYS=7mkdir -p "$BACKUP_DIR"# Create backupdocker compose -f "$COMPOSE_FILE" exec -T db \pg_dump -U archvault -Fc archvault \> "$BACKUP_DIR/archvault-$(date +%Y%m%d-%H%M%S).dump"# Remove backups older than retention periodfind "$BACKUP_DIR" -name "archvault-*.dump" -mtime +$RETENTION_DAYS -deleteecho "Backup complete. $(ls -1 "$BACKUP_DIR"/archvault-*.dump | wc -l) backups retained." -
Make it executable:
Terminal window chmod +x backup-archvault.sh -
Add a cron job (daily at 3 AM):
Terminal window crontab -e0 3 * * * /opt/archvault/backup-archvault.sh >> /var/log/archvault-backup.log 2>&1
Verifying Backups
Section titled “Verifying Backups”An untested backup is not a backup. Periodically verify by restoring to a throwaway container:
# Start a temporary PostgreSQL containerdocker run --rm -d \ --name archvault-backup-test \ -e POSTGRES_USER=archvault \ -e POSTGRES_PASSWORD=testpass \ -e POSTGRES_DB=archvault \ postgres:18-alpine
# Wait for it to be readysleep 3
# Restore the backupdocker exec -i archvault-backup-test \ pg_restore -U archvault -d archvault --clean --if-exists \ < /opt/archvault/backups/archvault-20260317-030000.dump
# Verify table countdocker exec archvault-backup-test \ psql -U archvault -d archvault -c "\dt" | tail -3
# Clean updocker stop archvault-backup-testOffsite Storage
Section titled “Offsite Storage”For disaster recovery, copy backups to cloud storage:
# AWS S3aws s3 cp backup.dump s3://my-bucket/archvault/backup-$(date +%Y%m%d).dump
# Google Cloud Storagegsutil cp backup.dump gs://my-bucket/archvault/backup-$(date +%Y%m%d).dump
# Backblaze B2b2 upload-file my-bucket backup.dump archvault/backup-$(date +%Y%m%d).dumpAdd the upload command to the end of your automated backup script for hands-off offsite copies.