Back to Catalog
job-config.json
{
  "config": {
    "env": {
      "AWS_ACCESS_KEY_ID": "${AWS_ACCESS_KEY_ID}",
      "AWS_SECRET_ACCESS_KEY": "${AWS_SECRET_ACCESS_KEY}",
      "DATABASE_URL": "${DATABASE_URL}",
      "S3_BUCKET": "${S3_BUCKET}"
    },
    "script": "#!/bin/bash\nset -e\nDUMP_FILE=\"/tmp/backup-$(date +%Y%m%d-%H%M%S).sql\"\necho \"Starting PostgreSQL backup...\"\npg_dump $DATABASE_URL > $DUMP_FILE\necho \"Uploading to S3...\"\naws s3 cp $DUMP_FILE s3://$S3_BUCKET/backups/\nrm $DUMP_FILE\necho \"Backup completed successfully!\"",
    "timeout": 300
  },
  "type": "shell"
}

Import via API

curl -X POST http://localhost:8080/api/jobs -H "Content-Type: application/json" -d '{"config":{"env":{"AWS_ACCESS_KEY_ID":"${AWS_ACCESS_KEY_ID}","AWS_SECRET_ACCESS_KEY":"${AWS_SECRET_ACCESS_KEY}","DATABASE_URL":"${DATABASE_URL}","S3_BUCKET":"${S3_BUCKET}"},"script":"#!/bin/bash\nset -e\nDUMP_FILE=\"/tmp/backup-$(date +%Y%m%d-%H%M%S).sql\"\necho \"Starting PostgreSQL backup...\"\npg_dump $DATABASE_URL > $DUMP_FILE\necho \"Uploading to S3...\"\naws s3 cp $DUMP_FILE s3://$S3_BUCKET/backups/\nrm $DUMP_FILE\necho \"Backup completed successfully!\"","timeout":300},"type":"shell"}'

Or copy the config above and create the job manually via the web UI.

PostgreSQL Backup to S3

Dump PostgreSQL database and upload to S3 bucket using pg_dump and AWS CLI. Perfect for scheduled database backups with automatic cloud storage.

Job TypeSHELL
CategoryBackup & Recovery
Tags
#postgres #s3 #backup #aws #database
Added on January 15, 2025