Author Topic: Backup CWP to an Amazon S3 bucket  (Read 6357 times)

0 Members and 1 Guest are viewing this topic.

Offline
*
Backup CWP to an Amazon S3 bucket
« on: September 07, 2021, 06:23:45 PM »
Hey folks, sharing this here if anyone needs it!

I host CWP on Amazon EC2, needed a better backup solution because:
  • standard EC2 volumes are expensive.
  • cheaper (per GB) volumes for backups are minimum 500GB, so end up expensive anyway.
  • CWP native backups suck.
  • CWP backups to FTP have extreme versioning issues, the daly backup simply overwrites the previous day.

To do this, you'll need to create Amazon security credentials here: https://console.aws.amazon.com/iam/home?#/security_credentials
Go to the section titled "Access keys (access key ID and secret access key)".
Create yourself an 'access key'.

Install Amazon client on your CWP server:
Code: [Select]
# sudo yum install -y python-pip
Code: [Select]
# sudo pip install awscli
Code: [Select]
# sudo pip install futures
Code: [Select]
# pip uninstall rsa
Code: [Select]
# pip install -v rsa==4.0

Set up the Amazon connection to S3 and create a bucket
Code: [Select]
# aws configure
In here, add your access keys and AWS region like so:
Code: [Select]
AWS Access Key ID [None]: ACCESSKEYID
AWS Secret Access Key [None]: ACCESSKEY
Default region name [None]: us-west-2
Default output format [None]: json

Once set up, create your S3 bucket (replace 'servername_backups' with the name of your desired S3 bucket):
Code: [Select]
# aws s3 mb s3://servername_backups
Create a backup script (if you don't have nano, use vi instead or install nano with 'yum install nano'):
Code: [Select]
# nano backups-s3.sh
Add the backup script content, sample of mine here:
https://gist.github.com/FreshLondon/0e2bd776ba69ab542b1afef0ecdd0db9
Don't forget to add your S3 bucket name at the top of your file, and choose your /home/ directory path if it isn't standard..

Save the file, then make that script executable:
Code: [Select]
# chmod +x backups-s3.sh
Run the script:
Code: [Select]
# ./backups-s3.sh
Add a cron job for the script to run every night:
Open chrontab:
Code: [Select]
#crontab -eAdd a new line at the end fo the file:
Code: [Select]
0 3 * * * /path/to/backups-s3.sh > /dev/null 2>&1The above runs your backup script every day at 3am (according to the server’s clock).


Hope it helps!
Thanks to cinique for the original script format, modified for Amazon S3 use (and reduced a lot of the process to preserve disk space on the server doing the backup.

Offline
*
Re: Backup CWP to an Amazon S3 bucket
« Reply #1 on: November 20, 2021, 07:13:03 PM »
Hey, thank you for the great post. Please can you help in which folder/server should I put the backups-s3.sh file? The rest of commands should be executed on the server too ?

Offline
*
Re: Backup CWP to an Amazon S3 bucket
« Reply #2 on: May 23, 2022, 06:54:13 PM »
Hello everyone..

Thanks for this S3 backup script.
But something weird is happening...
The backups apparently works fine, but it only do 2 sites and then it stop, and ends.
Never i can do the entire backup of all sites/users.

Could be the second site is too big (more of 5GB) and the tar.gz never finish the job?

Thanks for any help
Fabian

Offline
*
Re: Backup CWP to an Amazon S3 bucket
« Reply #3 on: April 17, 2023, 02:27:35 AM »
Please can you help in which folder/server should I put the backups-s3.sh file?

Anywhere, as long as the cron task is set to run the script with the correct file location for where it resides.
(for example, put the .sh script in root (/) and run it from there if it's easiest.

The rest of commands should be executed on the server too ?

Yes, add and run this script on the same server that you are backing up.

The backups apparently works fine, but it only do 2 sites and then it stop, and ends.
Never i can do the entire backup of all sites/users.

Could be the second site is too big (more of 5GB) and the tar.gz never finish the job?
Not sure, did you run out of disk space? One of the common issues when running any backup is the size of the zip file before it is sent and deleted. The above script splits files/database/vmail separately, which helps a bit ..but to play it safe keep at least the size (of your largest user account) spare on top of your current remaining disk quota. Minimum.

Sorry for the late responses, I now use AAPanel as I no longer host vmail for clients. Completely ditched the idea and went for just barebones control panel without all of the crazy CWP config pages and bugs.
I love CWP, it's just not developed and tested enough for long term production use as it could be - albeit a wonderful solution  :)
« Last Edit: April 17, 2023, 02:31:37 AM by FreshLondon »

Offline
*
Re: Backup CWP to an Amazon S3 bucket
« Reply #4 on: April 18, 2023, 09:46:28 AM »
Thank you for script. I want to use hetzner storage box instead of aws, Can you please modify rsync part and delete older backups part for that use case?