Poll

Useful For You?

Yes
2 (100%)
No
0 (0%)
I'll stay with CWP provided scripts
0 (0%)

Total Members Voted: 2

Voting closed: May 17, 2020, 11:57:14 AM

Author Topic: Custom Backup  (Read 40417 times)

0 Members and 1 Guest are viewing this topic.

Custom Backup
« on: March 15, 2020, 03:12:38 PM »
Run this as a nightly cron task, calling it for example custom-backup.sh
  • Ensure that you have /backup/custom and /home/tmp_bak directories (change to suit).
  • Adjust the retention value to suit.
  • Optional: create a /home/username/backup_exclude.conf to ignore cache, tmp etc. Just list each on a different line.
Caveats: currently untested for mail forwarders or subdomains.

This will create local backups only which you can then FTP/rsync (whatever) to a remote location, with a different cron task running, say an hour later.
Backups are stored in three directories: home_dir, mysql and vmail - this way you can restore only a portion of a backup if needs be. Just unzip/untar to a temporary location and grab the files that you need.

Code: [Select]
#!/usr/bin/bash
tmp_dir=/home/tmp_bak/
backup_dir=/backup/custom/
retention=2
# -------------------
mysql root_cwp -B -N -s -e "SELECT username,domain FROM user WHERE backup='on'" | while read -r username domain
do

echo Custom backup task starting for $username at $domain
mkdir -p ${tmp_dir}${username}/home_dir
echo Copying home directory
ionice -c 3 nice -n +19 rsync -a /home/${username}/ ${tmp_dir}${username}/home_dir
echo Backing up databases
mkdir -p ${tmp_dir}${username}/mysql/
mysql --defaults-extra-file=/root/.my.cnf -e "show databases LIKE '${username}%';" | grep -v Database | while read databasename
 do
     echo Dumping $databasename
     mysqldump --defaults-extra-file=/root/.my.cnf "$databasename" > ${tmp_dir}${username}/mysql/"$databasename.sql" \
               2> ${tmp_dir}${username}/mysql/errors.txt

 done
if [ -d /var/vmail/${domain} ]; then
 mkdir -p ${tmp_dir}${username}/vmail/
 echo Copying email
 ionice -c 3 nice -n +19 cp -fR /var/vmail/${domain} ${tmp_dir}${username}/vmail/
fi
echo Consolidating files
if [ -f /home/${username}/backup_exclude.conf ]; then
 ionice -c 3 nice -n +19 tar -cjf ${tmp_dir}${username}.tar.bz2 --exclude-from=/home/${username}/backup_exclude.conf ${tmp_dir}${username}
else
 ionice -c 3 nice -n +19 tar -cjf ${tmp_dir}${username}.tar.bz2 ${tmp_dir}${username}
fi
mv ${tmp_dir}${username}.tar.bz2 ${backup_dir}${username}-$(date -d "today" +"%Y%m%d%H%M").tar.bz2
echo Cleaning up
/usr/bin/find ${backup_dir} -name "*.bz2" -mtime +${retention} -delete > /dev/null 2>&1
rm -Rf ${tmp_dir}${username}

done
echo Custom Backup Job Finished


Use at your own risk!
I currently use this to supplement the CWP (new) backup function.
« Last Edit: March 15, 2020, 03:26:03 PM by ejsolutions »

Offline
*
Re: Custom Backup
« Reply #1 on: May 23, 2020, 08:00:06 AM »
Thank You!! Script finally give CWP the ability to backup databases. With both buildin options I culdnt do this at all!

Offline
*
Re: Custom Backup
« Reply #2 on: October 01, 2020, 08:26:10 AM »
Thanks for sharing this script.
What about moving the “if backup_exclude.conf” before to speedup the process and limiting the disk usage?
(the backup_exclude.conf lines have to be relative to user home, so if I need to exclude the folder /home/myuser/public_html/var/cache, I’ll write public_html/var/cache/)
Code: [Select]
#!/usr/bin/bash
tmp_dir=/home/tmp_bak/
backup_dir=/backup/custom/
retention=2
# -------------------
mysql root_cwp -B -N -s -e "SELECT username,domain FROM user WHERE backup='on'" | while read -r username domain
do

echo Custom backup task starting for $username at $domain
mkdir -p ${tmp_dir}${username}/home_dir
echo Copying home directory
if [ -f /home/${username}/backup_exclude.conf ]; then
ionice -c 3 nice -n +19 rsync -a --links --exclude-from=/home/${username}/backup_exclude.conf /home/${username}/ ${tmp_dir}${username}/home_dir
else
ionice -c 3 nice -n +19 rsync -a --links /home/${username}/ ${tmp_dir}${username}/home_dir
fi
echo Backing up databases
mkdir -p ${tmp_dir}${username}/mysql/
mysql --defaults-extra-file=/root/.my.cnf -e "show databases LIKE '${username}%';" | grep -v Database | while read databasename
 do
     echo Dumping $databasename
     mysqldump --defaults-extra-file=/root/.my.cnf "$databasename" > ${tmp_dir}${username}/mysql/"$databasename.sql" \
               2> ${tmp_dir}${username}/mysql/errors.txt

 done
if [ -d /var/vmail/${domain} ]; then
 mkdir -p ${tmp_dir}${username}/vmail/
 echo Copying email
 ionice -c 3 nice -n +19 cp -fR /var/vmail/${domain} ${tmp_dir}${username}/vmail/
fi
echo Consolidating files
ionice -c 3 nice -n +19 tar -cjf ${tmp_dir}${username}.tar.bz2 ${tmp_dir}${username}
mv ${tmp_dir}${username}.tar.bz2 ${backup_dir}${username}-$(date -d "today" +"%Y%m%d%H%M").tar.bz2
echo Cleaning up
/usr/bin/find ${backup_dir} -name "*.bz2" -mtime +${retention} -delete > /dev/null 2>&1
rm -Rf ${tmp_dir}${username}

done
echo Custom Backup Job Finished

Re: Custom Backup
« Reply #3 on: October 01, 2020, 09:02:25 AM »
@Teo
Good point. The advantage of sharing the script. :)
I've also made some recent tweaks but want to go a bit further with them.

Re: Custom Backup
« Reply #4 on: October 01, 2020, 10:55:00 AM »
Note:
Code: [Select]
rsync -a --linksAFAIK, -a (archive) implies -l (links). I'd be extremely careful when messing about with symlinks, though and just be content with -a. Generally speaking, don't use the --links, -l or -L options.
 

Offline
*
Re: Custom Backup
« Reply #5 on: November 03, 2020, 02:47:19 PM »
Hi
When I try to run your backup script I got the following error:

custom-backup.sh: line 40: syntax error: unexpected end of file

Can you please help?

Re: Custom Backup
« Reply #6 on: November 03, 2020, 03:43:10 PM »
Hi
When I try to run your backup script I got the following error:

custom-backup.sh: line 40: syntax error: unexpected end of file

Can you please help?
Assuming that you mean the first script posted.
A 20 second Google and I think you may be a Windoze luser..
Quote
I think file.sh is with CRLF line terminators.
run
dos2unix file.sh
then the problem will be fixed.
You can install dos2unix in ubuntu with this:
sudo apt-get install dos2unix
Substitute file.sh (obviously?) and "sudo apt-get" with yum (maybe less obvious).
Better still, create the file directly on your server using vi (perhaps nano is entering CR_LF, which I doubt - I rarely use it.)

Offline
*
Re: Custom Backup
« Reply #7 on: November 04, 2020, 11:35:06 AM »
A 20 second Google and I think you may be a Windoze luser..
Quote

You're so right... Unfortunately I'm a windoze luser hahaha
I prefered to create the file with vi editor on my server ;)

Thank you very much!

Re: Custom Backup
« Reply #8 on: November 04, 2020, 12:21:13 PM »
You're so right... Unfortunately I'm a windoze luser hahaha
I preferred to create the file with vi editor on my server ;)
Thank you very much!
All is not lost if you used vi  8)
You're welcome.

(dos2unix took me back 30+years !)
« Last Edit: November 04, 2020, 12:23:56 PM by cynique »

Offline
*
Re: Custom Backup
« Reply #9 on: November 16, 2020, 04:04:50 PM »
I currently use this to supplement the CWP (new) backup function.

Wonderful script, thank you!
Testing atm from CLI, looks great so far. Could really use a limiter on the bzip process. Although its only using one vCPU during the zip process it does bounce up to 100% which would be nice to avoid for a live server  :)

Oh and just FYI chaps, if you're adding this then write it in VI/VIM/nano from CLI and not from the 'file manager', it messes with encoding.

Re: Custom Backup
« Reply #10 on: November 16, 2020, 04:10:51 PM »
Whilst running, keep an eye on "top -c -d5"
Code: [Select]
ionice -c 3 nice -n +19 tar -cjf ${tmp_dir}${username}.tar.bz2 ${tmp_dir}${username}The combination of ionice & nice is supposed to limit the disc & CPU loading, respectively.

Try removing the '+' before 19.
« Last Edit: November 16, 2020, 04:13:29 PM by cynique »

Re: Custom Backup
« Reply #11 on: November 16, 2020, 06:03:05 PM »
Latest version.

Code: [Select]
#!/usr/bin/bash
# CWP Custom backup
# Version 2.1 ejsolutions/Teo/cynique
# Set the following 3 items to suit
tmp_dir=/home/tmp_bak/
backup_dir=/backup/custom/
retention=1
# Optional though advisable
# Create a backup_exclude.conf file in each user home directory,
# listing directories to exclude, for example
# backupcwp/
# tmp/
# cache/
# -------------------
if [ -d ${tmp_dir} ]; then
 mkdir -p ${tmp_dir}
fi
if [ -d ${backup_dir} ]; then
 mkdir -p ${backup_dir}
fi
# -------------------
mysql root_cwp -B -N -s -e "SELECT username,domain FROM user WHERE backup='on'" | while read -r username domain
do

echo Custom backup task starting for $username at $domain
mkdir -p ${tmp_dir}${username}/home_dir
echo Copying home directory
if [ -f /home/${username}/backup_exclude.conf ]; then
ionice -c 3 nice rsync -a --exclude-from=/home/${username}/backup_exclude.conf /home/${username}/ ${tmp_dir}${username}/home_dir
else
ionice -c 3 nice rsync -a /home/${username}/ ${tmp_dir}${username}/home_dir
fi
echo Backing up databases
mkdir -p ${tmp_dir}${username}/mysql/
mysql --defaults-extra-file=/root/.my.cnf -e "show databases LIKE '${username}%';" | grep -v Database | while read databasename
 do
     echo Dumping $databasename
     nice -n 5 mysqldump --defaults-extra-file=/root/.my.cnf "$databasename" > ${tmp_dir}${username}/mysql/"$databasename.sql" \
               2> ${tmp_dir}${username}/mysql/errors.txt && sync && \
nice gzip ${tmp_dir}${username}/mysql/"$databasename.sql"
 done
if [ -d /var/vmail/${domain} ]; then
 mkdir -p ${tmp_dir}${username}/vmail/
 echo Copying email
 ionice -c 3 nice cp -fR /var/vmail/${domain} ${tmp_dir}${username}/vmail/
fi
echo Consolidating files
ionice -c 3 nice -n 15 tar -cjf ${tmp_dir}${username}.tar.bz2 ${tmp_dir}${username} && \
mv ${tmp_dir}${username}.tar.bz2 ${backup_dir}${username}-$(date -d "today" +"%Y%m%d%H%M").tar.bz2
echo Cleaning up
/usr/bin/find ${backup_dir} -name "*.bz2" -mtime +${retention} -delete > /dev/null 2>&1
rm -Rf ${tmp_dir}${username}

done
echo Custom Backup Job Finished


Notes:
Values of nice have been chosen with particular attention, to not coincide with clamd and help the sequence of the script.
As with all backups, this script should be run at quiet times on your server.
« Last Edit: November 16, 2020, 06:23:10 PM by cynique »

Offline
*
Re: Custom Backup
« Reply #12 on: November 17, 2020, 07:48:51 PM »
Latest version.

Thank you! Testing now :)

One thing that would be SUPER cool is putting that days backups into its own folder, 20201118 for example.
I do understand that if a backup goes on past midnight it would go to a new folder but the idea is still super cool and useful (I plan on keeping 14+ days retention), would it be possible to add folders?  :D

or even go further and split the backup?
backups/date/username/mail.zip
backups/date/username/public_html.zip
backups/date/username/databases.zip
« Last Edit: November 17, 2020, 07:51:27 PM by FreshLondon »

Re: Custom Backup
« Reply #13 on: November 17, 2020, 08:48:04 PM »
Actually, I think I'll do a spot of reconfiguring, to more closely mimic a WHM/cPanel backup, so yes backups grouped by date. This will help with my multiple backups to remote locations - the key reason to have backups consolidated (larger block size).

I'll consider a parameter, to allow keeping the backups split.

Ideally backups should run between the hours of 02:00 and say, 06:30 (accounting for runtime), so date isn't an issue.

Offline
*
Re: Custom Backup
« Reply #14 on: November 18, 2020, 12:25:05 AM »
If its of interest to you, the result of your most recent snippet:

Works really well and although its very CPU heavy this isn't a clash with ClamAV/Varnish as those are mostly just hogging RAM :)

Actually, I think I'll do a spot of reconfiguring, to more closely mimic a WHM/cPanel backup, so yes backups grouped by date. This will help with my multiple backups to remote locations - the key reason to have backups consolidated (larger block size).
Amazing, really excited!

I'll consider a parameter, to allow keeping the backups split.
Yes, and if you segment the tasks imagine transferring one segment (say Mail.zip) to the backups location while the next task (say MySQL.zip) could be already preparing.. ooooo the possibilities!  ;D
« Last Edit: November 18, 2020, 12:34:10 AM by FreshLondon »