Control Web Panel
WebPanel => Backup => Topic started by: ejsolutions on March 15, 2020, 03:12:38 PM
-
Run this as a nightly cron task, calling it for example custom-backup.sh
- Ensure that you have /backup/custom and /home/tmp_bak directories (change to suit).
- Adjust the retention value to suit.
- Optional: create a /home/username/backup_exclude.conf to ignore cache, tmp etc. Just list each on a different line.
Caveats: currently untested for mail forwarders or subdomains.
This will create local backups only which you can then FTP/rsync (whatever) to a remote location, with a different cron task running, say an hour later.
Backups are stored in three directories: home_dir, mysql and vmail - this way you can restore only a portion of a backup if needs be. Just unzip/untar to a temporary location and grab the files that you need.
#!/usr/bin/bash
tmp_dir=/home/tmp_bak/
backup_dir=/backup/custom/
retention=2
# -------------------
mysql root_cwp -B -N -s -e "SELECT username,domain FROM user WHERE backup='on'" | while read -r username domain
do
echo Custom backup task starting for $username at $domain
mkdir -p ${tmp_dir}${username}/home_dir
echo Copying home directory
ionice -c 3 nice -n +19 rsync -a /home/${username}/ ${tmp_dir}${username}/home_dir
echo Backing up databases
mkdir -p ${tmp_dir}${username}/mysql/
mysql --defaults-extra-file=/root/.my.cnf -e "show databases LIKE '${username}%';" | grep -v Database | while read databasename
do
echo Dumping $databasename
mysqldump --defaults-extra-file=/root/.my.cnf "$databasename" > ${tmp_dir}${username}/mysql/"$databasename.sql" \
2> ${tmp_dir}${username}/mysql/errors.txt
done
if [ -d /var/vmail/${domain} ]; then
mkdir -p ${tmp_dir}${username}/vmail/
echo Copying email
ionice -c 3 nice -n +19 cp -fR /var/vmail/${domain} ${tmp_dir}${username}/vmail/
fi
echo Consolidating files
if [ -f /home/${username}/backup_exclude.conf ]; then
ionice -c 3 nice -n +19 tar -cjf ${tmp_dir}${username}.tar.bz2 --exclude-from=/home/${username}/backup_exclude.conf ${tmp_dir}${username}
else
ionice -c 3 nice -n +19 tar -cjf ${tmp_dir}${username}.tar.bz2 ${tmp_dir}${username}
fi
mv ${tmp_dir}${username}.tar.bz2 ${backup_dir}${username}-$(date -d "today" +"%Y%m%d%H%M").tar.bz2
echo Cleaning up
/usr/bin/find ${backup_dir} -name "*.bz2" -mtime +${retention} -delete > /dev/null 2>&1
rm -Rf ${tmp_dir}${username}
done
echo Custom Backup Job Finished
Use at your own risk!
I currently use this to supplement the CWP (new) backup function.
-
Thank You!! Script finally give CWP the ability to backup databases. With both buildin options I culdnt do this at all!
-
Thanks for sharing this script.
What about moving the “if backup_exclude.conf” before to speedup the process and limiting the disk usage?
(the backup_exclude.conf lines have to be relative to user home, so if I need to exclude the folder /home/myuser/public_html/var/cache, I’ll write public_html/var/cache/)
#!/usr/bin/bash
tmp_dir=/home/tmp_bak/
backup_dir=/backup/custom/
retention=2
# -------------------
mysql root_cwp -B -N -s -e "SELECT username,domain FROM user WHERE backup='on'" | while read -r username domain
do
echo Custom backup task starting for $username at $domain
mkdir -p ${tmp_dir}${username}/home_dir
echo Copying home directory
if [ -f /home/${username}/backup_exclude.conf ]; then
ionice -c 3 nice -n +19 rsync -a --links --exclude-from=/home/${username}/backup_exclude.conf /home/${username}/ ${tmp_dir}${username}/home_dir
else
ionice -c 3 nice -n +19 rsync -a --links /home/${username}/ ${tmp_dir}${username}/home_dir
fi
echo Backing up databases
mkdir -p ${tmp_dir}${username}/mysql/
mysql --defaults-extra-file=/root/.my.cnf -e "show databases LIKE '${username}%';" | grep -v Database | while read databasename
do
echo Dumping $databasename
mysqldump --defaults-extra-file=/root/.my.cnf "$databasename" > ${tmp_dir}${username}/mysql/"$databasename.sql" \
2> ${tmp_dir}${username}/mysql/errors.txt
done
if [ -d /var/vmail/${domain} ]; then
mkdir -p ${tmp_dir}${username}/vmail/
echo Copying email
ionice -c 3 nice -n +19 cp -fR /var/vmail/${domain} ${tmp_dir}${username}/vmail/
fi
echo Consolidating files
ionice -c 3 nice -n +19 tar -cjf ${tmp_dir}${username}.tar.bz2 ${tmp_dir}${username}
mv ${tmp_dir}${username}.tar.bz2 ${backup_dir}${username}-$(date -d "today" +"%Y%m%d%H%M").tar.bz2
echo Cleaning up
/usr/bin/find ${backup_dir} -name "*.bz2" -mtime +${retention} -delete > /dev/null 2>&1
rm -Rf ${tmp_dir}${username}
done
echo Custom Backup Job Finished
-
@Teo
Good point. The advantage of sharing the script. :)
I've also made some recent tweaks but want to go a bit further with them.
-
Note:
rsync -a --links
AFAIK, -a (archive) implies -l (links). I'd be extremely careful when messing about with symlinks, though and just be content with -a. Generally speaking, don't use the --links, -l or -L options.
-
Hi
When I try to run your backup script I got the following error:
custom-backup.sh: line 40: syntax error: unexpected end of file
Can you please help?
-
Hi
When I try to run your backup script I got the following error:
custom-backup.sh: line 40: syntax error: unexpected end of file
Can you please help?
Assuming that you mean the first script posted.
A 20 second Google and I think you may be a Windoze luser..
I think file.sh is with CRLF line terminators.
run
dos2unix file.sh
then the problem will be fixed.
You can install dos2unix in ubuntu with this:
sudo apt-get install dos2unix
Substitute file.sh (obviously?) and "sudo apt-get" with yum (maybe less obvious).
Better still, create the file directly on your server using vi (perhaps nano is entering CR_LF, which I doubt - I rarely use it.)
-
A 20 second Google and I think you may be a Windoze luser..
You're so right... Unfortunately I'm a windoze luser hahaha
I prefered to create the file with vi editor on my server ;)
Thank you very much!
-
You're so right... Unfortunately I'm a windoze luser hahaha
I preferred to create the file with vi editor on my server ;)
Thank you very much!
All is not lost if you used vi 8)
You're welcome.
(dos2unix took me back 30+years !)
-
I currently use this to supplement the CWP (new) backup function.
Wonderful script, thank you!
Testing atm from CLI, looks great so far. Could really use a limiter on the bzip process. Although its only using one vCPU during the zip process it does bounce up to 100% which would be nice to avoid for a live server :)
Oh and just FYI chaps, if you're adding this then write it in VI/VIM/nano from CLI and not from the 'file manager', it messes with encoding.
-
Whilst running, keep an eye on "top -c -d5"
ionice -c 3 nice -n +19 tar -cjf ${tmp_dir}${username}.tar.bz2 ${tmp_dir}${username}
The combination of ionice & nice is supposed to limit the disc & CPU loading, respectively.
Try removing the '+' before 19.
-
Latest version.
#!/usr/bin/bash
# CWP Custom backup
# Version 2.1 ejsolutions/Teo/cynique
# Set the following 3 items to suit
tmp_dir=/home/tmp_bak/
backup_dir=/backup/custom/
retention=1
# Optional though advisable
# Create a backup_exclude.conf file in each user home directory,
# listing directories to exclude, for example
# backupcwp/
# tmp/
# cache/
# -------------------
if [ -d ${tmp_dir} ]; then
mkdir -p ${tmp_dir}
fi
if [ -d ${backup_dir} ]; then
mkdir -p ${backup_dir}
fi
# -------------------
mysql root_cwp -B -N -s -e "SELECT username,domain FROM user WHERE backup='on'" | while read -r username domain
do
echo Custom backup task starting for $username at $domain
mkdir -p ${tmp_dir}${username}/home_dir
echo Copying home directory
if [ -f /home/${username}/backup_exclude.conf ]; then
ionice -c 3 nice rsync -a --exclude-from=/home/${username}/backup_exclude.conf /home/${username}/ ${tmp_dir}${username}/home_dir
else
ionice -c 3 nice rsync -a /home/${username}/ ${tmp_dir}${username}/home_dir
fi
echo Backing up databases
mkdir -p ${tmp_dir}${username}/mysql/
mysql --defaults-extra-file=/root/.my.cnf -e "show databases LIKE '${username}%';" | grep -v Database | while read databasename
do
echo Dumping $databasename
nice -n 5 mysqldump --defaults-extra-file=/root/.my.cnf "$databasename" > ${tmp_dir}${username}/mysql/"$databasename.sql" \
2> ${tmp_dir}${username}/mysql/errors.txt && sync && \
nice gzip ${tmp_dir}${username}/mysql/"$databasename.sql"
done
if [ -d /var/vmail/${domain} ]; then
mkdir -p ${tmp_dir}${username}/vmail/
echo Copying email
ionice -c 3 nice cp -fR /var/vmail/${domain} ${tmp_dir}${username}/vmail/
fi
echo Consolidating files
ionice -c 3 nice -n 15 tar -cjf ${tmp_dir}${username}.tar.bz2 ${tmp_dir}${username} && \
mv ${tmp_dir}${username}.tar.bz2 ${backup_dir}${username}-$(date -d "today" +"%Y%m%d%H%M").tar.bz2
echo Cleaning up
/usr/bin/find ${backup_dir} -name "*.bz2" -mtime +${retention} -delete > /dev/null 2>&1
rm -Rf ${tmp_dir}${username}
done
echo Custom Backup Job Finished
Notes:
Values of nice have been chosen with particular attention, to not coincide with clamd and help the sequence of the script.
As with all backups, this script should be run at quiet times on your server.
-
Latest version.
Thank you! Testing now :)
One thing that would be SUPER cool is putting that days backups into its own folder, 20201118 for example.
I do understand that if a backup goes on past midnight it would go to a new folder but the idea is still super cool and useful (I plan on keeping 14+ days retention), would it be possible to add folders? :D
or even go further and split the backup?
backups/date/username/mail.zip
backups/date/username/public_html.zip
backups/date/username/databases.zip
-
Actually, I think I'll do a spot of reconfiguring, to more closely mimic a WHM/cPanel backup, so yes backups grouped by date. This will help with my multiple backups to remote locations - the key reason to have backups consolidated (larger block size).
I'll consider a parameter, to allow keeping the backups split.
Ideally backups should run between the hours of 02:00 and say, 06:30 (accounting for runtime), so date isn't an issue.
-
If its of interest to you, the result of your most recent snippet:
(https://i.imgur.com/5Y3uCTk.png)
Works really well and although its very CPU heavy this isn't a clash with ClamAV/Varnish as those are mostly just hogging RAM :)
Actually, I think I'll do a spot of reconfiguring, to more closely mimic a WHM/cPanel backup, so yes backups grouped by date. This will help with my multiple backups to remote locations - the key reason to have backups consolidated (larger block size).
Amazing, really excited!
I'll consider a parameter, to allow keeping the backups split.
Yes, and if you segment the tasks imagine transferring one segment (say Mail.zip) to the backups location while the next task (say MySQL.zip) could be already preparing.. ooooo the possibilities! ;D
-
That's a
small screenshot!
Edit: Phew, that's better. :p
30 minutes to do 5GB ain't too bad but that's with restricting I/O & CPU throughput. Remember to exclude superfluous directories (not folders, please) such as cache and the other listed ones - limit it to the necessary only. Some Wordpress (and others) plugins allow lusers to backup from within their applications. It's pointless doing a backup of a backup: not the same as taking copies of a backup remotely.
(You may have gathered this subject area was one of my corporate consultancy roles.)
-
Major revamp, with provision for extending to system files.
Use in a cron task as normal, or add a split parameter if you prefer to keep individual parts.
examples:
12 1 * * * /root/custom-backup.sh > /dev/null 2>&1
16 3 * * * "/root/custom-backup.sh split" > /dev/null 2>&1
#!/usr/bin/bash
# CWP Custom backup
# Version 3.1 ejsolutions/Teo/cynique
# Use split as a parameter to retain split backups
# Set the following 3 items to suit
tmp_dir=/home/tmp_bak/
backup_dir=/backup/custom/
retention=1
# Optional though advisable
# Create a backup_exclude.conf file in each user home directory,
# listing directories to exclude, for example
# backupcwp/
# tmp/
# cache/
# -------------------
if [ -d ${tmp_dir} ]; then
mkdir -p ${tmp_dir}
fi
if [ -d ${backup_dir} ]; then
mkdir -p ${backup_dir}
fi
now_dir=${backup_dir}$(date -d "today" +"%Y%m%d%H%M")
echo "Timestamped location: " ${now_dir}
mkdir -p ${now_dir}/accounts
# -------------------
mysql root_cwp -B -N -s -e "SELECT username,domain FROM user WHERE backup='on'" | while read -r username domain
do
echo Custom backup task starting for $username at $domain
mkdir -p ${tmp_dir}${username}/home_dir
echo Copying home directory
if [ -f /home/${username}/backup_exclude.conf ]; then
ionice -c 3 nice rsync -a --exclude-from=/home/${username}/backup_exclude.conf /home/${username}/ ${tmp_dir}${username}/home_dir
else
ionice -c 3 nice rsync -a /home/${username}/ ${tmp_dir}${username}/home_dir
fi
echo Backing up databases
mkdir -p ${tmp_dir}${username}/mysql/
mysql --defaults-extra-file=/root/.my.cnf -e "show databases LIKE '${username}%';" | grep -v Database | while read databasename
do
echo Dumping $databasename
nice -n 5 mysqldump --defaults-extra-file=/root/.my.cnf "$databasename" > ${tmp_dir}${username}/mysql/"$databasename.sql" \
2> ${tmp_dir}${username}/mysql/errors.txt && sync && \
nice gzip ${tmp_dir}${username}/mysql/"$databasename.sql"
done
if [ -d /var/vmail/${domain} ]; then
mkdir -p ${tmp_dir}${username}/vmail/
echo Copying email
ionice -c 3 nice cp -fR /var/vmail/${domain} ${tmp_dir}${username}/vmail/
fi
if [ "split" = "$1" ]; then
mkdir -p ${now_dir}/accounts/${username}/
for i in home_dir mysql vmail
do
echo "Compressing " $i
ionice -c 3 nice -n 15 tar -cjf ${now_dir}/accounts/${username}/$i.tar.bz2 ${tmp_dir}${username}/$i 2>/dev/null
done
else
echo Consolidating files
ionice -c 3 nice -n 15 tar -cjf ${now_dir}/accounts/${username}.tar.bz2 ${tmp_dir}${username}
fi
echo Cleaning up
# /usr/bin/find ${backup_dir} -name "*.bz2" -mtime +${retention} -delete > /dev/null 2>&1
/usr/bin/find ${backup_dir} -mtime +${retention} -delete > /dev/null 2>&1
rm -Rf ${tmp_dir}${username}
done
echo Custom Backup Job Finished
-
BTW, for large backup sets, zstd is probably the way to go, rather than bz2. It's fairly new on the scene however, so application support is limited. Pigz (as used by WHM/cPanel) might also help and is more established. This opens a real can of worms though, as can be seen if you search for a performance comparison of these. ;)
I like to stick to the basics where possible and well established methods. KISS philosophy, again.
-
That's a small screenshot!
Eek, yea without manual constraints it took my desktop resolution and ran with it ::)
30 minutes to do 5GB ain't too bad but that's with restricting I/O & CPU throughput.
Agreed, I'm just a WordPress Developer (SCSS/PHP) hosting a bunch of clients sites. If this scales up in the near future I would strongly consider upgrading the current EC2 rig, it's currently 2x vCPU's and 4GB RAM.
Remember to exclude superfluous directories (not folders, please) such as cache and the other listed ones - limit it to the necessary only.
I meant to ask you about this! Could we add those as a wildcard within the variables above this script, with additional ones in the users config file (that you've already set)? For example:
#exclude these folders in all user directories:
/backupcwp/
/tmp/
/cache/
/cwp_stats/
Would be great, especially for anyone with a whole bunch of user accounts :)
Some Wordpress (and others) plugins allow lusers to backup from within their applications. It's pointless doing a backup of a backup: not the same as taking copies of a backup remotely.
Already on it, strongly advised all users that we didn't get a 500GB external drive for nothing but some still ignore it.. ;D
(You may have gathered this subject area was one of my corporate consultancy roles.)
Could have guessed, yea ;)
-
BTW, for large backup sets, zstd is probably the way to go, rather than bz2. It's fairly new on the scene however, so application support is limited.
Looks really good, but yea maybe wait a while until its more established ;)
Pigz (as used by WHM/cPanel) might also help and is more established. This opens a real can of worms though, as can be seen if you search for a performance comparison of these. ;)
I like to stick to the basics where possible and well established methods. KISS philosophy, again.
Having seen the stats of Pigz it looks great, was there a reason you opted for bz2 aside from old habits? :)
-
Major revamp, with provision for extending to system files.
Super excited, nice one!
Saving this for later today, have the original CWP backups running atm. I have to test a solution before moving over to it permanently, live clients with real websites.. etc :)
-
Could we add those as a wildcard within the variables above this script..
simple script for general deployment:
#!/usr/bin/bash
for i in /home/*
do
cp /root/backup_exclude.conf /home/$i
done
..was there a reason you opted for bz2 aside from old habits?
Got it in a oner. ;)
Got a multitude of other things to do, rather than explore a particular area and my burnt-out brain can only cope with so much at once.
*** I'll reiterate, from the opening post. This custom backup is used to supplement, not replace the CWP provided backup routines. It doesn't (yet?) cater for IP allocation, subdomains, email forwarders, packages and other functions. It is intended as a "failsafe" backup that can be used to restore most functionality of a user website onto any control panel, not just CWP. Additionally, it correctly uses retention terminology, (currently) unlike the CWP one. ;) ***
-
cp /root/backup_exclude.conf /home/$i
This looks amazing, thank you! I will do this once I get my backup_exclude.conf correct :)
Speaking of which, here it is:
#listing directories to exclude from custom backup:
backupcwp/
tmp/
cache/
cwp_stats/
ai1wm-backups/
ai1wm-backups/
wpvividbackups/
The last three entries are in the public_html/wp-content/ folder. This is relevant for all /wp-content/ folders, not just in public_html (imagine a subdomain or addon domain, for example).
Can we wildcard these, or is it already done?
For example, does mentioning:
wpvividbackups/
also include:
public_html/wp-content/ai1wm-backups/
public_html/wp-content/domains/domainone.com/ai1wm-backups/
public_html/wp-content/domains/domaintwo.com/ai1wm-backups/
or do these have to be set individually?
Would be great to wildcard the folder name, let me know if its possible!
*** I'll reiterate, from the opening post. This custom backup is used to supplement, not replace the CWP provided backup routines. It doesn't (yet?) cater for IP allocation, subdomains, email forwarders, packages and other functions. It is intended as a "failsafe" backup that can be used to restore most functionality of a user website onto any control panel, not just CWP. Additionally, it correctly uses retention terminology, (currently) unlike the CWP one. ;) ***
Agreed, and in time I will uncheck the options this backup covers but leave on the CWP backup for options it doesnt (like the ones you listed) but it is a VERY good start.
On another note, I hope you don't mind my suggestions. Just trying to give useful input/feedback, what's been achieved already is immense :D
-
IIRC it's an implicit pattern search, for exclusion, however I may be proven wrong - rsync is relatively new to me, in the grand scheme of things.
https://www.howtogeek.com/168009/how-to-exclude-files-from-rsync/
Which implies *backups/ could/should be added in your use case. Generally, backup exclusions/inclusions is an iterative process until a compromise is made between protection and speed/space.
I've been running with this for well over a year, so not really a "start". It saved my bacon once, when a VPS went down and I temporarily commissioned the (small spec.) backup VPS. I shared the script, to give others the benefit, especially as the CWP one has various (continuing/changing) issues and the developers haven't taken my advice onboard. It gets frustrating!
As you'll notice, the original script I used has somewhat morphed but it'll never have all the gizmos in it - there's commercial backup software for that (though even biggies from, for example R1Soft are bug-ridden/ill-conceived).
That being said, thanks for stirring me into the revamp - much easier to manage my multiple encrypted nextcloud remote backups now. :) 8)
Now, back to looking at a PHP e-commerce migration script, plus a newly created VPN. :'(
-
Which implies *backups/ could/should be added in your use case. Generally, backup exclusions/inclusions is an iterative process until a compromise is made between protection and speed/space.
I'll test and let you know, thanks! :)
That being said, thanks for stirring me into the revamp - much easier to manage my multiple encrypted nextcloud remote backups now. :) 8)
I know the feeling, this is like a theme I built a year ago. People come back and ask for edits i'd never imagined, pushing it to be better each time.
Now, back to looking at a PHP e-commerce migration script, plus a newly created VPN. :'(
Epic, now that is something I can do! All this linux stuff is a little bit beyond me, but i'm learning. It's a steep road ;D
-
Lemme know how you get on with V3.1 - just curiosity.
-
Major revamp, with provision for extending to system files.
Use in a cron task as normal, or add a split parameter if you prefer to keep individual parts.
Just started a test, OMG ITS SPLITTING IT!!! ;D
One thing I've noticed though:
(https://i.imgur.com/pIJ7GzB.png)
Any way we can remove the HH:MM from the date string for efficiency? :)
-
Lemme know how you get on with V3.1 - just curiosity.
It split, that worked really well. :) I don't know what. was expecting, maybe a bit of fear there whenever running as new script, but it worked great!
I assumed (after seeing a folder named yyyymmddhhmm) that it would create loads of folders but no, thats just the time that the backup was initiated. There is just one folder with several /home/ accounts inside!
Things to note:
YES, the backup exclusions work no matter what directory (really impressed) ;D
inside the backup created folder:
202011181436/accounts/yogacaveinitiati/
was a file for the home directory.
Unzipping this created a series of folders (inside the above mentioned path):
home/tmp_bak/yogacaveinitiati/home_dir/
not an issue, reeeeally not an issue.. but if you were wanting this to be super perfect, maybe just trim it down to /home_dir/ or something similar?
Epic job my man, epic job!
-
side note but not important, a client started sending 30k emails while i was testing the backup.
QUICKLY KILL PROCESS, KILL PROCESS, KILL PROCESS.
Jeez my pants need changing ;D
-
PMSL at the emails - spammer! ban, ban, ban >:(
HH:MM remains so that folks can run more than once a day, particularly during testing. Of course, doesn't affect directory sorts etc. ;)
Yeah, during a test (and later needed) restore I spotted the subdir creation. As you said, ain't major but a bit unpolished. :-[
Maybe a v3.2 will address this - who knows?
(Side note: my replacement VPN setup went peachy.. time to firewall it.)
-
PMSL at the emails - spammer! ban, ban, ban >:(
Nah they were legit emails (sending to Amazon SES and then out to the world) hence why I rushed to cancel the backup process.. just in case!
HH:MM remains so that folks can run more than once a day, particularly during testing. Of course, doesn't affect directory sorts etc. ;)
Got it, yea me likely :)
Yeah, during a test (and later needed) restore I spotted the subdir creation. As you said, ain't major but a bit unpolished. :-[
Maybe a v3.2 will address this - who knows?
EPIC NEWS!
OK, so one more thing. I have moved the tmp folder to the storage drive. It's not 'ideal' but helps reduce the load on the main drive when large accounts are zipped (main drive storage is much more expensive, and once I purchase allocated space i cant just 'downsize' it again, so i keep the main drive at about 80% capacity.
I'm a bit of an idiot and didn't separate the /home/ folder onto its own drive so that's always going to be at the back of my mind. Something for a rainy day!
I'd also mention that each account goes over its quota while the backup is running. I assume this is because (for example) 8/10GB used quickly becomes 12-16/10GB while the task is running.
On the downside, could this affect a users ability to receive mail or publish content on their site?
Might be a small flaw, albeit only for a short time. Something to ponder :)
-
Got the archived directory structure better but the split version is causing weird problems, so shelved for now (or until inspiration happens).
Tried out xz for compression and whilst it reduces backup size by a useful approx. 12% it also consumes more runtime CPU. Additionally, the CWP file manager doesn't recognise it as being able to be "descompressed" (yes another spelling mistake). I'll stay clear of pigz/zstd because they aren't available by default on CWP.
Optimal I/O will be where home, tmp_bak and backup are all on different storage - only likely in larger systems. Certainly, keeping root, home and backup on different partitions allows for better filesystem types (ext2 for backup) and allocation of quotas. With anything above say, 10GB storage /home should never be in the same partition as root and is a basic noob/Windoze-blinkered mistake.
During testing, with partitioned drives of course, I could see no temporary files being created during the compression process. However, this was with archives up to only about 120MB. I should imagine some 'streaming' to disc does happen with large data sets and changing directory to begin at another location might help, though that might negate trimming of the archive file structure. When a tar is created from a current directory files can easily have a relative path.
(I don't trust CWP to run my main/larger client sites, with only one live e-commerce site on CWP. I have 3 Pro licences! I still put up with punitive WHM/cPanel costs for most and the unintuitive DirectAdmin for some sites.)
-
Got the archived directory structure better but the split version is causing weird problems, so shelved for now (or until inspiration happens).
What problems did you face, could you describe? I understand this is super heavy on CPU which is a bit distressing but..
the CWP file manager doesn't recognise it as being able to be "descompressed"
Gets me every time.. ;D
-
After much gnashing of teeth and far too much time spent on it, I present below the latest iteration. I know there's a splattering of probably superfluous code (sync/wait) but this is what I got to work.
That's it for now though as I must concentrate on other things, when it comes to coding etc.
As always, use entirely at your own risk and to supplement the inbuilt CWP backup.
#!/usr/bin/bash
# CWP Custom backup
# Version 4.0 ejsolutions/Teo/cynique
# Use split as a parameter to retain split backups
# Set the following 3 items to suit
tmp_dir=/home/tmp_bak/
backup_dir=/backup/custom/
retention=1
# Optional though advisable
# Create a backup_exclude.conf file in each user home directory,
# listing directories to exclude, for example
# backupcwp/
# tmp/
# cache/
# -------------------
if [ -d ${tmp_dir} ]; then
mkdir -p ${tmp_dir}
fi
if [ -d ${backup_dir} ]; then
mkdir -p ${backup_dir}
fi
now_dir=${backup_dir}$(date -d "today" +"%Y%m%d%H%M")
echo "Timestamped location: " ${now_dir}
# -------------------
mysql root_cwp -B -N -s -e "SELECT username,domain FROM user WHERE backup='on'" | while read -r username domain
do
echo Custom backup task starting for $username at $domain
mkdir -p ${now_dir}/accounts/
echo Copying home directory
mkdir -p ${tmp_dir}${username}/home_dir
sync
if [ -f /home/${username}/backup_exclude.conf ]; then
ionice -c 3 nice rsync -a --exclude-from=/home/${username}/backup_exclude.conf /home/${username}/ ${tmp_dir}${username}/home_dir \
&& sync
wait $!
else
ionice -c 3 nice rsync -a /home/${username}/ ${tmp_dir}${username}/home_dir && sync
wait $!
fi
echo Backing up databases
mkdir -p ${tmp_dir}${username}/mysql/
mysql --defaults-extra-file=/root/.my.cnf -e "show databases LIKE '${username}%';" | grep -v Database | while read databasename
do
echo Dumping $databasename
nice -n 5 mysqldump --defaults-extra-file=/root/.my.cnf "$databasename" > ${tmp_dir}${username}/mysql/"$databasename.sql" \
&& sync && \
nice gzip ${tmp_dir}${username}/mysql/"$databasename.sql"
done
if [ -d /var/vmail/${domain} ]; then
mkdir -p ${tmp_dir}${username}/vmail/
echo Copying email
ionice -c 3 nice cp -fR /var/vmail/${domain} ${tmp_dir}${username}/vmail/
fi
if [ "split" = "$1" ]; then
mkdir -p ${now_dir}/accounts/${username}
sync
for i in home_dir mysql vmail
do
echo "Compressing " $i
ionice -c 3 nice -n 15 tar -C ${tmp_dir}${username} -cjf ${now_dir}/accounts/${username}/$i.tar.bz2 $i \
&& sync
wait $!
# Remove # from line below if experiencing file/buffer cache problems
# sleep 1m
done
sync
else
echo Consolidating files
ionice -c 3 nice -n 15 tar -C ${tmp_dir} -cjf ${now_dir}/accounts/${username}.tar.bz2 ${username} && sync
wait $!
fi
sync
wait $!
echo Cleaning up ${tmp_dir}${username}
rm -Rf ${tmp_dir}${username}
echo /---------------------------------/
done
sync
/usr/bin/find ${backup_dir} -mtime +${retention} -delete > /dev/null 2>&1
echo Custom Backup Job Finished
-
Cynique, may I ask how you'd go about restoring a mailbox from the vmail files in the backup?
I've tried and it seems that these files are neither importable, nor can they be directly dragged into a vmail folder to recreate the mailbox.
Stuck a bit.. just testing on one mail account to see if this is a viable solution but not yet managing to restore from files and folders.
Thought it might help :)
-
..how you'd go about restoring a mailbox from the vmail files in the backup?
I've tried and it seems that these files are neither importable, nor can they be directly dragged into a vmail folder to recreate the mailbox.
IIRC, I created the email account first in admin, then just copied the relevant vmail user directory. IF I get some time/want a diversion, I'll do a test on my failover server.
-
Didn't manage it in the end but Igor from CWP support sorted it out.
It would be cool if there was a guide on how to do this somewhere, have mentioned it.. you never know! ;D
-
The way you did it helped me a lot
I could easily back up with this method
Thank you very much for your help ;)
-
...
Thank you very much for your help ;)
Thanks for your feedback; it encourages more development, when I find time.
Now off to build yet another CWP VPS.
-
Hello,
Anyone still using this script? I am getting strange errors with CWP backups and support team have no update since weeks. I want to use some alternative as CWP team can take ages to fix the issues in backup.
-
I want to use some alternative as CWP team can take ages to fix the issues in backup.
Try this post, used these backups for a couple of years and restored from them on occasion - worked well enough for me!
https://forum.centos-webpanel.com/index.php?topic=11268.msg38452#msg38452 (https://forum.centos-webpanel.com/index.php?topic=11268.msg38452#msg38452)
-
Works Great...Thank You ;D