I had the chance to work as a systems administrator in an e-commerce company and they have Linux dedicated servers leased/purchase from a hosting provider.
Dedicated servers compared to a shared hosting is you have full control of the box. You can do whatever you want with the box, implement services and of course a full root access with the machine.
With this company, we are hosting about 200+ websites and some of them are hosted in our boxes run in Redhat Enterprise Linux. As part of administering the box is to have a full and incremental backup of our dynamic website files and database. Scheduled shell scripts are set on this machine and luckily, I was able to find one of the useful backup solution script that can be found in this site:
The backup solution that was presented on the script meets basically what I needed
What do you need here:
The built-in ftp client in linux seems can’t do the job.
Download and Install NcFTP client from http://www.ncftp.com/
This binary file will be included once you install the mysql and mysql-server package
3. tar command
tar will be the one to compressed or archived your files so that size will be smaller when you upload it to your destination machine
What to do:
1. You need to define what directory you want to backup
-for web files, default is /var/www/html. It really depends how you setup your httpd.conf for the document root
This will be put as a variable in the DIRS
2. Define a temporary folder to your “BACKUP”
3. Define a timestamp.
4. Input the INCFILE and DAY
5. Create a file for your shell script
Then paste the the script found in http://www.cyberciti.biz/tips/how-to-backup-mysql-databases-web-server-files-to-a-ftp-server-automatically.html
Due to copyright issue, just refer to the site below to get his script and modify for your own specifications and needs
Copyright (c) 2005-2006 nixCraft ,created by vivek of cyberciti.biz
6. Set a scheduler via cron, let say script will run every 12:01AM daily
1 0 * * * /home/flt/backup.sh > /home/flt/log/backup.log
If you want a generated script, the author of the script provided a link to generate an ftp backup script,
If you want to more, or clarifications you can find author’s whole post @
Kudos to him and I have backup script running on my 5 linux boxes patterned on his script.
Backing up files will consume a lot of bandwidth of your internet. If you don’t have dedicated lines for your data traffic, you can try to:
a. Either set a QOS to limit ftp traffic
b. Set an download limit from your FTP server (backup server)
I am fan of the site and I posted on this site since it’s very useful if you want an alternative on doing such job. Of course, there are lots of ways to do it and several 3rd party programs,in GUI or by web interface on backing up files and database in Linux.