Category Archives: Linux

Posts involving Linux.

How to Backup your Linux Server with Google Drive (Step 3 of 3) [Remote Backup Script & Grive Script]

This is the last post for this how to.  Hopefully you have already read and completed the following posts:

Part 1 – Local backup script

Part 2 – Installing Grive

Now we create another script.  This script may be different depending on whether or not you decided to go with a separate backup server.  I am going to paste this entire script into my post and you just need to read anything past the # for any comments.

Create this script in etc/cron.hourly.  Make sure to take off the .sh and make it executable.

#!/bin/bash
#Rsync multiple servers then Google Drive Sync
#Manipulate this script to match your server names, usernames, and associated passwords. 
#SSH will also need to trust the connecting server, so make sure to test each of the SSHPASS lines below in an SSH session.  Another option would just be to have previously SSH'd to all of the connecting servers.

#EXAMPLEWEBserver1 - [websites and databases]
sshpass -p "PASSWORD" rsync -avs --delete [email protected]:/var/backups/files /root/GoogleDrive/Backups/EXAMPLEWEBserver1

#EXAMPLEWEBserver2 - [websites and databases]
sshpass -p "PASSWORD" rsync -avs --delete [email protected]:/var/backups/files /root/GoogleDrive/Backups/EXAMPLEWEBserver2

#EXAMPLENAMEserver1 - [main bind files]
sshpass -p "PASSWORD" rsync -avs [email protected]:/etc/named.conf /root/GoogleDrive/Backups/EXAMPLENAMEserver1
sshpass -p "PASSWORD" rsync -avs --delete [email protected]:/var/backups /root/GoogleDrive/Backups/EXAMPLENAMEserver1

# Directory to backup
BACKUPDIR=/root/GoogleDrive/Backups

# Google Drive directory
GDRIVEDIR=/root/GoogleDrive

# Directory target in remote
TARGETDIR=/Backups

# =====================================================================
# It is unlikely you will need to edit anything below this line.
# Config END

# Create backup dir if not exists
echo Creating ${GDRIVEDIR}/${TARGETDIR} if needed
if [ ! -d "${GDRIVEDIR}/${TARGETDIR}" ]; then mkdir ${GDRIVEDIR}/${TARGETDIR}; fi

# Moving to Gdrive Dir
echo Entering ${GDRIVEDIR}
cd ${GDRIVEDIR}

# Initial sync
echo Initial Google Drive Sync
grive

# Coping new content
echo Copying from ${BACKUPDIR}/* to ${GDRIVEDIR}/${TARGETDIR}/
cp -R ${BACKUPDIR}/* ${GDRIVEDIR}/${TARGETDIR}/

# Showing files copied
echo Files to sync
find ${GDRIVEDIR}/${TARGETDIR}/

# Final sync
echo Final Google Drive Sync
grive

Like I mentioned in the comments of the above script, you will need to run each individual SSHPASS line to build a relationship between each rsync server.

 

A possibly more simple approach would be to just initiate a remote SSH connection from the backup server.

root@backup:~# ssh EXAMPLEWEBserver1.yourdomain.com
The authenticity of host 'EXAMPLEWEBserver1.yourdomain.com (127.0.0.1)' can't be established.
ECDSA key fingerprint is 03:3e:79:2c:3e:bb:ea:8a:fa:39:30:86:1a:d1:9f:24.
Are you sure you want to continue connecting (yes/no)?
root@backup:~#

Each of those lines are pulling files down from the remote server onto a local drive so they can be synced with Google Drive.  If the line has –delete in the script it will clean up and local files that have since been deleted on the remote location.  This keeps files that are past that 5 day old length automatically cleaned up.  Otherwise you will end up syncing every file your server ever puts in its backup directory.

If this script is running properly that is all you need to do.  Backups should be showing up on your Google drive regularly!

**With this current setup files will not be automatically cleaned up from the trash in Google Drive.  I believe this can be fixed, so I am currently working on that.**

GRIVE

How to Backup your Linux Server with Google Drive (Step 2 of 3) [Installing Grive]

GRIVEWelcome back! Hopefully you have already read my previous post.  Now that you have local backups automatically running on your Linux server, we need to get those files synced to Google Drive.  I am going to explain the step where I pull all my backup files to a backup server.  If this is unnecessary for your needs, feel free to skip to installing Grive on your Linux Server.

On my dedicated backup VPS I have a script that runs hourly to pull down backups and sync with Google Drive.  This server is nice to have separated from the other web servers in my environment because I want to practice “best practice”.  Right now I just have a friend of mine that I am managing a Linux server for, but I would like to be in a position where I can support others without additional configuration.  When you install Grive you will understand why I did it this way.  If you install Grive on a per server basis, each server will have a copy of all backups.  With one dedicated server, you limit the amount of copies.  This improves security and is just an overall cleaner install.

So lets install Grive:

My backup server is 64bit Ubuntu 12.04.  At this point 64 might be the only supported OS.

If you are using Ubuntu 12.04 you will need to run this command:
This step can likely be skipped on newer versions of Ubuntu Server

sudo apt-get update && sudo apt-get install python-software-properties

Once that is installed you will be able to install this reposiltory:

sudo add-apt-repository ppa:nilarimogard/webupd8
sudo apt-get update
sudo apt-get install grive

So now you have Grive installed.  We need to create a directory for Grive to work in.

mkdir ~/google_drive
cd ~/google_drive
grive -a

The command you just ran will eventually provide you with a URL to Authenticate with your Google account.  Copy that URL into your Broswer, then allow Grive to access your account.

Grive should begin to sync the files in your Google Drive account.  Now, you understand why you may not want this on your web server.

So now you have Grive installed, authenticated, and syncing with your Google Drive account.  The last step is configuring another script to automate syncs as well as pull them down form your various servers.

Please check that out in my next post.

Linux_Plus_Google_Drive

How to Backup your Linux Server with Google Drive (Step 1 of 3) [Local Backup Script]

Linux_Plus_Google_DriveI have a couple of Ubuntu and Centos VPS’s and I needed a solution to backup to a remote location.  I wanted this backup location to be something I could trust, but also inexpensive.  My main concern was keeping this backup separated from my VPS provider (Not that they have not been more than adequate).  There is a saying, and I believe it goes “Don’t put all of your eggs in one basket”.  I believe I first heard that saying with investing.  I try to live by that rule whenever feasible.  So I really had two choices.  Back it up to my home server (and eat up my bandwidth at times) or attempt to back it up to Google Drive.  I went with the ladder.

Google Drive is an optimal choice because it is included with my Business Apps account.  Even a free Gmail account will include 15 GBs of Drive space.  The problem is, Drive has no official support for Linux.  However there is an open source solution.  There always is, that’s why you have a Linux server right?

So, how did I accomplish this?

Well I went about this using some pretty simple tasks.

  1. I have multiple servers with multiple sites and databases on each server.  I use a simple bash script that runs every night in a cron job to keep a local copy of both my databases and my web content.  These files are automatically removed after being more than 5 days old.  They are also compressed into a tar.gz file.  This cuts down on storage size as well as transfer time.
  2. Because I manage multiple servers and I don’t want all of them to have full access to my Google Drive data, I have a VPS dedicated to backups.  This server has the open source version of Google Drive installed as well as some other scripts targeted at syncing backup directories.  These data syncs happens via SSH and RSync.  The other responsibility of this server is to trust the other servers.  This way none of the servers I am backing up have access to any of the data but their own.
  3. Then the data is synced hourly with Google Drive.  Now, that may seem strange to make this sync run hourly, but I will tell you way.  Google Drive for Linux is not supported and a bit buggy.  I have seen it fail a few times and it constantly says there is an error, even though it works fine.  I have found that running the script again after a failure eventually fixes it, usually the first time around.  Plus, if there is no new data to sync, then the script is done running in seconds.  This error I am seeing could possibly be a configuration issue on my end, but I don’t really care all too much because it has not failed to backup one night.
  4. I have had this in place for over a month now, and each time I check my Google Drive is in sync with my backup server.  It is a thing of beauty.  So not only are my backups being stored in the cloud, now I have the option to sync those backups locally with my machine.  I especially like this because what if I screw up and forget to pay my Google apps bill?  At least I will have the latest backup stored on my laptop (along with everything else on Google Drive).  Thankfully I am setup with autopay, and don’t think that will be an issue.  Another great feature is that I can share backup directories.  Now that these files are in my Google Drive I just right click and share that directory with the customer this data belongs to.  In my case I just run a server for a very close friend of mine.  However, we both always have access to local or cloud backups at any time.

Having backups of your data is extremely important.  You never know what may happen to your server.  This is peace of mind that you just can’t afford to live without.  I feel confident that I could spin up another VPS and have my sites back up and running before DNS would propagate (My TTL is 1 hour).  Google also automatically keeps files that are deleted for 25 days.  This means that even if I delete a line of code and I don’t realize the bad effect this has on my site, I can still find and restore that line I removed up to a month back.

Okay, so lets get started:

First you need to have a file or directory of files to backup.  So let’s create your local backup files.  I will paste my backup script below and explain it line by line.  At some point I borrowed this script, if I can find the original creator I will link to your site.

#!/bin/sh

THESITE1="website_name"
THEDB1="website_db_name"

THEDBUSER="put_your_db_username_here"
THEDBPW="**************"
THEDATE=`date +%d%m%y%H%M`

mysqldump -u $THEDBUSER -p${THEDBPW} $THEDB1 | gzip > /var/backups/files/dbbackup_${THEDB1}_${THEDATE}.bak.gz

tar czf /var/backups/files/sitebackup_${THESITE1}_${THEDATE}.tar -C / var/www/$THESITE1
gzip /var/backups/files/sitebackup_${THESITE1}_${THEDATE}.tar

find /var/backups/files/site* -mtime +5 -exec rm {} \;
find /var/backups/files/db* -mtime +5 -exec rm {} \;

Line 1 is commented out, so that’s nothing to worry about.  Just defining that this is indeed a bash script.

Line 3 is just calling out the website name.  This will be used in the script below to name the files.  The same goes for line 4, but this is for the database.

Lines 6 and 7 are asking for the MySQL database username and password.  Line 8 you can leave alone, it just defines the format of the date.

Line 10 is the script that actually dumps the database.  Here you will want to make sure that the file location above exists and is where you want everything to end up.

Lines 12 and 13 are backing up and compressing the site data.  You will also want to make sure the website location and backup location are correct.

Lastly Lines 15 and 16 keep the files no more than 5 days old.  You can certainly alter that number to keep more or less depending on your needs.

 

This script can handle multiple databases and websites.  You just need to make sure that you create multiple variables for each site, database, user and password.  Then you will need to duplicate the code below those variables and replace them so each site has its own set of instructions.

Now you need to test this script.  Save it as backup.sh and make sure to make it executable.

Run this script in SSH and test to see if everything was created properly.  If you have any errors along the way feel free to post a comment below.  I will try to answer as soon as I can.

If this script ran and you now see your site files where they are supposed to be, everything worked!  A big tip: you should attempt to restore that file eventually.  You never want to assume everything is fine.  I try to pull down a backup at least once a month and test to make sure its not corrupt.

Now, we want to setup a cron job.  I would recommend just putting this newly created file in your daily cron directory (etc/cron.daily/).  I have seen where they sometimes don’t like the .sh extension.  Just remove that and make sure its executable.  Let this run for a few days, and keep up with it.  Maybe even try to restore a database and a few files.  Once you are sure it is working as you want you will be ready for the next step.

As for me, I need to get some rest, so I am going to write that bit in Sunday’s blog post.

Thanks for reading!