Backups.. (Again!)

For a little while now, I’ve been working on a new backup script, as switching from Windows to a full Linux environment rendered my last script unusable.

So, here is the full script, it works well, but I’m making tweaks to it.

One issue I Have, is certain users don’t exist on certain machines, so my development branch doesn’t quite work. I’ve managed to loop through an array contains user names, and another looping through IP addresses, however I’ve not worked out away yet to filter those arrays to remove IP and user names that don’t exist on the target.

But basically, this script connects to machines using RSYNC, over SSH and copies the files to the backup location, and optionally uploads them to Amazon S3.

So, without further ado, here is a wall of code..

#!/bin/bash

#Main Script file for Back ups.

#See Bitbucket Repo for further information https://bitbucket.org/thompsonmichael/backup-sys

#Michael Thompson 2018

# mikethompson@gmx.co.uk (GPG Key-ID: 062C03D9)

#Version 0.0.1

#VARIABLES

BCK_DEST=/mnt/Logical_Data

EXCLUDE_FILE=/home/michael/Script/rsync_exclude

S3_BUCKET=s3://

RSYNC_CMD_STD="azh --progress"

RSYNC_CMD_CLEAN=" --delete-after --delete-excluded"

S3_CMD="-rHv --skip-existing --acl-private --continue-put --storage-class=STANDARD_IA --no-delete-removed --exclude-from=s3_exclude"

S3_EXTRA=$2

LOG_FILE="/home/michael/Script/log_file.log"

REM_HOST="192.168.0.2"

BLUE="\e[1;34m"

RED="\e[1;31m"

NORMAL_COL="\e[0m"

if ! [ -z "$2" ];

then

if ! [ "$2" = "-clean" ];

then

echo "Running Custom S3 Command"

S3_CMD=$S3_CMD" "$S3_EXTRA

fi

fi

echo Backing up systems

echo ______________

echo -e ${BLUE} S3 Bucket Configured: $RED $S3_BUCKET${NORMAL_COL}

echo -e ${BLUE}S3 Command is: $RED $S3_CMD${NORMAL_COL}

echo -e ${BLUE}Exclude File Path: $RED $EXCLUDE_FILE${NORMAL_COL}

echo -e ${BLUE}Running on: $RED $HOSTNAME${NORMAL_COL}

echo -e ${BLUE}Destination is: $RED $BCK_DEST${NORMAL_COL}

if [ -z "$1" ];

then

echo -e ${BLUE}Command line passed: Empty ${NORMAL_COL}

else

echo -e ${BLUE}Command line passed: $1 ${NORMAL_COL}

fi

echo

echo -----------------------------------------------------------------------

echo

#error function. pass as func error,code,message

function_error () {

echo -e ${RED}"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"

echo -e "CRITICAL ERROR!"

echo -e "Error occured in: " $1 "Error returned was: " $2

if [ -z "$3" ];

then

echo -e "Unknown Error, cannot advise. Check FAQ"

fi

echo -e "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"${NORMAL_COL}

return 1

exit

}

#ENSURE DRIVE IS MOUNTED

if mountpoint -q $BCK_DEST

then

echo -e ${BLUE}"Backup Location is mounted " $BCK_DEST ${NORMAL_COL}

else

function_error "Backup Location not mounted" $? "Mount location and restart"

exit

fi

if [ -z "$1" ];

then

#--------------------------------------------------------------------------------------

#Copy To Local Storage

echo Command Returned: $?

echo -e ${BLUE}Backing Up Dads${NORMAL_COL}

wget -q --tries=10 --timeout=20 --spider http://google.com

if [[ $? -eq 0 ]]; then

echo -e "Internet Connected"

else

function_error "Internet Connection Down" "INT_DOWN $?" "Check Internet Connection"

exit

fi

ping -c1 ${REM_HOST} -q 2>&1 >/dev/null

RET=$?

if [ ${RET} -eq 0 ]; then

echo -e ${BLUE}"Host Is Alive"${NORMAL_COL}

echo -e ${BLUE}Ping Command Returned: $? ${NORMAL_COL}

echo -e ${BLUE}rsync command is: "-$RSYNC_CMD_STD ( ${REM_HOST} )"${NORMAL_COL}

rsync -$RSYNC_CMD_STD -e ssh --exclude-from $EXCLUDE_FILE william@${REM_HOST}:/home/william $BCK_DEST/Backup/Dads

echo -e ${BLUE}"Command Returned (RSync):" $? ${NORMAL_COL}

if ! [ "$?" = "0" ];

then

function_error "RSYNC" $? "Check RSYNC Command Line, and RSYNC Dirs"

fi

rsync -$RSYNC_CMD_STD -e ssh --exclude-from $EXCLUDE_FILE $USER@${REM_HOST}:/home/$USER $BCK_DEST/Backup/$USER

if ! [ "$?" = "0" ];

then

function_error "RSYNC" $? "Check RSYNC Command Line, and RSYNC Dirs"

fi

echo -e ${BLUE}"Command Returned (RSync):" $? ${NORMAL_COL}

else

echo -e ${RED}"Host ${REM_HOST} failed ping monitoring on `date`"${NORMAL_COL}

echo -e ${RED}"Ping Command Returned (Ping):" $? ${NORMAL_COL}

echo -e ${RED}"${REM_HOST} is Dead"${NORMAL_COL}

fi

echo -e ${BLUE}Backing up $HOSTNAME${NORMAL_COL}

echo -e ${BLUE}rsync command is: "-$RSYNC_CMD_STD ( $HOSTNAME )"${NORMAL_COL}

rsync -$RSYNC_CMD_STD --exclude-from $EXCLUDE_FILE /home/michael $BCK_DEST/Backup/Michael-Debian

if ! [ "$?" = "0" ];

then

function_error "RSYNC" $? "Check RSYNC Command Line, and RSYNC Dirs"

fi

echo -e ${BLUE}"Command Returned (RSync):" $?${NORMAL_COL}

fi

if [ "$1" = "-clean" ];

then

ping -c1 ${REM_HOST} -q 2>&1 >/dev/null

RET=$?

if [ ${RET} -eq 0 ]; then

echo -e ${BLUE}"Host Is Alive"${NORMAL_COL}

echo -e ${BLUE}"Ping Command Returned (Ping):" $? ${NORMAL_COL}

echo -e ${BLUE}"Host is Alive" ${NORMAL_COL}

echo -e ${BLUE}rsync command is: "-$RSYNC_CMD_STD ( ${REM_HOST} )" ${NORMAL_COL}

rsync -$RSYNC_CMD_STD$RSYNC_CMD_CLEAN -e ssh --exclude-from $EXCLUDE_FILE william@${REM_HOST}:/home/william $BCK_DEST/Backup/Dads

if ! [ "$?" = "0" ];

then

function_error "RSYNC" $? "Check RSYNC Command Line, and RSYNC Dirs"

fi

echo -e ${BLUE}"Command Returned (RSync):" $? ${NORMAL_COL}

rsync -$RSYNC_CMD_STD$RSYNC_CMD_CLEAN -e ssh --exclude-from $EXCLUDE_FILE $USER@${REM_HOST}:/home/$USER $BCK_DEST/Backup/$USER

if ! [ "$?" = "0" ];

then

function_error "RSYNC" $? "Check RSYNC Command Line, and RSYNC Dirs"

fi

echo "Command Returned (RSync):" $?

else

echo -e ${RED}"Host ${REM_HOST} failed ping monitoring on `date`"${NORMAL_COL}

echo -e ${RED}"Ping Command Returned:" $? ${NORMAL_COL}

echo -e ${RED}"${REM_HOST} is Dead"${NORMAL_COL}

fi

echo -e ${BLUE}Backing up $HOSTNAME ${NORMAL_COL}

rsync -$RSYNC_CMD_STD$RSYNC_CMD_CLEAN --exclude-from $EXCLUDE_FILE /home/michael $BCK_DEST/Backup/Michael-Debian

if ! [ "$?" = "0" ];

then

function_error "RSYNC" $? "Check RSYNC Command Line, and RSYNC Dirs"

fi

echo -e ${BLUE}"Command Returned (RSync):" $? ${NORMAL_COL}

#-----------------------------------------------------------------------------------------------------------------------------------

# -s3clean has been added as a command line option, and must be passed a second command to -clean

# it will cause a S3 clean event to be processed.

# -clean on its own will pass only a standard archive clean. S3 is not routinly cleaned, unless explicity passed with -s3clean.

if [ "$2" = "-s3clean" ];

then

#Call Clean_S3

source s3_cmd.sc

fi

#-----------------------------------------------------------------------------------------------------------------------------------

fi

if [ "$1" = "-s3" ];

then

echo S3 destination is: $S3_BUCKET

echo Amazon Upload Proceding...

echo Uploading $BCK_DEST/Backup/Dads/william/Pictures/

s3cmd sync $S3_CMD $BCK_DEST/Backup/Dads/william/Pictures/ $S3_BUCKET/Dads/Pictures/

if ! [ "$?" = "0" ];

then

function_error "S3CMD" $? "Check S3CMD Command Line, and Dirs"

fi

echo "Command Returned (S3CMD):" $?

echo Uploading $BCK_DEST/Backup/Dads/william/Documents/

s3cmd sync $S3_CMD $BCK_DEST/Backup/Dads/william/Documents/ $S3_BUCKET/Dads/Documents/

if ! [ "$?" = "0" ];

then

function_error "S3CMD" $? "Check S3CMD Command Line, and Dirs"

fi

echo "Command Returned (S3CMD):" $?

echo Uploading $BCK_DEST/Backup/Dads/william/Videos/

s3cmd sync $S3_CMD $BCK_DEST/Backup/Dads/william/Videos/ $S3_BUCKET/Dads/Videos/

if ! [ "$?" = "0" ];

then

function_error "S3CMD" $? "Check S3CMD Command Line, and Dirs"

fi

echo "Command Returned (S3CMD):" $?

echo Uploading $BCK_DEST/Backup/Michael-Debian/

s3cmd sync $S3_CMD $BCK_DEST/Backup/Michael-Debian/ $S3_BUCKET/Michael-Debian/

if ! [ "$?" = "0" ];

then

function_error "S3CMD" $? "Check S3CMD Command Line, and Dirs"

fi

echo "Command Returned (S3CMD):" $?

exit

fi

#EOF

Backup, Its Important!

Recently I switched my last computer over to linux, and this meant I had to rethink my backup strategy. I like to copy all important files over to a central area, maybe even to a remote NAS box. In windows this was done via a simple batch file each user ran when they felt like it to copy it to a shared drive on the network. On linux, I can utilise the power of rsync, and do it all from a local terminal. So I cooked up the following bash script.
There are a few dependencies this has, it requires the following:

  1. s3cmd installed and configured to the S3 account
  2. rsync installed on all machines. (Not installed as standard on Debian systems.)
  3. An available drive or NAS box attached.
  4. If passwordless required for connecting to the other boxes, you will need to set up passwordless SSH, which I’ll not cover here, but there are countless other tutorials to cover this.

It can be run from a simple cron job, or manually when required. There are command line options that can be run

  1. -clean
    1. -clean command line will cause rsync to delete any files that are no longer on the source from the archive. It will also remove any files that have been copied, but now excluded from the exclusion file.
  2. -s3
    1. Will cause an upload event and syncronise the Amazon S3 archive with the local. This will not however trigger a local update event
  3. Empty command line
    1. Will simply backup files to the target machine from the sources, this will not clean the archives, nor will it trigger a S3 upload.

So, lets break down the source and have a look.

</p><p>if mountpoint -q $BCK_DEST<br>then<br>echo "Backup Location is mounted"</p><p>if [ -z "$1" ];<br>then<br>echo rsync command is: -$RSYNC_CMD_STD<br>#--------------------------------------------------------------------------------------<br>#Copy To Local Storage<br>echo Backing Up<br>rsync -$RSYNC_CMD_STD -e ssh --exclude-from $EXCLUDE_FILE REMOTE SYSTEM $BCK_DEST/1<br>echo Backing up $HOSTNAME<br>rsync -$RSYNC_CMD_STD --exclude-from $EXCLUDE_FILE REMOTE SYSTEM $BCK_DEST/2</p><p>#--------------------------------------------------------------------------------------<br>fi</p><p>

This first section is the overall backup, responsible for copying the source to the target. We initially connect the rsync over to the target via SSH, and copy the entire home folder, minus the excluded files and directories. Its only run, should the command line be empty, the default run if you like. The variable $1 is the first command line option passed to the script. One of the important things we do here, is ensure that the drive is actually mounted in the system. If this is not mounted, everything else will fail.

<br>if [ "$1" = "-clean" ];<br>then<br>echo RSync Clean Command is: -$RSYNC_CMD_STD$RSYNC_CMD_CLEAN<br>echo Backing Up<br>rsync -$RSYNC_CMD_STD$RSYNC_CMD_CLEAN --exclude-from $EXCLUDE_FILE REMOTE SYSTEM $BCK_DEST/1<br>echo Backing up $HOSTNAME<br>rsync -$RSYNC_CMD_STD$RSYNC_CMD_CLEAN --exclude-from $EXCLUDE_FILE REMOTE SYSTEM $BCK_DEST/2<br>fi<br>

This section is the clean up. It runs the standard backup, along with the rsync options “–delete-after –delete-excluded” which clean up the archives.

<br>if [ "$1" = "-s3" ];<br>then<br>echo S3 destination is: $S3_BUCKET<br>echo Amazon Upload Proceding<br>s3cmd sync $S3_CMD $BCK_DEST/1 $S3_BUCKET/1<br>exit<br>fi</p><p>

This section is the amazon upload. Its not quite complete,.and there are more directories to add to the upload. However, uploading to amazon, is not the quickest thing in the world, and I’ll add the other important directories as and when the uploads complete.

And here it is, in its entire bash like glory.

<br>#!/bin/bash</p><p>#VARIABLES<br>BCK_DEST=/mnt/sdc1<br>EXCLUDE_FILE=rsync_exclude<br>S3_BUCKET=S3 Bucket<br>RSYNC_CMD_STD=avzh<br>RSYNC_CMD_CLEAN=" --delete-after --delete-excluded"<br>S3_CMD="-rH --skip-existing --delete-removed --acl-private"</p><p>echo backing up systems<br>echo ______________<br>echo Exclude File Path: $EXCLUDE_FILE<br>echo running on: $HOSTNAME<br>echo destibation is: $BCK_DEST<br>echo Command line passed: $1</p><p>#ENSURE DRIVE IS MOUNTED<br>if mountpoint -q $BCK_DEST<br>then<br>echo "Backup Location is mounted"</p><p>if [ -z "$1" ];<br>then<br>echo rsync command is: -$RSYNC_CMD_STD<br>#--------------------------------------------------------------------------------------<br>#Copy To Local Storage<br>echo Backing Up<br>rsync -$RSYNC_CMD_STD --exclude-from $EXCLUDE_FILE REMOTE SYSTEM $BCK_DEST/1<br>echo Backing up $HOSTNAME<br>rsync -$RSYNC_CMD_STD --exclude-from $EXCLUDE_FILE REMOTE SYSTEM $BCK_DEST/2</p><p>#--------------------------------------------------------------------------------------<br>fi</p><p>if [ "$1" = "-clean" ];<br>then<br>echo RSync Clean Command is: -$RSYNC_CMD_STD$RSYNC_CMD_CLEAN<br>echo Backing Up 1<br>rsync -$RSYNC_CMD_STD$RSYNC_CMD_CLEAN --exclude-from $EXCLUDE_FILE REMOTE SYSTEM $BCK_DEST/1<br>echo Backing up $HOSTNAME<br>rsync -$RSYNC_CMD_STD$RSYNC_CMD_CLEAN --exclude-from $EXCLUDE_FILE REMOTE SYSTEM $BCK_DEST/2<br>fi</p><p>if [ "$1" = "-s3" ];<br>then<br>echo S3 destination is: $S3_BUCKET<br>echo Amazon Upload Proceding<br>s3cmd sync $S3_CMD $BCK_DEST/1 $S3_BUCKET/1<br>exit<br>fi</p><p>else<br>echo "Backup Location is not mounted"<br>exit<br>fi<br>

Hope you find this useful, and if you have any ideas how to improve it, let me know.