Update to the Backup

A quick update to the backup script. Instead of having a set list of folders stored in the backup directories to be uploaded to S3, the script now uploads everything in the backup location that’s not specifically excluded by the exclude file. This cuts a good few lines out of the script.

A quick note on the exclude file, you must exclude both the directory, and it’s contents as S3 has no concept of folders.

So, to exclude the folder /back/folder and it’s contents your exclude file must contain /back/folder and /back/folder/*


	if [ "$1" = "-s3" ]; 
		then 
			#This code is still subject to Testing. It shouldnt require a re upload of data.
#However, this uploads the WHOLE backup to the cloud. It does not filter out folders at this time.
#Use the S3 Exclude file for this purpose.
			shopt -s dotglob
			shopt -s nullglob
			array=($BCK_DEST/Backup/*)
			for dir in "${array[@]}"
				do
					echo $(basename "${dir%.*}")
					echo Uploading $dir
					FOL=$(basename "${dir%.*}")
					s3cmd sync $S3_CMD $dir $S3_BUCKET/$FOL/
					echo .......
				if ! [ "$?" = "0" ];
					then
					function_error "S3CMD" $? "Check S3CMD Command Line, and Dirs"
				fi
				done

	fi

Backups.. (Again!)

For a little while now, I’ve been working on a new backup script, as switching from Windows to a full Linux environment rendered my last script unusable.

So, here is the full script, it works well, but I’m making tweaks to it.

One issue I Have, is certain users don’t exist on certain machines, so my development branch doesn’t quite work. I’ve managed to loop through an array contains user names, and another looping through IP addresses, however I’ve not worked out away yet to filter those arrays to remove IP and user names that don’t exist on the target.

But basically, this script connects to machines using RSYNC, over SSH and copies the files to the backup location, and optionally uploads them to Amazon S3.

So, without further ado, here is a wall of code..

#!/bin/bash

#Main Script file for Back ups.

#See Bitbucket Repo for further information https://bitbucket.org/thompsonmichael/backup-sys

#Michael Thompson 2018

# mikethompson@gmx.co.uk (GPG Key-ID: 062C03D9)

#Version 0.0.1

#VARIABLES

BCK_DEST=/mnt/Logical_Data

EXCLUDE_FILE=/home/michael/Script/rsync_exclude

S3_BUCKET=s3://

RSYNC_CMD_STD="azh --progress"

RSYNC_CMD_CLEAN=" --delete-after --delete-excluded"

S3_CMD="-rHv --skip-existing --acl-private --continue-put --storage-class=STANDARD_IA --no-delete-removed --exclude-from=s3_exclude"

S3_EXTRA=$2

LOG_FILE="/home/michael/Script/log_file.log"

REM_HOST="192.168.0.2"

BLUE="\e[1;34m"

RED="\e[1;31m"

NORMAL_COL="\e[0m"

if ! [ -z "$2" ];

then

if ! [ "$2" = "-clean" ];

then

echo "Running Custom S3 Command"

S3_CMD=$S3_CMD" "$S3_EXTRA

fi

fi

echo Backing up systems

echo ______________

echo -e ${BLUE} S3 Bucket Configured: $RED $S3_BUCKET${NORMAL_COL}

echo -e ${BLUE}S3 Command is: $RED $S3_CMD${NORMAL_COL}

echo -e ${BLUE}Exclude File Path: $RED $EXCLUDE_FILE${NORMAL_COL}

echo -e ${BLUE}Running on: $RED $HOSTNAME${NORMAL_COL}

echo -e ${BLUE}Destination is: $RED $BCK_DEST${NORMAL_COL}

if [ -z "$1" ];

then

echo -e ${BLUE}Command line passed: Empty ${NORMAL_COL}

else

echo -e ${BLUE}Command line passed: $1 ${NORMAL_COL}

fi

echo

echo -----------------------------------------------------------------------

echo

#error function. pass as func error,code,message

function_error () {

echo -e ${RED}"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"

echo -e "CRITICAL ERROR!"

echo -e "Error occured in: " $1 "Error returned was: " $2

if [ -z "$3" ];

then

echo -e "Unknown Error, cannot advise. Check FAQ"

fi

echo -e "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"${NORMAL_COL}

return 1

exit

}

#ENSURE DRIVE IS MOUNTED

if mountpoint -q $BCK_DEST

then

echo -e ${BLUE}"Backup Location is mounted " $BCK_DEST ${NORMAL_COL}

else

function_error "Backup Location not mounted" $? "Mount location and restart"

exit

fi

if [ -z "$1" ];

then

#--------------------------------------------------------------------------------------

#Copy To Local Storage

echo Command Returned: $?

echo -e ${BLUE}Backing Up Dads${NORMAL_COL}

wget -q --tries=10 --timeout=20 --spider http://google.com

if [[ $? -eq 0 ]]; then

echo -e "Internet Connected"

else

function_error "Internet Connection Down" "INT_DOWN $?" "Check Internet Connection"

exit

fi

ping -c1 ${REM_HOST} -q 2>&1 >/dev/null

RET=$?

if [ ${RET} -eq 0 ]; then

echo -e ${BLUE}"Host Is Alive"${NORMAL_COL}

echo -e ${BLUE}Ping Command Returned: $? ${NORMAL_COL}

echo -e ${BLUE}rsync command is: "-$RSYNC_CMD_STD ( ${REM_HOST} )"${NORMAL_COL}

rsync -$RSYNC_CMD_STD -e ssh --exclude-from $EXCLUDE_FILE william@${REM_HOST}:/home/william $BCK_DEST/Backup/Dads

echo -e ${BLUE}"Command Returned (RSync):" $? ${NORMAL_COL}

if ! [ "$?" = "0" ];

then

function_error "RSYNC" $? "Check RSYNC Command Line, and RSYNC Dirs"

fi

rsync -$RSYNC_CMD_STD -e ssh --exclude-from $EXCLUDE_FILE $USER@${REM_HOST}:/home/$USER $BCK_DEST/Backup/$USER

if ! [ "$?" = "0" ];

then

function_error "RSYNC" $? "Check RSYNC Command Line, and RSYNC Dirs"

fi

echo -e ${BLUE}"Command Returned (RSync):" $? ${NORMAL_COL}

else

echo -e ${RED}"Host ${REM_HOST} failed ping monitoring on `date`"${NORMAL_COL}

echo -e ${RED}"Ping Command Returned (Ping):" $? ${NORMAL_COL}

echo -e ${RED}"${REM_HOST} is Dead"${NORMAL_COL}

fi

echo -e ${BLUE}Backing up $HOSTNAME${NORMAL_COL}

echo -e ${BLUE}rsync command is: "-$RSYNC_CMD_STD ( $HOSTNAME )"${NORMAL_COL}

rsync -$RSYNC_CMD_STD --exclude-from $EXCLUDE_FILE /home/michael $BCK_DEST/Backup/Michael-Debian

if ! [ "$?" = "0" ];

then

function_error "RSYNC" $? "Check RSYNC Command Line, and RSYNC Dirs"

fi

echo -e ${BLUE}"Command Returned (RSync):" $?${NORMAL_COL}

fi

if [ "$1" = "-clean" ];

then

ping -c1 ${REM_HOST} -q 2>&1 >/dev/null

RET=$?

if [ ${RET} -eq 0 ]; then

echo -e ${BLUE}"Host Is Alive"${NORMAL_COL}

echo -e ${BLUE}"Ping Command Returned (Ping):" $? ${NORMAL_COL}

echo -e ${BLUE}"Host is Alive" ${NORMAL_COL}

echo -e ${BLUE}rsync command is: "-$RSYNC_CMD_STD ( ${REM_HOST} )" ${NORMAL_COL}

rsync -$RSYNC_CMD_STD$RSYNC_CMD_CLEAN -e ssh --exclude-from $EXCLUDE_FILE william@${REM_HOST}:/home/william $BCK_DEST/Backup/Dads

if ! [ "$?" = "0" ];

then

function_error "RSYNC" $? "Check RSYNC Command Line, and RSYNC Dirs"

fi

echo -e ${BLUE}"Command Returned (RSync):" $? ${NORMAL_COL}

rsync -$RSYNC_CMD_STD$RSYNC_CMD_CLEAN -e ssh --exclude-from $EXCLUDE_FILE $USER@${REM_HOST}:/home/$USER $BCK_DEST/Backup/$USER

if ! [ "$?" = "0" ];

then

function_error "RSYNC" $? "Check RSYNC Command Line, and RSYNC Dirs"

fi

echo "Command Returned (RSync):" $?

else

echo -e ${RED}"Host ${REM_HOST} failed ping monitoring on `date`"${NORMAL_COL}

echo -e ${RED}"Ping Command Returned:" $? ${NORMAL_COL}

echo -e ${RED}"${REM_HOST} is Dead"${NORMAL_COL}

fi

echo -e ${BLUE}Backing up $HOSTNAME ${NORMAL_COL}

rsync -$RSYNC_CMD_STD$RSYNC_CMD_CLEAN --exclude-from $EXCLUDE_FILE /home/michael $BCK_DEST/Backup/Michael-Debian

if ! [ "$?" = "0" ];

then

function_error "RSYNC" $? "Check RSYNC Command Line, and RSYNC Dirs"

fi

echo -e ${BLUE}"Command Returned (RSync):" $? ${NORMAL_COL}

#-----------------------------------------------------------------------------------------------------------------------------------

# -s3clean has been added as a command line option, and must be passed a second command to -clean

# it will cause a S3 clean event to be processed.

# -clean on its own will pass only a standard archive clean. S3 is not routinly cleaned, unless explicity passed with -s3clean.

if [ "$2" = "-s3clean" ];

then

#Call Clean_S3

source s3_cmd.sc

fi

#-----------------------------------------------------------------------------------------------------------------------------------

fi

if [ "$1" = "-s3" ];

then

echo S3 destination is: $S3_BUCKET

echo Amazon Upload Proceding...

echo Uploading $BCK_DEST/Backup/Dads/william/Pictures/

s3cmd sync $S3_CMD $BCK_DEST/Backup/Dads/william/Pictures/ $S3_BUCKET/Dads/Pictures/

if ! [ "$?" = "0" ];

then

function_error "S3CMD" $? "Check S3CMD Command Line, and Dirs"

fi

echo "Command Returned (S3CMD):" $?

echo Uploading $BCK_DEST/Backup/Dads/william/Documents/

s3cmd sync $S3_CMD $BCK_DEST/Backup/Dads/william/Documents/ $S3_BUCKET/Dads/Documents/

if ! [ "$?" = "0" ];

then

function_error "S3CMD" $? "Check S3CMD Command Line, and Dirs"

fi

echo "Command Returned (S3CMD):" $?

echo Uploading $BCK_DEST/Backup/Dads/william/Videos/

s3cmd sync $S3_CMD $BCK_DEST/Backup/Dads/william/Videos/ $S3_BUCKET/Dads/Videos/

if ! [ "$?" = "0" ];

then

function_error "S3CMD" $? "Check S3CMD Command Line, and Dirs"

fi

echo "Command Returned (S3CMD):" $?

echo Uploading $BCK_DEST/Backup/Michael-Debian/

s3cmd sync $S3_CMD $BCK_DEST/Backup/Michael-Debian/ $S3_BUCKET/Michael-Debian/

if ! [ "$?" = "0" ];

then

function_error "S3CMD" $? "Check S3CMD Command Line, and Dirs"

fi

echo "Command Returned (S3CMD):" $?

exit

fi

#EOF

Blocking China

Tired of seeing Chinese IP addresses in my server logs trying SSH access, I decided to completly block it off.

Easy way todo this? Yup.


#!/bin/sh

cd /home/michael/System\ Scripts/Sineo_IPTABLES_Block/

wget http://www.okean.com/antispam/iptables/rc.firewall.sinokorea

sed -i 's/INPUT/SINEO/g' /home/michael/System\ Scripts/Sineo_IPTABLES_Block/rc.firewall.sinokorea

sed -i '/iptables -A SINEO -m state --state ESTABLISHED,RELATED -j ACCEPT/d' /home/michael/System\ Scripts/Sineo_IPTABLES_Block/rc.firewall.sinokorea

chmod +x /home/michael/System\ Scripts/Sineo_IPTABLES_Block/rc.firewall.sinokorea

sudo /home/michael/System\ Scripts/Sineo_IPTABLES_Block/rc.firewall.sinokorea

This will download a formatted list of chinese and Korean IP addresses, change the iptables chain to point to a new chain rather than the input chain, remove the now pointless established rule, and install the chain. Obviously, change the file paths to suit your structure.

It will however, require root to install the chain.

Backup, Its Important!

Recently I switched my last computer over to linux, and this meant I had to rethink my backup strategy. I like to copy all important files over to a central area, maybe even to a remote NAS box. In windows this was done via a simple batch file each user ran when they felt like it to copy it to a shared drive on the network. On linux, I can utilise the power of rsync, and do it all from a local terminal. So I cooked up the following bash script.
There are a few dependencies this has, it requires the following:

  1. s3cmd installed and configured to the S3 account
  2. rsync installed on all machines. (Not installed as standard on Debian systems.)
  3. An available drive or NAS box attached.
  4. If passwordless required for connecting to the other boxes, you will need to set up passwordless SSH, which I’ll not cover here, but there are countless other tutorials to cover this.

It can be run from a simple cron job, or manually when required. There are command line options that can be run

  1. -clean
    1. -clean command line will cause rsync to delete any files that are no longer on the source from the archive. It will also remove any files that have been copied, but now excluded from the exclusion file.
  2. -s3
    1. Will cause an upload event and syncronise the Amazon S3 archive with the local. This will not however trigger a local update event
  3. Empty command line
    1. Will simply backup files to the target machine from the sources, this will not clean the archives, nor will it trigger a S3 upload.

So, lets break down the source and have a look.


if mountpoint -q $BCK_DEST
then
echo "Backup Location is mounted"

if [ -z "$1" ];
then
echo rsync command is: -$RSYNC_CMD_STD
#--------------------------------------------------------------------------------------
#Copy To Local Storage
echo Backing Up
rsync -$RSYNC_CMD_STD -e ssh --exclude-from $EXCLUDE_FILE REMOTE SYSTEM $BCK_DEST/1
echo Backing up $HOSTNAME
rsync -$RSYNC_CMD_STD --exclude-from $EXCLUDE_FILE REMOTE SYSTEM $BCK_DEST/2

#--------------------------------------------------------------------------------------
fi

This first section is the overall backup, responsible for copying the source to the target. We initially connect the rsync over to the target via SSH, and copy the entire home folder, minus the excluded files and directories. Its only run, should the command line be empty, the default run if you like. The variable $1 is the first command line option passed to the script. One of the important things we do here, is ensure that the drive is actually mounted in the system. If this is not mounted, everything else will fail.

if [ "$1" = "-clean" ];
then
echo RSync Clean Command is: -$RSYNC_CMD_STD$RSYNC_CMD_CLEAN
echo Backing Up
rsync -$RSYNC_CMD_STD$RSYNC_CMD_CLEAN --exclude-from $EXCLUDE_FILE REMOTE SYSTEM $BCK_DEST/1
echo Backing up $HOSTNAME
rsync -$RSYNC_CMD_STD$RSYNC_CMD_CLEAN --exclude-from $EXCLUDE_FILE REMOTE SYSTEM $BCK_DEST/2
fi

This section is the clean up. It runs the standard backup, along with the rsync options “–delete-after –delete-excluded” which clean up the archives.

if [ "$1" = "-s3" ];
then
echo S3 destination is: $S3_BUCKET
echo Amazon Upload Proceding
s3cmd sync $S3_CMD $BCK_DEST/1 $S3_BUCKET/1
exit
fi

This section is the amazon upload. Its not quite complete,.and there are more directories to add to the upload. However, uploading to amazon, is not the quickest thing in the world, and I’ll add the other important directories as and when the uploads complete.

And here it is, in its entire bash like glory.

#!/bin/bash

#VARIABLES
BCK_DEST=/mnt/sdc1
EXCLUDE_FILE=rsync_exclude
S3_BUCKET=S3 Bucket
RSYNC_CMD_STD=avzh
RSYNC_CMD_CLEAN=" --delete-after --delete-excluded"
S3_CMD="-rH --skip-existing --delete-removed --acl-private"

echo backing up systems
echo ______________
echo Exclude File Path: $EXCLUDE_FILE
echo running on: $HOSTNAME
echo destibation is: $BCK_DEST
echo Command line passed: $1

#ENSURE DRIVE IS MOUNTED
if mountpoint -q $BCK_DEST
then
echo "Backup Location is mounted"

if [ -z "$1" ];
then
echo rsync command is: -$RSYNC_CMD_STD
#--------------------------------------------------------------------------------------
#Copy To Local Storage
echo Backing Up
rsync -$RSYNC_CMD_STD --exclude-from $EXCLUDE_FILE REMOTE SYSTEM $BCK_DEST/1
echo Backing up $HOSTNAME
rsync -$RSYNC_CMD_STD --exclude-from $EXCLUDE_FILE REMOTE SYSTEM $BCK_DEST/2

#--------------------------------------------------------------------------------------
fi

if [ "$1" = "-clean" ];
then
echo RSync Clean Command is: -$RSYNC_CMD_STD$RSYNC_CMD_CLEAN
echo Backing Up 1
rsync -$RSYNC_CMD_STD$RSYNC_CMD_CLEAN --exclude-from $EXCLUDE_FILE REMOTE SYSTEM $BCK_DEST/1
echo Backing up $HOSTNAME
rsync -$RSYNC_CMD_STD$RSYNC_CMD_CLEAN --exclude-from $EXCLUDE_FILE REMOTE SYSTEM $BCK_DEST/2
fi

if [ "$1" = "-s3" ];
then
echo S3 destination is: $S3_BUCKET
echo Amazon Upload Proceding
s3cmd sync $S3_CMD $BCK_DEST/1 $S3_BUCKET/1
exit
fi

else
echo "Backup Location is not mounted"
exit
fi

Hope you find this useful, and if you have any ideas how to improve it, let me know.