Another backup post

So I had a thought of redoing my backup scripts, as my existing script file was becoming huge, complicated and difficult to follow.

It was a good way for me to learn Bash scripting however. The idea was to vastly simplify the whole process. I began by thinking the best way, and decided to use each machine to independently backup to a network drive, rather than having a single machine doing the grunt work running a script.

So each computer on the network had the drive mounted, and the script file placed into the main users crontab. I could of used the root crontab to copy the whole /home/ directory of course, but each machine only has one real user so I opted for that.

The file that runs from crontab is very simple

source /mnt/dlink_nfs/backup-script/var-dec
rsync -va --delete-after --delete-excluded --exclude-from=$FOLDER_NFS/backup-script/exclude.lst /home/$USER $FOLDER_NFS/backup-test/$DIRNAME

And that’s it. Just uses rsync to copy the contents to the network drive . The referenced source file is just shared variable declarations.

Now, that’s not quite enough for me to be happy with a backup system, so I use a raspberry pi, to run a second set of scripts from its crontab. Those files are responsible for uploading to Amazon S3, and also copying to a secondary NAS.


#!/bin/bash

source /mnt/dlink_nfs/backup-script/var-dec
echo "Script Started: $(date)" >> uploads3.log

if pidof -x "$script_name" -o $$ >/dev/null;then
   echo "An another instance of this script is already running"
echo "Script Already running, exiting" >>  uploads3.log
echo "-----------------------"
   exit 1
fi

if [[ $1 == 'clean' ]]
	then
		echo "clean command passed" >> uploads3.log
		rsync -vruO --delete-after $FOLDER_NFS/backup-test /mnt/samba/
		echo "Clean compleated $(date)"	
		exit 1
else
	if mountpoint -q /mnt/samba
		then
echo "Samba share mounted, started RSYNC" >> uploads3.log
		rsync -vruO $FOLDER_NFS/backup-test /mnt/samba/
	fi

	cd $FOLDER_NFS/backup-test/
echo "Starting S3 uploads" >> /home/pi/uploads3.log
	shopt -s dotglob
	shopt -s nullglob
	array=(*/)
	echo runing s3

	for dir in "${array[@]}"
	 do 
		echo "Currently Running S3 on $dir" >> /home/pi/uploads3.log
		dir=${dir%/}
	        timeout 30m s3cmd $s3_cmd $dir $s3_bucket
		echo "Compleated uploading $dir" >> /home/pi/uploads3.log

	 done
echo "Finished Script: $(date)" >> /home/pi/uploads3.log
echo "--------------------" >> /home/pi/uploads3.log
fi

And that file basically, ensures the script isn’t already running, copies the backup to another NAS, then iterates through each directory uploading to S3. I use timeout to limit each upload to 30mins to prevent overruns. Once the initial upload has completed, this limit can be removed.

You can view the most up to date git repository at my github site https://github.com/mikethompson/new-backup

TwitterBot

I’d thought about writing a script for a Twitter bot. So the only thing I could think of is to push PiHole stats from my PiHole server.

In order to create a Twitter bot, you have to register as a developer but that’s easy enough. Next up is to find a interface that’s scripted in bash, as I didn’t fancy trying python, and I found the excellent Twurl Package.

Here’s a link to my repo where I’ve stored the code: https://github.com/mikethompson/PiHoleStats

I’ll update this, add a step by step and the repo when I have more time.

Update to the Backup

A quick update to the backup script. Instead of having a set list of folders stored in the backup directories to be uploaded to S3, the script now uploads everything in the backup location that’s not specifically excluded by the exclude file. This cuts a good few lines out of the script.

A quick note on the exclude file, you must exclude both the directory, and it’s contents as S3 has no concept of folders.

So, to exclude the folder /back/folder and it’s contents your exclude file must contain /back/folder and /back/folder/*


	if [ "$1" = "-s3" ]; 
		then 
			#This code is still subject to Testing. It shouldnt require a re upload of data.
#However, this uploads the WHOLE backup to the cloud. It does not filter out folders at this time.
#Use the S3 Exclude file for this purpose.
			shopt -s dotglob
			shopt -s nullglob
			array=($BCK_DEST/Backup/*)
			for dir in "${array[@]}"
				do
					echo $(basename "${dir%.*}")
					echo Uploading $dir
					FOL=$(basename "${dir%.*}")
					s3cmd sync $S3_CMD $dir $S3_BUCKET/$FOL/
					echo .......
				if ! [ "$?" = "0" ];
					then
					function_error "S3CMD" $? "Check S3CMD Command Line, and Dirs"
				fi
				done

	fi

Backups.. (Again!)

For a little while now, I’ve been working on a new backup script, as switching from Windows to a full Linux environment rendered my last script unusable.

So, here is the full script, it works well, but I’m making tweaks to it.

One issue I Have, is certain users don’t exist on certain machines, so my development branch doesn’t quite work. I’ve managed to loop through an array contains user names, and another looping through IP addresses, however I’ve not worked out away yet to filter those arrays to remove IP and user names that don’t exist on the target.

But basically, this script connects to machines using RSYNC, over SSH and copies the files to the backup location, and optionally uploads them to Amazon S3.

So, without further ado, here is a wall of code..

#!/bin/bash#Main Script file for Back ups.#See Bitbucket Repo for further information https://bitbucket.org/thompsonmichael/backup-sys#Michael Thompson 2018# mikethompson@gmx.co.uk (GPG Key-ID: 062C03D9)#Version 0.0.1#VARIABLESBCK_DEST=/mnt/Logical_DataEXCLUDE_FILE=/home/michael/Script/rsync_excludeS3_BUCKET=s3://RSYNC_CMD_STD="azh --progress"RSYNC_CMD_CLEAN=" --delete-after --delete-excluded"S3_CMD="-rHv --skip-existing --acl-private --continue-put --storage-class=STANDARD_IA --no-delete-removed --exclude-from=s3_exclude"S3_EXTRA=$2LOG_FILE="/home/michael/Script/log_file.log"REM_HOST="192.168.0.2"BLUE="\e[1;34m"RED="\e[1;31m"NORMAL_COL="\e[0m"if ! [ -z "$2" ];thenif ! [ "$2" = "-clean" ];thenecho "Running Custom S3 Command"S3_CMD=$S3_CMD" "$S3_EXTRAfifiecho Backing up systemsecho ______________echo -e ${BLUE} S3 Bucket Configured: $RED $S3_BUCKET${NORMAL_COL}echo -e ${BLUE}S3 Command is: $RED $S3_CMD${NORMAL_COL}echo -e ${BLUE}Exclude File Path: $RED $EXCLUDE_FILE${NORMAL_COL}echo -e ${BLUE}Running on: $RED $HOSTNAME${NORMAL_COL}echo -e ${BLUE}Destination is: $RED $BCK_DEST${NORMAL_COL}if [ -z "$1" ];thenecho -e ${BLUE}Command line passed: Empty ${NORMAL_COL}elseecho -e ${BLUE}Command line passed: $1 ${NORMAL_COL}fiechoecho -----------------------------------------------------------------------echo#error function. pass as func error,code,messagefunction_error () {echo -e ${RED}"xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"echo -e "CRITICAL ERROR!"echo -e "Error occured in: " $1 "Error returned was: " $2if [ -z "$3" ];thenecho -e "Unknown Error, cannot advise. Check FAQ"fiecho -e "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"${NORMAL_COL}return 1exit}#ENSURE DRIVE IS MOUNTEDif mountpoint -q $BCK_DESTthenecho -e ${BLUE}"Backup Location is mounted " $BCK_DEST ${NORMAL_COL}elsefunction_error "Backup Location not mounted" $? "Mount location and restart"exitfiif [ -z "$1" ];then#--------------------------------------------------------------------------------------#Copy To Local Storageecho Command Returned: $?echo -e ${BLUE}Backing Up Dads${NORMAL_COL}wget -q --tries=10 --timeout=20 --spider http://google.comif [[ $? -eq 0 ]]; thenecho -e "Internet Connected"elsefunction_error "Internet Connection Down" "INT_DOWN $?" "Check Internet Connection"exitfiping -c1 ${REM_HOST} -q 2>&1 >/dev/nullRET=$?if [ ${RET} -eq 0 ]; thenecho -e ${BLUE}"Host Is Alive"${NORMAL_COL}echo -e ${BLUE}Ping Command Returned: $? ${NORMAL_COL}echo -e ${BLUE}rsync command is: "-$RSYNC_CMD_STD ( ${REM_HOST} )"${NORMAL_COL}rsync -$RSYNC_CMD_STD -e ssh --exclude-from $EXCLUDE_FILE william@${REM_HOST}:/home/william $BCK_DEST/Backup/Dadsecho -e ${BLUE}"Command Returned (RSync):" $? ${NORMAL_COL}if ! [ "$?" = "0" ];thenfunction_error "RSYNC" $? "Check RSYNC Command Line, and RSYNC Dirs"firsync -$RSYNC_CMD_STD -e ssh --exclude-from $EXCLUDE_FILE $USER@${REM_HOST}:/home/$USER $BCK_DEST/Backup/$USERif ! [ "$?" = "0" ];thenfunction_error "RSYNC" $? "Check RSYNC Command Line, and RSYNC Dirs"fiecho -e ${BLUE}"Command Returned (RSync):" $? ${NORMAL_COL}elseecho -e ${RED}"Host ${REM_HOST} failed ping monitoring on `date`"${NORMAL_COL}echo -e ${RED}"Ping Command Returned (Ping):" $? ${NORMAL_COL}echo -e ${RED}"${REM_HOST} is Dead"${NORMAL_COL}fiecho -e ${BLUE}Backing up $HOSTNAME${NORMAL_COL}echo -e ${BLUE}rsync command is: "-$RSYNC_CMD_STD ( $HOSTNAME )"${NORMAL_COL}rsync -$RSYNC_CMD_STD --exclude-from $EXCLUDE_FILE /home/michael $BCK_DEST/Backup/Michael-Debianif ! [ "$?" = "0" ];thenfunction_error "RSYNC" $? "Check RSYNC Command Line, and RSYNC Dirs"fiecho -e ${BLUE}"Command Returned (RSync):" $?${NORMAL_COL}fiif [ "$1" = "-clean" ];thenping -c1 ${REM_HOST} -q 2>&1 >/dev/nullRET=$?if [ ${RET} -eq 0 ]; thenecho -e ${BLUE}"Host Is Alive"${NORMAL_COL}echo -e ${BLUE}"Ping Command Returned (Ping):" $? ${NORMAL_COL}echo -e ${BLUE}"Host is Alive" ${NORMAL_COL}echo -e ${BLUE}rsync command is: "-$RSYNC_CMD_STD ( ${REM_HOST} )" ${NORMAL_COL}rsync -$RSYNC_CMD_STD$RSYNC_CMD_CLEAN -e ssh --exclude-from $EXCLUDE_FILE william@${REM_HOST}:/home/william $BCK_DEST/Backup/Dadsif ! [ "$?" = "0" ];thenfunction_error "RSYNC" $? "Check RSYNC Command Line, and RSYNC Dirs"fiecho -e ${BLUE}"Command Returned (RSync):" $? ${NORMAL_COL}rsync -$RSYNC_CMD_STD$RSYNC_CMD_CLEAN -e ssh --exclude-from $EXCLUDE_FILE $USER@${REM_HOST}:/home/$USER $BCK_DEST/Backup/$USERif ! [ "$?" = "0" ];thenfunction_error "RSYNC" $? "Check RSYNC Command Line, and RSYNC Dirs"fiecho "Command Returned (RSync):" $?elseecho -e ${RED}"Host ${REM_HOST} failed ping monitoring on `date`"${NORMAL_COL}echo -e ${RED}"Ping Command Returned:" $? ${NORMAL_COL}echo -e ${RED}"${REM_HOST} is Dead"${NORMAL_COL}fiecho -e ${BLUE}Backing up $HOSTNAME ${NORMAL_COL}rsync -$RSYNC_CMD_STD$RSYNC_CMD_CLEAN --exclude-from $EXCLUDE_FILE /home/michael $BCK_DEST/Backup/Michael-Debianif ! [ "$?" = "0" ];thenfunction_error "RSYNC" $? "Check RSYNC Command Line, and RSYNC Dirs"fiecho -e ${BLUE}"Command Returned (RSync):" $? ${NORMAL_COL}#-----------------------------------------------------------------------------------------------------------------------------------# -s3clean has been added as a command line option, and must be passed a second command to -clean# it will cause a S3 clean event to be processed.# -clean on its own will pass only a standard archive clean. S3 is not routinly cleaned, unless explicity passed with -s3clean.if [ "$2" = "-s3clean" ];then#Call Clean_S3source s3_cmd.scfi#-----------------------------------------------------------------------------------------------------------------------------------fiif [ "$1" = "-s3" ];thenecho S3 destination is: $S3_BUCKETecho Amazon Upload Proceding...echo Uploading $BCK_DEST/Backup/Dads/william/Pictures/s3cmd sync $S3_CMD $BCK_DEST/Backup/Dads/william/Pictures/ $S3_BUCKET/Dads/Pictures/if ! [ "$?" = "0" ];thenfunction_error "S3CMD" $? "Check S3CMD Command Line, and Dirs"fiecho "Command Returned (S3CMD):" $?echo Uploading $BCK_DEST/Backup/Dads/william/Documents/s3cmd sync $S3_CMD $BCK_DEST/Backup/Dads/william/Documents/ $S3_BUCKET/Dads/Documents/if ! [ "$?" = "0" ];thenfunction_error "S3CMD" $? "Check S3CMD Command Line, and Dirs"fiecho "Command Returned (S3CMD):" $?echo Uploading $BCK_DEST/Backup/Dads/william/Videos/s3cmd sync $S3_CMD $BCK_DEST/Backup/Dads/william/Videos/ $S3_BUCKET/Dads/Videos/if ! [ "$?" = "0" ];thenfunction_error "S3CMD" $? "Check S3CMD Command Line, and Dirs"fiecho "Command Returned (S3CMD):" $?echo Uploading $BCK_DEST/Backup/Michael-Debian/s3cmd sync $S3_CMD $BCK_DEST/Backup/Michael-Debian/ $S3_BUCKET/Michael-Debian/if ! [ "$?" = "0" ];thenfunction_error "S3CMD" $? "Check S3CMD Command Line, and Dirs"fiecho "Command Returned (S3CMD):" $?exitfi#EOF

Blocking China

Tired of seeing Chinese IP addresses in my server logs trying SSH access, I decided to completly block it off.

Easy way todo this? Yup.


#!/bin/sh

cd /home/michael/System\ Scripts/Sineo_IPTABLES_Block/

wget http://www.okean.com/antispam/iptables/rc.firewall.sinokorea

sed -i 's/INPUT/SINEO/g' /home/michael/System\ Scripts/Sineo_IPTABLES_Block/rc.firewall.sinokorea

sed -i '/iptables -A SINEO -m state --state ESTABLISHED,RELATED -j ACCEPT/d' /home/michael/System\ Scripts/Sineo_IPTABLES_Block/rc.firewall.sinokorea

chmod +x /home/michael/System\ Scripts/Sineo_IPTABLES_Block/rc.firewall.sinokorea

sudo /home/michael/System\ Scripts/Sineo_IPTABLES_Block/rc.firewall.sinokorea

This will download a formatted list of chinese and Korean IP addresses, change the iptables chain to point to a new chain rather than the input chain, remove the now pointless established rule, and install the chain. Obviously, change the file paths to suit your structure.

It will however, require root to install the chain.

Backup, Its Important!

Recently I switched my last computer over to linux, and this meant I had to rethink my backup strategy. I like to copy all important files over to a central area, maybe even to a remote NAS box. In windows this was done via a simple batch file each user ran when they felt like it to copy it to a shared drive on the network. On linux, I can utilise the power of rsync, and do it all from a local terminal. So I cooked up the following bash script.
There are a few dependencies this has, it requires the following:

  1. s3cmd installed and configured to the S3 account
  2. rsync installed on all machines. (Not installed as standard on Debian systems.)
  3. An available drive or NAS box attached.
  4. If passwordless required for connecting to the other boxes, you will need to set up passwordless SSH, which I’ll not cover here, but there are countless other tutorials to cover this.

It can be run from a simple cron job, or manually when required. There are command line options that can be run

  1. -clean
    1. -clean command line will cause rsync to delete any files that are no longer on the source from the archive. It will also remove any files that have been copied, but now excluded from the exclusion file.
  2. -s3
    1. Will cause an upload event and syncronise the Amazon S3 archive with the local. This will not however trigger a local update event
  3. Empty command line
    1. Will simply backup files to the target machine from the sources, this will not clean the archives, nor will it trigger a S3 upload.

So, lets break down the source and have a look.


if mountpoint -q $BCK_DEST
then
echo "Backup Location is mounted"

if [ -z "$1" ];
then
echo rsync command is: -$RSYNC_CMD_STD
#--------------------------------------------------------------------------------------
#Copy To Local Storage
echo Backing Up
rsync -$RSYNC_CMD_STD -e ssh --exclude-from $EXCLUDE_FILE REMOTE SYSTEM $BCK_DEST/1
echo Backing up $HOSTNAME
rsync -$RSYNC_CMD_STD --exclude-from $EXCLUDE_FILE REMOTE SYSTEM $BCK_DEST/2

#--------------------------------------------------------------------------------------
fi

This first section is the overall backup, responsible for copying the source to the target. We initially connect the rsync over to the target via SSH, and copy the entire home folder, minus the excluded files and directories. Its only run, should the command line be empty, the default run if you like. The variable $1 is the first command line option passed to the script. One of the important things we do here, is ensure that the drive is actually mounted in the system. If this is not mounted, everything else will fail.

if [ "$1" = "-clean" ];
then
echo RSync Clean Command is: -$RSYNC_CMD_STD$RSYNC_CMD_CLEAN
echo Backing Up
rsync -$RSYNC_CMD_STD$RSYNC_CMD_CLEAN --exclude-from $EXCLUDE_FILE REMOTE SYSTEM $BCK_DEST/1
echo Backing up $HOSTNAME
rsync -$RSYNC_CMD_STD$RSYNC_CMD_CLEAN --exclude-from $EXCLUDE_FILE REMOTE SYSTEM $BCK_DEST/2
fi

This section is the clean up. It runs the standard backup, along with the rsync options “–delete-after –delete-excluded” which clean up the archives.

if [ "$1" = "-s3" ];
then
echo S3 destination is: $S3_BUCKET
echo Amazon Upload Proceding
s3cmd sync $S3_CMD $BCK_DEST/1 $S3_BUCKET/1
exit
fi

This section is the amazon upload. Its not quite complete,.and there are more directories to add to the upload. However, uploading to amazon, is not the quickest thing in the world, and I’ll add the other important directories as and when the uploads complete.

And here it is, in its entire bash like glory.

#!/bin/bash

#VARIABLES
BCK_DEST=/mnt/sdc1
EXCLUDE_FILE=rsync_exclude
S3_BUCKET=S3 Bucket
RSYNC_CMD_STD=avzh
RSYNC_CMD_CLEAN=" --delete-after --delete-excluded"
S3_CMD="-rH --skip-existing --delete-removed --acl-private"

echo backing up systems
echo ______________
echo Exclude File Path: $EXCLUDE_FILE
echo running on: $HOSTNAME
echo destibation is: $BCK_DEST
echo Command line passed: $1

#ENSURE DRIVE IS MOUNTED
if mountpoint -q $BCK_DEST
then
echo "Backup Location is mounted"

if [ -z "$1" ];
then
echo rsync command is: -$RSYNC_CMD_STD
#--------------------------------------------------------------------------------------
#Copy To Local Storage
echo Backing Up
rsync -$RSYNC_CMD_STD --exclude-from $EXCLUDE_FILE REMOTE SYSTEM $BCK_DEST/1
echo Backing up $HOSTNAME
rsync -$RSYNC_CMD_STD --exclude-from $EXCLUDE_FILE REMOTE SYSTEM $BCK_DEST/2

#--------------------------------------------------------------------------------------
fi

if [ "$1" = "-clean" ];
then
echo RSync Clean Command is: -$RSYNC_CMD_STD$RSYNC_CMD_CLEAN
echo Backing Up 1
rsync -$RSYNC_CMD_STD$RSYNC_CMD_CLEAN --exclude-from $EXCLUDE_FILE REMOTE SYSTEM $BCK_DEST/1
echo Backing up $HOSTNAME
rsync -$RSYNC_CMD_STD$RSYNC_CMD_CLEAN --exclude-from $EXCLUDE_FILE REMOTE SYSTEM $BCK_DEST/2
fi

if [ "$1" = "-s3" ];
then
echo S3 destination is: $S3_BUCKET
echo Amazon Upload Proceding
s3cmd sync $S3_CMD $BCK_DEST/1 $S3_BUCKET/1
exit
fi

else
echo "Backup Location is not mounted"
exit
fi

Hope you find this useful, and if you have any ideas how to improve it, let me know.