A quick update to the backup script. Instead of having a set list of folders stored in the backup directories to be uploaded to S3, the script now uploads everything in the backup location that’s not specifically excluded by the exclude file. This cuts a good few lines out of the script.
A quick note on the exclude file, you must exclude both the directory, and it’s contents as S3 has no concept of folders.
So, to exclude the folder /back/folder and it’s contents your exclude file must contain /back/folder and /back/folder/*
if [ "$1" = "-s3" ]; then #This code is still subject to Testing. It shouldnt require a re upload of data. #However, this uploads the WHOLE backup to the cloud. It does not filter out folders at this time. #Use the S3 Exclude file for this purpose. shopt -s dotglob shopt -s nullglob array=($BCK_DEST/Backup/*) for dir in "${array[@]}" do echo $(basename "${dir%.*}") echo Uploading $dir FOL=$(basename "${dir%.*}") s3cmd sync $S3_CMD $dir $S3_BUCKET/$FOL/ echo ....... if ! [ "$?" = "0" ]; then function_error "S3CMD" $? "Check S3CMD Command Line, and Dirs" fi done fi