About
This is a small tar backup script (smalltarbkp ) as it says in the name it was written in bash shell scripting designed for simplicity and purposely for Linux systems, this script is easy to use, easy to modify for those that want to improve, change or embed their own custom controls or reports.
The script smalltarbkp does not keep any configuration files making it easily possible to host multiple copies of the script each with it’s own name and purpose keeping all updates, configurations and details within it self.
The flexibility of the script is not only that its easy to use, easy to modify for those with scripting knowledge but also having the flexibility to use it as is with many options for different type of backups, retentions, folders, systems, target location even MEGA.nz cloud and all with a few simple interactive Q/A questions.
Installation
So before we can start and we can use the smalltarbkp script to backup our data we need to install it and like the rest of the scrip the installation is also just as simple.
You can install it with a few simple commands to insure that you have done it all correctly before adding it to the crontab scheduler.
Step 1
Create directory where to will keep the smalltarbkp script.
$ mkdir ~/scripts
$ cd ~/scripts
Step 2
Download the latest version of the script and save it into your newly created folder.
$ wget -qO smalltarbkp http://bit.ly/2KprwV0
Step 3
Before you can run the script you need to make it executable so we are going to do just that.
$ chmod 755 smalltarbkp
$ echo “alias smalltarbkp=’/home/fedora/smalltarbkp'” >> ~/.bashrc
$ source ~/.bashrc
Step 4
We are now going to configure our first smalltarbkp backup scripting
$ smalltarbkp -configure
The name of the script is not important, you can call it whatever you want and make sense to you, depending on the configurations required with target location, local, cloud, retentions, etc. You may have multiple copies of the same script with different names.
Step 5 (Cloud Target)
If you are planning to use the MEGA.nz cloud as a target you must have a account setup with https://mega.nz before running the setup as you will require your user name and password ready.
In additionally you will also require to go to Packages and download the installation package for your own Linux distribution:
(megacmd-{Ubuntu|Debian|Raspbian})
The reason I chosen to do this with MEGA.nz is because by default you get a nice size of FREE storage their initial FREE quota / capacity is 50GB it’s enough to store most of your very important files and folders, additional storage can be purchased via MEGA.nz it self.
Before installing “megacmd” make sure you have the additional required libraries also installed.
$ sudo apt-get install apt-transport-https libcrypto++9
$ sudo dpkg –install megacmd-Raspbian_9.0_armhf.deb
$ sudo dnf install megacmd-Fedora_29.x86_64.rpm
Configure
$ smalltarbkp -configure
As previously mentioned the -configure is an interactive Q/A menu updating the script’s own variables based on your answers.
With this method no additional configuration files are required and everything required for the script to function is stored internally, keeping it clean and organised.
Multiple copies of the script may be run on a single system, each bearing their own names and each with its unique set of details and variables based on your setup at the –configure stage.
Not all questions will be asked during the configuration stage – it all depends on the answers you give but for now let’s assume your answer is YES to all questions. Let us go through them ONE by ONE.
This is a example of the questions in action:
Q/A – Maximum size of each TAR file in MEGABYTES (100):
This question relates to the maximum size of the tar files before they get split into the next file. The reason that matters is because there are limitations in file size on certain file-systems and obviously uploading a large file to the cloud has an additional RISK (in case of a failure the whole file needs to be uploaded again), smaller files are also easier and quicker to encrypt and decrypt.
Q/A – Target location NFS Mount | USB | Folder (/backups):
This is the ROOT path to where all your backup images / tar files will be stored, I for instance have a NFS mount point on to additional storage where I keep my data. If you are planning to upload your backup into the MEGA.nz cloud you still need an initial storage location whether it’s for temporarily storing your tar files before they are encrypted and uploaded or for long term keeping.
I recommend having sufficient local storage to run the backup, NFS is a great alternative but very slow in the encryption and decryption processing as it does everything over the local network.
Q/A – How many backup copies to keep before deleting (31):
Generally I would say keep it for at least 31 days if you have the space BUT 31 days depends on the frequency of your backups, for example if you run your backups only once per day I would keep it for 31 backup copies but if you run it twice per day I would keep 62 copies. If you only run the backup once per week I would keep 4 or 5 copies only. All this is your preferred choice of backup retention – remember this is number of copies rather than number of days.
Q/A – How often would you like to run a full backup (7):
Again this will all depend you your own preference, you may wish to run full every backup, therefore you can set this to 1. Alternatively you can run the backup and use the -full flag to force it as a full backup every time.
Example: $ smalltarbkp [-path <path> -name <name>] [-full]
However, if a full backup is large but the amount of data change is low on your system it is recommended to run a lower number of full backups.
Run the backup daily = 31 copies and run a full backup only once per week and thus every 7th copy will be a new full backup.
The difference between a full backup and a non-full backup relies largely on the change rate of the files & folders you are backing up, in my case a full backup can be 1.5 Gigabytes but a non-full (incremental) can be as little as a few hundred kilobytes or a few megabytes.
The backup script uses the tar format to create a snapshot file (list) of all files backed up already so the incremental backup won’t backup the same file twice. This script deletes this snapshot file to force a full scan and a full backup of all the files again and that is how this scrip creates incremental.
It is important that you know how often you want a full backup as that will dictate the size of each backup and the total amount of space is required for all backup storage.
Q/A – Do you want to Encrypt the backup files? YES/NO
This is really a simple question – do you want to encrypt your backup files Yes or No?
I strongly recommend you encrypt your files if you are planning to upload them into the cloud but if you are running local backups only and only you have access to the system and files then keep them non-encrypted as that makes restores / recoveries a lot quicker. This is because you don’t need to decrypt before searching for a file or restoring a file or folder. If you are planning to upload it to the cloud it is strongly recommend accepting the inconvenience and encrypt the files.
Q/A – Enter Preferred Encryption Password (Example: Passw0rd):
This question will only present itself if you answer YES to the previous question to encrypt your backups. It is your choice what your password will be. As an example I include an UPPER case, lower cases and a number, but this is entirely up to you.
Q/A – Sub-Folder Name for all backups (No Spaces/Slashes):
There is no right or wrong answer to this question as long as you do not include spaces or slashes (“ \ “ or “ / ”) in its name as that will potentially confuse the script and cause issues.
This sub-folder will be stored under the original ROOT location and it exists for the potential that you may be running multiple scripts or even the same script for multiple source. In my case I have the main script running into BKP sub-folder and the testing script into the TEST folder, that way my
real backups don’t mix with the testing backups.
It could be the type of backups INC/ FULL or name of the source system or the name of a PET as I mentioned before there is no right or wrong the only rule is to avoid using spaces or slashes.
Q/A – Backup exclude list file location (none):
Not yet implemented, scheduled for a future release.
Q/A – Do you want to backup to MEGA.nz (YES|NO):
The question is simple – do you want to send your backups to the MEGA.nz cloud yes or no?
This is entirely up to you but if you chose to configure your backups to upload into the MEGA.nz cloud you must have an account setup with https://mega.nz before running the setup. Your username and password will be required by the script.
Additionally you will also require to go to https://mega.nz/cmd and download the installation package for your own Linux distribution:
(megacmd-{Ubuntu|Debian|Raspbian})
Q/A – What is your “MEGA.nz” capacity? Default (50):
If you have purchased additional capacity you can and should enter it here as this is how I calculate your free & available storage in the report but if you are using the FREE account the default amount is 50.
Q/A – What’s is your “MEGA.nz” User Email:
Enter the email address you have setup on mega.nz. Remember an account needs to be setup before you get to this stage as the script will test the account to see if it is valid.
Q/A – What’s is your “MEGA.nz” User Password:
Again when you setup your account with MEGA.nz you were required to create a password and now you need to enter the same password you use to login to mega.nz. This is required to upload your backups to the cloud if you chose to do so.
Q/A – Keep local backup copy aswell as MEGA.nz (YES|NO):
It was previously mentioned that you needed a location to store your backups, whether it was a NFS Mount, USB stick or local folder. This can be used to store your backups or just as a temporary location until you upload your images to the cloud.
This question configures if you want to keep the local backup after you successfully uploaded to the cloud or you want it to be deleted.
You can keep a local copy and also a copy on the cloud but if you have limited space locally because you are backing up a Raspberry for example you should chose “NO”. This will automatically delete your local images once it has successfully uploaded it to the Cloud.
It’s your choice and it all depends on local or cloud space constraints, only you can make that decision.
Help Menu
The script contains in built help documentation.
$ smalltarbkp -help
This is the most simple and most useful command of all, it simply display help and what options can be passed through on command line, it will also notify if you have not yet run SET-UP / CONFIGURED.
The usage will always display the correct name / name of the script it self so whatever you decide to call this particular script when running with a -help the actual script name will be displayed.
$ smalltarbkp -upgrade
This will download the latest version from online and update your existing script with the new code.
$ smalltarbkp -configure
This is a interactive Q/A menu updating the script’s own variables based on your answers.
$ smalltarbkp -purge-backup [-cloud | -local]
This will allow you to delete unwanted backups expire old images or clear up space for new backups if the space runs short.
$ smalltarbkp -images [ -cloud | -local ] [ -retrieve | -report ]
This will allow you to either list or retrieve backup images locally or on MEGA.nz if used.
$ smalltarbkp -path [path_name] -name [bkp_name]
The basics of the script to run a backup you must always include the source path and a name
$ smalltarbkp -path [path_name] -name [bkp_name] -verbose
The same as above but includes a more verbose output that can be re-directed to a log file
$ smalltarbkp -path [path_name] -name [bkp_name] -full
This option allows you to override the incremental and force a full backup
$ smalltarbkp -path [path_name] -name [bkp_name] -full -verbose
The same as above it overrides the incremental and force a full backup but includes a more verbose output that can be re-directed to a log file
Upgrading
$ smalltarbkp -upgrade
This is one of my favourite options and while it works well, and I have done all sort of testing I can think of, it is import to know this has only been tested on a script without user modifications.
While making user modifications to the script is encouraged, the upgrade may not work as I can’t predict user modifications. This upgrade option has been coded to protect part of it self only, mainly where your personal configuration is kept.
When the script is being upgraded the only part of the script that stays untouched is between:
# KEEP_START TO UPDATE VARIABLES – RECOMMEND
and
# KEEP_END STATIC SCRIPT VARIABLES – DO NOT MODIFY
That section is right at the top of the script and can easily be identified, anything outside of those parameters KEEP_START / KEEP_END gets upgraded with the downloaded version.
There are also possibilities that if manual editing between those parameters has any special characters, the upgrade may fail.
Should you prefer you can manually download a new version of the script and re-configure all your settings and variables.
Personally I recommend that if you are planning to use the -upgrade option do not manually edit the script.
If you want to run the upgrade this is what you can expect to see.
The upgrade option also creates a copy of the original script before upgrading in case something goes wrong or you do not like the outcome or the new updated script.
Include List example
$ cat include_list.txt
/home/ricardo/apps/KeePassXC
/home/ricardo/.vuescan
/home/ricardo/.local/share/applications
/home/ricardo/.mysql/workbench
/home/ricardo/.config/my-weather-indicator
/home/ricardo/.local/share/data/Nextcloud
/home/ricardo/.var/app/de.haeckerfelix.gradio
/home/ricardo/.ssh
/home/ricardo/.scmdl
/home/ricardo/.local/share/remmina
/home/ricardo/.conky
/home/ricardo/.areca
/home/ricardo/Programming/commands.txt
/home/ricardo/.bashrc
/home/ricardo/.conkyrc
/home/ricardo/smalltarbkp
Running Backups by specifying a PATH
$ smalltarbkp -path ~/Documents/ -name “docs”
$ smalltarbkp -path ~/Documents/ -name “docs” -full
$ smalltarbkp -path ~/Documents/ -name “docs” -full -verbose
Running Backups by specifying a Include List
$ smalltarbkp -include ~/include_list.txt -name “personal”
$ smalltarbkp -include ~/include_list.txt -name “personal” -verbose
Purge Backups
$ smalltarbkp -purge-backup [-local | -cloud]
There will be times that you wish to delete the backup images because the retention is too long or you are simply running out of space locally or on your cloud storage, whatever the reason this will allow you clear out unwanted files.
When you run the script you select wither local or cloud to start with and then a list of available images will be presented.
You start with selecting the unique Image ID as that will consolidate all the backups associated with that image.
It will then list all the images to make sure you selected the correct one and prompt you if you are sure you want to delete the listed images.
Here is a example of the deletion of image ID 1511823601
If you wish to delete ALL backups IMAGES you can use the wild card [ * ] instead of a image ID but personally I would not do that, unless that is exactly is exactly what you want to-do.
Report Images
$ smalltarbkp -images [ -cloud | -local ] -report
The simple way to list or report all the backup images available online or on your local system. The option will display everything wherever they are stored locally, NFS or on the MEGA.nz cloud.
This depends on where you have configured you script as a target location.
Example a list of all available images on the cloud
Example a list of all available images locally or NFS
Retrieve Images
$ smalltarbkp -images [ -cloud | -local ] -retrieve
Retrieving images is an easy way to recall / retrieve images from your back up, encrypted and stored. Before you can perform a restore of certain files or even the entire directory you need to have the images available locally and decrypted so that you can scan through it.
This syntax does just that for you with a couple of questions like what image do you want, where do you want to put it for you to perform the restore. Note that if the images are encrypted it will also decrypt the local retrieved image and prompt you for the available commands for restoration once retrieved and decrypted.
In this example you are retrieving image ID 1511823601 from MEGA.nz and decrypting it.
Example when retrieving encrypted images from the cloud
Example when retrieving encrypted images stored locally or NFS
In the above example you are retrieving image that are locally stored BUT it’s NOT encrypted so no decryption is required and you can run a search / list / restore from where it is without having to retrieve it.
Decrypt
All backups can be encrypted using standard openssl or they can be left as simple tar files without encryption but in case you chose to encrypt your backups and you chose to manually decrypt the backup files you can achieve this with a simple command:
$ openssl enc -d -aes-256-cbc -in FILE_NAME.enc -out FILE_NAME -pass pass:*******
File Restore
Assuming you know the location of the backup images and they are decrypted and stored locally you can run simple commands to list files, restore, search for a file or restore an individual file.
Additional restore capability will be shown further down the line in this document using the preview mentioned command:
$ smalltarbkp -images local -retrieve
• List all files
$ cat BACKUP_FILE.tar.gz-* | tar tz
• Restore all files
$ cat BACKUP_FILE.tar.gz-* | tar xz
• Search for a file
$ cat BACKUP_FILE.tar.gz-* | tar tz | grep PATTERN
Example:
• Restore a single file
$ cat BACKUP_FILE.tar.gz-* | tar xz FULL_FILE_PATH
This example shows restoring the smalltarbkp.poc file from the bkp-Scripts-1510411650.tar.gz-00 image but the full path needs to be provided as identified in the search above.
Scheduling Backups
Scheduling is performed using the standard crontab function typical to UNIX/Linux systems.
$ crontab -e
Let me start by demonstrating how I have configured several script with several settings running at different times and different schedules for different scripts.
Every 8 hours (Example).
In this example we are backing up every other hour, so the script has been configured with the following settings:
* Retention = 21 (3 x daily for 7 days retention)
* Full = 3 (one full backup daily)
Every Eight hours running at 00:00 / 08:00 / 16:00
00 */8 * * * /scripts/smalltarbkp.eight -path /home/user/scripts -name “Scripts” >/dev/null 2>&1
Every Eight hours running at 00:30/ 08:30 / 16:30
30 */8 * * * /scripts/smalltarbkp.eight -path /home/user/poc -name “Poc” >/dev/null 2>&1
The reason there are two lines is because I have the same requirements for Scripts & Poc and so I am using the same script name and settings, but a different path.
Every 12 hours (Example).
This example shows backing up every other hour so the script is configured with the following settings:
* Retention = 28 (2 x daily for 14 days retention)
* Full = 7 (one full backup weekly)
Every 12 hours running at 00:00 / 12:00
00 */12 * * * /scripts/smalltarbkp.two -path /home/user/docs -name “Docs” >/dev/null 2>&1
As the Docs folder are the only backups I need with those specific settings I have one entry in cron for it, backing up every 12 hours
Daily Backups (Example).
In this case I have a number backups that require to run daily with the same retention period of 31 days and a full backup only every 10 days.
* Retention = 31 (1 x daily)
* Full = 10 (one full backup every 10 days)
Runs everyday at 02:00 AM
0 2 * * * /scripts/smalltarbkp.poc -path /home/ricardo/Folder0 -name “Folder0” >/dev/null 2>&1
Runs everyday at 03:00 AM
0 3 * * * /scripts/smalltarbkp.poc -path /home/ricardo/Folder1 -name “Folder1” >/dev/null 2>&1
Runs everyday at 04:00 AM
0 4 * * * /scripts/smalltarbkp.poc -path /home/ricardo/Folder2 -name “Folder2” >/dev/null 2>&1
Runs everyday at 05:00 AM
0 5 * * * /scripts/smalltarbkp.poc -path /home/ricardo/Folder3 -name “Folder3” >/dev/null 2>&1
Monthly Backup (Example)
I have a requirement to backup my pictures & music on a monthly basis as it does not change much and I want a long term retention.
* Retention = 12 (1 backup every month)
* Full = irrelevant as you force a full backup in cron
Run on the 1st of each month at midnight
0 0 1 * * /scripts/smalltarbkp.month -path /Music -name “Music” -full >/dev/null 2>&1
Run on the end of each month starting at 01:30AM
30 1 2 * * /scripts/smalltarbkp.month -path /Photos -name “Photos” -full >/dev/null 2>&1
As you can see I’ve create four copies of the script each depending on my retention and full backup frequency and obviously if you have different target location also you can create even more.
For those that had the same retention and full backup frequency you can simply put an additional entry in crontab with the path and name.
You can download a PDF version of this document here smalltarbkp
You can find additional information about crontab here.
Big thank you to my friend Jock Stewart for taking the time and reviewing the process, script and document.
With that in mind feel free to contact me on twitter @gcclinux at any time or send us a message in the form below.
You must be logged in to post a comment.