Backup MIAB (Mail in a Box) through Rsync towards Openmediavault Server

Mail in a box (MIAB) has a backup feature available. It stores full and incremential backups on the mailserver and it is possible to store the backup also on another device through RSYNC. In my situation I am saving the data to an OpenMediaVault NAS

Here I write my own: How I did it (quick and dirty cause I expect you to know things).

In short:
rsync over port 5678 to backup your data to the OpenMediaVault NAS

  1. Make sure you have a hostname available where rsync can be connected to, the hostname must point to the IP where the OMV (OpenMediaVault) is connected
  2. Rsync over SSH is being used.
  3. if you do not want to use port 22 with Rsync, you need to modify /root/mailinabox/management/backup.py line 19: change -p 22 to -p 5678
  4. Enable Rsync Server in the GUI (Grapical User Interface) of openmediavault.

Please note that it is not possible to use the ~/.ssh/config file where you can add the port as well. The reason is that the verification process needs in the backup.py a -p setting which is not overridden bij de config file.

  • SSH standard port 22, this we will change. (ie. port 22 is already in use towards another server)
  • In the router go to your portforwarding section and open port 5678 towards port 22 to your device (with OpenMediaVault).
  • MIAB and RSYNC needs to have the full path where to store the backup. In my situation: /media/a925efd7-ada5-48b5-80e6-383cc6274bcd/Backup (the folder must available and writable
  • Make sure that a user can login with SSH and can access OpenMediaVault
  • MIAB is providing a public key for auto-login needed for rsync. this key must be available in OpenMediavault. You can put the public key in: ~/.ssh/authorized_keys or in a folder in /var/lib/openmediavault/ssh/authorized_keys where you create a file with the name of the user
  • within MIAB you can use from /root/mailinabox/ the following: sudo management/backup.py –verif

to test if your public key is accepted: from MAIB ssh with the following command:  ssh -p 5678 -i /root/.ssh/id_rsa_miab user@domain.name

If this is giving you a direct login to your OpenMediaVault NAS you can use Rsync ;)

Missing something? Reply and ask

 

RFC: Plesk backup to Amazon S3 – it works!

First thanks to Uli for the first step towards some idea’s and Armijn giving good guidelines how to use some variables together with commands.
cause I am a newbie on bin/bash scripting.. but with lots of testing tonight I was able to backup several ‘clients’ content towards buckets and objects on Amazon S3. Probably this is a first good step towards the usage of a simple script for the backup of your server’s content to Amazon S3

Needed: Python, S3cmd, data and the correct settings

note: this script is not finished yet as I am manually executing it and the removal of data from Amazon S3 is not well tested (it works, but I need to check it during the time the stuff is running.

#!/bin/sh

###################################
### ###
### BACKUP FOR

MediaGiant News: Save your settings before firmware upgrade

Using a Conceptronic Mediagiant and you want to save your settings before a firmware upgrade? Here is how you can do it

A forced firmware upgrade is deleting all settings, but with a script it is possible to save the information. (and possible it will allow you to do more things).

1. Copy the shooting.sh file to an USB Flash Drive (use link to save the file)
2. Insert USB Flash Drive into the MediaGiant
3. Goto the Copy page  and focus on the shooting.sh file
4. Push on your remote the GOTO button and than push the REPEAT button  (you see 2x the invalid icon) the configuration is being backupped.
5. The device will be rebooting, after the reboot than you can do the firmware upgrade.
6. When firmware upgraded is finished, please go into  the Copy page and focus on shooting.sh again
7. push on your remote control the GOTO button thanpush REPEAT button the configuration is restored, device will reboot

Please be noticed: the script is creating a ‘VenusSetup.dat’ file on your HDD, you have to REMOVE  after your settings are restored.Cause if exist a next time it will use that file for your settings. Even if you have made changes afterwards

Backup data from Server A to Server B using FTP

Message to self: to have all data from servage retrieved directly to the new server this following command can be used. You need SSH on your new server.

lftp -u ‘[username],[password]’ ftp.servage.net -e “mirror /path /destination/path new server”

This command needs to be entered in a ssh session. As we speak this command is already downloading my data for about 12 hours now ..

Not Good: My own FTP script

My backup FTP script is NOT GOOD.. it exhausted my memory ?? on the webserver caushing the server to ‘crash’ / not react for 4 hours, even connecting my monitor to the server: I could type my username, but login: nope ..

So message to self: how the hell can I FTP 40GB of data via a shell script without exhausting my memory .. <sigh> …

rebooting (fingers always crossed) and after 5 minutes the server was up again

My Simple FTP script

#!/bin/bash
cd /home/backup/server
if [ -e data ]
then
cd /home/backup/server/data/
HOST=’ftp-server-address’
USER=’username’
PASSWD=’password’
ncftp -u $USER -p $PASSWD $HOST << EOT
binary
cd /Volume_1
mkdir `date “+%d-%m-%y”`
cd `date “+%d-%m-%y”`
mput *
bye
EOT
fi
cd /home/backup/server
rm -rf data

with some help of Matthijs en of course myself and a little ‘maggi’ I created a bin/bash shell script to backup data (backup of the server) to one of my other FTP server(s)

it checks if the data folder is available and if so than the a ftp session is started, logged on, a folder with the todays date is made and the data is put in that folder. After that the data folder including the content will be removed. Yes I know there is no validation if A = B but I’ve no idea how to accomplish that, but this works in my test environment. This script is saving me about 4 Hours of datatransfer where FTP is showing me a nice 10MB/s where Samba / NFS is showing me a not satisfied performance at all .. (I believe the total amount of data is about 40GB I have to backup everytime … ) ..

hints and tips welcome to have a A=B comparison before removal of the data folder ;-)