Testing function about real time backup .. need to find out how that is working ..
Today I tried Backup software of CloudBerry backup. What is so special about this backup tool?
For months I am using CloudBerry Explorer for S3 (I have to shame myself not register the software yet)
First thanks to Uli for the first step towards some idea’s and Armijn giving good guidelines how to use some variables together with commands.
cause I am a newbie on bin/bash scripting.. but with lots of testing tonight I was able to backup several ‘clients’ content towards buckets and objects on Amazon S3. Probably this is a first good step towards the usage of a simple script for the backup of your server’s content to Amazon S3
Needed: Python, S3cmd, data and the correct settings
note: this script is not finished yet as I am manually executing it and the removal of data from Amazon S3 is not well tested (it works, but I need to check it during the time the stuff is running.
### BACKUP FOR
I am a newbie in bin/bash scripting so help is appreciated
I am using s3cmd with Amazons S3 service
what I want is to copy per day/week/month some backup files.
Using a Conceptronic Mediagiant and you want to save your settings before a firmware upgrade? Here is how you can do it
A forced firmware upgrade is deleting all settings, but with a script it is possible to save the information. (and possible it will allow you to do more things).
1. Copy the shooting.sh file to an USB Flash Drive (use link to save the file)
2. Insert USB Flash Drive into the MediaGiant
3. Goto the Copy page and focus on the shooting.sh file
4. Push on your remote the GOTO button and than push the REPEAT button (you see 2x the invalid icon) the configuration is being backupped.
5. The device will be rebooting, after the reboot than you can do the firmware upgrade.
6. When firmware upgraded is finished, please go into the Copy page and focus on shooting.sh again
7. push on your remote control the GOTO button thanpush REPEAT button the configuration is restored, device will reboot
Please be noticed: the script is creating a ‘VenusSetup.dat’ file on your HDD, you have to REMOVE after your settings are restored.Cause if exist a next time it will use that file for your settings. Even if you have made changes afterwards
rsync -avz -e ssh [username]@www.domain.com:/path/to/get/data/from/ /path/to/mirror-location/
I’m not really happy as it’s still a little unknown to me how to do this unattended (storing authorized keys is a little abacadabra for me) ..
Message to self: to have all data from servage retrieved directly to the new server this following command can be used. You need SSH on your new server.
lftp -u ‘[username],[password]’ ftp.servage.net -e “mirror /path /destination/path new server”
This command needs to be entered in a ssh session. As we speak this command is already downloading my data for about 12 hours now ..
My backup FTP script is NOT GOOD.. it exhausted my memory ?? on the webserver caushing the server to ‘crash’ / not react for 4 hours, even connecting my monitor to the server: I could type my username, but login: nope ..
So message to self: how the hell can I FTP 40GB of data via a shell script without exhausting my memory .. <sigh> …
rebooting (fingers always crossed) and after 5 minutes the server was up again
if [ -e data ]
ncftp -u $USER -p $PASSWD $HOST << EOT
mkdir `date “+%d-%m-%y”`
cd `date “+%d-%m-%y”`
rm -rf data
with some help of Matthijs en of course myself and a little ‘maggi’ I created a bin/bash shell script to backup data (backup of the server) to one of my other FTP server(s)
it checks if the data folder is available and if so than the a ftp session is started, logged on, a folder with the todays date is made and the data is put in that folder. After that the data folder including the content will be removed. Yes I know there is no validation if A = B but I’ve no idea how to accomplish that, but this works in my test environment. This script is saving me about 4 Hours of datatransfer where FTP is showing me a nice 10MB/s where Samba / NFS is showing me a not satisfied performance at all .. (I believe the total amount of data is about 40GB I have to backup everytime … ) ..
hints and tips welcome to have a A=B comparison before removal of the data folder ;-)
General information for users of AroundMyroom, Mixfans.org, Mixfreaks, tapas-recepten, kikkuh.nl. judithsgarden.eu and more.Â
- BlueQuartz updates. Major update today with 90 updates for BQ.
- Change of POP3 server. BQ upgraded to Dovecot. Had to enable POPS to have it function properly
- Removed dust from Server
- Backup running [takes about 7 hours]
- changed permalinks of AroundMyroom old style to new playground style
- changed permanent folder pr0n to erotica
- installed WordPress and changing styles