Ik gebruik al een tijdje een script die gebruik maakt van de API van Bierdopje om ondertitels van TV series automatisch te kunnen downloaden.
het script is geweldig maar mist een functionaliteit om jou te vertellen dat er een nieuwe ondertitel (of ondertitels) gedownloaded is.
Met behulp van Hens Zimmerman is er een script gemaakt die de gedownloade ondertitel naar een usenet account twittert.

Download het bestand hier.

First thanks to Uli for the first step towards some idea’s and Armijn giving good guidelines how to use some variables together with commands.
cause I am a newbie on bin/bash scripting.. but with lots of testing tonight I was able to backup several ‘clients’ content towards buckets and objects on Amazon S3. Probably this is a first good step towards the usage of a simple script for the backup of your server’s content to Amazon S3

Needed: Python, S3cmd, data and the correct settings

note: this script is not finished yet as I am manually executing it and the removal of data from Amazon S3 is not well tested (it works, but I need to check it during the time the stuff is running.


### ###

Using a Conceptronic Mediagiant and you want to save your settings before a firmware upgrade? Here is how you can do it

A forced firmware upgrade is deleting all settings, but with a script it is possible to save the information. (and possible it will allow you to do more things).

1. Copy the shooting.sh file to an USB Flash Drive (use link to save the file)
2. Insert USB Flash Drive into the MediaGiant
3. Goto the Copy page  and focus on the shooting.sh file
4. Push on your remote the GOTO button and than push the REPEAT button  (you see 2x the invalid icon) the configuration is being backupped.
5. The device will be rebooting, after the reboot than you can do the firmware upgrade.
6. When firmware upgraded is finished, please go into  the Copy page and focus on shooting.sh again
7. push on your remote control the GOTO button thanpush REPEAT button the configuration is restored, device will reboot

Please be noticed: the script is creating a ‘VenusSetup.dat’ file on your HDD, you have to REMOVE  after your settings are restored.Cause if exist a next time it will use that file for your settings. Even if you have made changes afterwards

My backup FTP script is NOT GOOD.. it exhausted my memory ?? on the webserver caushing the server to ‘crash’ / not react for 4 hours, even connecting my monitor to the server: I could type my username, but login: nope ..

So message to self: how the hell can I FTP 40GB of data via a shell script without exhausting my memory .. <sigh> …

rebooting (fingers always crossed) and after 5 minutes the server was up again

cd /home/backup/server
if [ -e data ]
cd /home/backup/server/data/
ncftp -u $USER -p $PASSWD $HOST << EOT
cd /Volume_1
mkdir `date “+%d-%m-%y”`
cd `date “+%d-%m-%y”`
mput *
cd /home/backup/server
rm -rf data

with some help of Matthijs en of course myself and a little ‘maggi’ I created a bin/bash shell script to backup data (backup of the server) to one of my other FTP server(s)

it checks if the data folder is available and if so than the a ftp session is started, logged on, a folder with the todays date is made and the data is put in that folder. After that the data folder including the content will be removed. Yes I know there is no validation if A = B but I’ve no idea how to accomplish that, but this works in my test environment. This script is saving me about 4 Hours of datatransfer where FTP is showing me a nice 10MB/s where Samba / NFS is showing me a not satisfied performance at all .. (I believe the total amount of data is about 40GB I have to backup everytime … ) ..

hints and tips welcome to have a A=B comparison before removal of the data folder ;-)

I got a special request of a user:

allow a user to upload files, see the files but do not allow them to download any of the stored files. (So you can share 1 account to many but they are not allowed to download those files with that same account).

The master user of that vhost is allowed to control the whole upload folder.

It took me a good dinner to find an appropriate proftpd.conf setting. Some tests performed and it’s working now .

Find here the script