ForumsTips & TricksBack up your Toodledo data with wget


Back up your Toodledo data with wget
Author Message
TheGriff_2

Posted: Jun 17, 2009
Score: 1 Reference
With the recent XML export enhancements I wanted a way to schedule nightly backups of my Toodledo data. In addition I thought it'd be handy if this backup went instantly to my Dropbox account.

The program WGET is a commonly installed software package of most Linux distibutions. It allows "for retrieving files using HTTP, HTTPS and FTP" from the Linux shell. More information about WGET usage as well as links to Windows and Mac downloads is available from LifeHacker http://lifehacker.com/software/downloads/geek-to-live--mastering-wget-161202.php.

Once a script (or DOS batch file) is built using the proper WGET command structure it is very simple have it run nightly in Linux CRON or as a Windows Scheduled Task.

The basic script commands are shown below. (Thanks to Matt - https://www.toodledo.com/forums/profile.php?user=258454 - for the commands!) Items in bold below are to be replaced with your information.

wget -O /dev/null --quiet --no-check-certificate --save-cookies cookies.txt --post-data 'email=[email protected]&pass=xxxxxxxx' https://www.toodledo.com/signin.php

wget --quiet --load-cookies cookies.txt http://www.toodledo.com/xml.php

rm cookies.txt


The first line logs the script/system in to the Toodledo website and saves the cookie generated by Toodledo. The second line performs the actual download of the file and the third line removes the cookie. For Windows you would replace the rm command with del.

It should be noted you will want to place the script in the directory which should store your download. I have my script in my Linux home folder and have added the following line to move and rename the downloaded file into my Dropbox directory.

mv /home/xxxx/xml.php /home/xxxx/Dropbox/toodledobackup.xml

For Windows replace mv with move.

I also use Jungledisk with the Amazon S3 service to backup my important files each night including the Dropbox folder. I'm now triply covered with my data backed up to my laptop's Dropbox folder, Dropbox itself and on Amazon S3; this is in addition to Toodledo's twice daily backups!

I admit this backup strategy is a bit extreme for simple task data however being a self-acknowledged lover of systems I just had to get it working.

I suppose the script could be modified to keep multiple copies of your data with a time-stamp but that's too extreme, even to me.


This message was edited Jun 17, 2009.
ylan

Posted: Jun 17, 2009
Score: 0 Reference
Excelent! I was about to write a script myself to do exactly this. Glad I checked the forums!

Thanks for sharing.
Jake

Toodledo Founder
Posted: Jun 17, 2009
Score: 1 Reference
For people who use this script, please be reasonable with your backup frequency. Once a day should be more than sufficient. Any more frequently will cause unnecessary stress on our servers.
benny

Posted: Jun 19, 2009
Score: 0 Reference
I think you'll need to change the single quotes to double quotes for Windoze as well - but could be wrong.

I'm glad TD posted that once a day is OK. I have had this setup for a little while - it grabs my tasks at 2a, compresses the XML and names it with a time/date stamp and keeps the last 7 of them. It dumps them into a directory that Crashplan watches and then gets distributed to all my Crashplan backups - local and remote.

See, you're not the only one who is anal about their data :) I've got a backup strategy for home that's definitely overkill, but is actually simplified since I started using Crashplan - really good stuff. Free now, too.

EDIT: Meant to mention to TD that if you see ANY ISSUE with doing this once a day please let us know. I thought that was reasonable as well, but would listen of course if there were any issues that arose from backing them up every night.


This message was edited Jun 19, 2009.
maguilar.gtd

Posted: Jul 10, 2009
Score: 0 Reference
Great example, and if you want to backup your Notebook, this is the line to append to the backup script:

wget --quiet --load-cookies cookies.txt http://www.toodledo.com/csv_notes.php
Warren

Posted: Aug 10, 2009
Score: 1 Reference
Just what I was looking for. I took TheGriff's wget tips and wrapped them up into a script that I can run from crontab on my OSX machine. Small changes to add full path to wget and saving directly to output files instead of moving after dl. Also does basic error checking for zero-length failed downloads. I run this once a day at 4am via crontab in an attempt to dl during off-peak hours.

<code>
#!/bin/bash
#
# backup toodledo.com account via cron
#
# set your login and password
export MYUSER='[email protected]'
export MYPASS='PASSWORD'
#
# directory to save files
export MYBACK=~/Backups/toodledo
#
export MYTEMP=/tmp
export MYCOOK=$MYTEMP/wgetcookies.txt
#
# MYWGET is path and params to pass to wget
# the no check certificate is required to connect over https
export MYWGET="/usr/local/bin/wget --quiet --no-check-certificate --save-cookies $MYCOOK --load-cookies $MYCOOK "
export MYLOG=$MYBACK/toodledo-backup.log
export TS=`date '+%Y%m%d%H%M%S'`

echo "$TS toodledo backup init" >> $MYLOG
cd $MYTEMP >> $MYLOG 2>&1

$MYWGET -O /dev/null --post-data "email=$MYUSER&pass=$MYPASS" https://www.toodledo.com/signin.php >> $MYLOG 2>&1

$MYWGET -O $MYBACK/$TS.toodledo-tasks.xml https://www.toodledo.com/xml.php >> $MYLOG 2>&1
$MYWGET -O $MYBACK/$TS.toodledo-notes.csv https://www.toodledo.com/csv_notes.php >> $MYLOG 2>&1
$MYWGET -O $MYBACK/$TS.toodledo-activity.html https://www.toodledo.com/activity.php >> $MYLOG 2>&1

#
# check for empty zero length files, write to log and email $MYUSER on errors
#
for FILE in \
$MYBACK/$TS.toodledo-tasks.xml \
$MYBACK/$TS.toodledo-notes.csv \
$MYBACK/$TS.toodledo-activity.html
do
[[ -s $FILE ]] || echo "$TS FAIL empty file $FILE" >> $MYLOG 2>&1
[[ -s $FILE ]] || echo "$TS FAIL empty file $FILE" | mail -s "toodledo backup error" $MYUSER >> $MYLOG 2>&1
done

rm $MYCOOK >> $MYLOG 2>&1

</code>


This message was edited Aug 10, 2009.
You cannot reply yet

U Back to topic home

R Post a reply

To participate in these forums, you must be signed in.