Backup my google calendars via shell and python scripts and scheduling them via cron

So recently my calendar got messed up a little (dupes) due my mobile phone (HTC Touch Pro running Windows Mobile 6.1) resetting [syncing again messed it up]. I realize (now???) I need to keep backups of my calendars and contacts.

I took the following script online (which requires python and the gdata module):

<pre class="src src-sh"><span style="color: #ff4500;">#</span><span style="color: #ff4500;">! /usr/bin/</span><span style="color: #00ffff;">python</span>

## This script backs up my google contacts ## http://www.joet3ch.com/2008/09/25/backup-google-contacts/

import gdata.contacts.service gd_client = gdata.contacts.service.ContactsService() gd_client.ClientLogin(‘uname@domain.com’, ‘password’) query = gdata.contacts.service.ContactsQuery() query.max_results = 2000 # change for max contacts returned feed = gd_client.GetContactsFeed(query.ToUri()) print feed

and adapted the following script to backup my calendars and contacts:

 <pre class="src src-sh"><span style="color: #ff4500;">#</span><span style="color: #ff4500;">! /bin/</span><span style="color: #00ffff;">bash</span>

## This script backs up my calendars — will use with Cron to backup daily. ## following is adapted from http://permanentinkdesign.com/articles/backing-up-a-google-calendar/ INCREMENT=date +%Y%m%d%H%M DIR=“$HOME/Documents/Backup_Google” USERNAME=“uname@domain.com” PASSWORD=“mypassword” ## Vinh’s calendar curl -s url-to-private-ics -o “$DIR/Vinh_$INCREMENT.ics” ## TNTTSP’s calendar’ curl -s url-to-private-ics -o “$DIR/TNTTSP_$INCREMENT.ics” ## Birthdays curl -s url-to-private-ics -o “$DIR/Birthdays_$INCREMENT.ics”

## Now backup Google Contacts ## http://www.joet3ch.com/2008/09/25/backup-google-contacts/ $HOME/Documents/bin/Backup_Google_Contacts.py > ${DIR}/VinhContacts_$INCREMENT.xml

## Now backup Google Reader subscriptions ## http://www.clausconrad.com/blog/backup-google-reader-subscription-list-to-opml-automatically $HOME/Documents/bin/gr-opml-backup.py $USERNAME $PASSWORD > Vinh_GoogleReaderSubs_$INCREMENT.opml

find $DIR -mtime +14 -exec rm -f {} \;

Every time the script runs, it backs up my 3 calendars and contacts. The very last line says delete if files are 14 days or older.

Now, to set up my crontab to run this everyday at 10pm, I type “crontab -e” in the shell. I put

<pre class="src src-sh">00 22 * * * $<span style="color: #eedd82;">HOME</span>/Documents/bin/Backup_Google.sh

when vi (or your default visual editor) opens up.

More info about cron on the Mac is here.

Sometimes, you might not want to schedule a recurring script. To run a script once, we can rely on the at (“at”) command. More information here and here. However, the at command didn’t work on my Macbook when I tried it a while back.

UPDATE: The script above (Backup_Google.sh) is updated to include the google reader subscription, thanks to this site.

Making my personal website and course websites: iWeb + rsync

So I’ve been using iWeb on my macbook to create my personal webpage and potential course websites. I use it because i don’t really know html, and i don’t think I NEED to learn it right now. Point and click to create them is fine with me for the time being. Actually, I would prefer to create the pages in google sites, and export them to my professional-life host, ie, uci-ics domain. However, this option isn’t quite available from google yet.

My main webpage is http://www.ics.uci.edu/~vqnguyen/, and from there, i can have my personal homepage and course websites hosted. However, when I use iweb to publish multiple sites to the same destination via the sftp option, things get funny because iweb puts a default index.html file in each directory, and this file directs u to a page. As i upload multiple sites to that one root domain, re-direction get’s a little fuzzy. I fixed this by uploading the course websites first, then my personal site (root directory) last. Then, with every update, i just use “Publish Site Changes.” However, what if i want to add some more pages? I didn’t like this, and i finally did something about it.

Got my information from UCI’s EEE help on iweb.

Now, what I do is this:

  1. ICS servers: websites are in ~/public_html/
  2. Created ~/public_html and ~/iWebSites on my macbook.
  3. Publish my sites to a local folder, ~/iWebSites, instead of using sftp, one directory for each site.
  4. After every update and publishing to my local folder, i run the following script (supposing my i have two sites, one personal, and one for a class website):
   <pre class="src src-sh"><span style="color: #ff4500;">#</span><span style="color: #ff4500;">! /bin/</span><span style="color: #00ffff;">bash</span>

rsync -progress -av ~/iWebSites/Vinh_Q._Nguyen/ ~/public_html/ rsync -progress -av ~/iWebsites/stat8 ~/public_html/ rsync -progress -av -e ssh ~/public_html/ vqnguyen@chi2.ics.uci.edu:~/public_html/

Now things work great! Good thing i have passwordless ssh!

Next thing to try is html in org-mode (emacs), which i found out through Michael Zeller’s comment on here (he makes his website with it).

Parallel computing in R: snowfall/snow

I finally found the time to try parallel computing in R using snowfall/snow thanks to this article in the first issue of the R Journal (replacement of R News). I didn’t try parallel computing before because I didn’t have a good toy example, and it seemed like a steep learning curve. Snow and Snowfall is perfect for ‘embarrassingly parallel’ jobs, eg, a simulation study, bootstrap, or a cross-validation. I do simulation studies a lot, eg, assessing the properties of a statistical methodology, so implementing parallel computing will be very useful.

I got the toy example to work, but it was parallel on a single computer with multiple cores. Thanks to Michael Zeller, I got it to work on multiple machines. If we use multiple nodes, make sure we enable passwordless ssh.

Credit for getting snowfall to work on the BDUC servers (uci-nacs) goes to Harry Mangalam.

Here is a script, with a few examples:

 <pre class="src src-sh"><span style="color: #ff4500;">## </span><span style="color: #ff4500;">Example 1 - Multi-core on a single computer</span>

sink(‘SnowFallExample.Rout’, split=TRUE) .Platform .Machine R.version Sys.info()

library(snowfall) # 1. Initialisation of snowfall. # (if used with sfCluster, just call sfInit()) sfInit(parallel=TRUE, cpus=2)

# 2. Loading data. require(mvna) data(sir.adm) # 3. Wrapper, which can be parallelised. wrapper <- function(idx) { # Output progress in worker logfile cat( “Current index: “, idx, “\n” ) index <- sample(1:nrow(sir.adm), replace=TRUE) temp <- sir.adm[index, ] fit <- crr(temp$time, temp$status, temp$pneu) return(fit$coef) } # 4. Exporting needed data and loading required # packages on workers. sfExport(“sir.adm”) sfLibrary(cmprsk)

# 5. Start network random number generator # (as “sample” is using random numbers). sfClusterSetupRNG() # 6. Distribute calculation

start <- Sys.time(); result <- sfLapply(1:1000, wrapper) ; Sys.time()-start # Result is always in list form. mean(unlist(result)) # 7. Stop snowfall sfStop()

## Example 2 – Multiple nodes on a cluster (namely, the family-guy cluster at uci-ics) sink(‘SnowFallExample.Rout’, split=TRUE) .Platform .Machine R.version Sys.info()

library(snowfall) # 1. Initialisation of snowfall. # (if used with sfCluster, just call sfInit()) sfInit(socketHosts=rep(c(‘peter-griffin.ics.uci.edu’,‘stewie-griffin.ics.uci.edu’, ‘neil-goldman.ics.uci.edu’, ‘mort-goldman.ics.uci.edu’,‘lois-griffin.ics.uci.edu’),each=2), cpus=10,type=‘SOCK’,parallel=T)

# 2. Loading data. require(mvna) data(sir.adm) # 3. Wrapper, which can be parallelised. wrapper <- function(idx) { # Output progress in worker logfile cat( “Current index: “, idx, “\n” ) index <- sample(1:nrow(sir.adm), replace=TRUE) temp <- sir.adm[index, ] fit <- crr(temp$time, temp$status, temp$pneu) return(fit$coef) } # 4. Exporting needed data and loading required # packages on workers. sfExport(“sir.adm”) sfLibrary(cmprsk)

# 5. Start network random number generator # (as “sample” is using random numbers). sfClusterSetupRNG() # 6. Distribute calculation

start <- Sys.time(); result <- sfLapply(1:1000, wrapper) ; Sys.time()-start # Result is always in list form. mean(unlist(result)) # 7. Stop snowfall sfStop()

## Example 3 – Multiple nodes on a cluster (namely, the BDUC servers of uci-ics) ## ssh to bduc, then ssh to one of their claws (the head node is 32bit whereas the other wones are 64) ## put something like ## export LD_LIBRARY_PATH=/home/vqnguyen/lib:/usr/local/lib:/usr/lib:/lib:/sge62/lib/lx24-x86 in .bashrc ## or ## Sys.setenv(LD_LIBRARY_PATH=”/home/vqnguyen/lib:/usr/local/lib:/usr/lib:/lib:/sge62/lib/lx24-x86″) ## in an R session. Note: modify path to your home directory ## might have to install required packages elsewhere, like ~/Rlib, and use .libPaths() to add library path. Put this in .Rprofile sink(‘SnowFallExample.Rout’, split=TRUE) .Platform .Machine R.version Sys.info()

# 1. Initialisation of snowfall. # (if used with sfCluster, just call sfInit()) library(snowfall) sfInit(socketHosts=rep(c(‘claw1′, ‘claw2′),each=4), cpus=8,type=‘SOCK’,parallel=T)

# 2. Loading data. require(mvna) data(sir.adm) # 3. Wrapper, which can be parallelised. wrapper <- function(idx) { # Output progress in worker logfile cat( “Current index: “, idx, “\n” ) index <- sample(1:nrow(sir.adm), replace=TRUE) temp <- sir.adm[index, ] fit <- crr(temp$time, temp$status, temp$pneu) return(fit$coef) } # 4. Exporting needed data and loading required # packages on workers. sfExport(“sir.adm”) sfLibrary(cmprsk)

# 5. Start network random number generator # (as “sample” is using random numbers). sfClusterSetupRNG() # 6. Distribute calculation

start <- Sys.time(); result <- sfLapply(1:1000, wrapper) ; Sys.time()-start # Result is always in list form. mean(unlist(result)) # 7. Stop snowfall sfStop()

This is a good general reference for snowfall. Next thing to try is getting rpvm (PVM) to work for snowfall!

emacs-style web browser: Conkeror

I remember back then when I used Kubuntu, I love Konqueror because I can hit Ctrl and all the links would be highlighted. I just need to type the letters corresponding to the link and Konqueror would take me to that link. I love it because I didnt have to use my mouse or touchpad. Recently, I discovered that Firefox had a similar feature, and it made me love firefox even more.

Even more recently, I was introduced to Conqueror, an emacs-like web browser. Everything is keyboard-based! It looks cool, but don’t know how compatible it will be with say a download manager. Everything else should be working since it is based on the same web engine that runs firefox! Installation instructions for Mac OS X can be found here.

passwordless ssh

I can’t believe I never set this up. I ssh to a few of my school servers all the time. I always have to enter my password to authenticate. I just found out I could authenticate with a key file instead of entering my password. I found the information from here and here. To set up, execute:

# for no passphrase, use
ssh-keygen -b 1024 -N ""
# safer to use with passphrase, and make it long (24+ characters)!
ssh-keygen -t rsa -b 4096
ssh-copy-id login@server
# you'll have to enter your password one last time to get it there.

On my Mac OS X, I did not have ssh-copy-id, but from this link I found a shell script for it here.

export google calendar to excel file or csv: gcal2excel

I can’t believe google calendar doesn’t let me export my calendar to a spreadsheet. It only allows export to .ics (ical) or xml. THAT SUCKS.

After many hours of search, this is what I came up with:

  1. gcal2excel(requires jdk). I couldn’t get it to work in the beginning because I kept putting in “Vinh Nguyen” for calendar id (on a mac, it didn’t display calendar id, it just put calendar …). To get it to work, enter the calendar id which you can find under settings in google calendar.
  2. Import your calendar into MS Outlook and export to csv.
  3. This can also be achieved with with Mozilla Sunbird. However, in one of my calendar I type using unicode (for Vietnamese) and the export gave me wierd

Firefox: find as you type

Back in the days when I used Kubuntu, I used konqueror because if I hit the CTRL key, all the links would light up with a yellow tag by its side with either a letter or a number (or some character). If I typed that character then konqueror would go to that link’s page. I love this feature because I can be efficient surfing the web with the keyboard (mouse / trackpad is slow!).

Anywho, I always wanted Firefox to have this feature. I just discovered that Firefox has a similar feature. Check this page for a description. To turn on the feature, go to preferences > Advance and check “Search text as i start typing”. Firefox just got better.

The following is taken from the previous link to remind myself of some features.

  • Type several characters into the active browser window to navigate to any link with that text in it
  • If you repeat the same character, it will start to cycle through all the links that begin with that character. However, if it can find a match with the exact string you’ve typed, such as “oo” in “woods” it will go there first. Typing “o” more times will then cycle through the links that start with “o”.
  • Use the backspace key to undo the last character typed
  • Type a ‘ before your string to search only links. Type / before your string to search all text.
  • You can use the text search field to get to buttons, text boxes and other form controls. Just search for the text right before it, and then press Tab when to move from there.
  • To cancel a find, change focus or scroll, press Escape, or wait for the timeout
  • Press Accel+G or F3 to use “find next”. Press Accel+Shift+G or Shift+F3 to find previous, with the current string you’ve typed. This respects the current “linksonly” setting. Note: ‘accel’ means Ctrl on Windows, and Cmd on Mac. On Unix, ‘accel’ usually means Ctrl, but it can be set to Alt.
  • Works with any Gecko HTML content window – embedded clients, IM conversation window, help, etc.
  • works with IME for input of Chinese, Japanese, Korean, etc.
  • When focused on a link, the following keys will work:
    • Enter – activate the link
    • Shift+Enter – save the page that the link points to
    • Ctrl+Enter (Cmd+Enter on Mac) – open the link in a new window
    • Insert – open the link in a new foreground or background tab, depending on the “Load links in the background” pref. under Preferences – Navigator – Tabbed Browsing.
    • Shift+Insert – same as Insert, but does the opposite of the foreground/background pref

Open remote file while in emacs ansi-term buffer/window: ansi-term + tramp

In emacs, I can edit files remotely using tramp. While ssh’d to a remote server in ansi-term at a specific location, I can open the remote files in emacs as if that remote location is my working directory. This is taken form here. Put the following in the remote server’s .bashrc file:

## Emacs: ansi-term + tramp integration
## in ansi-term, ssh to this remote computer, can do C-x C-f and find file in REMOTE working directory
## http://www.enigmacurry.com/2008/12/26/emacs-ansi-term-tricks/
#Emacs ansi-term directory tracking
# track directory, username, and cwd for remote logons
if [ $TERM = eterm-color ]; then
 function eterm-set-cwd {
 $@
 echo -e "\033AnSiTc" $(pwd)
 }

# set hostname, user, and cwd function eterm-reset { echo -e "\033AnSiTu" $(whoami) echo -e "\033AnSiTc" $(pwd) echo -e "\033AnSiTh" $(hostname) }

for temp in cd pushd popd; do alias $temp="eterm-set-cwd $temp" done

# set hostname, user, and cwd now eterm-reset fi

For SunOS servers, /usr/ucb is not in path, and whoami is not found. I need to put /usr/ucb in PATH in my .bashrc file. Credit belongs to this thread. Now while ssh’d to a remote server in ansi-term, try C-x C-f, and see the working directory on the remote server available by default.

edit files remotely: emacs + tramp

Suppose I want to edit a file remotely. I don’t want to download/ftp the file to my computer, edit, and send it back to the remote server. In emacs, I can edit it remotely using tramp via the ssh or rcp protocol. Put following in the .emacs file after installing tramp.

<pre class="src src-sh">;; tramp stuff

;; http://ubuntuforums.org/showthread.php?t=760038 (require ‘tramp) (setq tramp-default-method “ssh”)

Read a remote file by C-x C-f /user@your.host.com:/path/to/file. Note we Need that ‘/’ before username. This is a good reference for tramp.