Making my personal website and course websites: iWeb + rsync

So I’ve been using iWeb on my macbook to create my personal webpage and potential course websites. I use it because i don’t really know html, and i don’t think I NEED to learn it right now. Point and click to create them is fine with me for the time being. Actually, I would prefer to create the pages in google sites, and export them to my professional-life host, ie, uci-ics domain. However, this option isn’t quite available from google yet.

My main webpage is http://www.ics.uci.edu/~vqnguyen/, and from there, i can have my personal homepage and course websites hosted. However, when I use iweb to publish multiple sites to the same destination via the sftp option, things get funny because iweb puts a default index.html file in each directory, and this file directs u to a page. As i upload multiple sites to that one root domain, re-direction get’s a little fuzzy. I fixed this by uploading the course websites first, then my personal site (root directory) last. Then, with every update, i just use “Publish Site Changes.” However, what if i want to add some more pages? I didn’t like this, and i finally did something about it.

Got my information from UCI’s EEE help on iweb.

Now, what I do is this:

  1. ICS servers: websites are in ~/public_html/
  2. Created ~/public_html and ~/iWebSites on my macbook.
  3. Publish my sites to a local folder, ~/iWebSites, instead of using sftp, one directory for each site.
  4. After every update and publishing to my local folder, i run the following script (supposing my i have two sites, one personal, and one for a class website):
   <pre class="src src-sh"><span style="color: #ff4500;">#</span><span style="color: #ff4500;">! /bin/</span><span style="color: #00ffff;">bash</span>

rsync -progress -av ~/iWebSites/Vinh_Q._Nguyen/ ~/public_html/ rsync -progress -av ~/iWebsites/stat8 ~/public_html/ rsync -progress -av -e ssh ~/public_html/ vqnguyen@chi2.ics.uci.edu:~/public_html/

Now things work great! Good thing i have passwordless ssh!

Next thing to try is html in org-mode (emacs), which i found out through Michael Zeller’s comment on here (he makes his website with it).

Parallel computing in R: snowfall/snow

I finally found the time to try parallel computing in R using snowfall/snow thanks to this article in the first issue of the R Journal (replacement of R News). I didn’t try parallel computing before because I didn’t have a good toy example, and it seemed like a steep learning curve. Snow and Snowfall is perfect for ‘embarrassingly parallel’ jobs, eg, a simulation study, bootstrap, or a cross-validation. I do simulation studies a lot, eg, assessing the properties of a statistical methodology, so implementing parallel computing will be very useful.

I got the toy example to work, but it was parallel on a single computer with multiple cores. Thanks to Michael Zeller, I got it to work on multiple machines. If we use multiple nodes, make sure we enable passwordless ssh.

Credit for getting snowfall to work on the BDUC servers (uci-nacs) goes to Harry Mangalam.

Here is a script, with a few examples:

 <pre class="src src-sh"><span style="color: #ff4500;">## </span><span style="color: #ff4500;">Example 1 - Multi-core on a single computer</span>

sink(‘SnowFallExample.Rout’, split=TRUE) .Platform .Machine R.version Sys.info()

library(snowfall) # 1. Initialisation of snowfall. # (if used with sfCluster, just call sfInit()) sfInit(parallel=TRUE, cpus=2)

# 2. Loading data. require(mvna) data(sir.adm) # 3. Wrapper, which can be parallelised. wrapper <- function(idx) { # Output progress in worker logfile cat( “Current index: “, idx, “\n” ) index <- sample(1:nrow(sir.adm), replace=TRUE) temp <- sir.adm[index, ] fit <- crr(temp$time, temp$status, temp$pneu) return(fit$coef) } # 4. Exporting needed data and loading required # packages on workers. sfExport(“sir.adm”) sfLibrary(cmprsk)

# 5. Start network random number generator # (as “sample” is using random numbers). sfClusterSetupRNG() # 6. Distribute calculation

start <- Sys.time(); result <- sfLapply(1:1000, wrapper) ; Sys.time()-start # Result is always in list form. mean(unlist(result)) # 7. Stop snowfall sfStop()

## Example 2 – Multiple nodes on a cluster (namely, the family-guy cluster at uci-ics) sink(‘SnowFallExample.Rout’, split=TRUE) .Platform .Machine R.version Sys.info()

library(snowfall) # 1. Initialisation of snowfall. # (if used with sfCluster, just call sfInit()) sfInit(socketHosts=rep(c(‘peter-griffin.ics.uci.edu’,‘stewie-griffin.ics.uci.edu’, ‘neil-goldman.ics.uci.edu’, ‘mort-goldman.ics.uci.edu’,‘lois-griffin.ics.uci.edu’),each=2), cpus=10,type=‘SOCK’,parallel=T)

# 2. Loading data. require(mvna) data(sir.adm) # 3. Wrapper, which can be parallelised. wrapper <- function(idx) { # Output progress in worker logfile cat( “Current index: “, idx, “\n” ) index <- sample(1:nrow(sir.adm), replace=TRUE) temp <- sir.adm[index, ] fit <- crr(temp$time, temp$status, temp$pneu) return(fit$coef) } # 4. Exporting needed data and loading required # packages on workers. sfExport(“sir.adm”) sfLibrary(cmprsk)

# 5. Start network random number generator # (as “sample” is using random numbers). sfClusterSetupRNG() # 6. Distribute calculation

start <- Sys.time(); result <- sfLapply(1:1000, wrapper) ; Sys.time()-start # Result is always in list form. mean(unlist(result)) # 7. Stop snowfall sfStop()

## Example 3 – Multiple nodes on a cluster (namely, the BDUC servers of uci-ics) ## ssh to bduc, then ssh to one of their claws (the head node is 32bit whereas the other wones are 64) ## put something like ## export LD_LIBRARY_PATH=/home/vqnguyen/lib:/usr/local/lib:/usr/lib:/lib:/sge62/lib/lx24-x86 in .bashrc ## or ## Sys.setenv(LD_LIBRARY_PATH=”/home/vqnguyen/lib:/usr/local/lib:/usr/lib:/lib:/sge62/lib/lx24-x86″) ## in an R session. Note: modify path to your home directory ## might have to install required packages elsewhere, like ~/Rlib, and use .libPaths() to add library path. Put this in .Rprofile sink(‘SnowFallExample.Rout’, split=TRUE) .Platform .Machine R.version Sys.info()

# 1. Initialisation of snowfall. # (if used with sfCluster, just call sfInit()) library(snowfall) sfInit(socketHosts=rep(c(‘claw1′, ‘claw2′),each=4), cpus=8,type=‘SOCK’,parallel=T)

# 2. Loading data. require(mvna) data(sir.adm) # 3. Wrapper, which can be parallelised. wrapper <- function(idx) { # Output progress in worker logfile cat( “Current index: “, idx, “\n” ) index <- sample(1:nrow(sir.adm), replace=TRUE) temp <- sir.adm[index, ] fit <- crr(temp$time, temp$status, temp$pneu) return(fit$coef) } # 4. Exporting needed data and loading required # packages on workers. sfExport(“sir.adm”) sfLibrary(cmprsk)

# 5. Start network random number generator # (as “sample” is using random numbers). sfClusterSetupRNG() # 6. Distribute calculation

start <- Sys.time(); result <- sfLapply(1:1000, wrapper) ; Sys.time()-start # Result is always in list form. mean(unlist(result)) # 7. Stop snowfall sfStop()

This is a good general reference for snowfall. Next thing to try is getting rpvm (PVM) to work for snowfall!

emacs-style web browser: Conkeror

I remember back then when I used Kubuntu, I love Konqueror because I can hit Ctrl and all the links would be highlighted. I just need to type the letters corresponding to the link and Konqueror would take me to that link. I love it because I didnt have to use my mouse or touchpad. Recently, I discovered that Firefox had a similar feature, and it made me love firefox even more.

Even more recently, I was introduced to Conqueror, an emacs-like web browser. Everything is keyboard-based! It looks cool, but don’t know how compatible it will be with say a download manager. Everything else should be working since it is based on the same web engine that runs firefox! Installation instructions for Mac OS X can be found here.

passwordless ssh

I can’t believe I never set this up. I ssh to a few of my school servers all the time. I always have to enter my password to authenticate. I just found out I could authenticate with a key file instead of entering my password. I found the information from here and here. To set up, execute:

# for no passphrase, use
ssh-keygen -b 1024 -N ""
# safer to use with passphrase, and make it long (24+ characters)!
ssh-keygen -t rsa -b 4096
ssh-copy-id login@server
# you'll have to enter your password one last time to get it there.

On my Mac OS X, I did not have ssh-copy-id, but from this link I found a shell script for it here.

export google calendar to excel file or csv: gcal2excel

I can’t believe google calendar doesn’t let me export my calendar to a spreadsheet. It only allows export to .ics (ical) or xml. THAT SUCKS.

After many hours of search, this is what I came up with:

  1. gcal2excel(requires jdk). I couldn’t get it to work in the beginning because I kept putting in “Vinh Nguyen” for calendar id (on a mac, it didn’t display calendar id, it just put calendar …). To get it to work, enter the calendar id which you can find under settings in google calendar.
  2. Import your calendar into MS Outlook and export to csv.
  3. This can also be achieved with with Mozilla Sunbird. However, in one of my calendar I type using unicode (for Vietnamese) and the export gave me wierd

Firefox: find as you type

Back in the days when I used Kubuntu, I used konqueror because if I hit the CTRL key, all the links would light up with a yellow tag by its side with either a letter or a number (or some character). If I typed that character then konqueror would go to that link’s page. I love this feature because I can be efficient surfing the web with the keyboard (mouse / trackpad is slow!).

Anywho, I always wanted Firefox to have this feature. I just discovered that Firefox has a similar feature. Check this page for a description. To turn on the feature, go to preferences > Advance and check “Search text as i start typing”. Firefox just got better.

The following is taken from the previous link to remind myself of some features.

  • Type several characters into the active browser window to navigate to any link with that text in it
  • If you repeat the same character, it will start to cycle through all the links that begin with that character. However, if it can find a match with the exact string you’ve typed, such as “oo” in “woods” it will go there first. Typing “o” more times will then cycle through the links that start with “o”.
  • Use the backspace key to undo the last character typed
  • Type a ‘ before your string to search only links. Type / before your string to search all text.
  • You can use the text search field to get to buttons, text boxes and other form controls. Just search for the text right before it, and then press Tab when to move from there.
  • To cancel a find, change focus or scroll, press Escape, or wait for the timeout
  • Press Accel+G or F3 to use “find next”. Press Accel+Shift+G or Shift+F3 to find previous, with the current string you’ve typed. This respects the current “linksonly” setting. Note: ‘accel’ means Ctrl on Windows, and Cmd on Mac. On Unix, ‘accel’ usually means Ctrl, but it can be set to Alt.
  • Works with any Gecko HTML content window – embedded clients, IM conversation window, help, etc.
  • works with IME for input of Chinese, Japanese, Korean, etc.
  • When focused on a link, the following keys will work:
    • Enter – activate the link
    • Shift+Enter – save the page that the link points to
    • Ctrl+Enter (Cmd+Enter on Mac) – open the link in a new window
    • Insert – open the link in a new foreground or background tab, depending on the “Load links in the background” pref. under Preferences – Navigator – Tabbed Browsing.
    • Shift+Insert – same as Insert, but does the opposite of the foreground/background pref

Open remote file while in emacs ansi-term buffer/window: ansi-term + tramp

In emacs, I can edit files remotely using tramp. While ssh’d to a remote server in ansi-term at a specific location, I can open the remote files in emacs as if that remote location is my working directory. This is taken form here. Put the following in the remote server’s .bashrc file:

## Emacs: ansi-term + tramp integration
## in ansi-term, ssh to this remote computer, can do C-x C-f and find file in REMOTE working directory
## http://www.enigmacurry.com/2008/12/26/emacs-ansi-term-tricks/
#Emacs ansi-term directory tracking
# track directory, username, and cwd for remote logons
if [ $TERM = eterm-color ]; then
 function eterm-set-cwd {
 $@
 echo -e "\033AnSiTc" $(pwd)
 }

# set hostname, user, and cwd function eterm-reset { echo -e "\033AnSiTu" $(whoami) echo -e "\033AnSiTc" $(pwd) echo -e "\033AnSiTh" $(hostname) }

for temp in cd pushd popd; do alias $temp="eterm-set-cwd $temp" done

# set hostname, user, and cwd now eterm-reset fi

For SunOS servers, /usr/ucb is not in path, and whoami is not found. I need to put /usr/ucb in PATH in my .bashrc file. Credit belongs to this thread. Now while ssh’d to a remote server in ansi-term, try C-x C-f, and see the working directory on the remote server available by default.

edit files remotely: emacs + tramp

Suppose I want to edit a file remotely. I don’t want to download/ftp the file to my computer, edit, and send it back to the remote server. In emacs, I can edit it remotely using tramp via the ssh or rcp protocol. Put following in the .emacs file after installing tramp.

<pre class="src src-sh">;; tramp stuff

;; http://ubuntuforums.org/showthread.php?t=760038 (require ‘tramp) (setq tramp-default-method “ssh”)

Read a remote file by C-x C-f /user@your.host.com:/path/to/file. Note we Need that ‘/’ before username. This is a good reference for tramp.

Run screen in emacs with ansi-term (combine this with emacs + ess + remote R)

This is actually an update to this post, but since I discovered a few more things, I’ll write a new post. To run screen within a shell buffer in emacs, I tried M-x shell and fired up screen (ditto with M-x term). It gave me this error: Clear screen capability required. I found the solution to this here. To fix this issue, do M-x ansi-term (use /bin/bash when asked of course). screen now works inside emacs. Combine this with running a remote R session in emacs, and there you have it, the perfect work flow for developing and running computationally intensive R code! I can utilize screen to not have my R sessions interrupted, and I can utilize ESS to send code to an R session/buffer. I have to say, this WILL be the way I use R for any computationally-intensive project!

UPDATE

So screen doesn’t work in emacs after I ssh to a remote server inside ansi-term. I get the error: Cannot find terminfo entry for 'eterm-color'. To fix this, I put the following in my remote server’s .bashrc file (info from here.):

if [ "$TERM" = "eterm-color" ] ; then
 TERM="vt100"
fi

UPDATE AGAIN (better solution)

This page (Remote Term Type section) shows how to fix the e-term color issue. Do make sure you create the .terminfo folder if its not there:

$ scp /usr/share/emacs/22.1/etc/e/eterm-color username@remoteserver:~/.terminfo/e/eterm-color
$ scp /usr/share/emacs/22.1/etc/e/eterm-color.ti username@remoteserver:~/.terminfo/e/eterm-color.ti

UPDATE AGAIN 2

So copying files into .terminfo didn’t fix everything. I guess SunOS servers don’t look in my home directory for those files. I guess we can copy things into /usr/share/lib/terminfo/?/* (more information at man terminfo), but I don’t have access to this location in some of my servers. I will have to resort back to the old trick (changing TERM). This time, change it to xterm (this doesn’t give me funny characters in emacs ansi-term); found this at the bottom of this page. Put the following in the remote .bashrc file:

TEMP=`uname`
if [ $TEMP = "SunOS" ]; then
 if [ "$TERM" = "eterm-color" ] ; then
 TERM="xterm"
 fi
fi

I hope there aren’t any more issues. What the previous trick does is check if system is SunOS, and if so, use xterm. I got the unix command information from here. I got the uname command info from here.

FINAL UPDATE

To get eterm-color to work in SunOS, put the following in my .bashrc file:

##following to get eterm-color working in SunOS
TERMINFO=${HOME}/.terminfo
export TERMINFO

I guess I’ve been doing this, but I never exported TERMINFO. Didn’t know this was needed. Make sure the eterm-files are copied over (see top of post). Now everything should work, hopefully flawlessly. To summarize, copy eterm files into ~/.terminfo, and put the TERMINFO stuff in ~/.bashrc.

Now screen works in emacs. An issue that arised from this method is that when screen is run inside emacs, I can’t execute ess-remote anymore because I can’t press M-x. In ansi-term, C-x C-j : behave like emacs, cursor can go anywhere C-x C-k : behave like a terminal (default) This is documented here and here. Press C-x C-j and I can press M-x again. However, ess-remote still doesn’t work.

I guess when I use screen, I am forced to copy and paste code. If I really must use screen with ESS, do the regular M-x shell. After logging into the remote server, execute “TERM\=’vt100′” in the shell. Now, run screen -> R -> ess-remote. I can send code with keypresses now, but screen steals some of my emacs key bindings. To fix this, put the following,

escape \^Oo

in my remote ~/.screenrc file to switch the default command key from C-a to C-o so it doesn’t conflict with my emacs key bindings (documented here).

More information on ansi term (like remaping C-x to C-c) can be found here.

This was a long post. Summary:

  1. ansi-term in emacs behaves VERY much like a terminal. I can run vi, emacs, etc, inside of it. Emacs behavior is ‘term’ and ‘shell’.
  2. I can change things by editing the env variable, TERM.
  3. Change keybinding in the remote .screenrc file.

NEED TO DO: get ess-remote to work with ansi-term and screen in emacs!

UPDATE2: It seems the best way to do things so far is to use ansi-term -> ssh to remote server -> screen -> R, then go to line run (C-c C-j) and copy and paste code from there. To get screen commands to work (like detach, etc), need to go back to char run (C-c C-k). Remember, I now use C-o instead of C-a (defined in .screenrc); this only works on a regular terminal or in emacs with ansi-term, not using ‘shell’ in emacs with the hack I mentioned up there.