Using Google Adwords to max out Dropbox referrals

I stumpled upon Vladik Rikhter’s post via Hacker News on how to utilize Google Adwords to max out your Dropbox referrals for less than $10. Basically, you set up an ad campaign that leads people to sign up for a Dropbox account via your referral link, giving them 2.25 GB (250 MB extra for using a referral link) of free storage while you benefit 250 MB (or 500 MB if your account is linked to your school email). My initial attempt at this did not yield any clicks, probably because many people were following this method. I waited two weeks and I still didn’t get any referrals; this was probably brought forth by the publicity generated by Lifehacker’s post (see this too). After about 3 months of waiting, I raised my max bid to $0.10/click, and I got 5 referrals in a 24 hour period. I then raised my max bid to $0.20/click and I completed my quota in a day (I’m at 18.3 GB right now with my student email linked to my account). My total cost was $5.34 since I had many referrals prior to using the campaign (was at 9.3 GB). FYI, the default auto bid was around $0.14/click, so this amount should give you enough referrals in a 24 hour period.

Note to self: use Google Adwords when I need people to click on a link (e.g., for referrals).

Real time file synchronization like Dropbox via Unison

Dropbox is a very nice tool for real time synchronization. It works very well to keep files from multiple devices (computers, phones, etc.) in sync. I use it mainly as a cloud-based backup for some of my files. However, it’s been on the headlines recently due to security and privacy concerns, leading to calls for encrypting your files prior to syncing with Dropbox.

I’ve always contemplated on running my own Dropbox-like service to have yet another safe backup of my files. Besides knowing where my data are stored exactly, I have (in theory) an unlimited amount of space. This post and this post outline solutions based on open source tools such as OpenSSH (for encrypted file transfer), lsyncd (for monitoring files), and Unison (rsync-like tool). I’ve attempted this setup, but failed to get things working with lsyncd (see the extensive discussion with the author via the comments).

I stumbled upon this post that outlines a solution based on the bleeding edge version of Unison, which includes the -repeat watch option, featuring the monitoring of files. However, the author outlined a solution for Mac OS X. I played around with the new Unison and arrived at a solution I am pretty satisfied with for my Ubuntu machines (easily extended to Mac and Windows, I’m sure). I will outline my setup in this post. Note that I have password-less ssh set up so that I can ssh into my server without typing in the password. Also, I am using Unison version 2.44.2, which I downloaded via svn around 7/16/2011.

Installing Unison

The same version of Unison must be installed on both the client and the server. Both my client and server runs Ubuntu (11.04 and 10.04 server). On the client, the folder I would like to sync is /home/vinh/Documents; the server’s destination is /home/vinh/Backup/Documents.

sudo apt-get install ocaml python-pyinotify
## install the .deb file from http://packages.ubuntu.com/search?keywords=python-pyinotify via `dpkg -i` if python-pyinotify is not in your repository
svn checkout https://webdav.seas.upenn.edu/svn/unison
cd trunk
make NATIVE=true UISTYLE=text
## `make install` installs into $HOME/bin/
sudo cp src/unison /usr/local/bin/
sudo cp src/fsmonitor.py /usr/local/bin/

Everything following is done on the client computer.

Scripts

unisonNetworkOnPortForward:

#! /bin/bash

## http://ubuntuforums.org/showpost.php?p=6679437&postcount=4
## can't have extension in filename http://www.duncanelliot.com/blog/?p=28

# ssh username@server.ip -f -N -L 9922:server.ip:22 ## minimal
sudo -u local.username ssh username@server.ip -Y -C -f -N -L 9922:server.ip:22

## multiple instances can run in case of disconnect and reconnect

This script forwards my local port 9922 to the server’s port 22 via ssh. That way, I can ssh username@localhost -p 9922 if I wanted to connect to the server. I do this so that file synchronization can resume after a disconnect and reconnect (changed files does not get synced after a reconnect if I connect to the remote server directly).

Run sudo cp unisonNetworkOnPortForward /etc/network/if-up.d/ on Debian or Ubuntu. By doing this, the script will be executed whenever the computer is connected to a network (this will be different for non-debian-based distros). Note that multiple instances of this port forwarding will be present if the network is disconnected and reconnected multiple times. This makes things a little ugly, but I haven’t noticed any problems really. Also note that the script name cannot have a file extension or things will not work.

unisonMonitor.sh:

#! /bin/bash

## in /etc/rc.local, add:
## sudo -u local.username /path/to/unisonMonitor.sh &

unison default ~/Documents ssh://username@localhost:9922//home/vinh/Backup/Documents -repeat watch -times -logfile /tmp/unison.log
# -times: sync timestamps
# -repeat watch: real-time synchronization via pyinotify

Add to /etc/rc.local before the last line:

sudo -u local.username /path/to/unisonMonitor.sh &

This turns on unison sync at startup (unison will keep trying to connect to the server if it is disconnected). Again, this implementation is different for non-debian-based distros.

unisonSync.sh:

#! /bin/bash

unison -batch -times ~/Documents ssh://username@localhost:9922//home/vinh/Backup/Documents -logfile /tmp/unison.log

Run unisonSync.sh when you want to manually sync the two folders. I add the following line to cron (crontab -e) to have a manual sync everyday at 12:30pm:

30 12 * * * /path/to/unisonSync.sh

I set up this cron job because unisonMonitor.sh will only sync files that have changed while the unison process is running. This daily backup makes sure all my files are in sync at least once a day.

unisonKill.sh:

#! /bin/bash

ps aux | grep unison | awk '{print $2}' | xargs kill -9

I run this script on the client or server when I want to clean up unison processes. The one drawback about the monitor feature of unison currently is that the unison -server and fsmonitor.py process on the server is not killed when the unison process stops on the client side. After multiple connects, this will leave a lot of unison processes running on the server. Although I haven’t seen any issues with this, the unisonKill.sh script should make cleaning up the processes easier.

Start the service

Once these scripts are in their correct locations, first run unisonSync.sh to have the initial sync. Then restart the computer. You should see a unison and fsmonitor.py process by executing ps aux | grep unison on the client and server. Also, you should see an ssh process corresponding to the port forwarding by executing ps aux | grep ssh. Run touch foo.txt in the directory that you are watching and see if it appears on the server. Remove it and see if it gets deleted. Good luck!

What are some drawbacks with this setup compared to Dropbox? Well, I can’t revert back to files from a previous date, and I don’t have a dedicated Android app that I can access the files with. To solve the former, you can set up another cron job that syncs to a different location on your server every few days, giving you access to files that are a few days old. To solve the latter, I’m sure there are Android apps that allow you to access files via the sftp protocol.

Multiple Dropbox instances on UNIX systems (Linux, Mac OS X)

I have one main Dropbox that I use and try to grow by referrals. I use it mainly for files I feel are important (backup!) and I need accessible everywhere. Sometimes I need to collaborate with others or have additional files I need accessible everywhere (for a short period of time). I don’t want these files filling up my main account so I have side accounts (2gb). To get multiple instances on a Mac OS X I followed these instructions. For other Unix-like machines follow these instructions (includes a link for Windows too). NOTE: Don’t follow the Linux instructions on the first link.

UPDATE: For Linux, use the dropbox command from $HOME/.dropbox-dist/ (see this post). Thus, for my second Dropbox account, I do

<pre class="src src-sh">mkdir $<span style="color: #eedd82;">HOME</span>/.dropbox-alt

## content of dropbox2.sh: #!/bin/bash HOME=/home/username/.dropbox-alt /home/username/.dropbox-dist/dropbox start -i ## end of dropbox2.sh chmod 775 dropbox2.sh

Add /path/to/dropbox2.sh/ in System > Preferences > Startup Applications.

Online storage: A Drive

So I mentioned before about using Drop Box to sync my files, and I use google sites to host my website, etc. However, both have limits on files uploaded and stored. What if I have large media files? I can upload videos to youtube for videos, but they convert the video to flash, and these days they remove audio if u have any copyright sound like music. Crap.

I ran across MS skydrive, which gives 25gb. I was surprised at this, a MS product that Google doesn’t have? I decided to google online storage, and i ran into this site and this site. After some research, it appears ADrive seems to be the best out there, with 50gb free storage, and the limit on a file is 2gb. That’s plenty for me and my youth group. Now I will use A Drive to store large files….until Google G drive comes out.

UPDATE: based on this post, it seems like the public link of each file gets changed every 14 days if you click on share this file. This means i can’t put the link on my google site. ehh, i guess i can only use it to store important files, not necessarily share them.

No attachments on blogger yet

So google didn’t add a feature where we can attach files to blogs (other than image files). I wanted to do it with google docs, but couldn’t because I’d have to share it with people by inviting them. Since Google’s G drive isn’t out yet, I’m going to put it in my pubic folder in dropbox. This is actually a good, long-term solution.

NOTE to self, always put ” Here is the attached file. ” and link the file this way. This way I can easily search for posts that have attachments.